url
stringlengths 50
53
| repository_url
stringclasses 1
value | labels_url
stringlengths 64
67
| comments_url
stringlengths 59
62
| events_url
stringlengths 57
60
| html_url
stringlengths 38
43
| id
int64 597k
2.65B
| node_id
stringlengths 18
32
| number
int64 1
6.83k
| title
stringlengths 1
296
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 2
classes | assignee
dict | assignees
listlengths 0
4
| milestone
dict | comments
int64 0
211
| created_at
stringlengths 20
20
| updated_at
stringlengths 20
20
| closed_at
stringlengths 20
20
⌀ | author_association
stringclasses 3
values | active_lock_reason
stringclasses 4
values | body
stringlengths 0
65.6k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 59
62
| performed_via_github_app
null | state_reason
stringclasses 3
values | draft
bool 2
classes | pull_request
dict | is_pull_request
bool 2
classes | issue_comments
listlengths 0
30
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/psf/requests/issues/1830
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1830/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1830/comments
|
https://api.github.com/repos/psf/requests/issues/1830/events
|
https://github.com/psf/requests/pull/1830
| 24,812,998 |
MDExOlB1bGxSZXF1ZXN0MTExMDg4Njk=
| 1,830 |
Requests must be installed as root when using pip to avoid errors
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3462055?v=4",
"events_url": "https://api.github.com/users/NickGeek/events{/privacy}",
"followers_url": "https://api.github.com/users/NickGeek/followers",
"following_url": "https://api.github.com/users/NickGeek/following{/other_user}",
"gists_url": "https://api.github.com/users/NickGeek/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/NickGeek",
"id": 3462055,
"login": "NickGeek",
"node_id": "MDQ6VXNlcjM0NjIwNTU=",
"organizations_url": "https://api.github.com/users/NickGeek/orgs",
"received_events_url": "https://api.github.com/users/NickGeek/received_events",
"repos_url": "https://api.github.com/users/NickGeek/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/NickGeek/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NickGeek/subscriptions",
"type": "User",
"url": "https://api.github.com/users/NickGeek",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2013-12-27T08:37:25Z
|
2021-09-08T22:01:07Z
|
2013-12-27T09:39:03Z
|
NONE
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1830/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1830/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1830.diff",
"html_url": "https://github.com/psf/requests/pull/1830",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1830.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1830"
}
| true |
[
"I wouldn't recommend doing this at all. If at all possible always install your packages to virtualenvs and avoid using sudo as much as you can.\n",
"@NickGeek This is simply not true. _Sometimes_ Requests must be installed using `sudo`: if you install directly into the site packages and you don't have the correct permissions for them. But if you're using a virtualenv or a copy of Python where you have permission to the site packages, you do not need `sudo`.\n\nFor that reason, we assume that people who see errors out of `pip` that relate to lack of permissions will just elevate, running `sudo !!`. Better to leave the `sudo` out of the docs.\n\nThanks for this though!\n",
"Okay, no problem.\n"
] |
|
https://api.github.com/repos/psf/requests/issues/1829
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1829/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1829/comments
|
https://api.github.com/repos/psf/requests/issues/1829/events
|
https://github.com/psf/requests/issues/1829
| 24,775,814 |
MDU6SXNzdWUyNDc3NTgxNA==
| 1,829 |
r.text exception in Python 3.x
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1488134?v=4",
"events_url": "https://api.github.com/users/douglarek/events{/privacy}",
"followers_url": "https://api.github.com/users/douglarek/followers",
"following_url": "https://api.github.com/users/douglarek/following{/other_user}",
"gists_url": "https://api.github.com/users/douglarek/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/douglarek",
"id": 1488134,
"login": "douglarek",
"node_id": "MDQ6VXNlcjE0ODgxMzQ=",
"organizations_url": "https://api.github.com/users/douglarek/orgs",
"received_events_url": "https://api.github.com/users/douglarek/received_events",
"repos_url": "https://api.github.com/users/douglarek/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/douglarek/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/douglarek/subscriptions",
"type": "User",
"url": "https://api.github.com/users/douglarek",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2013-12-26T02:05:39Z
|
2021-09-09T00:28:22Z
|
2013-12-26T02:16:21Z
|
NONE
|
resolved
|
I use `r.context`, it is `bytes` type, then I use `r = requests.get('http://github.com'); print(r.text)`, but an error occurs as below:
```
UnicodeEncodeError: 'ascii' codec can't encode character '\xb7' in position 235: ordinal not in range(128)
```
is it a bug in python 3 ? I use requests 2.1.0 version in openSUSE 13.1 x86_64.
Thanks
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1829/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1829/timeline
| null |
completed
| null | null | false |
[
"This is not the place to ask questions. Please ask your questions on [StackOverflow](http://stackoverflow.com/questions/tagged/python-requests).\n",
"@sigmavirus24 Ok, maybe my title confused you, now it is ok.\n",
"I was not confused and it's not a bug in `requests`. Your standard out encoding is `ascii`. StackOverflow is the place to seek help about that.\n"
] |
https://api.github.com/repos/psf/requests/issues/1828
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1828/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1828/comments
|
https://api.github.com/repos/psf/requests/issues/1828/events
|
https://github.com/psf/requests/issues/1828
| 24,761,644 |
MDU6SXNzdWUyNDc2MTY0NA==
| 1,828 |
Incorrect Timezone
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2215059?v=4",
"events_url": "https://api.github.com/users/intrixs/events{/privacy}",
"followers_url": "https://api.github.com/users/intrixs/followers",
"following_url": "https://api.github.com/users/intrixs/following{/other_user}",
"gists_url": "https://api.github.com/users/intrixs/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/intrixs",
"id": 2215059,
"login": "intrixs",
"node_id": "MDQ6VXNlcjIyMTUwNTk=",
"organizations_url": "https://api.github.com/users/intrixs/orgs",
"received_events_url": "https://api.github.com/users/intrixs/received_events",
"repos_url": "https://api.github.com/users/intrixs/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/intrixs/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/intrixs/subscriptions",
"type": "User",
"url": "https://api.github.com/users/intrixs",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2013-12-25T05:15:22Z
|
2021-09-09T00:28:21Z
|
2013-12-25T09:13:36Z
|
NONE
|
resolved
|
Recently, I tried to crawl some data from www.soccerway.com using 'requests' + 'beautifulsoup'.
Crawling went very smoothly until I found out that all of the time has been offset by 7 hours. (see below comparison)

So I was thinking maybe 'requests' didn't have the right time zone settings. Because the website said that match time will be converted to visitor's local time.

Any idea on how I can change the time-zone settings in 'requests' would be much appreciated.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1828/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1828/timeline
| null |
completed
| null | null | false |
[
"Sorry, I just found out this issue has nothing to do with 'requests'.The same thing happened when I tried to crawl page with 'urllib2'.\n\nTo anyone who know how to solve the time-zone difference when crawling a webpage, please help.\n\nThanks.\n",
"The best place to ask that question is probably StackOverflow. I'm not sure what the answer is, but usually websites do this based on source IP address geolocation. Requests shouldn't be doing anything different here, which makes that a bit confusing. My recommendation would be to examine the headers your browser is sending to see if there's any information in there that the site could be using.\n",
"Thanks!\n"
] |
https://api.github.com/repos/psf/requests/issues/1827
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1827/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1827/comments
|
https://api.github.com/repos/psf/requests/issues/1827/events
|
https://github.com/psf/requests/pull/1827
| 24,742,200 |
MDExOlB1bGxSZXF1ZXN0MTEwNzIwNDk=
| 1,827 |
Use adapter pool size for proxies
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/168609?v=4",
"events_url": "https://api.github.com/users/pepijndevos/events{/privacy}",
"followers_url": "https://api.github.com/users/pepijndevos/followers",
"following_url": "https://api.github.com/users/pepijndevos/following{/other_user}",
"gists_url": "https://api.github.com/users/pepijndevos/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/pepijndevos",
"id": 168609,
"login": "pepijndevos",
"node_id": "MDQ6VXNlcjE2ODYwOQ==",
"organizations_url": "https://api.github.com/users/pepijndevos/orgs",
"received_events_url": "https://api.github.com/users/pepijndevos/received_events",
"repos_url": "https://api.github.com/users/pepijndevos/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/pepijndevos/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pepijndevos/subscriptions",
"type": "User",
"url": "https://api.github.com/users/pepijndevos",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2013-12-24T10:25:28Z
|
2021-09-08T23:08:29Z
|
2014-01-08T18:50:00Z
|
CONTRIBUTOR
|
resolved
|
I was doing hundreds of parallel requests, and kept seeing warnings about dropped connections because the connection pool was full.
So I increased the connection pool on the adapter, and increased it some more. And then even more. It did not help.
Then I noticed that only my proxy connections where dropped.
Some investigation revealed that proxy connections use their own pool subclass.
However, this pool is created without respecting the pool size setting.
This PR simply propagates the pool kwargs to the proxy pool.
I contemplated adding additional kwargs for the proxy pool size, but this seems uneccecery complexity.
If there is a use case for using one Session for proxied and direct requests, and the proxies needs a different pool size than the direct connections, I could add that.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1827/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1827/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1827.diff",
"html_url": "https://github.com/psf/requests/pull/1827",
"merged_at": "2014-01-08T18:50:00Z",
"patch_url": "https://github.com/psf/requests/pull/1827.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1827"
}
| true |
[
"Thanks for this!\n\nI like this solution, I think it's the correct approach. We can quite safely overload these kwargs, and I think doing so provides the nicer interface.\n\nOnce again, it's annoying that we can't easily test this, but I'm happy with this solution as-is. @sigmavirus24?\n",
"Yea, I looked at writing a test case, but instead just hamered it with proxy requests and looked for warning -- there where none.\n",
"If you want to write a test (that would just the slightest bit brittle) we could introduce mock as a testing dependency. We could then do something like:\n\n``` python\ndef test_proxy_receives_pool_settings(self):\n with mock.patch('requests.packages.urllib3.poolmanager.proxy_from_url') as proxy_from_url_mock:\n http = HTTPAdapter()\n proxy_from_url_mock.assert_called_once_with(\n # ...\n )\n```\n\nIt's quite simple and it tests the intended behaviour. I'm just a bit wary of introducing mock without other tests relying more strongly on it. And I still need to re-write the tests so that they're fast, offline and thorough enough.\n",
"> If you want to write a test\n\nNot terribly excited. But if it's the right thing to do, I'll do it. I'd really like this merged.\n",
"> Not terribly excited\n\nYeah testing is boring. I guess I'll cover this when I pull an all-nighter on the tests in the near future.\n\n:+1:\n",
"Meh proxies suck.\n"
] |
https://api.github.com/repos/psf/requests/issues/1826
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1826/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1826/comments
|
https://api.github.com/repos/psf/requests/issues/1826/events
|
https://github.com/psf/requests/issues/1826
| 24,699,060 |
MDU6SXNzdWUyNDY5OTA2MA==
| 1,826 |
SSLError exception leak
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/303476?v=4",
"events_url": "https://api.github.com/users/zbb42/events{/privacy}",
"followers_url": "https://api.github.com/users/zbb42/followers",
"following_url": "https://api.github.com/users/zbb42/following{/other_user}",
"gists_url": "https://api.github.com/users/zbb42/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/zbb42",
"id": 303476,
"login": "zbb42",
"node_id": "MDQ6VXNlcjMwMzQ3Ng==",
"organizations_url": "https://api.github.com/users/zbb42/orgs",
"received_events_url": "https://api.github.com/users/zbb42/received_events",
"repos_url": "https://api.github.com/users/zbb42/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/zbb42/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zbb42/subscriptions",
"type": "User",
"url": "https://api.github.com/users/zbb42",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2013-12-23T09:18:56Z
|
2021-09-09T00:10:13Z
|
2014-01-29T19:27:06Z
|
NONE
|
resolved
|
My program caught an ssl.SSLError which is not RequestException and I think it may be an uncaught exception? Requests version is 2.0.0.
Traceback:
```
......
r = requests.post(self._token_url, data=data, timeout=self._timeout)
File "/data/poi-op/env/python/lib/python2.7/site-packages/requests/api.py", line 88, in post
return request('post', url, data=data, **kwargs)
File "/data/poi-op/env/python/lib/python2.7/site-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/data/poi-op/env/python/lib/python2.7/site-packages/requests/sessions.py", line 357, in request
resp = self.send(prep, **send_kwargs)
File "/data/poi-op/env/python/lib/python2.7/site-packages/requests/sessions.py", line 460, in send
r = adapter.send(request, **kwargs)
File "/data/poi-op/env/python/lib/python2.7/site-packages/requests/adapters.py", line 367, in send
r.content
File "/data/poi-op/env/python/lib/python2.7/site-packages/requests/models.py", line 633, in content
self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()
File "/data/poi-op/env/python/lib/python2.7/site-packages/requests/models.py", line 572, in generate
decode_content=True):
File "/data/poi-op/env/python/lib/python2.7/site-packages/requests/packages/urllib3/response.py", line 225, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "/data/poi-op/env/python/lib/python2.7/site-packages/requests/packages/urllib3/response.py", line 174, in read
data = self._fp.read(amt)
File "/usr/local/python27/lib/python2.7/httplib.py", line 543, in read
return self._read_chunked(amt)
File "/usr/local/python27/lib/python2.7/httplib.py", line 585, in _read_chunked
line = self.fp.readline(_MAXLINE + 1)
File "/usr/local/python27/lib/python2.7/socket.py", line 476, in readline
data = self._sock.recv(self._rbufsize)
File "/usr/local/python27/lib/python2.7/ssl.py", line 241, in recv
return self.read(buflen)
File "/usr/local/python27/lib/python2.7/ssl.py", line 160, in read
return self._sslobj.read(len)
SSLError: The read operation timed out
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1826/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1826/timeline
| null |
completed
| null | null | false |
[
"Thanks for raising this issue!\n\nThis was originally spotted in #1787, and should be fixed in `master`. Try using the GitHub version of Requests to see if you can still hit it.\n",
"Closed due to inactivity.\n"
] |
https://api.github.com/repos/psf/requests/issues/1825
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1825/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1825/comments
|
https://api.github.com/repos/psf/requests/issues/1825/events
|
https://github.com/psf/requests/issues/1825
| 24,690,993 |
MDU6SXNzdWUyNDY5MDk5Mw==
| 1,825 |
Session hook is ignored
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1894766?v=4",
"events_url": "https://api.github.com/users/akhomchenko/events{/privacy}",
"followers_url": "https://api.github.com/users/akhomchenko/followers",
"following_url": "https://api.github.com/users/akhomchenko/following{/other_user}",
"gists_url": "https://api.github.com/users/akhomchenko/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/akhomchenko",
"id": 1894766,
"login": "akhomchenko",
"node_id": "MDQ6VXNlcjE4OTQ3NjY=",
"organizations_url": "https://api.github.com/users/akhomchenko/orgs",
"received_events_url": "https://api.github.com/users/akhomchenko/received_events",
"repos_url": "https://api.github.com/users/akhomchenko/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/akhomchenko/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/akhomchenko/subscriptions",
"type": "User",
"url": "https://api.github.com/users/akhomchenko",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2013-12-23T00:36:25Z
|
2021-09-09T00:28:22Z
|
2013-12-23T07:49:31Z
|
CONTRIBUTOR
|
resolved
|
Hello.
My English is not so good, sorry.
Problem example:
```
def fail_on_error(r, *args, **kwargs):
r.raise_for_status()
s = requests.session()
s.hooks.update(dict(response=fail_on_error))
r = s.get('http://httpbin.org/status/404')
```
This code doesn't raise an Exception. Is wasn't too obvious for me. Is it ok?
It is caused by `merge_setting` if this a problem, which overrides `Session` hooks with default `Request` hooks.
Thanks.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1825/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1825/timeline
| null |
completed
| null | null | false |
[
"Thanks for raising this issue!\n\nI can't reproduce this on my machine. Can you tell me what version of Requests you're using? You can find this out by getting `requests.__version__` in a Python shell.\n",
"This was fixed in version 2.1.0 of requests. \n\nI, like @Lukasa, see this work properly when tried interactively:\n\n``` pycon\n>>> import requests\n>>> def fail_on_error(r, *args, **kwargs):\n... r.raise_for_status()\n... \n>>> s = requests.session()\n>>> s.hooks.update(dict(response=fail_on_error))\n>>> r = s.get('http://httpbin.org/status/404')\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"requests/sessions.py\", line 395, in get\n return self.request('GET', url, **kwargs)\n File \"requests/sessions.py\", line 383, in request\n resp = self.send(prep, **send_kwargs)\n File \"requests/sessions.py\", line 491, in send\n r = dispatch_hook('response', hooks, r, **kwargs)\n File \"requests/hooks.py\", line 41, in dispatch_hook\n _hook_data = hook(hook_data, **kwargs)\n File \"<stdin>\", line 2, in fail_on_error\n File \"requests/models.py\", line 773, in raise_for_status\n raise HTTPError(http_error_msg, response=self)\nrequests.exceptions.HTTPError: 404 Client Error: NOT FOUND\n```\n",
"Excellent, OK. This can be fixed by updating your version of Requests.\n",
"Thx.Sorry for outdated bug.\nMy version was 2.0.1.\n"
] |
https://api.github.com/repos/psf/requests/issues/1824
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1824/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1824/comments
|
https://api.github.com/repos/psf/requests/issues/1824/events
|
https://github.com/psf/requests/issues/1824
| 24,618,972 |
MDU6SXNzdWUyNDYxODk3Mg==
| 1,824 |
Invalid ~/.netrc file does not raise any warnings or errors from requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/102495?v=4",
"events_url": "https://api.github.com/users/ssbarnea/events{/privacy}",
"followers_url": "https://api.github.com/users/ssbarnea/followers",
"following_url": "https://api.github.com/users/ssbarnea/following{/other_user}",
"gists_url": "https://api.github.com/users/ssbarnea/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ssbarnea",
"id": 102495,
"login": "ssbarnea",
"node_id": "MDQ6VXNlcjEwMjQ5NQ==",
"organizations_url": "https://api.github.com/users/ssbarnea/orgs",
"received_events_url": "https://api.github.com/users/ssbarnea/received_events",
"repos_url": "https://api.github.com/users/ssbarnea/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ssbarnea/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ssbarnea/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ssbarnea",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2013-12-20T12:17:49Z
|
2021-09-09T00:10:14Z
|
2014-01-29T19:26:14Z
|
CONTRIBUTOR
|
resolved
|
I end-up with a case where requests behaves very strange, because it does not load the credentials from .netrc file and it's extremely hard to discover when is using credentials and where is taking them from.
The debug level output is identical in all 3 cases: no auth, auth from netrc or auth manually specified.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1824/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1824/timeline
| null |
completed
| null | null | false |
[
"I've explained to you, at least twice ([1st](https://github.com/kennethreitz/requests/issues/1297#issuecomment-16130288), [2nd](https://github.com/kennethreitz/requests/pull/1302#commitcomment-2984789)) that requests performs **no logging** whatsoever. With those reminders brought up, I'll reiterate once more that requests will not and does not perform any logging.\n\nIf you want to, you can monkey-patch `requests.utils.get_netrc_auth` so that it will perform logging for you. Otherwise, you'll have to reason about the conditions [here](https://github.com/kennethreitz/requests/blob/0eb6dc36f088e6e5c0baaac3334dc5f0e4f07942/requests/sessions.py#L275).\n",
"Probably it is useless to ask for proper logging in requests so I will skip to the point: with debugging I discovered that the .netrc file stopped working because one new line couldn't be parsed by requests.\n\nThis new line was not related to the one that was supposed to be loaded but the way it is implemented the parsing fails and no information is loaded from the file (strange!?). In fact the desired line was before the parse errors one.\n\nAlso, there is zero information regarding the failure to parse the .netrc file.\n\nIn case you would wonder add a line like \"machine example.com john johnpass\" to the end of the file and .netrc will stop working with requests, it will still be loadable by other tools like `curl` or `wget`.\n",
"We don't parse `netrc` files ourselves, we use the standard library `netrc` library to do it. Check the function [here](https://github.com/kennethreitz/requests/blob/master/requests/utils.py#L67-L102), and confirm whether it works on your system.\n",
"In this case there are 1½ bugs here: requests does silently bypass any error in parsing .netrc (which it shoudn't), and ½ bug inside netrc in stdlib, which fails to load if parsing one line fails (other tools do not).\n",
"@ssbarnea That's interesting, you seem to have contradictory views about these two issues. In the first case, we're continuing in the face of errors, which you deem to be a bug. In the second case, Python aborts in the face of errors, which you deem to be a bug.\n\nFurthermore, if you fixed the second issue (Python does not fail loading netrc if parsing one line fails), Requests will never silently bypass errors in parsing `.netrc` because it will never find out about those errors!\n\nI'm +0 on the idea of no longer catching the `Netrc` exception. We could do it. Some people will find unexpected failures because they've never noticed that they even had anything in `.netrc`, but they'll be able to fix that in time. However, we certainly can't do it until at least 2.2, because we cannot start throwing exceptions where previously we worked just fine.\n",
"> Furthermore, if you fixed the second issue (Python does not fail loading netrc if parsing one line fails), Requests will never silently bypass errors in parsing .netrc because it will never find out about those errors!\n\nYeah this sounds like a bug that should be reported on bugs.python.org. The fact that we catch an exception is not an issue since we shouldn't be bubbling up exceptions especially if we can't be certain the user wants us to be using their `.netrc` file.\n\nI'm -0 on the idea of allowing it to bubble up though.\n\nThat aside, I'm :+1: on closing this issue since the work to stop catching that exception is trivial.\n",
"Guys, thank you for your help. I would vote for raising the exception in at future release. It's quite easy to fix such a problem if the user sees the problem. Also, I will raise a python bug regarding the parsing.\n\n`.netrc` is a wonderful thing and I like the default behavior of using it if it exists, the only remark was that if it is invalid it doesn't say anything. Imagine that I am calling some 3rd party libraries that do also work with anonymous but they return different results... it took me some time to find the cause.\n"
] |
https://api.github.com/repos/psf/requests/issues/1823
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1823/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1823/comments
|
https://api.github.com/repos/psf/requests/issues/1823/events
|
https://github.com/psf/requests/pull/1823
| 24,611,868 |
MDExOlB1bGxSZXF1ZXN0MTEwMDQ4MjE=
| 1,823 |
Update docs to highlight the value of r.encoding.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2013-12-20T09:16:45Z
|
2021-09-08T23:11:01Z
|
2013-12-20T14:21:04Z
|
MEMBER
|
resolved
|
This should help address the concerns raised in #1795.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1823/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1823/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1823.diff",
"html_url": "https://github.com/psf/requests/pull/1823",
"merged_at": "2013-12-20T14:21:04Z",
"patch_url": "https://github.com/psf/requests/pull/1823.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1823"
}
| true |
[
"LGTM. :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/1822
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1822/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1822/comments
|
https://api.github.com/repos/psf/requests/issues/1822/events
|
https://github.com/psf/requests/issues/1822
| 24,602,748 |
MDU6SXNzdWUyNDYwMjc0OA==
| 1,822 |
UnicodeEncodeError: 'latin-1' codec can't encode characters
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1450977?v=4",
"events_url": "https://api.github.com/users/xjsender/events{/privacy}",
"followers_url": "https://api.github.com/users/xjsender/followers",
"following_url": "https://api.github.com/users/xjsender/following{/other_user}",
"gists_url": "https://api.github.com/users/xjsender/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/xjsender",
"id": 1450977,
"login": "xjsender",
"node_id": "MDQ6VXNlcjE0NTA5Nzc=",
"organizations_url": "https://api.github.com/users/xjsender/orgs",
"received_events_url": "https://api.github.com/users/xjsender/received_events",
"repos_url": "https://api.github.com/users/xjsender/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/xjsender/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xjsender/subscriptions",
"type": "User",
"url": "https://api.github.com/users/xjsender",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2013-12-20T02:56:35Z
|
2021-09-08T02:10:09Z
|
2013-12-20T08:45:48Z
|
NONE
|
resolved
|
Requests is the latest version.
When I try to post the data which contains Chinese character, this exception is thrown.
```
Traceback (most recent call last):
File "X/threading.py", line 639, in _bootstrap_inner
File "X/threading.py", line 596, in run
File "C:\Users\Administrator\Dropbox\Sublime3056\Data\Packages\SublimeApex\salesforce\api.py", line 546, in execute_anonymous
headers=headers)
File "C:\Users\Administrator\Dropbox\Sublime3056\Data\Packages\SublimeApex\requests\api.py", line 88, in post
return request('post', url, data=data, **kwargs)
File "C:\Users\Administrator\Dropbox\Sublime3056\Data\Packages\SublimeApex\requests\api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Users\Administrator\Dropbox\Sublime3056\Data\Packages\SublimeApex\requests\sessions.py", line 338, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\Administrator\Dropbox\Sublime3056\Data\Packages\SublimeApex\requests\sessions.py", line 441, in send
r = adapter.send(request, **kwargs)
File "C:\Users\Administrator\Dropbox\Sublime3056\Data\Packages\SublimeApex\requests\adapters.py", line 292, in send
timeout=timeout
File "C:\Users\Administrator\Dropbox\Sublime3056\Data\Packages\SublimeApex\requests\packages\urllib3\connectionpool.py", line 428, in urlopen
body=body, headers=headers)
File "C:\Users\Administrator\Dropbox\Sublime3056\Data\Packages\SublimeApex\requests\packages\urllib3\connectionpool.py", line 280, in _make_request
conn.request(method, url, **httplib_request_kw)
File "X/http/client.py", line 1049, in request
File "X/http/client.py", line 1086, in _send_request
UnicodeEncodeError: 'latin-1' codec can't encode characters in position 1632-1633: ordinal not in range(256)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1822/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1822/timeline
| null |
completed
| null | null | false |
[
"> File \"X/http/client.py\"\n\nDid you write `X` because that's a path to a local file? If so, your directory structure may be confusing urllib3. If not, then you should probably raise this with on bugs.python.org since this is not something I think requests should be handling. This looks like it's rising from `httplib` (or `http` on Python 3 which I'm guessing you're using).\n",
"@sigmavirus24 , \n\nI used requests in sublime plugin, if the soap_body in below statement didn't contains any Chinese characters, there will be no exception.\n\n`response = requests.post(self.apex_url, soap_body, verify=False, headers=headers)`\n",
"Firstly, unless you're using a different version of Sublime Apex to the one in their public repository, Requests is _not_ the latest version, it's version 1.2.3. Can you tell me what version of Sublime Text you're using?\n",
"It's sublime text 3056\n",
"So, ST 3, but not the most recent revision. Ok, that gives us something. Specifically, Sublime Text 3 uses Python 3.3, not Python 2.7 (which Sublime Text 2 used). This means all the default strings in Sublime Apex are unicode strings.\n\nIf you open up the Python 3.3 `http.client` file, you'll find that the `_send_request()` function looks like this:\n\n``` python\n# Honor explicitly requested Host: and Accept-Encoding: headers.\nheader_names = dict.fromkeys([k.lower() for k in headers])\nskips = {}\nif 'host' in header_names:\n skips['skip_host'] = 1\nif 'accept-encoding' in header_names:\n skips['skip_accept_encoding'] = 1\n\nself.putrequest(method, url, **skips)\n\nif body is not None and ('content-length' not in header_names):\n self._set_content_length(body)\nfor hdr, value in headers.items():\n self.putheader(hdr, value)\nif isinstance(body, str):\n # RFC 2616 Section 3.7.1 says that text default has a\n # default charset of iso-8859-1.\n body = body.encode('iso-8859-1')\nself.endheaders(body)\n```\n\nNow, ISO-8859-1 is an alias for Latin-1, which is the codec we're having trouble with. The problem we've got is that Sublime Apex is providing a unicode string body to Requests, which httplib needs to encode into bytes. Taking the default from RFC 2616, it concludes you want Latin-1, which doesn't include any Chinese characters. Clearly then, encoding fails, and you get the exception in question.\n\nConsidering that Sublime Apex claims in the headers it sends to be sending UTF-8 encoded data (which is a lie currently), Sublime Apex wants to be encoding the data as UTF-8 before sending it. This means any line sending data (in this case line 545 of `salesforce/api.py`) should read like this:\n\n``` python\nresponse = requests.post(self.apex_url, soap_body.encode('utf-8'), verify=False, headers=headers)\n```\n\nFor the sake of anyone else who wants to confirm my diagnosis, here's a quick bit of sample code that confirms the problem:\n\n``` python\na = \"\\u13E0\\u19E0\\u1320\"\na.encode('latin1') # Throws UnicodeEncodeError, proves that this can't be expressed in ISO-8859-1.\na.encode('utf-8') # Totally fine.\nr = requests.post('http://httpbin.org/post', data=a) # Using unicode string, throws UnicodeEncodeError blaming Latin1.\nr = requests.post('http://httpbin.org/post', data=a.encode('utf-8')) # Works fine.\n```\n\nThanks for raising this with us, but this is not a Requests bug. =)\n",
"Thanks.\n",
"r = requests.post('http://httpbin.org/post', data=a.encode('utf-8')) \r\nvery usefull,\r\nthank you!"
] |
https://api.github.com/repos/psf/requests/issues/1821
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1821/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1821/comments
|
https://api.github.com/repos/psf/requests/issues/1821/events
|
https://github.com/psf/requests/issues/1821
| 24,583,485 |
MDU6SXNzdWUyNDU4MzQ4NQ==
| 1,821 |
SSL verify failed though cert appears to be OK
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/82214?v=4",
"events_url": "https://api.github.com/users/sradu/events{/privacy}",
"followers_url": "https://api.github.com/users/sradu/followers",
"following_url": "https://api.github.com/users/sradu/following{/other_user}",
"gists_url": "https://api.github.com/users/sradu/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sradu",
"id": 82214,
"login": "sradu",
"node_id": "MDQ6VXNlcjgyMjE0",
"organizations_url": "https://api.github.com/users/sradu/orgs",
"received_events_url": "https://api.github.com/users/sradu/received_events",
"repos_url": "https://api.github.com/users/sradu/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sradu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sradu/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sradu",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2013-12-19T19:55:17Z
|
2021-09-09T00:01:10Z
|
2013-12-19T20:01:36Z
|
NONE
|
resolved
|
I'm trying to call a simple URL from a Debian stable server and I'm getting a certificate error:
> > > import requests
> > > r = requests.get('https://mobile.amber.io')
> > > [..]
> > > requests.exceptions.SSLError: [Errno 1] _ssl.c:504: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed
The certificate should be valid. I've tried all SSL online checks, it works on all platforms.
On my mac though it might be unrelated I'm getting SSL hosts don't match (there are a couple of hosts on that server with different certs).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/82214?v=4",
"events_url": "https://api.github.com/users/sradu/events{/privacy}",
"followers_url": "https://api.github.com/users/sradu/followers",
"following_url": "https://api.github.com/users/sradu/following{/other_user}",
"gists_url": "https://api.github.com/users/sradu/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sradu",
"id": 82214,
"login": "sradu",
"node_id": "MDQ6VXNlcjgyMjE0",
"organizations_url": "https://api.github.com/users/sradu/orgs",
"received_events_url": "https://api.github.com/users/sradu/received_events",
"repos_url": "https://api.github.com/users/sradu/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sradu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sradu/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sradu",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1821/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1821/timeline
| null |
completed
| null | null | false |
[
"My psychic powers tell me that you're using Python 2.7, which doesn't support SNI out of the box. See [this StackOverflow answer](http://stackoverflow.com/questions/18578439/using-requests-with-tls-doesnt-give-sni-support/18579484#18579484) for instructions on how to add SNI support.\n\nAm I right?\n",
"That fixed it for me (installing urllib3 and those three packages). Thanks!\n"
] |
https://api.github.com/repos/psf/requests/issues/1820
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1820/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1820/comments
|
https://api.github.com/repos/psf/requests/issues/1820/events
|
https://github.com/psf/requests/issues/1820
| 24,582,960 |
MDU6SXNzdWUyNDU4Mjk2MA==
| 1,820 |
How to set the post parameters order?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/127591?v=4",
"events_url": "https://api.github.com/users/ogrishman/events{/privacy}",
"followers_url": "https://api.github.com/users/ogrishman/followers",
"following_url": "https://api.github.com/users/ogrishman/following{/other_user}",
"gists_url": "https://api.github.com/users/ogrishman/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ogrishman",
"id": 127591,
"login": "ogrishman",
"node_id": "MDQ6VXNlcjEyNzU5MQ==",
"organizations_url": "https://api.github.com/users/ogrishman/orgs",
"received_events_url": "https://api.github.com/users/ogrishman/received_events",
"repos_url": "https://api.github.com/users/ogrishman/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ogrishman/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ogrishman/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ogrishman",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2013-12-19T19:47:01Z
|
2021-09-09T00:28:25Z
|
2013-12-19T19:55:42Z
|
NONE
|
resolved
|
This is part of my code:
```
files = {'FileData': ("01.avi", open("H:\\Dropbox\\temp\\test\\videos\\01.avi", 'rb'), "application/octet-stream")}
url = "http://upload.test.com/api/upload_form_data"
headers = {
"User-Agent": "Shockwave Flash",
"Accept": "text/*",
"Connection": "Keep-Alive",
"Pragma": "no-cache",
"Host": "upload.test.com"
}
data = {
"Filename": upload_me_name,
"upload_token": upload_token,
"client": "flash",
"Upload": "Submit Query"
}
r = s.post(url, proxies = proxies, headers = headers, files = files, data = data)
```
I use fiddler to trace the post process and I got the following result.
```
POST http://upload.test.com/api/upload_form_data HTTP/1.1
Accept-Encoding: gzip, deflate, compress
Accept: text/*
Content-Type: multipart/form-data; boundary=35b35bbdadc54949a1e280d19fa85850
Connection: Keep-Alive
Pragma: no-cache
Host: upload.test.com
Cookie: yktk=1%7C1387481091%7C15%7CaWQ6NjMxMDQwMjksbm465ZCR5L2g6IKa5a2Q5LiK5ouN5LiA5o6MLHZpcDpmYWxzZSx5dGlkOjAsdGlkOjA%3D%7Ce2af70d5dc486fb7daf96eb31011bf65%7Cda22 5886e18eff748b67f650accb1b75810e42d1%7C1; u=%E5%90%91%E4%BD%A0%E8%82%9A%E5%AD%90%E4%B8%8A%E6%8B%8D%E4%B8%80%E6%8E%8C; ykss=fd47b352415e60cf5699819e; _l_lgi=63104029
User-Agent: Shockwave Flash
Content-Length: 162469
--35b35bbdadc54949a1e280d19fa85850
Content-Disposition: form-data; name="upload_token"
MTg0ODQ4NjgwXzAxMDA2NDNBQTI1MkFGNTQ1QUJBRUMwM0MyRTQxRERFNjA0NUUyLTg4QUEtQ0Q5Ny1BNkRBLUQ2MkIyOTA3Rjk4N18xXzdmYWI3ZGU0MDkzYzdhMTA0OTkyMmM1NjczNThkMWYw
--35b35bbdadc54949a1e280d19fa85850
Content-Disposition: form-data; name="client"
flash
--35b35bbdadc54949a1e280d19fa85850
Content-Disposition: form-data; name="Upload"
Submit Query
--35b35bbdadc54949a1e280d19fa85850
Content-Disposition: form-data; name="Filename"
01.avi
--35b35bbdadc54949a1e280d19fa85850
Content-Disposition: form-data; name="FileData"; filename="01.avi"
Content-Type: application/octet-stream
<File data .....................>
```
However, that's not what exactly I want. I need the parameters in different order like this:
```
POST http://upload.test.com/api/upload_form_data HTTP/1.1
Accept-Encoding: gzip, deflate, compress
Accept: text/*
Content-Type: multipart/form-data; boundary=35b35bbdadc54949a1e280d19fa85850
Connection: Keep-Alive
Pragma: no-cache
Host: upload.test.com
Cookie: yktk=1%7C1387481091%7C15%7CaWQ6NjMxMDQwMjksbm465ZCR5L2g6IKa5a2Q5LiK5ouN5LiA5o6MLHZpcDpmYWxzZSx5dGlkOjAsdGlkOjA%3D%7Ce2af70d5dc486fb7daf96eb31011bf65%7Cda22 5886e18eff748b67f650accb1b75810e42d1%7C1; u=%E5%90%91%E4%BD%A0%E8%82%9A%E5%AD%90%E4%B8%8A%E6%8B%8D%E4%B8%80%E6%8E%8C; ykss=fd47b352415e60cf5699819e; _l_lgi=63104029
User-Agent: Shockwave Flash
Content-Length: 162469
--35b35bbdadc54949a1e280d19fa85850
Content-Disposition: form-data; name="upload_token"
MTg0ODQ4NjgwXzAxMDA2NDNBQTI1MkFGNTQ1QUJBRUMwM0MyRTQxRERFNjA0NUUyLTg4QUEtQ0Q5Ny1BNkRBLUQ2MkIyOTA3Rjk4N18xXzdmYWI3ZGU0MDkzYzdhMTA0OTkyMmM1NjczNThkMWYw
--35b35bbdadc54949a1e280d19fa85850
Content-Disposition: form-data; name="client"
flash
--35b35bbdadc54949a1e280d19fa85850
Content-Disposition: form-data; name="Filename"
01.avi
--35b35bbdadc54949a1e280d19fa85850
Content-Disposition: form-data; name="FileData"; filename="01.avi"
Content-Type: application/octet-stream
<File data .....................>
--35b35bbdadc54949a1e280d19fa85850
Content-Disposition: form-data; name="Upload"
Submit Query
```
How to do this in requests? Thank you!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1820/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1820/timeline
| null |
completed
| null | null | false |
[
"Firstly, the correct place to ask this question is not here: this is for tracking bugs only. You should ask your question at StackOverflow, and use the \"python-requests\" tag.\n\nSecondly, why does the order matter? It shouldn't affect the remote web site at all.\n\nFinally, the correct code is this:\n\n``` python\nfiles = [\n ('upload_token', upload_token),\n ('client', 'flash'),\n ('Filename', upload_me_name),\n ('FileData', (\"01.avi\", open(\"H:\\\\Dropbox\\\\temp\\\\test\\\\videos\\\\01.avi\", 'rb'), \"application/octet-stream\")),\n ('Upload', 'Submit Query'),\n]\n\nurl = \"http://upload.test.com/api/upload_form_data\"\nheaders = {\n \"User-Agent\": \"Shockwave Flash\",\n \"Accept\": \"text/*\",\n \"Connection\": \"Keep-Alive\",\n \"Pragma\": \"no-cache\",\n}\n\nr = s.post(url, proxies = proxies, headers = headers, files = files)\n```\n"
] |
https://api.github.com/repos/psf/requests/issues/1819
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1819/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1819/comments
|
https://api.github.com/repos/psf/requests/issues/1819/events
|
https://github.com/psf/requests/issues/1819
| 24,577,571 |
MDU6SXNzdWUyNDU3NzU3MQ==
| 1,819 |
How to disable (not send it at all) a request header?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/127591?v=4",
"events_url": "https://api.github.com/users/ogrishman/events{/privacy}",
"followers_url": "https://api.github.com/users/ogrishman/followers",
"following_url": "https://api.github.com/users/ogrishman/following{/other_user}",
"gists_url": "https://api.github.com/users/ogrishman/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ogrishman",
"id": 127591,
"login": "ogrishman",
"node_id": "MDQ6VXNlcjEyNzU5MQ==",
"organizations_url": "https://api.github.com/users/ogrishman/orgs",
"received_events_url": "https://api.github.com/users/ogrishman/received_events",
"repos_url": "https://api.github.com/users/ogrishman/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ogrishman/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ogrishman/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ogrishman",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2013-12-19T18:54:24Z
|
2021-09-09T00:28:24Z
|
2013-12-19T18:58:38Z
|
NONE
|
resolved
|
I'm using requests writing a program and it added the following header to my custom header:
```
Accept-Encoding: gzip, deflate, compress
```
My custom header is:
```
headers = {
"User-Agent": "Shockwave Flash",
"Accept": "text/*",
"Connection": "Keep-Alive",
"Pragma": "no-cache",
"Host": "upload.youku.com"
}
```
Is it possible to disable the added header at all?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1819/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1819/timeline
| null |
completed
| null | null | false |
[
"Yup. Like this:\n\n``` python\nr = requests.get(url, headers={'Accept-Encoding': None})\n```\n",
"@ogrishman, @Lukasa explained on #1820 that this is a bug tracker and not a question and answer forum. Further questions you ask here will be closed without answer.\n"
] |
https://api.github.com/repos/psf/requests/issues/1818
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1818/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1818/comments
|
https://api.github.com/repos/psf/requests/issues/1818/events
|
https://github.com/psf/requests/issues/1818
| 24,571,748 |
MDU6SXNzdWUyNDU3MTc0OA==
| 1,818 |
Removal of TLSv1 from the webserver gives socket.error: [Errno 54] Connection reset by peer error to python requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/102495?v=4",
"events_url": "https://api.github.com/users/ssbarnea/events{/privacy}",
"followers_url": "https://api.github.com/users/ssbarnea/followers",
"following_url": "https://api.github.com/users/ssbarnea/following{/other_user}",
"gists_url": "https://api.github.com/users/ssbarnea/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ssbarnea",
"id": 102495,
"login": "ssbarnea",
"node_id": "MDQ6VXNlcjEwMjQ5NQ==",
"organizations_url": "https://api.github.com/users/ssbarnea/orgs",
"received_events_url": "https://api.github.com/users/ssbarnea/received_events",
"repos_url": "https://api.github.com/users/ssbarnea/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ssbarnea/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ssbarnea/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ssbarnea",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2013-12-19T17:40:43Z
|
2021-09-09T00:28:24Z
|
2013-12-20T02:40:49Z
|
CONTRIBUTOR
|
resolved
|
I discovered that if I force nginx to use only TLSv1.2 and TLSv1.1 protocols, it will prevent the requests library from working.
Other browsers are able to deal with this, including CURL but request doesn't, at least when is used on OS X with Python 2.7
Reproduce, reconfigure webserver to use only recent SSL protocols:
ssl_protocols TLSv1.2 TLSv1.1;
```
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='example.com', port=443): Max retries exceeded with url: /some/url (Caused by <class 'socket.error'>: [Errno 54] Connection reset by peer)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1818/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1818/timeline
| null |
completed
| null | null | false |
[
"This sounds like a bug in urllib3. @shazow opinions?\n",
"Actually following the conversation on #171 this is not a bug in requests but in Python as @Lukasa explained.\n",
"My favourite.\n"
] |
https://api.github.com/repos/psf/requests/issues/1817
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1817/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1817/comments
|
https://api.github.com/repos/psf/requests/issues/1817/events
|
https://github.com/psf/requests/issues/1817
| 24,556,852 |
MDU6SXNzdWUyNDU1Njg1Mg==
| 1,817 |
Losing cookies in redirects after HTTPS POST
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5185185?v=4",
"events_url": "https://api.github.com/users/rbock/events{/privacy}",
"followers_url": "https://api.github.com/users/rbock/followers",
"following_url": "https://api.github.com/users/rbock/following{/other_user}",
"gists_url": "https://api.github.com/users/rbock/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rbock",
"id": 5185185,
"login": "rbock",
"node_id": "MDQ6VXNlcjUxODUxODU=",
"organizations_url": "https://api.github.com/users/rbock/orgs",
"received_events_url": "https://api.github.com/users/rbock/received_events",
"repos_url": "https://api.github.com/users/rbock/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rbock/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rbock/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rbock",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2013-12-19T13:45:42Z
|
2021-09-09T00:28:25Z
|
2013-12-19T13:53:44Z
|
NONE
|
resolved
|
Hi,
in a session, I am sending HTTPS GET request, followed by a HTTPS POST request which results in a series of 302-redirects. I monitor traffic via mitmproxy, but the results are the same whether I use the proxy or not.
With allow_redirect=True, most of the cookies get lost between the first and second redirect (4 out of 6 got lost in my case).
With allow_redirect=False for the POST, I initiate the GET with allow_redirect=True and everything works just fine.
Here's the boiled-down code:
```
self.session = requests.Session()
self.session.get(get_url, headers=self.headers, proxies=self.proxies, verify=False)
self.session.post(post_url, data=payload, headers=self.headers, proxies=self.proxies, verify=False)
```
I cannot give you all details about the URL or the parameters since it is a login procedure, but I can certainly answer questions or experiment if you tell me what to do.
Edit: This happens in 2.1.0. But it works fine in 2.0.1.
Regards,
Roland
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1817/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1817/timeline
| null |
completed
| null | null | false |
[
"Thanks for raising this issue! In future, however, please take a quick look through both the open and the closed issues before opening yours. This issue has been reported twice already (in #1805 and #1809) and is fixed in the current master branch by #1808. You can either wait until the next release for the fix, or you can use the current GitHub code if you need the fix immediately.\n",
"Thanks for the quick answer. I did not make the connection, when I read the issue titles.\n"
] |
https://api.github.com/repos/psf/requests/issues/1816
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1816/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1816/comments
|
https://api.github.com/repos/psf/requests/issues/1816/events
|
https://github.com/psf/requests/pull/1816
| 24,544,211 |
MDExOlB1bGxSZXF1ZXN0MTA5NjY3OTg=
| 1,816 |
Re-raise DecodeError
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3482660?v=4",
"events_url": "https://api.github.com/users/daftshady/events{/privacy}",
"followers_url": "https://api.github.com/users/daftshady/followers",
"following_url": "https://api.github.com/users/daftshady/following{/other_user}",
"gists_url": "https://api.github.com/users/daftshady/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/daftshady",
"id": 3482660,
"login": "daftshady",
"node_id": "MDQ6VXNlcjM0ODI2NjA=",
"organizations_url": "https://api.github.com/users/daftshady/orgs",
"received_events_url": "https://api.github.com/users/daftshady/received_events",
"repos_url": "https://api.github.com/users/daftshady/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/daftshady/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/daftshady/subscriptions",
"type": "User",
"url": "https://api.github.com/users/daftshady",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 1 |
2013-12-19T09:07:16Z
|
2021-09-08T23:06:02Z
|
2013-12-19T20:38:34Z
|
CONTRIBUTOR
|
resolved
|
Re-raise DecodeError in urllib3 for https://github.com/kennethreitz/requests/issues/1815.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1816/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1816/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1816.diff",
"html_url": "https://github.com/psf/requests/pull/1816",
"merged_at": "2013-12-19T20:38:34Z",
"patch_url": "https://github.com/psf/requests/pull/1816.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1816"
}
| true |
[
"LGTM! Thanks so much @daftshady! :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/1815
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1815/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1815/comments
|
https://api.github.com/repos/psf/requests/issues/1815/events
|
https://github.com/psf/requests/issues/1815
| 24,536,184 |
MDU6SXNzdWUyNDUzNjE4NA==
| 1,815 |
urllib3.exceptions.DecodeError being raised
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2161601?v=4",
"events_url": "https://api.github.com/users/adamsc/events{/privacy}",
"followers_url": "https://api.github.com/users/adamsc/followers",
"following_url": "https://api.github.com/users/adamsc/following{/other_user}",
"gists_url": "https://api.github.com/users/adamsc/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/adamsc",
"id": 2161601,
"login": "adamsc",
"node_id": "MDQ6VXNlcjIxNjE2MDE=",
"organizations_url": "https://api.github.com/users/adamsc/orgs",
"received_events_url": "https://api.github.com/users/adamsc/received_events",
"repos_url": "https://api.github.com/users/adamsc/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/adamsc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/adamsc/subscriptions",
"type": "User",
"url": "https://api.github.com/users/adamsc",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2013-12-19T04:18:21Z
|
2021-09-09T00:28:23Z
|
2013-12-20T09:01:37Z
|
NONE
|
resolved
|
I have been getting a `requests.packages.urllib3.exceptions.DecodeError` exception raised. From my understanding of the documentation, shouldn't this be translated to a Requests exception which inherits from `requests.exceptions.RequestException`?
This occurred using Requests 2.1.0 when making a GET request through a misconfigured HTTP proxy.
Here is the stack trace:
```
Traceback (most recent call last):
File "proxy-manager.py", line 94, in <module>
good, comment = is_good(proxy, base_headers, base_latency * 10)
File "proxy-manager.py", line 42, in is_good
headers = get_header_data(proxy)
File "proxy-manager.py", line 14, in get_header_data
r = requests.get('http://proxyjudge.us/azenv.php', proxies=proxies, timeout=timeout)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 55, in get
return request('get', url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 382, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 485, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 388, in send
r.content
File "/usr/local/lib/python2.7/dist-packages/requests/models.py", line 676, in content
self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()
File "/usr/local/lib/python2.7/dist-packages/requests/models.py", line 615, in generate
decode_content=True):
File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/response.py", line 236, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/response.py", line 204, in read
e)
requests.packages.urllib3.exceptions.DecodeError: ('Received response with content-encoding: gzip, but failed to decode it.', error('Error -3 while decompressing: incorrect header check',))
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1815/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1815/timeline
| null |
completed
| null | null | false |
[
"Thanks for raising this! Yeah, it probably should.\n",
"Fixed by #1816.\n"
] |
https://api.github.com/repos/psf/requests/issues/1814
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1814/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1814/comments
|
https://api.github.com/repos/psf/requests/issues/1814/events
|
https://github.com/psf/requests/pull/1814
| 24,535,176 |
MDExOlB1bGxSZXF1ZXN0MTA5NjE3ODM=
| 1,814 |
make requests compatible with SAE
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/170430?v=4",
"events_url": "https://api.github.com/users/tg123/events{/privacy}",
"followers_url": "https://api.github.com/users/tg123/followers",
"following_url": "https://api.github.com/users/tg123/following{/other_user}",
"gists_url": "https://api.github.com/users/tg123/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tg123",
"id": 170430,
"login": "tg123",
"node_id": "MDQ6VXNlcjE3MDQzMA==",
"organizations_url": "https://api.github.com/users/tg123/orgs",
"received_events_url": "https://api.github.com/users/tg123/received_events",
"repos_url": "https://api.github.com/users/tg123/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tg123/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tg123/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tg123",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2013-12-19T03:36:49Z
|
2021-09-08T22:01:07Z
|
2013-12-19T03:40:45Z
|
NONE
|
resolved
|
make requests compatible with SAE
Sina App Engine conn has a None valued sock
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1814/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1814/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1814.diff",
"html_url": "https://github.com/psf/requests/pull/1814",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1814.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1814"
}
| true |
[
"Hey @tg123 this PR belongs on [shazow/urllib3](/shazow/urllib3) as was noted in the [README in the packages directory](https://github.com/tg123/requests/tree/fa34ab816aa03a009ceb86db7e55f16ca6687ba4/requests/packages). Please be more careful next time.\n",
"sorry about that\n",
"No worries! Thanks for sending the PR though and fixing up urllib3! :cake:\n",
"in SAE, try:\n\n```\nimport os\nos.environ['disable_fetchurl'] = True\n```\n\nhttp://sae.sina.com.cn/?m=devcenter&catId=291#anchor_24be84a1512409fbeb5363421a1a19fe\n"
] |
https://api.github.com/repos/psf/requests/issues/1813
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1813/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1813/comments
|
https://api.github.com/repos/psf/requests/issues/1813/events
|
https://github.com/psf/requests/issues/1813
| 24,522,453 |
MDU6SXNzdWUyNDUyMjQ1Mw==
| 1,813 |
Allow Content-Disposition header to be overridden per-field
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/97720?v=4",
"events_url": "https://api.github.com/users/orokusaki/events{/privacy}",
"followers_url": "https://api.github.com/users/orokusaki/followers",
"following_url": "https://api.github.com/users/orokusaki/following{/other_user}",
"gists_url": "https://api.github.com/users/orokusaki/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/orokusaki",
"id": 97720,
"login": "orokusaki",
"node_id": "MDQ6VXNlcjk3NzIw",
"organizations_url": "https://api.github.com/users/orokusaki/orgs",
"received_events_url": "https://api.github.com/users/orokusaki/received_events",
"repos_url": "https://api.github.com/users/orokusaki/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/orokusaki/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/orokusaki/subscriptions",
"type": "User",
"url": "https://api.github.com/users/orokusaki",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 10 |
2013-12-18T22:12:35Z
|
2021-09-09T00:28:26Z
|
2013-12-18T22:15:37Z
|
NONE
|
resolved
|
`make_multipart` overrides the `Content-Disposition` header (see https://github.com/kennethreitz/requests/blob/54ad646067a3e023fd4dcf40c0937ec46069871f/requests/packages/urllib3/fields.py#L165), even when explicitly set (using the new per-field headers feature).
Is there any reason why the code couldn't be updated to allow this header to be explicitly set when providing files as 4-tuples containing the filename, content, content type and content disposition? The reason for wanting this feature is to allow working with REST APIs that do it all wrong (can't name names, sorry).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1813/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1813/timeline
| null |
completed
| null | null | false |
[
"@orokusaki In principle there's no reason. I suggest raising the issue on urllib3 and seeing what @shazow thinks. =)\n",
"+1 on moving this discussion to urllib3, but a cursory comment: The tuple format is generally deprecated in urllib3 (only remains there as a shortcut, for now), constructing RequestField objects is the preferred way.\n",
"@shazow - interesting... is there currently a way to manually use a `RequestField` while using `request.post(...)`? If not, that would surely be something worth my contributing when I get a moment.\n\nCC @Lukasa \n",
"@orokusaki Yea, you should be able to pass a list of `RequestField` objects same way you pass a list of tuples. They should behave interchangeably. If that's not the case, please open a bug. :)\n\nMight be worth updating the docs accordingly.\n",
"@shazow No docs changes on our side. =) There's no plan to officially document the crazy things you can do with multipart encoding because they make us sad. There's plans afoot to move some multipart stuff outside the main library, which will likely become the preferred way of doing anything non-trivial with multipart encoding.\n",
"@shazow - sorry to pester, but I'm not sure I follow. I normally use `post` like this:\n\n```\nrequests.post(\n 'http://www.example.com/',\n data={'field': 'value'},\n files=[('filename.txt', 'file content', 'text/plain', {'some-header': 'some value'})]\n)\n```\n\nWhere would manually creating a `RequestField` fit into that equation, simply replacing the list of `files` with a list of `RequestField` instances? If so, it would seem that the header would still be overwritten per https://github.com/kennethreitz/requests/blob/d88cd02fd86c6e74ef8a2d1928db78b8976ce00f/requests/models.py#L141 - also would result in a likely error due to the lines preceding that line, right?\n\nI might simply be confused as to what you mean though.\n\nCC @Lukasa \n",
"TIL requests has its own code for this. I believe your understanding is correct, and that is indeed the case.\n\nSoooo, the bug is back in requests land.\n",
"Ah, so it sounds like maybe a `requests==3.0.0` type of feature, if it were to be added. Thanks for checking on that for me @shazow.\n\n@Lukasa @kennethreitz If more things were delegated to urllib3, it would simplify the requests codebase even more. If either of you have suggestions pertaining a change like this, I'd be more than happy to take a stab at it.\n",
"@orokusaki That's true, but not relevant. In our list of priorities, the simplicity of the API is vastly higher than the simplicity of the codebase. We're already not hugely happy with the interface here, and using `RequestField` classes is not going to make us happier.\n\nAs mentioned above, we're more likely to suggest that anyone who needs to do something interesting with multipart encoding use a third-party package that interoperates with Requests. Right now such a package doesn't exist, but there are plans to work on one.\n",
"You can just write your own class with it's own `__iter__` that does exactly what you wants. Very simple :)\n"
] |
https://api.github.com/repos/psf/requests/issues/1812
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1812/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1812/comments
|
https://api.github.com/repos/psf/requests/issues/1812/events
|
https://github.com/psf/requests/pull/1812
| 24,496,949 |
MDExOlB1bGxSZXF1ZXN0MTA5Mzg3NjY=
| 1,812 |
Removed the bundled charade and urllib3.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/151924?v=4",
"events_url": "https://api.github.com/users/sontek/events{/privacy}",
"followers_url": "https://api.github.com/users/sontek/followers",
"following_url": "https://api.github.com/users/sontek/following{/other_user}",
"gists_url": "https://api.github.com/users/sontek/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sontek",
"id": 151924,
"login": "sontek",
"node_id": "MDQ6VXNlcjE1MTkyNA==",
"organizations_url": "https://api.github.com/users/sontek/orgs",
"received_events_url": "https://api.github.com/users/sontek/received_events",
"repos_url": "https://api.github.com/users/sontek/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sontek/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sontek/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sontek",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 20 |
2013-12-18T15:34:21Z
|
2014-09-27T19:52:56Z
|
2013-12-18T15:55:40Z
|
NONE
| null |
This removes the bundled versions of charade and urllib3, allowing
users to upgrade those packages separately. This also provides a
nice way for Linux distro packages to be able to package requests
and have it depend on their packaged version of those libraries.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1812/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1812/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1812.diff",
"html_url": "https://github.com/psf/requests/pull/1812",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1812.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1812"
}
| true |
[
":+1: \n",
"These projects are vendored for a reasons that I discuss publicly on a regular basis.\n\nAlso, my code decisions are not be influenced by the random decisions of Linux distros :)\n",
"Although, the work of said distros is much appreciated :)\n",
"@kennethreitz Any write-ups/blogs you can point to on the reasons?\n",
"Not that I know of — mostly in Q/A form.\n\nThe tl;dr is:\n- I treat requests as a product. I want to be in 100% control of all aspects of the user experience. \n- I believe these dependencies are implementation details. If there's something wrong with one of them, I can change it arbitrarily. I can test new patches before they are merged. I can remove them entirely.\n- I want my users to be free to not use packaging if they don't won't to. They can just grab the tarball and go.\n- Requests itself is therefore vendorable. This is particularly useful for system-level applications because Python does not allow for multiple versions of dependencies to be installed. If all the CLI tools you use written in Python used different versions of requests, and you had to use site-packages, that'd be a bad situation. \n",
"I could produce a patch that addressed the third of those points very effectively. The others only partially. The strategy would be that requests.packages would check for the existence of bundled versions of urllib3, charade, etc. If those were available, it would implement requests.packages.$foo via the bundled versions. If they weren't available, it would implement them via the system packages. Distros could then simply rm the bundled versions in their packages in order to use system packages.\n\nThe tarball downloaded from pypi (since it included the bundled versions of the libraries) would continue to use the bundled copies.\n\nWould that be an acceptable patch? Or should I not waste my time working on something like that?\n",
"I hope this doesn't come across as abrasive (it's not meant to be!) — but, Requests has been through all of these cycles already. We supported exactly what you're proposing for a long time. It was a waste of a time, and a big headache for me. The less branching and indeterminacy the codebase has, the better.\n\nBelieve it or not, we also removed the bundled versions altogether for a good year! So many wasted lines in requirements.txt files around the world :)\n\nThis is the only solution that I'm happy with. As a matter of fact, I'm extremely happy with the current design :) :cake:\n\n---\n\nAt the end of the day, Requests is valuable purely because of its opinions — and this is one of its strongest ones. It's the little things that matter the most — the most subjective things. This is the type of thing that I find we teach ourselves to trivialize — as it's the source of endless bikeshed. \n\n> Just simply make it a dependency and the world will go about it's way.\n\nThe architectural decisions of this project are a personal decision. I am simply choosing to make my own decision. \n\nOpen source is all about freedom — and if one of the founding forces of the open source movement, the linux distros, are encouraging developers to contort their works of art to acclimate to their packaging ideologies — I'm a little concerned for the state of things :)\n",
"No worries. The distros are going to unbundle the bundled code anyway -- I was just seeing if you'd take a patch to make that easier on them. Since you won't it saves me some coding time at the expense of the individual maintainers' times maintaining their patches to implement the unbundling. :-)\n",
"Thank you for requests.\n\nStandard questions about bundling dependencies:\n- How far behind upstream are the bundled versions?\n- How frequently are checks for updated versions made?\n - (<3 _pip-review_ from https://github.com/nvie/pip-tools)\n- What is the process for reviewing upstream changes?\n- What is the process for reporting/notifying re: updated versions of upstream packages?\n- If there are urgent patches upstream, how long does it take for those to propagate downstream?\n - OS packaging\n - Vendor-bundles\n - End-user upgrades\n\n...\n- Why don't all packages bundle their dependencies for convenience [despite community standards]?\n",
"Everything is at my discretion, as it is all an implementation detail. None of any of this should matter to any user. \n",
"Also, it's worth noting as I did on the superfluous issue created before this PR that this has been discussed several times before. No amount of discussion will sway us.\n",
"@westurner \nYou compiled a nice set of questions there :) Statements from https://github.com/kennethreitz/requests/pull/1812#issuecomment-30871660 neither answer nor invalidate any of these questions to be clear. However that's in theory. In practice, most of the time, bundling just works and is better than all existing alternatives. Sharing dependencies, while sound in theory, is very difficult to do right and that's why people chose alternatives which in practice just work better. I have a strong feeling that sharing dependencies, which brings the need to manage them, once a must due to scarce computer resources (I suppose), nowadays creates more problems than it solves.\nI also do not understand why packagers waste their time unbundling bundled packages. Do they also reverse monkey patching :) ? I mean if some piece of software chooses to bundle something then it does it for a reason. It's not like authors just don't know any better.\nReturning to your questions; while there are no policies set, from which one could arrive at clear answers you expected, general answer is that contributors make whatever they are able and willing to do to make complaining users (more) happy. You know, because in life, people need working software more than official policies.\n",
"How many outdated bundled/embedded copies of OpenSSL could there be? Are\nthere standard mechanisms for ensuring that security updates reach end\nusers?\n\nHere's this: https://news.ycombinator.com/item?id=7385150\n\nWes Turner\nOn Mar 19, 2014 2:38 PM, \"Piotr Dobrogost\" [email protected] wrote:\n\n> @westurner https://github.com/westurner\n> You compiled a nice set of questions there :) Statements from #1812\n> (comment)https://github.com/kennethreitz/requests/pull/1812#issuecomment-30871660neither answer nor invalidate any of these questions to be clear. However\n> that's in theory. In practice, most of the time, bundling just works and is\n> better than all existing alternatives. Sharing dependencies, while sound in\n> theory, is very difficult to do right and that's why people chose\n> alternatives which in practice just work better. I have a strong feeling\n> that sharing dependencies, which brings the need to manage them, once a\n> must due to scarce computer resources (I suppose), nowadays creates more\n> problems than it solves.\n> I also do not understand why packagers waste their time unbundling bundled\n> packages. Do they also reverse monkey patching :) ? I mean if some piece of\n> software chooses to bundle something then it does it for a reason. It's not\n> like authors just don't know any better.\n> Returning to your questions; while there are no policies set, from which\n> one could arrive at clear answers you expected, general answer is that\n> contributors make whatever they are able and willing to do to make\n> complaining users (more) happy. You know, because in life, people need\n> working software more than official policies.\n> \n> —\n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/pull/1812#issuecomment-38097168\n> .\n",
"From the Python programmer's perspective, embedded copies of OpenSSL are less dangerous than using the standard library's SSL module, which is a broken insecure mess is any version of Python less than 3.3.\n",
"- http://www.cvedetails.com/vulnerability-list/vendor_id-10210/product_id-18230/Python-Python.html\n- http://www.cvedetails.com/vulnerability-list/vendor_id-217/Openssl.html\n",
"@westurner while I appreciate your concern, your efforts are for naught. Allow me to relay to you what I recently learned, security is not the foremost concern of requests (not @Lukasa's or my opinion). Its API is the first concern. Trying to support your point by making valid points about security will do nothing. Since little is likely to change, I'm unsubscribing from this issue and would encourage others who value their time to do the same.\n",
"I share abadgers' interest in making it easier to devendorize, even as i sympathize with upstream's strong desire to vendorize known good versions of packages. It's a tough bit of tension between upstream's requirements and a distro's very real requirements. I just devendorized pip, and with good upstream discipline, grep, and a decent editor it wasn't _too_ hard. The pip developers are going to make it easier (as piotr-dobrogost's reference implies), but as long as we can identify and easily hack the import lines, I'm okay with it.\n",
"> In practice, most of the time, bundling just works and is better than all existing alternatives.\n\nHaving said that I agree with @abadger and I see making is easy to use non-bundled dependencies as something maintainers of every software should do.\n",
"@kennethreitz totally agree with the decisions made here. If anyone has ever tried to ship Python software to be standalone you'll know why it's done like this. \n",
"@whalesalad exxxactly :)\n"
] |
https://api.github.com/repos/psf/requests/issues/1811
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1811/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1811/comments
|
https://api.github.com/repos/psf/requests/issues/1811/events
|
https://github.com/psf/requests/issues/1811
| 24,495,646 |
MDU6SXNzdWUyNDQ5NTY0Ng==
| 1,811 |
Stop bundling urllib3, just add it to install_requires
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/151924?v=4",
"events_url": "https://api.github.com/users/sontek/events{/privacy}",
"followers_url": "https://api.github.com/users/sontek/followers",
"following_url": "https://api.github.com/users/sontek/following{/other_user}",
"gists_url": "https://api.github.com/users/sontek/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sontek",
"id": 151924,
"login": "sontek",
"node_id": "MDQ6VXNlcjE1MTkyNA==",
"organizations_url": "https://api.github.com/users/sontek/orgs",
"received_events_url": "https://api.github.com/users/sontek/received_events",
"repos_url": "https://api.github.com/users/sontek/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sontek/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sontek/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sontek",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2013-12-18T15:14:38Z
|
2021-09-08T23:00:55Z
|
2013-12-18T16:07:39Z
|
NONE
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1811/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1811/timeline
| null |
completed
| null | null | false |
[
"We have discussed this issue with users several times in the past. A search of the issues that are closed would show that. Our opinion has not changed, nor has our reasoning. By vendoring urllib3 we have a very specific version that we have tested against that may include unreleased bug fixes and does not put pressure on @shazow. It also allows us to checkout a specific version of the repository and just work on it with what existed at the time. It's reliable and it will be how we work for the foreseeable future. I'll leave this open until @Lukasa can make his way around to it, but I'm strongly in favor of closing this.\n",
"Fixed by https://github.com/kennethreitz/requests/pull/1812\n",
"@sigmavirus24 The big problem is that distros remove the bundled version anyways, making conflicts when you want to handle exceptions / use urllib3 directly, since requests is using a different version than everything else.\n",
"For example, if you look at the patch Fedora keeps around for requests:\n\nhttp://pkgs.fedoraproject.org/cgit/python-requests.git/commit/?id=2f898f274c560a0fb5ac48719a9529f68688fb7a\n\nI think doing a requires>=<ver1> <=<ver2> would also be acceptable practice, allowing you to pin a range of versions you know are working with requests. \n",
"Since #1812 has been closed, I feel it's fair to close this and end the discussion. We've discussed this publicly before and we have discussed it with many of the distro package maintainers who package requests for you and everyone else. Our stance has also always been to use PyPI and not the distro packages because they remove the vendored dependencies.\n"
] |
|
https://api.github.com/repos/psf/requests/issues/1810
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1810/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1810/comments
|
https://api.github.com/repos/psf/requests/issues/1810/events
|
https://github.com/psf/requests/pull/1810
| 24,493,853 |
MDExOlB1bGxSZXF1ZXN0MTA5MzY5MTU=
| 1,810 |
Swap charade for chardet
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
] | null | 3 |
2013-12-18T14:47:36Z
|
2021-09-08T23:08:31Z
|
2013-12-18T15:52:55Z
|
CONTRIBUTOR
|
resolved
|
Fixes #1800
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1810/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1810/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1810.diff",
"html_url": "https://github.com/psf/requests/pull/1810",
"merged_at": "2013-12-18T15:52:55Z",
"patch_url": "https://github.com/psf/requests/pull/1810.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1810"
}
| true |
[
"This shouldn't affect any of the tests @kennethreitz and should be merged before the next release.\n",
"What's the difference between the projects?\n\nAlso, NOTICES needs to be updated.\n",
"oh I see your explanation in the history now. :)\n"
] |
https://api.github.com/repos/psf/requests/issues/1809
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1809/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1809/comments
|
https://api.github.com/repos/psf/requests/issues/1809/events
|
https://github.com/psf/requests/issues/1809
| 24,475,554 |
MDU6SXNzdWUyNDQ3NTU1NA==
| 1,809 |
Redirection fails if redirect url contains semicolon
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/71184?v=4",
"events_url": "https://api.github.com/users/adriaant/events{/privacy}",
"followers_url": "https://api.github.com/users/adriaant/followers",
"following_url": "https://api.github.com/users/adriaant/following{/other_user}",
"gists_url": "https://api.github.com/users/adriaant/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/adriaant",
"id": 71184,
"login": "adriaant",
"node_id": "MDQ6VXNlcjcxMTg0",
"organizations_url": "https://api.github.com/users/adriaant/orgs",
"received_events_url": "https://api.github.com/users/adriaant/received_events",
"repos_url": "https://api.github.com/users/adriaant/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/adriaant/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/adriaant/subscriptions",
"type": "User",
"url": "https://api.github.com/users/adriaant",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2013-12-18T08:24:44Z
|
2021-09-09T00:28:26Z
|
2013-12-18T09:01:29Z
|
NONE
|
resolved
|
URLs can contain a semicolon as per http://www.ietf.org/rfc/rfc1738.txt
The website http://www.vd.nl/ redirects as follows (using curl):
```
curl -IL 'http://www.vd.nl/'
HTTP/1.1 302 Moved Temporarily
Server: Unknown
Vary: Accept-Encoding
Content-Type: text/plain
Date: Wed, 18 Dec 2013 08:21:31 GMT
Location: http://www.vd.nl/index.jsf?noCookieCheck=true
Transfer-Encoding: chunked
Connection: Keep-Alive
Set-Cookie: JSESSIONID=509416AD306056112AD4457880BC7FBD.mirvd110; Domain=www.vd.nl; Path=/;HttpOnly
Set-Cookie: X-Mapping-egldpcee=00BA6EB61567615AFB15FACB74D67EE1; path=/
HTTP/1.1 302 Moved Temporarily
Server: Unknown
Vary: Accept-Encoding
Content-Type: text/plain
Date: Wed, 18 Dec 2013 08:21:31 GMT
Location: http://www.vd.nl/index.jsf;jsessionid=7B0827ACDBDA874DD62277173C1709B3.mirvd112
Transfer-Encoding: chunked
Connection: Keep-Alive
Set-Cookie: JSESSIONID=7B0827ACDBDA874DD62277173C1709B3.mirvd112; Domain=www.vd.nl; Path=/;HttpOnly
Set-Cookie: X-Mapping-egldpcee=309EF29094DD6794869AC642FE42E744; path=/
HTTP/1.1 200 OK
Server: Unknown
Vary: Accept-Encoding
Content-Type: text/html;charset=UTF-8
Date: Wed, 18 Dec 2013 08:21:31 GMT
Transfer-Encoding: chunked
Connection: Keep-Alive
Set-Cookie: X-Mapping-egldpcee=75FACE26A300BA579BA2B1C5BE336FA3; path=/
X-Powered-By: JSF/1.2
```
Requests 2.1.0, however, strips the url right at the semicolon, making it go into a redirect loop:
```
1: redirect to http://www.vd.nl/index.jsf?noCookieCheck=true [302]
2: redirect to http://www.vd.nl/index.jsf [302]
3: redirect to http://www.vd.nl/ [301]
4: redirect to http://www.vd.nl/index.jsf?noCookieCheck=true [302]
5: redirect to http://www.vd.nl/index.jsf [302]
6: redirect to http://www.vd.nl/ [301]
7: redirect to http://www.vd.nl/index.jsf?noCookieCheck=true [302]
8: redirect to http://www.vd.nl/index.jsf [302]
etc.
```
using sample code:
```
import requests
url = 'http://www.vd.nl/'
headers = {
'Accept-Encoding': 'identity, deflate, compress, gzip',
'Accept': '*/*',
'Connection': 'close',
'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_8_5) AppleWebKit/536.30.1 (KHTML, like Gecko) Version/6.0.5 Safari/536.30.1'
}
s = requests.session()
s.keep_alive = False
response = s.get(url, verify=False, timeout=30, headers=headers, stream=True, allow_redirects=False)
responsecode = response.status_code
if responsecode >= 300 and responsecode < 400 and response.headers.get('location'):
redirect_counter = 1
print " %d: redirect to %s [%d]" % (redirect_counter, response.headers.get('location'), responsecode)
for redirect in s.resolve_redirects(response, response.request):
redirect_counter += 1
responsecode = redirect.status_code
new_location = redirect.headers.get('location')
if responsecode >= 300 and responsecode < 400 and new_location:
print " %d: redirect to %s [%d]" % (redirect_counter, new_location, responsecode)
else:
break
response.close()
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1809/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1809/timeline
| null |
completed
| null | null | false |
[
"Thanks for raising this issue! However, I disagree with your assessment: I think this is actually the same issue as #1805, and will be fixed by #1808. Checking out the code in #1808 confirms that diagnosis.\n\nIn the future, there are a couple of ways you could have spotted this. Firstly, it's worth noting that the symptom you must have seen (a `TooManyRedirects` exception) is exactly the same as described in the title of #1805: it wouldn't have hurt to at least look at that issue.\n\nSecond, we don't change the headers on the response at all. Given that your `new_location` is the value of the Location header, not the URL we actually looked up, you can assume that the server genuinely told us to go there. The question is why: the answer is that we weren't persisting cookies properly.\n\nI'm closing this issue in favour of centralising the discussion on #1808. Thanks again for raising this issue!\n",
"Thanks for the quick reply and the explanation!\n"
] |
https://api.github.com/repos/psf/requests/issues/1808
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1808/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1808/comments
|
https://api.github.com/repos/psf/requests/issues/1808/events
|
https://github.com/psf/requests/pull/1808
| 24,469,924 |
MDExOlB1bGxSZXF1ZXN0MTA5MjMzMDY=
| 1,808 |
Fixes #1805
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
] | null | 2 |
2013-12-18T04:55:25Z
|
2021-09-08T23:07:31Z
|
2013-12-18T15:56:30Z
|
CONTRIBUTOR
|
resolved
|
Sure cookies are persisted to the session, but those new cookies are not added
to the next prepared request. We need to update that new request's CookieJar
with the new cookies.
/cc @teleyinex
This allows me to fetch the URL you specified quite quickly.
TODO:
- [ ] Find a way to reliably test this. Perhaps introduce [Betamax](https://github.com/sigmavirus24/betamax) to make this test reproducible.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1808/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1808/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1808.diff",
"html_url": "https://github.com/psf/requests/pull/1808",
"merged_at": "2013-12-18T15:56:30Z",
"patch_url": "https://github.com/psf/requests/pull/1808.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1808"
}
| true |
[
"Looks reasonable to me. Testing is going to be a pain though.\n",
"Looks good too! Thanks for fixing it so quickly @sigmavirus24 \n"
] |
https://api.github.com/repos/psf/requests/issues/1807
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1807/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1807/comments
|
https://api.github.com/repos/psf/requests/issues/1807/events
|
https://github.com/psf/requests/pull/1807
| 24,466,796 |
MDExOlB1bGxSZXF1ZXN0MTA5MjE4MDU=
| 1,807 |
cookiejar.iteritems() is broken after 2.1.0 update
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/314085?v=4",
"events_url": "https://api.github.com/users/code-of-kpp/events{/privacy}",
"followers_url": "https://api.github.com/users/code-of-kpp/followers",
"following_url": "https://api.github.com/users/code-of-kpp/following{/other_user}",
"gists_url": "https://api.github.com/users/code-of-kpp/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/code-of-kpp",
"id": 314085,
"login": "code-of-kpp",
"node_id": "MDQ6VXNlcjMxNDA4NQ==",
"organizations_url": "https://api.github.com/users/code-of-kpp/orgs",
"received_events_url": "https://api.github.com/users/code-of-kpp/received_events",
"repos_url": "https://api.github.com/users/code-of-kpp/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/code-of-kpp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/code-of-kpp/subscriptions",
"type": "User",
"url": "https://api.github.com/users/code-of-kpp",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2013-12-18T03:03:21Z
|
2021-09-08T23:06:15Z
|
2013-12-19T20:38:52Z
|
NONE
|
resolved
|
```
Python 2.7.5+ (default, Sep 19 2013, 13:48:49)
[GCC 4.8.1] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import requests
>>> requests.__version__
'2.1.0'
>>> r = requests.get('http://google.com')
>>> r.cookies
<<class 'requests.cookies.RequestsCookieJar'>[Cookie(version=0, name='NID', value='67=WbyIM1gU91bGQ5Y3SjPHQ8_fqtStHiMWGJQ5n2Bt_PwytcmPii7fV7AELJ95cqfSmhAHSJCtFm0bBDZJJoTzQHGGQvk6PT8h78PrwWi96r0K7R9N0KsNCZkkrgJW94dV', port=None, port_specified=False, domain='.google.ru', domain_specified=True, domain_initial_dot=True, path='/', path_specified=True, secure=False, expires=1403146705, discard=False, comment=None, comment_url=None, rest={'HttpOnly': None}, rfc2109=False), Cookie(version=0, name='PREF', value='ID=7c3265d2706e8bea:FF=0:NW=1:TM=1387335505:LM=1387335505:S=LsdqbwGuUdHENT0m', port=None, port_specified=False, domain='.google.ru', domain_specified=True, domain_initial_dot=True, path='/', path_specified=True, secure=False, expires=1450407505, discard=False, comment=None, comment_url=None, rest={}, rfc2109=False)]>
>>> dict(r.cookies)
{'PREF': 'ID=7c3265d2706e8bea:FF=0:NW=1:TM=1387335505:LM=1387335505:S=LsdqbwGuUdHENT0m', 'NID': '67=WbyIM1gU91bGQ5Y3SjPHQ8_fqtStHiMWGJQ5n2Bt_PwytcmPii7fV7AELJ95cqfSmhAHSJCtFm0bBDZJJoTzQHGGQvk6PT8h78PrwWi96r0K7R9N0KsNCZkkrgJW94dV'}
>>> r.cookies.items()
[('NID', '67=WbyIM1gU91bGQ5Y3SjPHQ8_fqtStHiMWGJQ5n2Bt_PwytcmPii7fV7AELJ95cqfSmhAHSJCtFm0bBDZJJoTzQHGGQvk6PT8h78PrwWi96r0K7R9N0KsNCZkkrgJW94dV'), ('PREF', 'ID=7c3265d2706e8bea:FF=0:NW=1:TM=1387335505:LM=1387335505:S=LsdqbwGuUdHENT0m')]
>>> r.cookies.iteritems()
<generator object iteritems at 0x1940dc0>
>>> list(r.cookies.iteritems())
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/kpp/build/trng_social/venv/lib/python2.7/_abcoll.py", line 387, in iteritems
yield (key, self[key])
File "/home/kpp/build/trng_social/venv/local/lib/python2.7/site-packages/requests/cookies.py", line 267, in __getitem__
return self._find_no_duplicates(name)
File "/home/kpp/build/trng_social/venv/local/lib/python2.7/site-packages/requests/cookies.py", line 322, in _find_no_duplicates
raise KeyError('name=%r, domain=%r, path=%r' % (name, domain, path))
KeyError: "name=Cookie(version=0, name='NID', value='67=WbyIM1gU91bGQ5Y3SjPHQ8_fqtStHiMWGJQ5n2Bt_PwytcmPii7fV7AELJ95cqfSmhAHSJCtFm0bBDZJJoTzQHGGQvk6PT8h78PrwWi96r0K7R9N0KsNCZkkrgJW94dV', port=None, port_specified=False, domain='.google.ru', domain_specified=True, domain_initial_dot=True, path='/', path_specified=True, secure=False, expires=1403146705, discard=False, comment=None, comment_url=None, rest={'HttpOnly': None}, rfc2109=False), domain=None, path=None"
```
Because of this one can't for example build `OrderedDict` from `RequestCookieJar`.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1807/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1807/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1807.diff",
"html_url": "https://github.com/psf/requests/pull/1807",
"merged_at": "2013-12-19T20:38:52Z",
"patch_url": "https://github.com/psf/requests/pull/1807.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1807"
}
| true |
[
"If you can reduce the number of changes to what is absolutely necessary I will be :+1: but as it stands you've added a lot of complexity and python version checking which is unnecessary. Much of the functionality you wrote is provided by `collections.MutableMapping`, the issue is to make that work, not duplicate code (otherwise, we wouldn't inherit from that class in the first place).\n",
"the problem is iteritems() with default collections.Mapping implementation will call `__getitem__()`.\n",
"I see now why your PR is so large. Your description was awfully miniscule and does not properly describe the problem you're actually solving here. Regardless, I remain :-1: until you fix what are some glaring discrepancies in how you seem to believe some methods behave in Python 3 and how they actually behave. Furthermore, we were already providing a consistent API across both Python 3 and Python 2 so your changes are quite extravagant and overreaching and technically speaking create a backwards incompatible API break.\n\nPeople will call `items`, `keys`, or `values` expecting lists regardless of python version and now receive conditional values for which something methods will not work. You cannot call `sort` for example on what you're returning from those methods, you'll receive an exception.\n",
"I strongly believe that people using python3 should not expect that `keys()`, `values()` and `items()` of a dict-like object will return list.\nOn the other hand you're right about views. People may want to store keys in a variable and use this variable more than once.\nThe way to fix the bug and keep the old API is just `list(self.iter...())` in `keys()`, `items()` and `values()`\nThe simplest way to fix the bug and implement python3-compatable API is `frozenset(self.iter...())` in `keys()`, `items()` and `values()`.\nAnd finally the most correct way is to use `collections.MappingView` and subclasses. But in this case we should discuss were to put conditional import.\nWhat do you think?\n",
"I must not have been clear. The existing users of requests expect that behaviour regardless of how other Mapping objects behave. That's the behaviour we intend to preserve because backwards API consistency is sacred and we value the implicit contract that we have with our users.\n",
"But returning `MappingView` subclasses will also break backward-compatibility - you can't call `sort()` on `KeysView`.\n",
"> But returning MappingView subclasses will also break backward-compatibility - you can't call sort() on KeysView.\n\nExactly, and what you have now is perfectly correct. Once you fix up that test I'm :+1: on this PR. Thanks for pulling through on this. :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/1806
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1806/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1806/comments
|
https://api.github.com/repos/psf/requests/issues/1806/events
|
https://github.com/psf/requests/issues/1806
| 24,442,724 |
MDU6SXNzdWUyNDQ0MjcyNA==
| 1,806 |
Need a way to disable SSL verification for all requests calls
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/102495?v=4",
"events_url": "https://api.github.com/users/ssbarnea/events{/privacy}",
"followers_url": "https://api.github.com/users/ssbarnea/followers",
"following_url": "https://api.github.com/users/ssbarnea/following{/other_user}",
"gists_url": "https://api.github.com/users/ssbarnea/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ssbarnea",
"id": 102495,
"login": "ssbarnea",
"node_id": "MDQ6VXNlcjEwMjQ5NQ==",
"organizations_url": "https://api.github.com/users/ssbarnea/orgs",
"received_events_url": "https://api.github.com/users/ssbarnea/received_events",
"repos_url": "https://api.github.com/users/ssbarnea/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ssbarnea/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ssbarnea/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ssbarnea",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2013-12-17T19:06:18Z
|
2021-09-09T00:28:27Z
|
2013-12-18T19:14:40Z
|
CONTRIBUTOR
|
resolved
|
If you are using a 3rd party library or just have lots of requests calls inside your code it's almost impossible to to use `verify=False`.
We need a way to disable this for all requests calls, probbly an environment variable.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1806/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1806/timeline
| null |
completed
| null | null | false |
[
"We used to have a way to globally configure the library like this in v0.x. We intentionally and viciously removed it in v1.x. We've not added it back and we will not add anything similar back, especially not to disable SSL verification and especially not in an environment variable.\n",
"I'm waiting for @Lukasa to weigh in before closing this.\n",
"Agreed. At this time we have no plans to make it possible to disable SSL verification on a global scope. SSL verification is and should be on by default.\n"
] |
https://api.github.com/repos/psf/requests/issues/1805
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1805/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1805/comments
|
https://api.github.com/repos/psf/requests/issues/1805/events
|
https://github.com/psf/requests/issues/1805
| 24,422,782 |
MDU6SXNzdWUyNDQyMjc4Mg==
| 1,805 |
Getting a public Google Spreadsheet doc gives you the too many redirects error
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/131838?v=4",
"events_url": "https://api.github.com/users/teleyinex/events{/privacy}",
"followers_url": "https://api.github.com/users/teleyinex/followers",
"following_url": "https://api.github.com/users/teleyinex/following{/other_user}",
"gists_url": "https://api.github.com/users/teleyinex/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/teleyinex",
"id": 131838,
"login": "teleyinex",
"node_id": "MDQ6VXNlcjEzMTgzOA==",
"organizations_url": "https://api.github.com/users/teleyinex/orgs",
"received_events_url": "https://api.github.com/users/teleyinex/received_events",
"repos_url": "https://api.github.com/users/teleyinex/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/teleyinex/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/teleyinex/subscriptions",
"type": "User",
"url": "https://api.github.com/users/teleyinex",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2013-12-17T14:12:55Z
|
2021-09-09T00:10:04Z
|
2013-12-18T15:56:31Z
|
NONE
|
resolved
|
In version 2.0.0 when you try to get a document like this:
``` python
import requests
r = requests.get('https://docs.google.com/spreadsheet/ccc?key=0AsNlt0WgPAHwdEczcWduOXRUb1JUc1VGMmJtc2xXaXc&usp=sharing&output=csv')
print r.text
```
You get the CSV content, however with 2.1.0 you get the TooManyRedirects error (it tries 30 times), so it is not possible to download the document. Any idea where could it broken?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1805/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1805/timeline
| null |
completed
| null | null | false |
[
"Ooh, this is weird: I don't even see that request get off my system.\n",
"Which is clearly a problem with my wireshark, since it definitely does. We seem to be spinning in a redirect loop, which it's hard to imagine is the fault of a Requests version.\n",
"But it is, because all is well in 2.0.1.\n",
"Then, why it is working with 2.0.0 on my machine after creating a new virtualenv just for testing this? It works also with version 1.2, so I don't know what could be wrong. I tried this code from [stackoverflow](http://stackoverflow.com/a/16948852/1960596):\n\n``` python\n>>> session = requests.session()\n>>> r = session.get(url, allow_redirects=False)\n>>> r.headers.get('location')\n>>> for redirect in session.resolve_redirects(r, r.request):\n... print redirect.headers.get('location')\n```\n\nAnd you can see, how requests tries all the time to reload almost the same page. Could be something related that Google sets in the first call a cookie, and that cookie is not sent in the redirected request?\n",
"Urgh, this is a cookies problem. Observe:\n\n``` python\n>>> s = requests.Session()\n>>> r = s.get('https://docs.google.com/spreadsheet/ccc?key=0AsNlt0WgPAHwdEczcWduOXRUb1JUc1VGMmJtc2xXaXc&usp=sharing&output=csv')\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"requests\\sessions.py\", line 394, in get\n return self.request('GET', url, **kwargs)\n File \"requests\\sessions.py\", line 382, in request\n resp = self.send(prep, **send_kwargs)\n File \"requests\\sessions.py\", line 505, in send\n history = [resp for resp in gen] if allow_redirects else []\n File \"requests\\sessions.py\", line 99, in resolve_redirects\n raise TooManyRedirects('Exceeded %s redirects.' % self.max_redirects)\nrequests.exceptions.TooManyRedirects: Exceeded 30 redirects.\n>>> r = s.get('https://docs.google.com/spreadsheet/ccc?key=0AsNlt0WgPAHwdEczcOXRUb1JUc1VGMmJtc2xXaXc&usp=sharing&output=csv')\n>>> r.status_code\n200\n```\n\n@gazpachoking, @sigmavirus24, I choose you!\n",
"Wow, that was fast ;-) Thanks for finding the bug!\n",
"It would seem like cookies extracted during redirects aren't being stored on the session (I haven't dug into the code yet).\n",
"Possible solution in #1808.\n"
] |
https://api.github.com/repos/psf/requests/issues/1804
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1804/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1804/comments
|
https://api.github.com/repos/psf/requests/issues/1804/events
|
https://github.com/psf/requests/issues/1804
| 24,342,176 |
MDU6SXNzdWUyNDM0MjE3Ng==
| 1,804 |
requests seems to expect more content until it is forcibly closed by the remote host [Errno 10054]
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1075126?v=4",
"events_url": "https://api.github.com/users/ChristopherHackett/events{/privacy}",
"followers_url": "https://api.github.com/users/ChristopherHackett/followers",
"following_url": "https://api.github.com/users/ChristopherHackett/following{/other_user}",
"gists_url": "https://api.github.com/users/ChristopherHackett/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ChristopherHackett",
"id": 1075126,
"login": "ChristopherHackett",
"node_id": "MDQ6VXNlcjEwNzUxMjY=",
"organizations_url": "https://api.github.com/users/ChristopherHackett/orgs",
"received_events_url": "https://api.github.com/users/ChristopherHackett/received_events",
"repos_url": "https://api.github.com/users/ChristopherHackett/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ChristopherHackett/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ChristopherHackett/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ChristopherHackett",
"user_view_type": "public"
}
|
[
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
}
] |
closed
| true | null |
[] | null | 13 |
2013-12-16T10:55:58Z
|
2021-09-08T20:00:51Z
|
2015-01-19T09:24:12Z
|
NONE
|
resolved
|
I have noticed that requests seems to have some sort of issue with www.sainsburysbank.co.uk
It will block for 2+ mins despite having already received the content. Other tools like curl have no issue.
Below is the console and I have taken a WireShark dump (screenshot attached). If you need the pcapng file then let me know.
```
Python 2.7.3 (default, Apr 10 2012, 23:24:47) [MSC v.1500 64 bit (AMD64)] on win 32
>>>import requests
>>> requests.get('http://www.sainsburysbank.co.uk')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Python27\lib\site-packages\requests\api.py", line 55, in get
return request('get', url, **kwargs)
File "C:\Python27\lib\site-packages\requests\api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 279, in request
resp = self.send(prep, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 374, in send
r = adapter.send(request, **kwargs)
File "C:\Python27\lib\site-packages\requests\adapters.py", line 222, in send
r.content
File "C:\Python27\lib\site-packages\requests\models.py", line 550, in content
self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()
File "C:\Python27\lib\site-packages\requests\models.py", line 496, in generate
chunk = self.raw.read(chunk_size)
File "C:\Python27\lib\site-packages\requests\packages\urllib3\response.py", line 148, in read
return self._fp.read(amt)
File "C:\Python27\Lib\httplib.py", line 561, in read
s = self.fp.read(amt)
File "C:\Python27\Lib\socket.py", line 380, in read
data = self._sock.recv(left)
socket.error: [Errno 10054] An existing connection was forcibly closed by the remote host
```

|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1804/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1804/timeline
| null |
completed
| null | null | false |
[
"Gah, well. There's all kinds of fun here. First, Sainsbury's Bank is using chunked encoding to return the response to us. Not the end of the world, but a potential source of fun. According to Wireshark they have terminated it properly though, so that's something. Also, their headers are ill-formatted:\n\n```\nCache-Control: private\nDate: Mon, 16 Dec 2013 11:03:51 GMT\nContent-Type: text/html\n: \nSet-Cookie: <..snip..>\nSet-Cookie: <..snip..>\nContent-Encoding: gzip\nVary: Accept-Encoding\nTranfer-Encoding: chunked\n```\n\nThat's not a typo: they have an empty header key and value. Which is fun. I'm amazed the server allows them to send that.\n\nAnyway, the problem appears to be that `httplib` doesn't spot that chunked encoding is being used, presumably because the header parse falls over on that empty header key-value pair. I'm digging into httplib's parsing code now.\n",
"Yup, the parsing breaks. `httplib` calls `HTTPMessage.isheader`, which resolves to `rfc822.Message.isheader()`, which has the following code:\n\n``` python\n\"\"\"Determine whether a given line is a legal header.\n\nThis method should return the header name, suitably canonicalized.\nYou may override this method in order to use Message parsing on tagged\ndata in RFC 2822-like formats with special header formats.\n\"\"\"\ni = line.find(':')\nif i > 0:\n return line[:i].lower()\nreturn None\n```\n\nBecause the colon is the first character, `line.find(':')` returns `0`, which means this function returns `None`, which means `httplib` concludes this isn't a header line. This leads to the header being thrown away _and the header block being assumed to be over_.\n",
"Sainsbury's Bank is at fault here: that header line is not legal. RFC 2616 defines a header line like so:\n\n```\nmessage-header = field-name \":\" [ field-value ]\nfield-name = token\ntoken = 1*<any CHAR except CTLs or separators>\n```\n\nIt's clear that a field-name **must not** be zero length, which is exactly what we've been sent.\n",
"The same behaviour exists in 3.3 as well, and doesn't appear to have been reported on the stdlib bug tracker. I'll do that now.\n",
"Issue now being tracked [here](http://bugs.python.org/issue19996). We'll leave this open until we work out what we're doing in the core library.\n",
"Thanks @Lukasa - quick and detailed follow up. I'll keep an eye on the httplib bug. \n",
"@Lukasa how can I get some beer to you? Seriously, that was incredibly fast and awesome.\n",
"Thanks guys. This was actually quite a fun bug to find. =D\n\n@ChristopherHackett I forgot to mention, if you want to be able to work around this, it seems to be affecting only the `gzip` delivery of the page. It'll work fine if you unset the `Accept-Encoding` header we send, like this:\n\n``` python\nr = requests.get('http://www.sainsburysbank.co.uk/', headers={'Accept-Encoding': None})\n```\n",
"There is a patch against upstream for this, but there seems to be no movement. Regardless, it's CPython's problem now.\n",
"Amazingly, this patch did actually get merged!\n",
"WOAH\n",
"ah neat! i should update hamms to send back that bad header haha\n",
":cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/1803
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1803/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1803/comments
|
https://api.github.com/repos/psf/requests/issues/1803/events
|
https://github.com/psf/requests/issues/1803
| 24,341,307 |
MDU6SXNzdWUyNDM0MTMwNw==
| 1,803 |
Timeouts do not occur when stream == True.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/997507?v=4",
"events_url": "https://api.github.com/users/tylercrompton/events{/privacy}",
"followers_url": "https://api.github.com/users/tylercrompton/followers",
"following_url": "https://api.github.com/users/tylercrompton/following{/other_user}",
"gists_url": "https://api.github.com/users/tylercrompton/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tylercrompton",
"id": 997507,
"login": "tylercrompton",
"node_id": "MDQ6VXNlcjk5NzUwNw==",
"organizations_url": "https://api.github.com/users/tylercrompton/orgs",
"received_events_url": "https://api.github.com/users/tylercrompton/received_events",
"repos_url": "https://api.github.com/users/tylercrompton/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tylercrompton/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tylercrompton/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tylercrompton",
"user_view_type": "public"
}
|
[
{
"color": "5319e7",
"default": false,
"description": null,
"id": 67760318,
"name": "Fixed",
"node_id": "MDU6TGFiZWw2Nzc2MDMxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Fixed"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
] |
{
"closed_at": "2014-12-25T16:11:51Z",
"closed_issues": 1,
"created_at": "2013-12-19T08:24:14Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": null,
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/22",
"id": 517411,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/22/labels",
"node_id": "MDk6TWlsZXN0b25lNTE3NDEx",
"number": 22,
"open_issues": 0,
"state": "closed",
"title": "2.2",
"updated_at": "2014-12-25T16:11:51Z",
"url": "https://api.github.com/repos/psf/requests/milestones/22"
}
| 22 |
2013-12-16T10:38:27Z
|
2021-09-08T07:00:27Z
|
2014-10-05T17:27:28Z
|
NONE
|
resolved
|
As stated in the [docs](http://requests.readthedocs.org/en/latest/user/quickstart/#timeouts):
> `timeout` is not a time limit on the entire response download; rather, an exception is raised if the server has not issued a response for `timeout` seconds (more precisely, if no bytes have been received on the underlying socket for `timeout` seconds).
With that said, it should still be possible to timeout when `stream == True`.
The following command times out as expected.
```
$ python3 -c "import requests; response = requests.request('GET', 'http://httpbin.org/delay/3', timeout=1); print(response)"
```
The following command does not timeout. A timeout is expected.
```
$ python3 -c "import requests; response = requests.request('GET', 'http://httpbin.org/delay/3', stream=True, timeout=1); print(response)"
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1803/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1803/timeline
| null |
completed
| null | null | false |
[
"Thanks for raising this issue! Currently in stream mode we only apply the timeout to the connection attempt, not to the read. This will probably change in future (for instance, #1801 removes this limitation), but there's no strict timescale on this.\n",
"Once again @Lukasa is 100% correct. This isn't an issue or a bug in requests, just in your expectations of how it behaves.\n",
"I had the same issue today, and I try to work around it, but I can't find a clean way.\n\nIf the request is `stream=True` and the server just hangs, it will hang forever.\n\nI also tried with Pyhton 3 futures, it still hangs forever.\n\n``` python\nfrom concurrent.futures import ThreadPoolExecutor\nimport requests\n\n\ndef make_request():\n response = requests.get('http://example.com', stream=True, timeout=5)\n response_text = ''\n chunks = response.iter_content(chunk_size=2048, decode_unicode=True)\n for chunk in chunks:\n response_text += chunk\n return response_text\n\nwith ThreadPoolExecutor(max_workers=1) as executor:\n future = executor.submit(make_request)\n response_text = future.result(timeout=5)\n print(response_text)\n\n```\n\nThe timeout should be applied to every wait of data to be received on the socket, because the server hang can happen at anytime, before sending the headers, during, or after (when it is sending the body).\n\nThe only ways I see that could work are to start another process and kill after a certain time (which comes with IPC just to make an HTTP request), or to use signals, which I find not clean either.\n",
"@antoineleclair You could use multithreading, or the gevent library.\n",
"So, I chatted briefly via email with @sigmavirus24 about this. Actually changing the library so that timeouts apply to streaming data is a trivial change with a negative diff. The problem is that it's a backwards incompatible change.\n\nSo I'm in favour of fixing this, but not right now. Maybe as part of 2.2. I'm going to schedule this issue for that release unless the BDFL or @sigmavirus24 complains.\n",
"Makes sense. Thanks.\n",
"I have no complaint. I just want to be sure that you meant 2.2 and not 3.0.\n",
"I did mean 2.2, not 3.0. I'm open to arguments that it should be pushed back to 3.0, especially if they're well-reasoned and/or provided alongside a beverage/food based bribe.\n",
"Hm, this might explain some mysterious hangs-forever cases in an application I have. Is there something I could do at the application level to cause code like this ...\n\n```\ndef load_enough_content(resp):\n \"\"\"Load no more than 16KB of a response, in 1K chunks.\n Allow this process to take no more than 5 seconds in total.\n These numbers are arbitrarily chosen to defend against\n teergrubes (intentional or not) while still allowing us a\n useful amount of data for anomaly post-mortem.\"\"\"\n body = b\"\"\n start = time.time()\n for chunk in resp.iter_content(chunk_size=1024):\n body += chunk\n if len(body) > 16*1024 or time.time() - start > 5:\n resp.close()\n break\n return body\n```\n\n... to fulfill its contract of returning in no more than 5 seconds _no matter what_ the server does?\n",
"This issue was supposed to be fixed before 2.2. I'm not sure what's the state of it (requests==2.2.1 as of now).\n\nIf it is still not fixed, you could use another process and kill it if it times out.\n",
"#1935 should fix this. =)\n",
"Just wondering if there's any general idea on when this might be released. I just hit this bug in my code and it's happening in a daemon that's attempting to download a large file, so the daemon just sits there forever.\n",
"For anyone else who might be hung up on this, a friend mentioned to me that I might be able to use signals to workaround this problem. Sure enough you can, and it's pretty simple. See the example here: http://docs.python.org/2/library/signal.html#example\n",
"@andynemzek We're not on any release schedule, so it'll be released when we next release requests. =)\n",
"This was fixed and never closed. Woops! :cake: \n",
"OK posting on an old issue but it still applies to master branch. The test does not set stream=True, so I don't think it is actually testing what it should test. Correct?",
"@julienaubert Sorry, can you provide more explanation about what is going on here? The read timeout should still apply.",
"@Lukasa afaic, the test that was meant to test that timeout works when stream=True, is not correct, as it is not setting stream=True and the default value is stream=False. See: https://github.com/kennethreitz/requests/blob/master/tests/test_requests.py#L1943",
"@julienaubert The reason there isn't an explicit `stream` argument is due to the patch that fixed this (and added this test). #1935 removed the use of a different timeout object in `send` depending on the value of `stream`. That means Requests now *always* hands the read timeout so the `stream` flag isn't relevant in this regard.\r\n\r\nAre you experiencing issues with a current version of Requests?",
"My code is as following\r\n```\r\nr = requests.get(url, stream=True, timeout=10)\r\nwith open(filename, 'wb') as f:\r\n for chunk in r.iter_content(chunk_size=1024):\r\n if chunk:\r\n f.write(chunk)\r\n```\r\n\r\nI hope that the code will raise exception if it did not download file over before the timeout. While it did not work. \r\nDid I do anything wrong? The version of Requests is 2.18.4.",
"Timeouts are *per socket operation*, not total-time. If each read occurs faster than 10 seconds, then this will not trigger at timeout. If you want that behaviour you need to add it yourself by checking the clock between each read. =)",
"@Lukasa \r\nOk, I get it. Thanks~"
] |
https://api.github.com/repos/psf/requests/issues/1802
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1802/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1802/comments
|
https://api.github.com/repos/psf/requests/issues/1802/events
|
https://github.com/psf/requests/pull/1802
| 24,331,108 |
MDExOlB1bGxSZXF1ZXN0MTA4NDQzOTI=
| 1,802 |
Add as_http() method to print out HTTP response as a string
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4",
"events_url": "https://api.github.com/users/kevinburke/events{/privacy}",
"followers_url": "https://api.github.com/users/kevinburke/followers",
"following_url": "https://api.github.com/users/kevinburke/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kevinburke",
"id": 234019,
"login": "kevinburke",
"node_id": "MDQ6VXNlcjIzNDAxOQ==",
"organizations_url": "https://api.github.com/users/kevinburke/orgs",
"received_events_url": "https://api.github.com/users/kevinburke/received_events",
"repos_url": "https://api.github.com/users/kevinburke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kevinburke",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2013-12-16T05:41:53Z
|
2021-09-08T10:01:10Z
|
2013-12-16T14:08:23Z
|
CONTRIBUTOR
|
resolved
|
When a HTTP response doesn't contain the content that I expect, most times I find that the simplest debugging information I want is the actual HTTP content that was sent over the wire. Instead of having to construct all of this manually I thought it would be nice to add this information as a convenience `as_http()` request.
Example:
``` python
>>> r = requests.get('https://api.twilio.com/2010-04-01.json')
>>> print r.as_http()
HTTP/1.1 200 OK
content-length: 109
x-powered-by: AT-5000
connection: close
etag:
date: Mon, 16 Dec 2013 05:39:55 GMT
x-shenanigans: none
access-control-allow-origin: https://www.twilio.com
content-type: application/json
{"name":"2010-04-01","uri":"\/2010-04-01.json","subresource_uris":{"accounts":"\/2010-04-01\/Accounts.json"}}
```
This PR also passes through version information as an attribute on the `Response` object, and adds tests for this method.
I asked about this in IRC last week, didn't get much feedback, figured it wouldn't be too hard to just write it up and submit a PR.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1802/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1802/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1802.diff",
"html_url": "https://github.com/psf/requests/pull/1802",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1802.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1802"
}
| true |
[
"Gah, tests are broken due to a Python3 encoding issue. Let me know if this is a desirable feature for the library and I'll put the work in to fix the tests.\n",
"Thanks for the pull request @kevinburke!\n\nUnfortunately, I'm -1 on this feature. My problems with this feature are as follows:\n1. It's misleading. You say in your original description that \"the simplest debugging information I want is the actual HTTP content that was sent over the wire\". This is almost certainly true, and is exactly what any user of this feature would expect it to give. Unfortunately, that's not what this feature provides. Instead, it provides Requests' view of the response, which is not the same thing. Mostly this is stuff that's probably trivial: all the header keys are lower case, their order isn't preserved, etc. However, the fact of the matter is that we're not showing the actual response, we're reconstructing something that matches our view of the response. This means that there's nothing in the textual output you've provided that isn't in the `Response` object already. Any unusual stuff in the HTTP response won't be shown.\n \n This gets even worse if you consider extending this to serialize HTTP requests (which is almost certainly the next step here, being infinitely more useful in debugging weird failures), since Requests cannot possibly reconstruct the HTTP request: we just don't have all the information.\n2. It extends the API. We're fighting very hard to avoid extending the API if at all possible: as a result, any extension to the API automatically starts in negative territory, and must provide a very strong justification. =)\n\nSorry about this: thanks for doing the work, but I don't think we'll merge it. My advice to people who need this debugging functionality is either to get familiar with Wireshark or to consider using a service like [Runscope](https://www.runscope.com/).\n",
"I agree with 100% of what @Lukasa said. In fact, even curl would give you far more information to debug with. _Furthermore_, there are headers that we compute that we don't even set on the `PreparedRequest` object so adding this method to that would be useless too quite frankly. Unfortunately your best bet is something like `mitmproxy` or `wireshark` (locally) or a service like `Runscope` or `httpbin`.\n\nWith that said, the both of us are in agreement and I'm certain @kennethreitz will agree with both of us, so I'm going to close this.\n",
"Correct on both accounts :)\n\nVery nice implementation, though!\n",
"Fair enough. FWIW, here's the code we use to implement this at Twilio:\n\nhttps://gist.github.com/kevinburke/7990326\n\nIt also lets you translate from a request to curl, because this is pretty much the natural next step for us any time something breaks.\n",
"Requests used to have a \"as_curl\" method before 1.x but it was intentionally removed and the feature has been rejected a couple of times since though. \n\nThanks for documenting that in a gist though!\n"
] |
https://api.github.com/repos/psf/requests/issues/1801
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1801/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1801/comments
|
https://api.github.com/repos/psf/requests/issues/1801/events
|
https://github.com/psf/requests/pull/1801
| 24,329,258 |
MDExOlB1bGxSZXF1ZXN0MTA4NDM0NDU=
| 1,801 |
Connect timeout
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4",
"events_url": "https://api.github.com/users/kevinburke/events{/privacy}",
"followers_url": "https://api.github.com/users/kevinburke/followers",
"following_url": "https://api.github.com/users/kevinburke/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kevinburke",
"id": 234019,
"login": "kevinburke",
"node_id": "MDQ6VXNlcjIzNDAxOQ==",
"organizations_url": "https://api.github.com/users/kevinburke/orgs",
"received_events_url": "https://api.github.com/users/kevinburke/received_events",
"repos_url": "https://api.github.com/users/kevinburke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kevinburke",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 36 |
2013-12-16T04:10:00Z
|
2021-09-08T11:00:53Z
|
2014-08-23T21:22:02Z
|
CONTRIBUTOR
|
resolved
|
Per discussion with @sigmavirus24 in IRC, this PR kills the Timeout class and adds support for connect timeouts via a simple tuple interface:
``` python
r = requests.get('https://api.twilio.com', timeout=(3.05, 27))
```
Sometimes you try to connect to a dead host (this happens to us all the time, because of AWS) and you would like to fail fairly quickly in this case; the request has no chance of succeeding. However, once the connection succeeds, you'd like to give the server a fairly long time to process the request and return a response.
The attached tests and documentation have more information. Hope this is okay!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4",
"events_url": "https://api.github.com/users/kevinburke/events{/privacy}",
"followers_url": "https://api.github.com/users/kevinburke/followers",
"following_url": "https://api.github.com/users/kevinburke/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kevinburke",
"id": 234019,
"login": "kevinburke",
"node_id": "MDQ6VXNlcjIzNDAxOQ==",
"organizations_url": "https://api.github.com/users/kevinburke/orgs",
"received_events_url": "https://api.github.com/users/kevinburke/received_events",
"repos_url": "https://api.github.com/users/kevinburke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kevinburke",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1801/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1801/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1801.diff",
"html_url": "https://github.com/psf/requests/pull/1801",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1801.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1801"
}
| true |
[
"I'm generally +1 on the idea and the API. I haven't combed over the diff though and a quick skim makes me think there's more there than needs to be. Also, did I see \"TimeoutSauce\"? Is there any reason to not simply call it \"Timeout\"?\n",
"In principle I'm +0.5 on this. It's a nice extension. However, as memory serves, @kennethreitz considered this approach and ended up not taking it. It might be that now that someone has written the code he'll be happy with it, but equally, it might be that he doesn't like the approach.\n\n@sigmavirus24 Yeah, `TimeoutSauce` is used for the urllib3 `Timeout` object, because we have our own `Timeout` object (an exception).\n",
"@Lukasa As I understood it @kennethreitz was more concerned with the addition (and requirement) of the `Timeout` class from urllib3. And thanks for clearing up the naming, I still think there has to be a better name for it. (I'm shaving a yak, I know)\n",
"@sigmavirus24 That was my understanding from IRC as well.\n",
"@kevinburke I discussed it with you on IRC so the likelihood is that you came to that conclusion through me.\n",
"I'd rather us have our current interface and just have it support the intended use case, personally. You don't have to tweak with a bunch of timeouts when you're using a web browser :)\n",
"The main reason for this is because we have the tuple interface in another spot (e.g. an expansion of the file upload api) and it's my least favorite part of the API. \n",
"I'm not against the class either, I just need to brew about it for a bit, basically.\n",
"I was under the impression that your concern was for the streaming API, not for more standard uses like GET? \n",
"Actually I care more about the connect timeout than the streaming case,\nthough we use requests with both at Twilio and would like both changes to\ngo through.\n\nThe use case is that if I want to wait 30 seconds for a request to complete\n(for a server to complete processing it and return), that shouldn't also\nimply that I need to wait 30 seconds for requests to tell me I can't\nestablish a connection to the host, which is currently true for remote\nhosts that are dead (ones where your machine doesn't immediately send a TCP\nReset, like http://google.com:81 or 10.255.255.1).\n\nThere are other possible interfaces - adding a `connect_timeout` parameter\nfor example, or similar.\n\nOn Wednesday, December 18, 2013, Kenneth Reitz wrote:\n\n> I was under the impression that your concern was for the streaming API,\n> not for more standard uses like GET?\n> \n> —\n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/pull/1801#issuecomment-30853501\n> .\n\n## \n\n---\n\nKevin Burke | 415-723-4116 | www.twilio.com\n",
"Perhaps we can make this an option on the connection adapter itself. That was the intention for this interface.\n",
"That's no unreasonable, the `HTTPAdapter` could happily take these as parameters/properties.\n",
"Yeah the HTTPAdapter seems like a good place for this. To try and avoid more go-rounds, what should the interface look like on the Adapter? so far we've proposed \n- a `Timeout` class - seems the least popular\n- a tuple `(connect, read)` which is also not implemented in very many other places in the library\n- separate parameters - a `timeout` parameter would apply to both, a `connect_timeout` param to the connect and a `read_timeout` param to the read.\n",
"Ping :)\n",
"Hmm. I'd rather not do a `Timeout` class, I'd prefer the optional tuple I think. But hold off until @kennethreitz gets another chance to look at this.\n",
"I found this because my Flask app currently tries to connect to a dead host for which I've set a timeout of 5 seconds but it takes forever to actually time out. What is the status on this one?\n",
"Right now we do apply the timeout to connections, we just don't have it configurable. Are you using `stream=True`?\n",
"Hadn't been using `stream=True`, I guess I should use it if only to get the timeout? Just did a quick test with the following script (the host in the test, http://pillbox.nlm.nih.gov, is still down), and with `stream=True` it does timeout after 5 seconds, without it runs anywhere from 20 to 120 seconds—not what I would expect.\n\n```\nimport requests\n\nurl = 'http://pillbox.nlm.nih.gov/assets/large/abc.jpg'\nrequests.get(url, timeout=5)\n```\n\nUsing requests 2.2.1 with Python 3.3.3\n",
"That's weird, it works fine on Python 2.7. Seems like a Python 3 bug, because I can reproduce your problem in 3.4. @kevinburke, are you aware of any timeout bugs in urllib3?\n",
"I believe connection timeouts would be retried 3 times as they represent a\nfailure to connect to the server, but not at my computer at the moment.\n\nOn Saturday, April 5, 2014, Cory Benfield [email protected] wrote:\n\n> That's weird, it works fine on Python 2.7. Seems like a Python 3 bug,\n> because I can reproduce your problem in 3.4. @kevinburkehttps://github.com/kevinburke,\n> are you aware of any timeout bugs in urllib3?\n> \n> ## \n> \n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/pull/1801#issuecomment-39640149\n> .\n\n## \n\n## \n\nKevin Burke | Twilio\nphone: 925.271.7005 | kev.inburke.com\n",
"Weird, I'm not seeing that happen in Python 2.7 then. We'll need to investigate this inconsistency.\n",
"Hm, we definitely need a _sane_ way of configuring both timeouts, and I think that the tuple-approach is nice. `requests` is for humans and humans don't really like all that extra classes and additional keyword arguments mess.\n\nBy the way, speaking about hosts that are down, I just did a test with\n\n```\nimport requests\n\nrequests.get('http://google.com:81/', timeout=5)\n```\n\nand I get 35 seconds both on Python 2.7 and 3.3 (requests 2.2.1). That's not what I would expect from `timeout=5`… And with `timeout=1` I get 7 seconds. I mean, we really need a _sane_ interface…\n",
"@kirelagin I see the same behaviour as you, but I believe this to be a socket-level problem. Dumping Wireshark shows that my OS X box makes five independent attempts to connect. Each of _those_ connection attempts only retransmits for five seconds.\n\nI suspect this behaviour comes down to the fact that `httplib` uses `socket.create_connection`, not `socket.connect()`. Python's socket module documentation [has this to say](https://docs.python.org/2/library/socket.html#socket.create_connection) (emphasis mine):\n\n> This is a higher-level function than socket.connect(): if host is a non-numeric hostname, it will try to resolve it for both AF_INET and AF_INET6, and _then try to connect to all possible addresses in turn_ until a connection succeeds.\n\nCloser examination of Wireshark trace shows that we are definitely hitting different addresses: five of them.\n\nIf we wanted to 'fix' this behaviour (more on those scare quotes in a minute) it would be incredibly complicated: we'd end up needing to either circumvent `httplib`'s connection logic or use signals or some form of interrupt-processing to return control to us after a given timeout.\n\nMore generally, I don't know what 'fix' would mean here. This behaviour is naively surprising (you said we'd time out after 5 seconds but it took 30!), but makes sense once you understand what's happening. I believe that this is an 'expert friendly' interface (thanks to Nick Coghlan for the term), so I'm prepared to believe that we should change it. With that said, if we do change it, how do we give the 'I want to wait 5 seconds for each possible target' behaviour to expert users?\n",
"There's a retries branch of urllib3 that would let you specify exactly this.\n\nOn Saturday, May 10, 2014, Cory Benfield [email protected] wrote:\n\n> @kirelagin https://github.com/kirelagin I see the same behaviour as\n> you, but I believe this to be a socket-level problem. Dumping Wireshark\n> shows that my OS X box makes five independent attempts to connect. Each of\n> _those_ connection attempts only retransmits for five seconds.\n> \n> I suspect this behaviour comes down to the fact that httplib uses\n> socket.create_connection, not socket.connect(). Python's socket module\n> documentation has this to sayhttps://docs.python.org/2/library/socket.html#socket.create_connection(emphasis mine):\n> \n> This is a higher-level function than socket.connect(): if host is a\n> non-numeric hostname, it will try to resolve it for both AF_INET and\n> AF_INET6, and _then try to connect to all possible addresses in turn_until a connection succeeds.\n> \n> Closer examination of Wireshark trace shows that we are definitely hitting\n> different addresses: five of them.\n> \n> If we wanted to 'fix' this behaviour (more on those scare quotes in a\n> minute) it would be incredibly complicated: we'd end up needing to either\n> circumvent httplib's connection logic or use signals or some form of\n> interrupt-processing to return control to us after a given timeout.\n> \n> More generally, I don't know what 'fix' would mean here. This behaviour is\n> naively surprising (you said we'd time out after 5 seconds but it took\n> 30!), but makes sense once you understand what's happening. I believe that\n> this is an 'expert friendly' interface (thanks to Nick Coghlan for the\n> term), so I'm prepared to believe that we should change it. With that said,\n> if we do change it, how do we give the 'I want to wait 5 seconds for each\n> possible target' behaviour to expert users?\n> \n> —\n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/pull/1801#issuecomment-42735878\n> .\n\n## \n\n## \n\nKevin Burke | Twilio\nphone: 925.271.7005 | kev.inburke.com\n",
"Uh, is there? Where? I can't see it...\n",
"Ah, yeah, that makes sense.\n\n> five independent attempts\n\n<!-- -->\n\n> different addresses: five of them.\n\nWait, google.com has _six_ A records (plus one AAAA, that's why I see 35=5*7 seconds, I guess).\n\nAnyway, I just checked Firefox and it is trying exactly one IPv6 and exactly one IPv4 address. I believe, that multiple DNS records are mostly used for load balancing, not fault-tolerance, so attempting only the first address by default makes most sense. Having an option to control this is useful, of course.\n",
"Is there need to be able to specify both timeouts independently? When I specify the timeout, I'm thinking of it as \"I don't want this line of code to run longer than x seconds\", I don't care which part of the connection takes how long.\n\nIt seems to me this would be a true \"human\" interpretation and could be implemented without having to rely on urllib3 by an internal timer that kills the request if it hasn't returned within the timeout.\n",
"@lukasa https://github.com/shazow/urllib3/pull/326 ; though now that I read it more carefully, if the OS itself is trying each DNS record in turn then there's not much that can be done. That pull request lets you specify the number of times you would like to retry a connection failure, whether a timeout or an error.\n",
"@p2 Sadly computing the wall clock time for an HTTP request remains incredibly difficult. Mostly, our tools for determining the amount of time used for system calls like DNS resolution, establishing a socket, and sending data (and then passing this information to the necessary places, and subtracting these from a total amount of time) remains difficult.\n\nMy suggestion would be to run your HTTP request in a separate thread, then use a timer to cancel the thread if it takes longer than a stated amount of time to return a value. gevent can be used for this: http://www.gevent.org/gevent.html#timeouts\n\nSorry I can't be more helpful :(\n",
"@kevinburke That is kind of what I do now, I was just wondering if this would make sense as a default approach for _requests_ as well. I personally don't have a need to specify individual timeouts, but that assumption may be too naïve.\n"
] |
https://api.github.com/repos/psf/requests/issues/1800
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1800/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1800/comments
|
https://api.github.com/repos/psf/requests/issues/1800/events
|
https://github.com/psf/requests/issues/1800
| 24,304,647 |
MDU6SXNzdWUyNDMwNDY0Nw==
| 1,800 |
Move off of charade to erikrose/chardet
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"color": "e102d8",
"default": false,
"description": null,
"id": 117745,
"name": "Planned",
"node_id": "MDU6TGFiZWwxMTc3NDU=",
"url": "https://api.github.com/repos/psf/requests/labels/Planned"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
] | null | 1 |
2013-12-15T03:16:53Z
|
2021-09-09T00:28:27Z
|
2013-12-18T15:52:55Z
|
CONTRIBUTOR
|
resolved
|
@dan-blanchard has taken over maintenance of [chardet](erikrose/chardet). We've merged charade into chardet so that chardet will now work on python2 and python3 without needing separate versions. I'd like to get rid of charade since it's been a responsibility that I had neglected and has been more trouble than it has been worth.
We should start using chardet proper again when they have it ready. (In all candor it should already be ready)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1800/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1800/timeline
| null |
completed
| null | null | false |
[
"A new release of chardet is available with all the features we ever needed from charade. I'll throw together a PR to close this issue in a few minutes.\n"
] |
https://api.github.com/repos/psf/requests/issues/1799
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1799/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1799/comments
|
https://api.github.com/repos/psf/requests/issues/1799/events
|
https://github.com/psf/requests/pull/1799
| 24,298,739 |
MDExOlB1bGxSZXF1ZXN0MTA4MzI4ODA=
| 1,799 |
Issue a warning when using apparent_encoding.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/78768?v=4",
"events_url": "https://api.github.com/users/martinblech/events{/privacy}",
"followers_url": "https://api.github.com/users/martinblech/followers",
"following_url": "https://api.github.com/users/martinblech/following{/other_user}",
"gists_url": "https://api.github.com/users/martinblech/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/martinblech",
"id": 78768,
"login": "martinblech",
"node_id": "MDQ6VXNlcjc4NzY4",
"organizations_url": "https://api.github.com/users/martinblech/orgs",
"received_events_url": "https://api.github.com/users/martinblech/received_events",
"repos_url": "https://api.github.com/users/martinblech/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/martinblech/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/martinblech/subscriptions",
"type": "User",
"url": "https://api.github.com/users/martinblech",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2013-12-14T20:11:04Z
|
2021-09-08T22:01:06Z
|
2013-12-15T04:20:07Z
|
NONE
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/78768?v=4",
"events_url": "https://api.github.com/users/martinblech/events{/privacy}",
"followers_url": "https://api.github.com/users/martinblech/followers",
"following_url": "https://api.github.com/users/martinblech/following{/other_user}",
"gists_url": "https://api.github.com/users/martinblech/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/martinblech",
"id": 78768,
"login": "martinblech",
"node_id": "MDQ6VXNlcjc4NzY4",
"organizations_url": "https://api.github.com/users/martinblech/orgs",
"received_events_url": "https://api.github.com/users/martinblech/received_events",
"repos_url": "https://api.github.com/users/martinblech/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/martinblech/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/martinblech/subscriptions",
"type": "User",
"url": "https://api.github.com/users/martinblech",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1799/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1799/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1799.diff",
"html_url": "https://github.com/psf/requests/pull/1799",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1799.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1799"
}
| true |
[
"I'm something in the region of -0 on this. @sigmavirus24?\n",
"I'm moreso in the half-open interval [-1, -0) on this if only because the warning will likely only be displayed once per detected encoding and not once per response if I remember the way they work correctly.\n\nAn example is like so:\n\n``` python\ndef foo(word):\n warnings.warn(word)\n\nfoo('biz')\nfoo('biz')\nfoo('biz')\nfoo('bar')\nfoo('bar')\nfoo('baz')\nfoo('baz')\nfoo('baz')\n```\n\nWill only produce 3 warnings and will not achieve what you want @martinblech \n",
"You're right @sigmavirus24, that's not what I want. I didn't know warnings worked like that, thanks for teaching me something new!\n\nHow about doing `log.warning()` instead of `warnings.warn()`? Here's how it would look on the REPL:\n\n``` python\n>>> import logging\n>>> logging.basicConfig(level=logging.INFO)\n>>> import requests\n>>> print requests.get('http://lyrics.wikia.com/api.php', params={'artist': 'Javiera Mena', 'song': 'Al Siguiente Nivel', 'fmt': 'xml'}).text\nINFO:requests.packages.urllib3.connectionpool:Starting new HTTP connection (1): lyrics.wikia.com\nWARNING:requests.models:No encoding found in HTTP headers, using apparent \"TIS-620\".\n<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n<LyricsResult>\n <artist>Javiera Mena</artist>\n <song>Al Siguiente Nivel</song>\n <lyrics>Esa visiรณn estรก\nYo la veo venir\nSe me aparece\nNo tengo la nocion\nSi en mi generacion\nSe sentirรก\n\nSi te vas no quiebro fuerza[...]</lyrics>\n <url>http://lyrics.wikia.com/Javiera_Mena:Al_Siguiente_Nivel</url>\n <page_namespace>0</page_namespace>\n <page_id>1626596</page_id>\n <isOnTakedownList>0</isOnTakedownList>\n</LyricsResult>\n```\n",
"First, calling it a warning is far more alarmist than making it an INFO level. Second, requests removed all of its logging statements in v1.0 and has intentionally kept them out since. \n",
"I see. Then both `warnings.warn` and `logger.warning` are out of question.\n"
] |
|
https://api.github.com/repos/psf/requests/issues/1798
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1798/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1798/comments
|
https://api.github.com/repos/psf/requests/issues/1798/events
|
https://github.com/psf/requests/issues/1798
| 24,296,455 |
MDU6SXNzdWUyNDI5NjQ1NQ==
| 1,798 |
Add params in custom authentication
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/554079?v=4",
"events_url": "https://api.github.com/users/juanriaza/events{/privacy}",
"followers_url": "https://api.github.com/users/juanriaza/followers",
"following_url": "https://api.github.com/users/juanriaza/following{/other_user}",
"gists_url": "https://api.github.com/users/juanriaza/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/juanriaza",
"id": 554079,
"login": "juanriaza",
"node_id": "MDQ6VXNlcjU1NDA3OQ==",
"organizations_url": "https://api.github.com/users/juanriaza/orgs",
"received_events_url": "https://api.github.com/users/juanriaza/received_events",
"repos_url": "https://api.github.com/users/juanriaza/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/juanriaza/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/juanriaza/subscriptions",
"type": "User",
"url": "https://api.github.com/users/juanriaza",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2013-12-14T18:05:19Z
|
2021-09-09T00:28:27Z
|
2013-12-14T18:44:06Z
|
CONTRIBUTOR
|
resolved
|
Modifying the request params in a custom authentication implementation raises a `AttributeError: 'PreparedRequest' object has no attribute 'params'`.
Example:
``` python
class CustomAuth(requests.auth.AuthBase):
def __call__(self, r):
r.params['key'] = 'my_api_key'
return r
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1798/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1798/timeline
| null |
completed
| null | null | false |
[
"Thanks for raising this issue!\n\nThis fails because you aren't modifying a `Request` object, you're modifying a `PreparedRequest` object. That only has a `url` property. You'll have to edit that manually. The API documentation for `PreparedRequest` objects is [here](http://docs.python-requests.org/en/latest/api/#requests.PreparedRequest).\n",
"A quick fix,\n\n``` python\nclass CustomAuth(requests.auth.AuthBase):\n def __call__(self, r):\n r.prepare_url(r.url, {'key': 'my_api_key'})\n return r\n```\n\n`PreparedRequest` should have a params property imho. \n",
"`PreparedRequest` should not have a params property. The `PreparedRequest` object is intended to be directly serialised into HTTP. That means no assembling or encoding URLs.\n",
"Everything @Lukasa said has been correct and as far as I'm concerned this is not a valid issue. Your concerns are duly noted and appreciated. Don't hesitate to open further issues if you think you've found other bugs.\n",
"@Lukasa @sigmavirus24 the method `prepare_url` from `PreparedRequest` [already checks](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L375) if the given url contains previous params. I don't understand why `PreparedRequest` shouldn't have a params property as they are serializable into a HTTP req when the method `prepare` is called.\n",
"The `prepare` method is called on the `PreparedRequest` instance long before the auth handler receives it. We're not going to call `prepare` obsessively because we do not have to. That's why there's no `params` attribute.\n"
] |
https://api.github.com/repos/psf/requests/issues/1797
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1797/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1797/comments
|
https://api.github.com/repos/psf/requests/issues/1797/events
|
https://github.com/psf/requests/issues/1797
| 24,293,616 |
MDU6SXNzdWUyNDI5MzYxNg==
| 1,797 |
Socket timeout exception thrown instead of RequestException
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1663259?v=4",
"events_url": "https://api.github.com/users/joepie91/events{/privacy}",
"followers_url": "https://api.github.com/users/joepie91/followers",
"following_url": "https://api.github.com/users/joepie91/following{/other_user}",
"gists_url": "https://api.github.com/users/joepie91/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/joepie91",
"id": 1663259,
"login": "joepie91",
"node_id": "MDQ6VXNlcjE2NjMyNTk=",
"organizations_url": "https://api.github.com/users/joepie91/orgs",
"received_events_url": "https://api.github.com/users/joepie91/received_events",
"repos_url": "https://api.github.com/users/joepie91/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/joepie91/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joepie91/subscriptions",
"type": "User",
"url": "https://api.github.com/users/joepie91",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2013-12-14T15:13:52Z
|
2021-09-09T00:28:28Z
|
2013-12-14T16:05:48Z
|
NONE
|
resolved
|
When running the following code....
``` python
response = requests.get(result, timeout=10)
```
Where `result` was http://63.224.204.109:8085/browse/category/allbooks, the following exception was thrown after hanging for a few minutes (and apparently disregarding the `timeout` argument?):
```
Traceback (most recent call last):
File "find-calibre.py", line 34, in <module>
response = requests.get(result, timeout=10)
File "/usr/lib/python2.7/site-packages/requests/api.py", line 55, in get
return request('get', url, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 357, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 460, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/adapters.py", line 367, in send
r.content
File "/usr/lib/python2.7/site-packages/requests/models.py", line 633, in content
self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()
File "/usr/lib/python2.7/site-packages/requests/models.py", line 572, in generate
decode_content=True):
File "/usr/lib/python2.7/site-packages/requests/packages/urllib3/response.py", line 225, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "/usr/lib/python2.7/site-packages/requests/packages/urllib3/response.py", line 174, in read
data = self._fp.read(amt)
File "/usr/lib64/python2.7/httplib.py", line 561, in read
s = self.fp.read(amt)
File "/usr/lib64/python2.7/socket.py", line 380, in read
data = self._sock.recv(left)
socket.timeout: timed out
```
Shouldn't this be caught by Requests, and passed on as a RequestException instead?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1797/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1797/timeline
| null |
completed
| null | null | false |
[
"Same issue is already opened.\nSee https://github.com/kennethreitz/requests/issues/1787\n",
"As @daftshady points out, please check the issues tracker before opening a new issue.\n",
"Thanks for opening this @joepie91 and thank you @daftshady for taking the time to find the already opened issue.\n"
] |
https://api.github.com/repos/psf/requests/issues/1796
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1796/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1796/comments
|
https://api.github.com/repos/psf/requests/issues/1796/events
|
https://github.com/psf/requests/pull/1796
| 24,285,315 |
MDExOlB1bGxSZXF1ZXN0MTA4MjgyMDY=
| 1,796 |
Expose apparent_encoding_confidence.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/78768?v=4",
"events_url": "https://api.github.com/users/martinblech/events{/privacy}",
"followers_url": "https://api.github.com/users/martinblech/followers",
"following_url": "https://api.github.com/users/martinblech/following{/other_user}",
"gists_url": "https://api.github.com/users/martinblech/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/martinblech",
"id": 78768,
"login": "martinblech",
"node_id": "MDQ6VXNlcjc4NzY4",
"organizations_url": "https://api.github.com/users/martinblech/orgs",
"received_events_url": "https://api.github.com/users/martinblech/received_events",
"repos_url": "https://api.github.com/users/martinblech/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/martinblech/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/martinblech/subscriptions",
"type": "User",
"url": "https://api.github.com/users/martinblech",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2013-12-14T03:50:35Z
|
2021-09-08T22:01:06Z
|
2013-12-14T20:11:51Z
|
NONE
|
resolved
|
When there's no encoding in the headers and it has to be autodetected from the
contents using charade, the Response object should expose the confidence value
so the user can decide whether the value of Response.text should be trusted or
not.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/78768?v=4",
"events_url": "https://api.github.com/users/martinblech/events{/privacy}",
"followers_url": "https://api.github.com/users/martinblech/followers",
"following_url": "https://api.github.com/users/martinblech/following{/other_user}",
"gists_url": "https://api.github.com/users/martinblech/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/martinblech",
"id": 78768,
"login": "martinblech",
"node_id": "MDQ6VXNlcjc4NzY4",
"organizations_url": "https://api.github.com/users/martinblech/orgs",
"received_events_url": "https://api.github.com/users/martinblech/received_events",
"repos_url": "https://api.github.com/users/martinblech/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/martinblech/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/martinblech/subscriptions",
"type": "User",
"url": "https://api.github.com/users/martinblech",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1796/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1796/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1796.diff",
"html_url": "https://github.com/psf/requests/pull/1796",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1796.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1796"
}
| true |
[
"Thanks for this @martinblech!\n\nUnfortunately, I'm opposed to this change. Not anything to do with the code itself, but I don't think adding this parameter to the `Response` object adds sufficient value to justify including it. If people _really_ want this information they can get it (by running `charade`/`chardet` themselves), and for a significant majority of people this information is simply unnecessary. Requests got where it was today by hiding complexity from users, and I'm in favour of that trend continuing. @sigmavirus24?\n",
"@Lukasa I understand and I love Requests' simplicity. Still, I think it's not safe for Requests to rely so heavily on charade as default and hide it completely from the user.\n\nHere's some examples of what happens to common German, Spanish, French and Portuguese words after being \"charadized\":\n\n``` python\n>>> def charadize(s):\n... encoded = s.encode('utf8')\n... encoding = charade.detect(encoded)['encoding']\n... return encoded.decode(encoding)\n...\n>>> print charadize(u'Sonatine für Gitarre')\nSonatine fĂźr Gitarre\n>>> print charadize(u'Canción de cuna')\nCanciรณn de cuna\n>>> print charadize(u'Martín Blech')\nMartĂn Blech\n>>> print charadize(u'Chatêau')\nChatĂŞau\n>>> print charadize(u'Licença')\nLicenรงa\n```\n",
"That's not really a very fair test of Charade. Charade works by analyzing byte frequency and byte sequences, to work out what an encoding is likely to be. Using a single word puts the frequency of certain byte sequences way off, which will almost always cause charade to make an error. Consider putting a decent quantity of text through it in different encodings, or in the same encoding in different languages, to get a better idea of its effectiveness.\n",
"It might have looked like I was, and I apologize for that, but I'm not judging the quality of the Charade library itself. If I was, I wouldn't use my previous example as a benchmark, by no means.\n\nI'm just trying to illustrate that charade (or any other charset detection library, for that matter) is perhaps more likely to fail than initially assumed when integrating it to Requests, and that applying it without any warning at all might not be a safe default after all.\n\nAt this moment I am looking at a database with ~100k song lyrics snippets retrieved from the LyricWiki API, and most non-English examples look like they've been affected by this glitch. It's no big deal, we can retrieve them again over the next couple of days, but it certainly would have helped to see a warning in the console that read `requests: no encoding found in HTTP headers, using autodetected <encoding name>`.\n",
"Maybe something like https://github.com/kennethreitz/requests/pull/1799 will do?\n"
] |
https://api.github.com/repos/psf/requests/issues/1795
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1795/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1795/comments
|
https://api.github.com/repos/psf/requests/issues/1795/events
|
https://github.com/psf/requests/issues/1795
| 24,267,957 |
MDU6SXNzdWUyNDI2Nzk1Nw==
| 1,795 |
Incorrect encoding detected for UTF-8 XML response.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/78768?v=4",
"events_url": "https://api.github.com/users/martinblech/events{/privacy}",
"followers_url": "https://api.github.com/users/martinblech/followers",
"following_url": "https://api.github.com/users/martinblech/following{/other_user}",
"gists_url": "https://api.github.com/users/martinblech/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/martinblech",
"id": 78768,
"login": "martinblech",
"node_id": "MDQ6VXNlcjc4NzY4",
"organizations_url": "https://api.github.com/users/martinblech/orgs",
"received_events_url": "https://api.github.com/users/martinblech/received_events",
"repos_url": "https://api.github.com/users/martinblech/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/martinblech/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/martinblech/subscriptions",
"type": "User",
"url": "https://api.github.com/users/martinblech",
"user_view_type": "public"
}
|
[
{
"color": "0b02e1",
"default": false,
"description": null,
"id": 191274,
"name": "Contributor Friendly",
"node_id": "MDU6TGFiZWwxOTEyNzQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly"
}
] |
closed
| true | null |
[] | null | 13 |
2013-12-13T19:43:19Z
|
2021-09-09T00:28:23Z
|
2013-12-22T16:01:02Z
|
NONE
|
resolved
|
Requests decodes the UTF-8 response as TIS-620:
``` python
>>> response = requests.get('http://lyrics.wikia.com/api.php', params={'artist': 'Javiera Mena', 'song': 'Al Siguiente Nivel', 'fmt': 'xml'})
>>> print response.text
<?xml version="1.0" encoding="UTF-8"?>
<LyricsResult>
[...]
<lyrics>Esa visiรณn estรก
[...]
</LyricsResult>
```
Forcing UTF-8 yields the correct value, "Esa visión está", instead of "Esa visiรณn estรก":
``` python
>>> print response.content.decode('utf8')
<?xml version="1.0" encoding="UTF-8"?>
[...]
<lyrics>Esa visión está
[...]
</LyricsResult>
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1795/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1795/timeline
| null |
completed
| null | null | false |
[
"Requests is a HTTP library: we only consider HTTP headers when attempting to work out text encodings. We think it sets a dangerous precedent to attempt to parse body content. If you want to do something specific for a given content type, you have to do it yourself.\n",
"@Lukasa Absolutely agree: I wouldn't expect Requests to parse the XML declaration to read the encoding, that completely out of scope for a HTTP library.\n\nI wonder, though, what made Requests autodetect a Thai encoding for an XML document with some Spanish content. It's latin alphabet after all.\n\nIf the autodetection feature is not reliable enough, maybe the documentation should discourage the users from using `response.text` at all and use `response.content` instead. Or made an opt-in feature instead of default?\n",
"Ah, good spot. We have mixed feelings about auto detection. It's hard to do right, and no matter how well we do there will always be edge cases which cause problems.\n\nRealistically, there's not a lot we can do about that beyond warn people. Auto detection _usually_ works well enough for people. I'm interested to see what @sigmavirus24 thinks. \n",
"Could you print `r.headers` and `r.encoding`? Charade is the fallback when the content-type has no charset specified and charade can actually (typically) detect utf-8 very reliably. That said, it seems like charade and python-chardet will be merging soon and have a real maintainer after that (which is more than I can say for charade right now).\n",
"Sure! Here's the output:\n\n``` python\n>>> print response.headers\nCaseInsensitiveDict({'content-length': '319', 'x-served-by': 'cache-s24-SJC, cache-mia1327-MIA', 'x-cache': 'MISS, MISS', 'x-content-type-options': 'nosniff', 'content-encoding': 'gzip', 'set-cookie': 'wikia_beacon_id=7PuM__S5b5; domain=.wikia.com; path=/; expires=Thu, 12 Jun 2014 03:21:31 GMT;, Geo={%22city%22:%22FIXME%22%2C%22country%22:%22AR%22%2C%22continent%22:%22SA%22}; path=/', 'accept-ranges': 'bytes', 'x-timer': 'S1386991290.966581345,VS0,VS38,VE123,VE162', 'vary': 'Accept-Encoding', 'x-cacheable': 'YES', 'server': 'Apache', 'connection': 'keep-alive', 'x-cache-hits': '0, 0', 'cache-control': 'max-age=3600, s-maxage=3600, public', 'date': 'Sat, 14 Dec 2013 03:21:31 GMT', 'x-age': '0', 'content-type': 'application/xml'})\n>>> print response.encoding\nNone\n```\n",
"I think here's the interesting part:\n\n``` python\n>>> charade.detect(response.content)\n{'confidence': 0.99, 'encoding': 'TIS-620'}\n```\n\n`charade` is detecting TIS-620 with a rather high confidence. It's something I've seen before when using `charade` outside of `requests`:\n- https://github.com/sigmavirus24/charade/issues/24\n- https://github.com/sigmavirus24/charade/issues/25\n",
"Does it make sense to at least expose the confidence value for the autodetected encoding, as in https://github.com/kennethreitz/requests/pull/1796?\nIt won't be of much help in this particular case because the confidence value is unusually high, but I think it's a good thing to have as a user of the library.\n",
"I guess charade is in worse shape than I thought and that's mostly my fault. I don't have the interest or the time to maintain it and frankly there's data that's needed that just doesn't exist. This is why I'm hopeful a merge with python-chardet will fix some things.\n\n> Does it make sense to at least expose the confidence value for the autodetected encoding, as in #1796?\n\nI don't think so. _Most_ users don't run into charade frankly. That said, if others are interested, I'd rather we only calculate the encoding once and store the result of `charade.detect` once and just reference that. It's unfortunately computationally expensive.\n",
"To recap:\n- When there's no explicit charset in the response headers, `Response.text` falls back silently to an auto detected charset that in many cases is not the right one.\n- From a user's perspective, AFAIK there's no obvious way to tell whether an explicit or an automatic charset was used to decode the contents.\n- Exposing the charset detection confidence is [not a good idea](https://github.com/kennethreitz/requests/pull/1796#issuecomment-30563831).\n- Issuing or logging a warning is [not either](https://github.com/kennethreitz/requests/pull/1799).\n",
"@martinblech Point 2 isn't correct. If `response.encoding` is `None`, an automatic charset is used.\n\nThe biggest problem here is that Requests can only do so much to protect users. Text encoding is a horrible minefield of pain, and `charade`/`chardet` usually helps us solve _some_ of that pain. However, users who are in a position to know better should take a more active role in choosing text encodings. For instance, on HTML/XML pages, people should look for an `encoding` directive in the content.\n",
"@Lukasa Got it. It wasn't obvious to me but it's [right there](http://requests.readthedocs.org/en/latest/api/?highlight=text#requests.Response.text) in the docs: \"If Response.encoding is None, encoding will be guessed using charade.\" My bad.\n\nMy point is, I'm not sure if falling back to `charade`/`chardet` solves more pain than it creates. It didn't for me, but maybe I just hit an unusual number of edge cases that most users of Requests won't. It's hard to estimate how well my experience extrapolates to the average Requests user.\n\nI would love to see a shorter version of this part of your answer in the docs for `Response.text`, just in case:\n\n> […] users who are in a position to know better should take a more active role in choosing text encodings. For instance, on HTML/XML pages, people should look for an encoding directive in the content.\n",
"I would not be adverse to adding that to the documentation.\n\nThe average user of requests receives responses that either indicate the encoding correctly or vaguely indicates the encoding. For example, we've received complaints that we follow the RFC when we receive a `text/*` Content-Type with no charset provided. In that case we use what the RFC tells us to: `ISO-8859-1`. Encodings and detection thereof are hard and vastly inaccurate. One other thing we might add to the documentation is to tell users to check `response.encoding` if they know what they should be expecting. If they do not receive what they expect, they can override it to ensure they get the results they need. Overriding it, however, should be discouraged.\n",
"Docs update was made as part of #1823.\n"
] |
https://api.github.com/repos/psf/requests/issues/1794
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1794/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1794/comments
|
https://api.github.com/repos/psf/requests/issues/1794/events
|
https://github.com/psf/requests/pull/1794
| 24,219,475 |
MDExOlB1bGxSZXF1ZXN0MTA3OTAyODI=
| 1,794 |
Fixed pickle support for requests.adapters.HTTPAdapter
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/113129?v=4",
"events_url": "https://api.github.com/users/erikcw/events{/privacy}",
"followers_url": "https://api.github.com/users/erikcw/followers",
"following_url": "https://api.github.com/users/erikcw/following{/other_user}",
"gists_url": "https://api.github.com/users/erikcw/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/erikcw",
"id": 113129,
"login": "erikcw",
"node_id": "MDQ6VXNlcjExMzEyOQ==",
"organizations_url": "https://api.github.com/users/erikcw/orgs",
"received_events_url": "https://api.github.com/users/erikcw/received_events",
"repos_url": "https://api.github.com/users/erikcw/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/erikcw/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/erikcw/subscriptions",
"type": "User",
"url": "https://api.github.com/users/erikcw",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2013-12-13T00:26:27Z
|
2021-09-08T22:01:06Z
|
2013-12-14T04:29:58Z
|
CONTRIBUTOR
|
resolved
|
Fixed issue https://github.com/kennethreitz/requests/issues/1777 by adding unpickle support to the HTTPAdapter.
While unpickleing, the class will now set `proxy_manager` to an empty dictionary.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1794/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1794/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1794.diff",
"html_url": "https://github.com/psf/requests/pull/1794",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1794.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1794"
}
| true |
[
"This fix is a subset of the fix in #1793. You and @sigmavirus24 should work out which fix we apply (maybe a merged set of the fixes).\n",
"My fix initializes those attributes ahead of time in the event that they ever get added to the `__attrs__` list. It seems the most fool-proof way at the moment and far less complex than first checking if it has an attribute. (I don't quite like `hasattr` or how frequently we use it.)\n\nThe only issue with both of our pull requests is that the original issue could still come back (sort of) if the for-loop ever initializes `proxy_manager` to `None`. I'm going to pull @erikcw's commit into my branch and work from there.\n",
"In retrospect, I tend to agree with @sigmavirus24 approach of initializing the attributes instead of using `hasattr` in this case. Since it's before the loop, there are no concerns about clobbering pickled values with an empty dictionary -- while future-proofing the ability to move it into `__attrs__`.\n\nMy first approach to solving this was to simply add `proxy_manager` to `__attrs__` -- but that won't work because of the lambdas in `proxy_manager`. My recommendation would be to initialize the attributes before the loop as suggested by @sigmavirus24 , and to incorporate my code comments to save future contributors from stumbling down the `__attrs__` approach.\n",
"I cherry-picked your commits into my PR. Thanks for your work here @erikcw ! Keep the PRs coming!\n"
] |
https://api.github.com/repos/psf/requests/issues/1793
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1793/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1793/comments
|
https://api.github.com/repos/psf/requests/issues/1793/events
|
https://github.com/psf/requests/pull/1793
| 24,200,478 |
MDExOlB1bGxSZXF1ZXN0MTA3Nzg2MjU=
| 1,793 |
Prevent error when using proxies after picking an Adapter
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
] | null | 2 |
2013-12-12T19:04:51Z
|
2021-09-08T08:01:06Z
|
2014-01-08T18:53:42Z
|
CONTRIBUTOR
|
resolved
|
@Lukasa if you'd like to add tests for this you can. I'm not exactly certain if we already had tests for pickling objects and didn't have the time to check right now. I'll check when I get home though (unless you've already added tests).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1793/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1793/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1793.diff",
"html_url": "https://github.com/psf/requests/pull/1793",
"merged_at": "2014-01-08T18:53:42Z",
"patch_url": "https://github.com/psf/requests/pull/1793.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1793"
}
| true |
[
"@Lukasa any further comments?\n",
"Nope, this looks good to me. Thanks guys! :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/1792
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1792/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1792/comments
|
https://api.github.com/repos/psf/requests/issues/1792/events
|
https://github.com/psf/requests/pull/1792
| 24,189,411 |
MDExOlB1bGxSZXF1ZXN0MTA3NzIyMTc=
| 1,792 |
Fixup changelog with missing breaking change.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2013-12-12T16:21:35Z
|
2021-09-08T23:05:26Z
|
2013-12-12T16:21:38Z
|
MEMBER
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1792/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1792/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1792.diff",
"html_url": "https://github.com/psf/requests/pull/1792",
"merged_at": "2013-12-12T16:21:38Z",
"patch_url": "https://github.com/psf/requests/pull/1792.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1792"
}
| true |
[] |
|
https://api.github.com/repos/psf/requests/issues/1791
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1791/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1791/comments
|
https://api.github.com/repos/psf/requests/issues/1791/events
|
https://github.com/psf/requests/issues/1791
| 24,187,641 |
MDU6SXNzdWUyNDE4NzY0MQ==
| 1,791 |
Requests 2.1.0 regression - cookies set on session.get are not remembered anymore for subsequent requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/198355?v=4",
"events_url": "https://api.github.com/users/avallen/events{/privacy}",
"followers_url": "https://api.github.com/users/avallen/followers",
"following_url": "https://api.github.com/users/avallen/following{/other_user}",
"gists_url": "https://api.github.com/users/avallen/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/avallen",
"id": 198355,
"login": "avallen",
"node_id": "MDQ6VXNlcjE5ODM1NQ==",
"organizations_url": "https://api.github.com/users/avallen/orgs",
"received_events_url": "https://api.github.com/users/avallen/received_events",
"repos_url": "https://api.github.com/users/avallen/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/avallen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/avallen/subscriptions",
"type": "User",
"url": "https://api.github.com/users/avallen",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2013-12-12T15:57:23Z
|
2021-09-09T00:28:29Z
|
2013-12-12T16:22:24Z
|
NONE
|
resolved
|
The following line that made it happen has disappeared from Session.request:
```
# Add param cookies to session cookies
self.cookies = cookiejar_from_dict(cookies, cookiejar=self.cookies, overwrite=False)
```
If this was on purpose then some documentation of breaking changes such as this one would be nice to have.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1791/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1791/timeline
| null |
completed
| null | null | false |
[
"That line is replaced with `self.cookies = merge_cookies(self.cookies, cookies)` in 2.1.0.\nIs your problem happens when you provides cookies with `cookieJar`? \n",
"Apologies, this was on purpose. See Issue #1728, which was fixed in #1729 (though actually merged as part of #1776). I'm very sorry that it got missed out of the changelog: that's my fault. I'll update it now.\n",
"Noted in 456c42b00d15c1ffb7fe610e4dad9164ab3eb6f0. Once again, I'm very sorry for that.\n",
"Ah, i did't notice that change. I also apologies about my misinformation\n",
"Thanks Cory for your prompt support and updating the changelog so promptly\nso other people are not caught by it.\n\nOn Thu, Dec 12, 2013 at 5:23 PM, Cory Benfield [email protected]:\n\n> Noted in 456c42bhttps://github.com/kennethreitz/requests/commit/456c42b00d15c1ffb7fe610e4dad9164ab3eb6f0.\n> Once again, I'm very sorry for that.\n> \n> —\n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/issues/1791#issuecomment-30437004\n> .\n"
] |
https://api.github.com/repos/psf/requests/issues/1790
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1790/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1790/comments
|
https://api.github.com/repos/psf/requests/issues/1790/events
|
https://github.com/psf/requests/issues/1790
| 24,158,307 |
MDU6SXNzdWUyNDE1ODMwNw==
| 1,790 |
Auto convert data into json if Content-type is set to 'application/json'
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/311929?v=4",
"events_url": "https://api.github.com/users/kracekumar/events{/privacy}",
"followers_url": "https://api.github.com/users/kracekumar/followers",
"following_url": "https://api.github.com/users/kracekumar/following{/other_user}",
"gists_url": "https://api.github.com/users/kracekumar/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kracekumar",
"id": 311929,
"login": "kracekumar",
"node_id": "MDQ6VXNlcjMxMTkyOQ==",
"organizations_url": "https://api.github.com/users/kracekumar/orgs",
"received_events_url": "https://api.github.com/users/kracekumar/received_events",
"repos_url": "https://api.github.com/users/kracekumar/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kracekumar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kracekumar/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kracekumar",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2013-12-12T05:16:45Z
|
2021-09-09T00:28:30Z
|
2013-12-12T13:18:20Z
|
CONTRIBUTOR
|
resolved
|
```
>>> r = requests.get('https://api.github.com/user', auth=('user', 'pass'))
>>> r.status_code
200
>>> r.headers['content-type']
'application/json; charset=utf8'
>>> r.encoding
'utf-8'
>>> r.text
u'{"type":"User"...'
>>> r.json()
{u'private_gists': 419, u'total_private_repos': 77, ...}
```
If the response is `json`, requests auto converts to `dict`.
```
import requests
data = {'name': 'kracekumar', 'lang': ['python', 'go']}
headers = {'Content-type': 'application/json'}
r = requests.post("http://api.someserver.com/v1/profile/update", headers=headers, data=data)
```
Here I have set `content-type` to `json`, but forgot to encode the data to `json` (made this mistake multiple times), server will complain.
I think auto converting `data` to `json` will bring consistency to requests since `json` is converted to `dict` in response.
### Implementation
If `content-type` is set to `application/json` and data is `dict`, do `json.dumps(data)`.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1790/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1790/timeline
| null |
completed
| null | null | false |
[
"Hey, thanks for thi @kracekumar!\n\nI don't think this is a good idea, however. The key reason is that auto-converting outgoing data is upsettingly magic. As much as possible, what you put in one parameter to the request function (e.g. `headers`) should not affect what happens to something that came in on another parameter (e.g. `data`, or `auth`).\n\nIt's also not quite the same as what we do on `Response`s. For a `Response`, we don't automatically convert to JSON unless explicitly asked to by the user: that is, they have to actually call `Response.json()`. That fits the Zen of Python (\"Explicit is better than implicit\"). For that reason as well, I'd rather that we stick to the current behaviour.\n\nLet's leave this open to see if @sigmavirus24 agrees.\n",
"Oh, Now I got the point why `response.json` was made `response.json()` in 1.0.0. \n",
"This has been proposed countless times before and every time it has been shot down. We analyze your request to make sure that it appears as correct as we can possibly ensure without restricting what you can actually send.\n\nAs @Lukasa said, headers have currently no effect on any other parameter we use and they shouldn't have any effect here. `data`+`files` has an effect because it's the simplest API to making a `multipart/form-data` request especially since sending a file is safest when using that multipart type. Beyond the fact that this is a bad design for an API (it is entirely unintuitive), we would then have people insisting that because we accepted this then we should also accept other `Content-Type` headers and parse the other parameters based upon that. For example, someone might insist that we accept a header like `multipart/form-data; boundary=xyz` and dictionaries for `data` and `files` and then parse out the boundary and create a multipart/form-data` request. That is insanity.\n\nSomeone else might want us to turn his dictionaries into XML for him... \n\nYou can see exactly where this rabbit hole goes, just like https://github.com/kennethreitz/requests/pull/1779 led to https://github.com/kennethreitz/requests/pull/1785\n\nFinally, you seem to already understand it, but let me stress:\n\n``` python\nimport requests\n\nr = requests.get('https://www.google.com')\nr.json()\n```\n\nWill raise an exception as well it should. In both your example, and this one though, calling `r.content` **or** `r.text` will get you the body of the response.\n\nI'm :-1: \\* 10 on this (regardless of how convenient it would make developing things like [github3.py](https://github.com/sigmavirus24/github3.py)).\n",
"Sorry, I didn't think people may request similar kind of features for other content-types. Only reason to propose this was `json` is so commonly used. \n\nI am ok to close the issue. \n",
"Alright then, let's close this. Thanks for the suggestion @kracekumar and please keep them coming! We always say 'no' to more things than we say 'yes' to, but we can't say 'yes' to things people never ask us to do. You're doing good work! Keep it up.\n\n:cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/1789
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1789/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1789/comments
|
https://api.github.com/repos/psf/requests/issues/1789/events
|
https://github.com/psf/requests/issues/1789
| 24,111,608 |
MDU6SXNzdWUyNDExMTYwOA==
| 1,789 |
Requests on PYPY 2.21 is over twice as slow as on CPython
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/456254?v=4",
"events_url": "https://api.github.com/users/handloomweaver/events{/privacy}",
"followers_url": "https://api.github.com/users/handloomweaver/followers",
"following_url": "https://api.github.com/users/handloomweaver/following{/other_user}",
"gists_url": "https://api.github.com/users/handloomweaver/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/handloomweaver",
"id": 456254,
"login": "handloomweaver",
"node_id": "MDQ6VXNlcjQ1NjI1NA==",
"organizations_url": "https://api.github.com/users/handloomweaver/orgs",
"received_events_url": "https://api.github.com/users/handloomweaver/received_events",
"repos_url": "https://api.github.com/users/handloomweaver/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/handloomweaver/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/handloomweaver/subscriptions",
"type": "User",
"url": "https://api.github.com/users/handloomweaver",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2013-12-11T14:55:35Z
|
2021-09-09T00:10:10Z
|
2014-02-03T11:02:44Z
|
NONE
|
resolved
|
A get request between two local servers in same data center that is typically 15ms on CPython is more like 33ms on the same box with latest PYPY.
Why should this be?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1789/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1789/timeline
| null |
completed
| null | null | false |
[
"Before I go digging into this, what's your sample? I would want a fairly conclusive demonstration that this is actually a problem: that means something like doing 1000 requests to something on a local network (or even the same box) in CPython and the same again in PyPy, then checking the _average_ execution time for each request.\n\nWhat exactly did you do to conclude this?\n",
"Ok\nThis hits couchbase on a server for a small slab of json in same data centre from same server running two virtualenvs - 1 cpython 2.74 and one pypy2.21. \n\n``` python\nimport requests\nfrom utils.timefunctions import etime\nfrom profilehooks import profile\n\n@profile\ndef req():\n for i in xrange(2000):\n r = requests.get(\"http://data.atomised.io:8091/couchBase/default/1738568:1739813:29610\")\n\ndef main():\n req()\n\nif __name__ == \"__main__\":\n main()\n```\n\n\n\n",
"It seems to spend most of its time in `_socket.getaddrinfo()` (if I am reading the traces correctly).\nSo it shouldn't be the fault of requests but pypy. Maybe you want to profile that and report it to the pypy people.\n",
"will do\n",
"Yeah, @t-8ch seems to have the right analysis. In CPython, it takes 29milliseconds per call to `requests.get()`, of which 22ms is `getaddrinfo()`. In PyPy, it takes 47 ms per call to `requests.get()`, of which 35ms is `getaddrinfo()` That accounts for 13 ms of the 18ms difference. I imagine the remaining portion of that time difference is probably because the profiler doesn't play well with PyPy (I seem to recall that being a problem, though @alex will surely be able to correct me if it isn't).\n",
"Do you have a benchmark of just `getaddrinfo()`? /cc @fijal\n",
"Hmm, I wanted to test this. Running on Windows, I get garbage output when profiling on PyPy, but I wrote a quick test of `getaddrinfo()`:\n\n``` python\nimport socket\nimport time\n\ndef req():\n for i in xrange(2000):\n res = socket.getaddrinfo('lukasa.co.uk', 'http')\n\nif __name__ == '__main__':\n start = time.time()\n req()\n end = time.time()\n\n print \"Elapsed time: %f\" % (end - start)\n```\n\nOutput:\n\n```\nPyPy:\nElapsed time: 1.203000\n\nCPython:\nElapsed time: 1.011000\n```\n\nThis _is_ slower, but only marginally. I don't think this is PyPy's fault, unless @handloomweaver can demonstrate the problem on their hardware/network.\n",
"Closed due to inactivity.\n"
] |
https://api.github.com/repos/psf/requests/issues/1788
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1788/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1788/comments
|
https://api.github.com/repos/psf/requests/issues/1788/events
|
https://github.com/psf/requests/pull/1788
| 24,086,100 |
MDExOlB1bGxSZXF1ZXN0MTA3MTQ2NDY=
| 1,788 |
fixed https connection to be able to handle SSL_v3 when using it with Op...
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/281777?v=4",
"events_url": "https://api.github.com/users/AtlasPilotPuppy/events{/privacy}",
"followers_url": "https://api.github.com/users/AtlasPilotPuppy/followers",
"following_url": "https://api.github.com/users/AtlasPilotPuppy/following{/other_user}",
"gists_url": "https://api.github.com/users/AtlasPilotPuppy/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/AtlasPilotPuppy",
"id": 281777,
"login": "AtlasPilotPuppy",
"node_id": "MDQ6VXNlcjI4MTc3Nw==",
"organizations_url": "https://api.github.com/users/AtlasPilotPuppy/orgs",
"received_events_url": "https://api.github.com/users/AtlasPilotPuppy/received_events",
"repos_url": "https://api.github.com/users/AtlasPilotPuppy/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/AtlasPilotPuppy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/AtlasPilotPuppy/subscriptions",
"type": "User",
"url": "https://api.github.com/users/AtlasPilotPuppy",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2013-12-11T05:01:57Z
|
2021-09-09T00:01:25Z
|
2013-12-11T06:58:29Z
|
NONE
|
resolved
|
...enSSL
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1788/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1788/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1788.diff",
"html_url": "https://github.com/psf/requests/pull/1788",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1788.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1788"
}
| true |
[
"This is `urllib3` change, not requests.\nYou should open PR in `urllib3` not here.\n",
"Good point.\nOn Dec 10, 2013 10:26 PM, \"Park Ilsu\" [email protected] wrote:\n\n> This is urllib3 change, not requests.\n> You should open PR in urllib3 not here.\n> \n> —\n> Reply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/pull/1788#issuecomment-30295114\n> .\n",
"Thanks for this, good luck at urllib3!\n",
"@anantasty next time, please check the README in the `requests/packages` directory which GitHub kindly displays when you visit [that directory](https://github.com/kennethreitz/requests/tree/master/requests/packages).\n"
] |
https://api.github.com/repos/psf/requests/issues/1787
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1787/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1787/comments
|
https://api.github.com/repos/psf/requests/issues/1787/events
|
https://github.com/psf/requests/issues/1787
| 24,066,907 |
MDU6SXNzdWUyNDA2NjkwNw==
| 1,787 |
socket.timeout exception
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/179599?v=4",
"events_url": "https://api.github.com/users/ThiefMaster/events{/privacy}",
"followers_url": "https://api.github.com/users/ThiefMaster/followers",
"following_url": "https://api.github.com/users/ThiefMaster/following{/other_user}",
"gists_url": "https://api.github.com/users/ThiefMaster/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ThiefMaster",
"id": 179599,
"login": "ThiefMaster",
"node_id": "MDQ6VXNlcjE3OTU5OQ==",
"organizations_url": "https://api.github.com/users/ThiefMaster/orgs",
"received_events_url": "https://api.github.com/users/ThiefMaster/received_events",
"repos_url": "https://api.github.com/users/ThiefMaster/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ThiefMaster/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ThiefMaster/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ThiefMaster",
"user_view_type": "public"
}
|
[
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
}
] |
closed
| true | null |
[] | null | 31 |
2013-12-10T21:27:49Z
|
2021-09-08T23:06:06Z
|
2015-01-19T09:24:40Z
|
CONTRIBUTOR
|
resolved
|
I think that exception should be handled/wrapped internally and be reraised as a `RequestException`-based exception.
```
Traceback (most recent call last):
File "X:\dev\rotaryboard\daemon\testdaemon.py", line 117, in <module>
res = foobar_cmd()
File "X:\dev\rotaryboard\daemon\testdaemon.py", line 53, in foobar_cmd
return requests.get(url, params=params, auth=('foobar', 'meow'), timeout=0.1).json()
File "F:\Python27\lib\site-packages\requests\api.py", line 55, in get
return request('get', url, **kwargs)
File "F:\Python27\lib\site-packages\requests\api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "F:\Python27\lib\site-packages\requests\sessions.py", line 382, in request
resp = self.send(prep, **send_kwargs)
File "F:\Python27\lib\site-packages\requests\sessions.py", line 485, in send
r = adapter.send(request, **kwargs)
File "F:\Python27\lib\site-packages\requests\adapters.py", line 388, in send
r.content
File "F:\Python27\lib\site-packages\requests\models.py", line 676, in content
self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()
File "F:\Python27\lib\site-packages\requests\models.py", line 615, in generate
decode_content=True):
File "F:\Python27\lib\site-packages\requests\packages\urllib3\response.py", line 236, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "F:\Python27\lib\site-packages\requests\packages\urllib3\response.py", line 183, in read
data = self._fp.read(amt)
File "F:\Python27\lib\httplib.py", line 552, in read
s = self.fp.read(amt)
File "F:\Python27\lib\httplib.py", line 1288, in read
return s + self._file.read(amt - len(s))
File "F:\Python27\lib\socket.py", line 378, in read
data = self._sock.recv(left)
socket.timeout: timed out
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1787/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1787/timeline
| null |
completed
| null | null | false |
[
"Ok, seems to be a duplicate of #1236... still annoying.\n",
"With the latest version this no longer seems to show. The socket exception has been wrapped with a requests.exceptions.TimeoutException . \nI think it is fair to say this can be closed?\n",
"It happened with requests 2.1.0. So if that's the \"latest version\" it might not be wrapped in all cases.\n",
"That's actually a urllib3 bug: `socket.timeout` should be caught in urllib3 and rethrown as a urllib3 exception. Mind opening this issue over there?\n",
"@ThiefMaster , Hi, can you still reproduce this using the requests package compiled from the sources on here?\nI burrowed down into the urllib3 and it seems i can't reproduce this issue.\nCan you provide me with the server code or give me some hints of what it is or description something to help me diagnose the problem better?\nThanks.\n",
"When it happened the server was not running. And even then it just happened sporadically (the script was trying to GET a page every ~0.5s all night and at some point failed with the error above).\n",
"@adaschevici I've been experiencing this bug sporadically too. It's been occurring under similar circumstances to @ThiefMaster. I'm just running a script which makes a bunch of GET requests on 3rd party servers. About 1 of every 100 requests which time out throw the bad exception.\n\nI've been attempting to reproduce it consistently, but haven't had any luck.\n\nedit: This seems to be the exact same issue as: kennethreitz/requests/issues/1236. I'm also using 2.1.0\n",
"Okay, so what I can tell from a cursory look at the code:\n- This socket.timeout exception will only happen when the connection was successful, but it _times out while reading the response_ from the server, ie. the server went away inbetween the original request and finishing the response retrieval. HTTP requests are usually fairly short-lived, thus this would be an uncommon scenario - that'd explain why it's hard to reproduce.\n- The issue manifests itself in urllib3's `response.py`.\n- `httplib` does _not_ wrap this in a custom kind of exception, and just leaves it propagate as `socket.exception`. It is unclear to me whether this is intended behaviour or not.\n- To fix this, urllib3 would have to capture `socket.exception`s on `.read()`/`.recv()` calls of the socket (via httplib), and re-raise them as a urllib3 exception of some sort (I'm unfamiliar with the internals of urllib3).\n\nI will also cross-post this to shazow/urllib3#297.\n",
"@ThiefMaster , Thanks. I will try and reproduce this. Regarding the description provided it seems that it will occur on server overload.\nWhen i tried to overload it in ubuntu what i got was a pipe error. I've been trying it on windows but unfortunately the tests are failing before this one.\nWe might be looking into a platform dependent test?\nOr maybe a patch test to provide the coverage?\n\nCrossRef: shazow/urllib3#297\n",
"Hang on, it looks to me like urllib3 doesn't catch `socket.timeout` in `Response.read()`. That feels clearly and obviously wrong to me. @shazow, (and @kevinburke since you implemented this), am I missing something here?\n",
"Makes sense, if we can get a test that triggers a `socket.timeout` in `Response.read()`.\n",
"@shazow Pfft, as a socket-level test that's easy =D. I'll take a swing at it at somepoint today if someone doesn't beat me to it.\n",
":thumbsup: \n",
"@Lukasa , where do you mean to catch the exception\nIn httpresponse.read(), yes?\nI have been working with it and trying to get the exception thrown from there but i've had no successful attempts so far.\n\nAnyway...i will retrace my steps and see where i went wrong maybe i will come up with the proper test.\n",
"Given further investigation, this seems to no longer show, at least under ubuntu.\nFrom urllib3.\nI built a http server and was returning a file on get request, made the get request run multiple times.\nClosed the server.\nAll i am getting for a stack trace is this:\n File \"/home/hackawaye/PycharmProjects/test_urllib3/test_urllib3.py\", line 15, in <module>\n r = pool.request('GET', '/', timeout=20)\n File \"/home/hackawaye/virtual_envs/urllib3_virtual/local/lib/python2.7/site-packages/urllib3-dev-py2.7.egg/urllib3/request.py\", line 75, in request\n *_urlopen_kw)\n File \"/home/hackawaye/virtual_envs/urllib3_virtual/local/lib/python2.7/site-packages/urllib3-dev-py2.7.egg/urllib3/request.py\", line 88, in request_encode_url\n return self.urlopen(method, url, *_urlopen_kw)\n File \"/home/hackawaye/virtual_envs/urllib3_virtual/local/lib/python2.7/site-packages/urllib3-dev-py2.7.egg/urllib3/connectionpool.py\", line 544, in urlopen\n release_conn=release_conn, *_response_kw)\n File \"/home/hackawaye/virtual_envs/urllib3_virtual/local/lib/python2.7/site-packages/urllib3-dev-py2.7.egg/urllib3/connectionpool.py\", line 544, in urlopen\n release_conn=release_conn, *_response_kw)\n File \"/home/hackawaye/virtual_envs/urllib3_virtual/local/lib/python2.7/site-packages/urllib3-dev-py2.7.egg/urllib3/connectionpool.py\", line 544, in urlopen\n release_conn=release_conn, **response_kw)\n File \"/home/hackawaye/virtual_envs/urllib3_virtual/local/lib/python2.7/site-packages/urllib3-dev-py2.7.egg/urllib3/connectionpool.py\", line 528, in urlopen\n raise MaxRetryError(self, url, e)\nurllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=8888): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 111] Connection refused)\n",
"In requests(Ubuntu) : \n\nTraceback (most recent call last):\n File \"/home/hackawaye/PycharmProjects/test_urllib3/test_urllib3.py\", line 9, in <module>\n r = requests.get('http://localhost', timeout=0.5)\n File \"/home/hackawaye/virtual_envs/requests_virtual/local/lib/python2.7/site-packages/requests-2.1.0-py2.7.egg/requests/api.py\", line 55, in get\n return request('get', url, *_kwargs)\n File \"/home/hackawaye/virtual_envs/requests_virtual/local/lib/python2.7/site-packages/requests-2.1.0-py2.7.egg/requests/api.py\", line 44, in request\n return session.request(method=method, url=url, *_kwargs)\n File \"/home/hackawaye/virtual_envs/requests_virtual/local/lib/python2.7/site-packages/requests-2.1.0-py2.7.egg/requests/sessions.py\", line 382, in request\n resp = self.send(prep, *_send_kwargs)\n File \"/home/hackawaye/virtual_envs/requests_virtual/local/lib/python2.7/site-packages/requests-2.1.0-py2.7.egg/requests/sessions.py\", line 485, in send\n r = adapter.send(request, *_kwargs)\n File \"/home/hackawaye/virtual_envs/requests_virtual/local/lib/python2.7/site-packages/requests-2.1.0-py2.7.egg/requests/adapters.py\", line 372, in send\n raise ConnectionError(e)\nrequests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=80): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 104] Connection reset by peer)\n\nBoth stack traces were with the code commented from the read method in urllib so i was not attempting to catch the timeout exception at all. I will test it on windows later on to see if it shows.\n",
"The way I'd test this is to use a urllib3 socket-level test, like [this one](https://github.com/shazow/urllib3/blob/master/test/with_dummyserver/test_socketlevel.py#L68-L104), but rewritten to close the socket midway through the data. Something like this:\n\n``` python\n# At the top of the file, some more imports\nfrom urllib3.import Timeout\n\n# Down with the test I linked above\n def test_timeout_midway_through_read(self):\n def socket_handler(listener):\n sock = listener.accept()[0]\n\n buf = b''\n while not buf.endswith(b'\\r\\n\\r\\n'):\n buf = sock.recv(65536)\n\n body = 'Test Data'\n sock.send(('HTTP/1.1 200 OK\\r\\n'\n 'Content-Type: text/plain\\r\\n'\n 'Content-Length: %d\\r\\n'\n '\\r\\n' % len(body)).encode('utf-8'))\n\n # Wait for the read timeout.\n time.sleep(0.002)\n\n sock.send(body.encode('utf-8'))\n sock.close()\n\n self._start_server(socket_handler)\n pool = HTTPConnectionPool(self.host, self.port)\n\n response = pool.urlopen('GET', '/', timeout=Timeout(connect=1, read=0.001))\n response.read() # Should throw our exception.\n```\n\nConfirm that this test is OK: I wrote it in this text box and haven't run it, so there could be all kinds of crazy problems here.\n",
"Looks ok to me. This is the output i am getting from the test. It's similar to what i was getting before, seems the exception has been wrapped properly.\n\nTraceback (most recent call last):\n File \"/home/hackawaye/virtual_envs/urllib3/test/with_dummyserver/test_socketlevel.py\", line 357, in test_timeout_midway_through_read\n response = pool.urlopen('GET', '/', timeout=Timeout(connect=1, read=0.001))\n File \"/home/hackawaye/virtual_envs/urllib3/urllib3/connectionpool.py\", line 544, in urlopen\n release_conn=release_conn, *_response_kw)\n File \"/home/hackawaye/virtual_envs/urllib3/urllib3/connectionpool.py\", line 544, in urlopen\n release_conn=release_conn, *_response_kw)\n File \"/home/hackawaye/virtual_envs/urllib3/urllib3/connectionpool.py\", line 544, in urlopen\n release_conn=release_conn, **response_kw)\n File \"/home/hackawaye/virtual_envs/urllib3/urllib3/connectionpool.py\", line 528, in urlopen\n raise MaxRetryError(self, url, e)\nMaxRetryError: HTTPConnectionPool(host='localhost', port=35642): Max retries exceeded with url: / (Caused by <class 'socket.error'>: [Errno 111] Connection refused)\n-------------------- >> begin captured logging << --------------------\nurllib3.connectionpool: INFO: Starting new HTTP connection (1): localhost\nurllib3.connectionpool: DEBUG: \"GET / HTTP/1.1\" 200 9\nurllib3.connectionpool: WARNING: Retrying (3 attempts remain) after connection broken by 'timeout('timed out',)': /\nurllib3.connectionpool: INFO: Starting new HTTP connection (2): localhost\nurllib3.connectionpool: WARNING: Retrying (2 attempts remain) after connection broken by 'error(104, 'Connection reset by peer')': /\nurllib3.connectionpool: INFO: Starting new HTTP connection (3): localhost\nurllib3.connectionpool: WARNING: Retrying (1 attempts remain) after connection broken by 'error(111, 'Connection refused')': /\nurllib3.connectionpool: INFO: Starting new HTTP connection (4): localhost\n--------------------- >> end captured logging << ---------------------\n",
"Arg, sorry, you need to turn the retries off. Pass `retries=0` into `urlopen`.\n",
"Yes, did that same result:\nMy guess is that it is being wrapped from the HTTPConnectionPool\nTraceback (most recent call last):\n File \"/home/hackawaye/virtual_envs/urllib3/test/with_dummyserver/test_socketlevel.py\", line 357, in test_timeout_midway_through_read\n response = pool.urlopen('GET', '/', retries=0, timeout=Timeout(connect=1, read=0.001))\n File \"/home/hackawaye/virtual_envs/urllib3/urllib3/connectionpool.py\", line 528, in urlopen\n raise MaxRetryError(self, url, e)\nMaxRetryError: HTTPConnectionPool(host='localhost', port=34819): Max retries exceeded with url: / (Caused by <class 'socket.timeout'>: timed out)\n\nDoing this on ubuntu i will try it on windows tonight.\n",
"See, we're failing in the wrong place here. I'm expecting to throw the exception in `read()` not in `urlopen`. I'll see if I can work out why.\n",
"Uh, I know why. Also pass `preload_content=False` to `urlopen()`.\n",
"good call. Throws it now. Wasn't aware of that preload_content directive.\nI will update the fix and get the test in as well. Thanks for the help.\n",
"I'am gona sneak into the thread, because I've experienced the very same bug.\n\nThe fix isn't live yet, am I right? I'am using the latest version from requests and it still occurs to me.\nIs there a way to fix that these exceptions didn't even occur?\n\nMy code which raise the exception:\n\n```\ntry:\n if session:\n r = session.get(url, stream=True)\n else:\n r = requests.get(url, stream=True)\nexcept requests.exceptions.ConnectionError as e:\n xbmc.log(\"Saving Error: \" + e, xbmc.LOGDEBUG)\n return\nwith open(file_path, \"wb\") as file:\n for chunk in r.iter_content(1024):\n if not chunk:\n break\n file.write(chunk)\n```\n\nThe exception itself:\n\n```\n18:34:29 T:5812 ERROR: EXCEPTION Thrown (PythonToCppException) : -->Python callback/script returned the following error<--\n - NOTE: IGNORING THIS CAN LEAD TO MEMORY LEAKS!\n Error Type: <class 'socket.timeout'>\n Error Contents: timed out\n Traceback (most recent call last):\n File \"C:\\Users\\Leandros\\AppData\\Roaming\\XBMC\\addons\\steambmc\\default.py\", line 211, in <module>\n steamuser.getOwnedGames(prog_callback=progress, artupdate=True)\n File \"C:\\Users\\Leandros\\AppData\\Roaming\\XBMC\\addons\\steambmc\\steamapi.py\", line 270, in getOwnedGames\n game.scrapePromo(s, artupdate)\n File \"C:\\Users\\Leandros\\AppData\\Roaming\\XBMC\\addons\\steambmc\\steamapi.py\", line 165, in scrapePromo\n file_path = self.__downloadFile(url, session, \"promo\")\n File \"C:\\Users\\Leandros\\AppData\\Roaming\\XBMC\\addons\\steambmc\\steamapi.py\", line 198, in __downloadFile\n for chunk in r.iter_content():\n File \"C:\\Users\\Leandros\\AppData\\Roaming\\XBMC\\addons\\script.module.requests2\\lib\\requests\\models.py\", line 616, in generate\n decode_content=True):\n File \"C:\\Users\\Leandros\\AppData\\Roaming\\XBMC\\addons\\script.module.requests2\\lib\\requests\\packages\\urllib3\\response.py\", line 236, in stream\n data = self.read(amt=amt, decode_content=decode_content)\n File \"C:\\Users\\Leandros\\AppData\\Roaming\\XBMC\\addons\\script.module.requests2\\lib\\requests\\packages\\urllib3\\response.py\", line 183, in read\n data = self._fp.read(amt)\n File \"C:\\Program Files (x86)\\XBMC\\system\\python\\Lib\\httplib.py\", line 542, in read\n s = self.fp.read(amt)\n File \"C:\\Program Files (x86)\\XBMC\\system\\python\\Lib\\socket.py\", line 377, in read\n data = self._sock.recv(left)\n timeout: timed out\n -->End of Python script error report<--\n```\n",
"Correct, the fix isn't live yet. This exception is raised when a server doesn't respond with data within our timeout. To prevent these, you can try extending the timeout by passing a `timeout` parameter to `requests.get()` or `session.get()`. However, if the server has for some reason stopped sending data, that bug will manifest in this way.\n",
"Ok, I did tested it with a 30 seconds `timeout`, but I'am going to add a bit higher one and try to repeat the request if it fails. \n\nThanks!\n",
"Is this bug fixed yet?\n\nI am getting this same \"socket.timeout: timed out\" exception when using requests with a particular embedded device.\n\nThe exception occurs even when I use Session with max_retries argument: session.mount('http://', HTTPAdapter(max_retries=10). I'm guessing that when the \"socket.timeout\" exception is translated to a requests-based exception, that max_retries will catch it and retry.\n\nIn the meantime, any advice to get around this \"socket.timeout\" exception? I suppose I could catch and retry in a higher level.\n",
"@darshahlu To be clear, are you actually getting a `socket.timeout` exception, or one that has been wrapped? Can I see the full exception text?\n",
"@Lukasa Sure, here is the full exception (starting at requests code):\n\n File \"C:\\dev\\virtualenvs\\py27mft\\lib\\site-packages\\requests\\sessions.py\", line 395, in get\n return self.request('GET', url, *_kwargs)\n File \"C:\\dev\\virtualenvs\\py27mft\\lib\\site-packages\\requests\\sessions.py\", line 383, in request\n resp = self.send(prep, *_send_kwargs)\n File \"C:\\dev\\virtualenvs\\py27mft\\lib\\site-packages\\requests\\sessions.py\", line 486, in send\n r = adapter.send(request, **kwargs)\n File \"C:\\dev\\virtualenvs\\py27mft\\lib\\site-packages\\requests\\adapters.py\", line 394, in send\n r.content\n File \"C:\\dev\\virtualenvs\\py27mft\\lib\\site-packages\\requests\\models.py\", line 679, in content\n self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()\n File \"C:\\dev\\virtualenvs\\py27mft\\lib\\site-packages\\requests\\models.py\", line 616, in generate\n decode_content=True):\n File \"C:\\dev\\virtualenvs\\py27mft\\lib\\site-packages\\requests\\packages\\urllib3\\response.py\", line 236, in stream\n data = self.read(amt=amt, decode_content=decode_content)\n File \"C:\\dev\\virtualenvs\\py27mft\\lib\\site-packages\\requests\\packages\\urllib3\\response.py\", line 183, in read\n data = self._fp.read(amt)\n File \"C:\\Python27\\Lib\\httplib.py\", line 567, in read\n s = self.fp.read(amt)\n File \"C:\\Python27\\Lib\\socket.py\", line 380, in read\n data = self._sock.recv(left)\nsocket.timeout: timed out\n\nWhen I merged the fixed response.py (https://raw.githubusercontent.com/adaschevici/urllib3/296-exception-not-properly-wrapped/urllib3/response.py), the exception now changed to (the retry did not occur as I hoped):\n\n File \"C:\\dev\\virtualenvs\\py27mft\\lib\\site-packages\\requests\\sessions.py\", line 395, in get\n return self.request('GET', url, *_kwargs)\n File \"C:\\dev\\virtualenvs\\py27mft\\lib\\site-packages\\requests\\sessions.py\", line 383, in request\n resp = self.send(prep, *_send_kwargs)\n File \"C:\\dev\\virtualenvs\\py27mft\\lib\\site-packages\\requests\\sessions.py\", line 486, in send\n r = adapter.send(request, **kwargs)\n File \"C:\\dev\\virtualenvs\\py27mft\\lib\\site-packages\\requests\\adapters.py\", line 394, in send\n r.content\n File \"C:\\dev\\virtualenvs\\py27mft\\lib\\site-packages\\requests\\models.py\", line 679, in content\n self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()\n File \"C:\\dev\\virtualenvs\\py27mft\\lib\\site-packages\\requests\\models.py\", line 616, in generate\n decode_content=True):\n File \"C:\\dev\\virtualenvs\\py27mft\\lib\\site-packages\\requests\\packages\\urllib3\\response.py\", line 245, in stream\n data = self.read(amt=amt, decode_content=decode_content)\n File \"C:\\dev\\virtualenvs\\py27mft\\lib\\site-packages\\requests\\packages\\urllib3\\response.py\", line 223, in read\n \"Remote connection closed. Read timed out.\")\nurllib3.exceptions.ReadTimeoutError: <requests.packages.urllib3.response.HTTPResponse object at 0x000000000326BDD8>: Remote connection closed. Read timed out.\n\nBy the way: this issue is easily reproduced on this particular embedded device--occurs within seconds. I took a packet trace and I can see a difference between no-issue and issue--but not sure if it is the client (requests) or the server (my embedded device) that is misbehaving.\n\nThanks!\nDarhsan\n",
"Ok, so it looks like there's a fix for the exact issue in the GH issue (unwrapped socket error), so that's something.\n\nI don't think `max_retries` will help you here. As far as I recall, `max_retries` only retries if errors are occurred during connection, which is not happening here. Here, we're timing out our attempt to read the response.\n\nYou say you took a packet trace and can't see the difference: do you mind posting them?\n"
] |
https://api.github.com/repos/psf/requests/issues/1786
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1786/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1786/comments
|
https://api.github.com/repos/psf/requests/issues/1786/events
|
https://github.com/psf/requests/issues/1786
| 23,897,795 |
MDU6SXNzdWUyMzg5Nzc5NQ==
| 1,786 |
Certificate Revocation List (CRL) support with https
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/33690?v=4",
"events_url": "https://api.github.com/users/chutz/events{/privacy}",
"followers_url": "https://api.github.com/users/chutz/followers",
"following_url": "https://api.github.com/users/chutz/following{/other_user}",
"gists_url": "https://api.github.com/users/chutz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/chutz",
"id": 33690,
"login": "chutz",
"node_id": "MDQ6VXNlcjMzNjkw",
"organizations_url": "https://api.github.com/users/chutz/orgs",
"received_events_url": "https://api.github.com/users/chutz/received_events",
"repos_url": "https://api.github.com/users/chutz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/chutz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chutz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/chutz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 12 |
2013-12-07T02:23:46Z
|
2021-09-03T00:10:42Z
|
2013-12-09T19:12:17Z
|
NONE
|
resolved
|
Currently requests doesn't support checking if a certificate has been revokes in a CRL when using verify=True, it would be quite useful if this was supported.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1786/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1786/timeline
| null |
completed
| null | null | false |
[
"Have you encountered a certificate that should have been revoked which wasn't? I'm 90% sure that Kenneth curates the list carefully to remove revoked certs.\n",
"We have a new policy thanks to the security-conscious guys handling Requests' integration with `pip`. Each release we update our list of certs, which includes removing any that have been revoked.\n\nNow, obviously this is not perfect: if for any reason you can't update Requests you have to do this yourself, and there's also a delay of a month or so (usually) between releases. So that's a bit lame.\n\nThe much bigger problem is that this is not easy to do in a user-friendly way. To consult a CRL the user would have to tell us which CRL to look at. These CRLs can be provided in a number of ways (IIRC the standard already suggests WebDAV and HTTP, and flat files are possible as well). In practice, we'd either have to hardcode _all_ the possible CRLs, or resort to an API where the user provides the location of the CRL they'd like us to consult. That API is pretty brittle, frankly.\n\nI'd be happy to have this if we could think of a decent API for it, but I just don't think we can. Given that it's also not a widely-requested feature, it's probably going to be best to ask that users who are worried about it curate their own certificate lists.\n",
"On Fri, 06 Dec 2013 23:13:10 -0800\n\n> I'd be happy to have this if we could think of a decent API for it,\n> but I just don't think we can. Given that it's also not a\n> widely-requested feature, it's probably going to be best to ask that\n> users who are worried about it curate their own certificate lists.\n\nI am mostly concerned about using this internally, we have our own\ncertificate management system, and we do publish CRLs for our internal\ncertificates, this will allow us to have requests fully integrate into\nour internal certificate system.\n\nFor my usage, being able to specify a file or set of files to read the\nCRL from would be quite sufficient, maybe along the lines of how a\nparticular client cert is specified.\n",
"So are you providing a custom certfile anyway? If so, it seems like the best thing to do is to handle this outside Requests: obtain and parse the CRL, apply the changes to the certfile before passing it to Requests. I just don't think this feature would be widely-used enough to make it worth the pain.\n",
"I think this is a problem with OpenSSL (or perhaps just Python's implementation of it), not Requests. \n\nI would recommend setting up your own CA bundle and just providing that. with the `CURL_CA_BUNDLE` environment variable, or similar. \n",
"On a slightly tangential point: Is requests still under a feature freeze?\n",
"Yes, but policies are never without exceptions.\n",
"The cert bundle won't work as we have a single CA cert and accept any cert signed by our CA cert and not in the CRL (occasionally with some CN restrictions), so doing processing outside of requests doesn't really work for us (I suspect this is a fairly common use case, I know there are lots of companies that use this internally).\n",
"Thanks for the feedback.\n",
"I am also interested by this feature. When using an internal CA, the CRL as a file is the easiest path. Far easier than using an OCSP endpoint and more reliable than embedding an HTTP URL into the certificate (no way that the CRL can be unavailable due to a temporary outage).\n\n@Lukasa I don't understand when you say this can be done outside requests. There is no way to preprocess the certfile with the CRL. The certfile is usually a root certificate and the CRL contains the serial numbers of revoked certificates. There is no way to apply a CRL to a root certificate.\n\nWith curl, you can specify it with `--crlfile`.\n",
"@vincentbernat Yeah, that's a fair point.\n\nWe're aiming to add support for providing an external SSLContext object, which should make this possible, but until that's done I don't think we can do much else.\n",
"Plus one for this feature request for use with an internal CA."
] |
https://api.github.com/repos/psf/requests/issues/1785
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1785/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1785/comments
|
https://api.github.com/repos/psf/requests/issues/1785/events
|
https://github.com/psf/requests/pull/1785
| 23,880,585 |
MDExOlB1bGxSZXF1ZXN0MTA2MDU5OTU=
| 1,785 |
Made default_user_agent reusable
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1580136?v=4",
"events_url": "https://api.github.com/users/smallcode/events{/privacy}",
"followers_url": "https://api.github.com/users/smallcode/followers",
"following_url": "https://api.github.com/users/smallcode/following{/other_user}",
"gists_url": "https://api.github.com/users/smallcode/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/smallcode",
"id": 1580136,
"login": "smallcode",
"node_id": "MDQ6VXNlcjE1ODAxMzY=",
"organizations_url": "https://api.github.com/users/smallcode/orgs",
"received_events_url": "https://api.github.com/users/smallcode/received_events",
"repos_url": "https://api.github.com/users/smallcode/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/smallcode/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/smallcode/subscriptions",
"type": "User",
"url": "https://api.github.com/users/smallcode",
"user_view_type": "public"
}
|
[
{
"color": "fbca04",
"default": false,
"description": null,
"id": 44501249,
"name": "Needs BDFL Input",
"node_id": "MDU6TGFiZWw0NDUwMTI0OQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input"
},
{
"color": "e11d21",
"default": false,
"description": null,
"id": 44501305,
"name": "Not Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTMwNQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Not%20Ready%20To%20Merge"
}
] |
closed
| true | null |
[] | null | 3 |
2013-12-06T19:46:03Z
|
2021-09-08T22:01:05Z
|
2013-12-07T05:55:44Z
|
NONE
|
resolved
|
make the version reusable too.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1580136?v=4",
"events_url": "https://api.github.com/users/smallcode/events{/privacy}",
"followers_url": "https://api.github.com/users/smallcode/followers",
"following_url": "https://api.github.com/users/smallcode/following{/other_user}",
"gists_url": "https://api.github.com/users/smallcode/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/smallcode",
"id": 1580136,
"login": "smallcode",
"node_id": "MDQ6VXNlcjE1ODAxMzY=",
"organizations_url": "https://api.github.com/users/smallcode/orgs",
"received_events_url": "https://api.github.com/users/smallcode/received_events",
"repos_url": "https://api.github.com/users/smallcode/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/smallcode/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/smallcode/subscriptions",
"type": "User",
"url": "https://api.github.com/users/smallcode",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1785/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1785/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1785.diff",
"html_url": "https://github.com/psf/requests/pull/1785",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1785.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1785"
}
| true |
[
"I'm now -1 on this. I was neutral on the previous change, but each additional changeable value takes us one step closer to the logical endpoint of this trend, which is every parameter is editable. I think we should stop changing this, and just let people reimplement the totally trivial logic of this function if they feel they need it.\n",
"The slippery slope was my own concern about the last pull request. Furthermore, there aren't even any tests for this, not that their addition would make this anymore palatable to me.\n",
"It's not a big deal. I just think if you just change the name, without also changing the version, it will cause confusion. For example: 'other-package-name/2.1.0', where the name has changed, but the version is still requests version. Either all allowed to change, either all changes are not allowed.\n"
] |
https://api.github.com/repos/psf/requests/issues/1784
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1784/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1784/comments
|
https://api.github.com/repos/psf/requests/issues/1784/events
|
https://github.com/psf/requests/issues/1784
| 23,845,672 |
MDU6SXNzdWUyMzg0NTY3Mg==
| 1,784 |
Can't upload file larger than 2GB to webdav by requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2759502?v=4",
"events_url": "https://api.github.com/users/wchang/events{/privacy}",
"followers_url": "https://api.github.com/users/wchang/followers",
"following_url": "https://api.github.com/users/wchang/following{/other_user}",
"gists_url": "https://api.github.com/users/wchang/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/wchang",
"id": 2759502,
"login": "wchang",
"node_id": "MDQ6VXNlcjI3NTk1MDI=",
"organizations_url": "https://api.github.com/users/wchang/orgs",
"received_events_url": "https://api.github.com/users/wchang/received_events",
"repos_url": "https://api.github.com/users/wchang/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/wchang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wchang/subscriptions",
"type": "User",
"url": "https://api.github.com/users/wchang",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 25 |
2013-12-06T09:20:32Z
|
2021-09-09T00:28:30Z
|
2013-12-11T12:34:35Z
|
NONE
|
resolved
|
Dear Sir,
I can't upload file larger than 2GB to webdav server by requests, and the following is the error message. Any idea about how to solve this problem ? Thanks a lot for your help.
OS: SL5,SL6
Python version:2.6.6
requests version:2.1.0
raceback (most recent call last):
File "upload.py", line 5, in <module>
File "/opt/rucio/.venv/lib64/python2.6/site-packages/requests/api.py", line 99, in put
return request('put', url, data=data, *_kwargs)
File "/opt/rucio/.venv/lib64/python2.6/site-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, *_kwargs)
File "/opt/rucio/.venv/lib64/python2.6/site-packages/requests/sessions.py", line 361, in request
resp = self.send(prep, *_send_kwargs)
File "/opt/rucio/.venv/lib64/python2.6/site-packages/requests/sessions.py", line 464, in send
r = adapter.send(request, *_kwargs)
File "/opt/rucio/.venv/lib64/python2.6/site-packages/requests/adapters.py", line 363, in send
raise SSLError(e)
requests.exceptions.SSLError: [Errno 5] _ssl.c:1217: Some I/O error occurred
Best Regards,
Evan Chang
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1784/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1784/timeline
| null |
completed
| null | null | false |
[
"Hi: can I see the line of code you're using to send the file?\n",
"Hi,\nIt's my simple code:\nimport requests\nfile=open('/home/evan/file_2.5G_01','rb')\ndata=file.read()\nrequests.put('https://my/webdav/server/file_2.5G_01', data=data, verify=False, cert=('/tmp/evan.proxy','/tmp/evan.proxy'))\n\nThanks,\nEvan Chang\n",
"Can you try streaming the upload? Change your code to:\n\n``` python\nimport requests\nfile=open('/home/evan/file_2.5G_01','rb')\nrequests.put('https://my/webdav/server/file_2.5G_01', data=file, verify=False, cert=('/tmp/evan.proxy','/tmp/evan.proxy'))\n```\n",
"Hi,\nNow I can upload my file to webdav server by the code you changed, but I saw the file size is 0\nwhen I use \"du -sh\" to check it.\n\ndu -sh wj_rucio_2.5G_20.10483315.0\n0 wj_rucio_2.5G_20.10483315.0\n\nBest Regards,\nEvan Chang\n",
"@wchang we do not change the size of the file.\n",
"Hi,\nI mean I checked the file size after I upload my file to webdav server, and I found the size is 0,\nwhich means there is something wrong during the file upload. I'm not sure it's requests problem \nor webdav's problem, any comment ? Thanks a lot for your help. \n",
"Oh the size on the server side is 0? That could e from an old upload that failed. Could you delete it and try again?\n",
"Yes, the size on the server side is 0 and I tried more than three times. I can upload files larger than 2GB by curl successfully and I can also upload files < 2GB without problem by requests.put, but I can't upload files > 2GB successfully by requests.put. \n\nThanks. \n",
"By the way, this is the curl command I'm using, maybe it's helpful for debugging \n\ncurl -E /my/certificate --capath /path/for/verifying -L https://hostname/of/my/webdav/server:443/path/file_2.5G_30 -T file_2.5G_30\n\nThanks\n",
"So I reduced that to the simplest form, which is\n\n```\ncurl --upload-file=file -L https://httpbin.org/put\n```\n\nFor those unfamiliar (as I was), `-T/--upload-file` sets the verb to put and streams the file. cURL also **expects** a 100 Continue response. (#713) It receives that and redirects to what you would expect, which is a 200 OK. (To test this out yourself just add the `-v` option to the command above to see the debugging information.\n\nUnfortunately, however, I cannot find very good documentation as to how cURL handles the data. There's no `Content-Type` header set by the request, simply `Content-Length`, which requests also sets for you with the streaming upload.\n\nLooking more closely over the issue, however, your first code snippet includes this:\n\n```\ncert=('/tmp/evan.proxy','/tmp/evan.proxy')\n```\n\nAre you behind a proxy while trying to upload the file? If so, is it possible that is causing the issues you're seeing? It might not seem likely but the fact of the matter is that cURL handles proxies far better than we do.\n\nCan you try putting the file to `http://httpbin.org/put` and let us know what happens. Specifically check what JSON is returned.\n",
"Hi, I got this message when I tried to upload a 2.5GB file. I didn't get such message when I uploaded 1MB file.\n\n/put\nTraceback (most recent call last):\n File \"upload.py\", line 6, in <module>\n requests.put('https://httpbin.org/put', data=file)\n File \"/asgc_ui_home/wjc00/python26/lib/python2.6/site-packages/requests/api.py\", line 99, in put\n return request('put', url, data=data, *_kwargs)\n File \"/asgc_ui_home/wjc00/python26/lib/python2.6/site-packages/requests/api.py\", line 44, in request\n return session.request(method=method, url=url, *_kwargs)\n File \"/asgc_ui_home/wjc00/python26/lib/python2.6/site-packages/requests/sessions.py\", line 382, in request\n resp = self.send(prep, *_send_kwargs)\n File \"/asgc_ui_home/wjc00/python26/lib/python2.6/site-packages/requests/sessions.py\", line 485, in send\n r = adapter.send(request, *_kwargs)\n File \"/asgc_ui_home/wjc00/python26/lib/python2.6/site-packages/requests/adapters.py\", line 372, in send\n raise ConnectionError(e)\nrequests.exceptions.ConnectionError: HTTPSConnectionPool(host='httpbin.org', port=443): Max retries exceeded with url: /put (Caused by <class 'socket.error'>: [Errno 104] Connection reset by peer)\n\nCheers,\nEvan Chang\n",
"And here is a brief code\nimport requests\nfile=open('/path/evan_2.5G_01','rb')\nrequests.put('https://httpbin.org/put', data=file)\n",
"Seeing `Connection reset by peer` is setting off alarm bells for me, especially as it isn't the same exception as you hit originally.\n\n2,5GB is an interesting number, only because it's very slightly larger than the maximum value of a 32-bit signed integer, and also not very far from the maximum memory allowed to a 32-bit process. Does curl have trouble when uploading the same file to httpbin?\n",
"Hi,\nI got the message when I upload file to httpbin by curl:\n$ curl --upload-file evan_2.5G_02 -L https://httpbin.org/put\n<!DOCTYPE html>\n <html>\n <head>\n <style type=\"text/css\">\n html, body, iframe { margin: 0; padding: 0; height: 100%; }\n iframe { display: block; width: 100%; border: none; }\n </style>\n <title>Application Error</title>\n </head>\n <body>\n <iframe src=\"//s3.amazonaws.com/heroku_pages/error.html\">\n <p>Application Error</p>\n </iframe>\n </body>\n\nAnd yes, 2.5GB is really an interesting number. Actually, I tried to upload a 2.1GB file to our webdav server by my original code (data=file.read()) but it failed . And I can upload a 2.0GB file without any problem by my original code. Our cpu is Intel Xeon L5520. \nThanks.\n\nEvan Chang\n",
"Sorry, it's the message I got from uploading file to httpbin by curl\n\n```\n<!DOCTYPE html>\n <html>\n <head>\n <style type=\"text/css\">\n html, body, iframe { margin: 0; padding: 0; height: 100%; }\n iframe { display: block; width: 100%; border: none; }\n </style>\n <title>Application Error</title>\n </head>\n <body>\n <iframe src=\"//s3.amazonaws.com/heroku_pages/error.html\">\n <p>Application Error</p>\n </iframe>\n </body>\n```\n",
"@wchang you ignored my question about whether you're behind a proxy or not. That would not cause the Application Error you were seeing with httpbin, but it might explain the bizarre difference in behaviour between putting a 2.0 GB file and a >2.0 GB file.\n",
"Oh~ Sorry. Yes, I would like to bind a proxy. But how does it make difference in behavior between 2GB and >2GB?\nI suppose the proxy is just for user's validation.\n",
"So I think we can't use httpbin as a target here: it's clearly running out of memory.\n",
"Why it's been closed ? The problem is still there ?\n",
"So now I sum up current questions\n1. I tried to use streaming but the file size shows 0 on webdav server, how can I solve this issue? \n2. The proxy we use is just for user's certificate, does it affect the file transfer as the file size larger than 2GB?\n3. I tried to search the issue on stackoverflow and there are some comments, do you think it's helpful (I tried the \n code but didn't success)\n http://stackoverflow.com/questions/13909900/progress-of-python-requests-post\n\nThanks again for the help.\nCheers\nEvan Chang\n",
"> 1. The proxy we use is just for user's certificate, does it affect the file transfer as the file size larger than 2GB?\n\nThis is a _possibility_ but not a certainty\n\n> 1. I tried to search the issue on stackoverflow and there are some comments, do you think it's helpful\n\nThat's a very old question and answer. I doubt it would be helpful.\n\n> 1. I tried to use streaming but the file size shows 0 on webdav server, how can I solve this issue? \n\nThis is what's most baffling to me. And I wonder if this may be influenced by the proxy.\n",
"Hi\nI tried to upload my file to another webdav server which doesn't need certificate (proxy) and the file size on server side is still 0. Maybe it's not the proxy's problem.\n\nIt's a simple code:\n\n```\nwith open('/my/path/file_2.5G_01') as file:\nrequests.put('http://hostname/path/file_2.5G_01', data=file, verify=False)\n```\n\nThanks a lot.\n",
"Yeah, I'm planning to try this locally and see what happens.\n",
"Yup, so this isn't a Requests bug. I set up this simple Flask app:\n\n``` python\nfrom flask import Flask, request\napp = Flask(__name__)\n\[email protected](\"/\", methods=[\"POST\"])\ndef hello():\n with open('posted.dat', 'wb') as f:\n data = request.stream.read(1024)\n\n while data:\n f.write(data)\n data = request.stream.read(1024)\n\n return \"Success!\"\n\n\nif __name__ == '__main__':\n app.run()\n```\n\nAnd then posted a 4.34GB Fedora ISO to it. No bug was encountered here, everything went off fine, and the MD5 hashes of the two files match.\n\nThis makes me think the proxy is to blame here.\n",
"If not the proxy then the server. But I found it hard to believe this could be a requests issue when it was returning 200 OK and the server didn't have the proper file size.\n"
] |
https://api.github.com/repos/psf/requests/issues/1783
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1783/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1783/comments
|
https://api.github.com/repos/psf/requests/issues/1783/events
|
https://github.com/psf/requests/pull/1783
| 23,822,752 |
MDExOlB1bGxSZXF1ZXN0MTA1NzM3Njg=
| 1,783 |
jenkins test 2
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2013-12-05T22:21:05Z
|
2021-09-08T23:05:08Z
|
2013-12-05T22:26:45Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1783/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1783/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1783.diff",
"html_url": "https://github.com/psf/requests/pull/1783",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1783.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1783"
}
| true |
[] |
|
https://api.github.com/repos/psf/requests/issues/1782
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1782/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1782/comments
|
https://api.github.com/repos/psf/requests/issues/1782/events
|
https://github.com/psf/requests/pull/1782
| 23,822,484 |
MDExOlB1bGxSZXF1ZXN0MTA1NzM2MTA=
| 1,782 |
Fake PR for testing
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2013-12-05T22:16:57Z
|
2021-09-08T23:06:26Z
|
2013-12-05T22:20:43Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1782/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1782/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1782.diff",
"html_url": "https://github.com/psf/requests/pull/1782",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1782.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1782"
}
| true |
[] |
|
https://api.github.com/repos/psf/requests/issues/1781
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1781/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1781/comments
|
https://api.github.com/repos/psf/requests/issues/1781/events
|
https://github.com/psf/requests/pull/1781
| 23,817,778 |
MDExOlB1bGxSZXF1ZXN0MTA1NzA5NTY=
| 1,781 |
Clean up #1779.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2013-12-05T21:07:54Z
|
2021-09-08T23:11:09Z
|
2013-12-05T22:00:59Z
|
MEMBER
|
resolved
|
The unicode literals are a time bomb, let's just be rid of them.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1781/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1781/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1781.diff",
"html_url": "https://github.com/psf/requests/pull/1781",
"merged_at": "2013-12-05T22:00:59Z",
"patch_url": "https://github.com/psf/requests/pull/1781.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1781"
}
| true |
[
"Probably I would have fixed it if a message was left like \"Clean up the unicode, then we are good to merge\". \n",
"@kracekumar I know. =) The only reason I did this was because Kenneth got eager to push out a release last night, and I wanted to make sure we didn't slow him down. =) Your PR was a good one, thanks for providing it!\n",
"@kracekumar it's no big deal at all :)\n",
"@kennethreitz @Lukasa That is great. \n"
] |
https://api.github.com/repos/psf/requests/issues/1780
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1780/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1780/comments
|
https://api.github.com/repos/psf/requests/issues/1780/events
|
https://github.com/psf/requests/issues/1780
| 23,775,370 |
MDU6SXNzdWUyMzc3NTM3MA==
| 1,780 |
Specifying POST body content-type
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/343998?v=4",
"events_url": "https://api.github.com/users/bow/events{/privacy}",
"followers_url": "https://api.github.com/users/bow/followers",
"following_url": "https://api.github.com/users/bow/following{/other_user}",
"gists_url": "https://api.github.com/users/bow/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bow",
"id": 343998,
"login": "bow",
"node_id": "MDQ6VXNlcjM0Mzk5OA==",
"organizations_url": "https://api.github.com/users/bow/orgs",
"received_events_url": "https://api.github.com/users/bow/received_events",
"repos_url": "https://api.github.com/users/bow/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bow/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bow/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bow",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2013-12-05T09:29:11Z
|
2021-09-08T09:00:36Z
|
2013-12-09T19:12:38Z
|
NONE
|
resolved
|
I was using requests to submit POST requests to a third-party website when I noticed that the `files` parameter can also accept a three- or four-element tuple that denotes body content-type (and another header?) (https://github.com/kennethreitz/requests/blob/master/requests/models.py#L120) instead of just two as noted in the docs.
Is there a reason this is not yet mentioned in the docs or can I expect this interface may break in future releases?
I'm on Requests version 2.0.1, by the way :).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1780/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1780/timeline
| null |
completed
| null | null | false |
[
"It's not mentioned in the docs primarily out of laziness. We do talk about it elsewhere, such as on Stack Overflow. I think more generally we're not very happy with that interface, so we're always considering changing it, but I think we'd be happy to accept a docs pull request anyway. =)\n",
"Is there any reason you're not happy about it? If not (and if you do plan on keeping it), I could try look into the docs later on :).\n",
"It feels unpleasant to me, that's all. Requests' interface is normally very good at being self-describing, and the interface we're talking about now is phenomenally opaque. This is just not a clear way of providing the relevant information:\n\n``` python\nfiles = {'file': ('filename.xls', open('filename.xls'), 'application/vnd.ms-excel', {})}\n```\n\nIn the past, I've suggested that I'd like to remove all but the most basic multipart file upload functionality from Requests entirely, and provide a 'blessed' third-party module that can do all of the craziness that people often want from their multipart uploads. As of right now there's no plan afoot to do that, so documenting the current solution is OK, but it's possible I'll look into doing it for a 3.0 release.\n",
"There's also a secret Class you can use to do your own streamed uploads exactly as you like, to support any format. I never documented it though :)\n",
"@Lukasa I personally make great use of Requests' file uploading options (very helpful when debugging and auditing web apps), and would like to see it remain in core if possible. It's features like this (being able to specify a filename, a string or file object, and a content type in any multipart upload) that make Requests far more versatile than urllib2.\n\nA potential API change could be to allow each file to be either a tuple as it is now, or a dict, where the dict can specify any number of parameters as in: `{\"name\": \"filename.xls\", \"content\": open(\"filename.xls\"), \"content_type\": \"application/vnd.ms-excel\"}`. This is more verbose, but removes the need to remember where to position each element in the tuple (though unfortunately might cause the need to remember the filename is \"name\", etc..).\n",
"@Anorov I appreciate that the current set of features is highly useful, and I wouldn't dream of stripping it out unless there was a good option for replacing it that had total feature parity. We're a ways away from that right now, so we don't need to worry about it, but your objection is noted. =)\n\nThis isn't the kind of decision I get to make anyway, I only get to petition Kenneth.\n",
"> I only get to petition Kenneth.\n\n[When I was back there in seminary school, there was a man who put for the proposition that you could petition the Lord with prayer!](https://www.youtube.com/watch?v=PbpRlqqzSD4)\n\nSorry it just popped into my head. ;)\n"
] |
https://api.github.com/repos/psf/requests/issues/1779
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1779/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1779/comments
|
https://api.github.com/repos/psf/requests/issues/1779/events
|
https://github.com/psf/requests/pull/1779
| 23,769,423 |
MDExOlB1bGxSZXF1ZXN0MTA1NDM2Mjk=
| 1,779 |
Made default_user_agent reusable
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/311929?v=4",
"events_url": "https://api.github.com/users/kracekumar/events{/privacy}",
"followers_url": "https://api.github.com/users/kracekumar/followers",
"following_url": "https://api.github.com/users/kracekumar/following{/other_user}",
"gists_url": "https://api.github.com/users/kracekumar/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kracekumar",
"id": 311929,
"login": "kracekumar",
"node_id": "MDQ6VXNlcjMxMTkyOQ==",
"organizations_url": "https://api.github.com/users/kracekumar/orgs",
"received_events_url": "https://api.github.com/users/kracekumar/received_events",
"repos_url": "https://api.github.com/users/kracekumar/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kracekumar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kracekumar/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kracekumar",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2013-12-05T06:41:57Z
|
2021-09-08T23:06:31Z
|
2013-12-05T19:44:06Z
|
CONTRIBUTOR
|
resolved
|
This is not exciting change, but will help for people who wants to change name of user agent alone. Also name can be unicode.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1779/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1779/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1779.diff",
"html_url": "https://github.com/psf/requests/pull/1779",
"merged_at": "2013-12-05T19:44:06Z",
"patch_url": "https://github.com/psf/requests/pull/1779.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1779"
}
| true |
[
"I'm strongly +0 on this change. I can see how it's useful, but it's hidden away in an undocumented library function that we never planned to expose to the world, so I'm quite happy to let this change go unmerged as well. It's not hard to construct a User-Agent on your own. =)\n\nMore importantly, the code isn't right. This:\n\n``` python\nreturn \" \".join([u'%s/%s' % (name, __version__),\n u'%s/%s' % (_implementation, _implementation_version),\n u'%s/%s' % (p_system, p_release)])\n```\n\nhas multiple bugs in it. Firstly, if `name` is a bytestring on Python 3 then this will cause a bug. If it's a bytestring in Python 2 encoded in anything other than ASCII, this will also cause a bug.\n\nIf we insist on doing this, then `utils.to_native_str()` is the function to use here, and you must desperately avoid passing bytestrings into unicode strings. This set of problems makes me think that this function is far more complex than it's worth.\n",
"You didn't need to close this, it's possible that Kenneth or Ian will want it. =) +0 represents neutrality, not opposition.\n",
"@Lukasa I am wondering why you didn't use `0`. \n",
"I'm surprised that you haven't seen +0 before @kracekumar. We use it excessively to vote on topics and it is used in lots of other projects.\n\n+1 := strongly in favor\n+0 := neutral but leaning positively toward the change\n-0 := neutral but not exactly a fan of the change\n-1 := very much against the change.\n\nI'll respond to the PR itself later today\n",
"@sigmavirus24 I have seen `+0` in mailing lists and other places. Sure. I am glad to fix the bugs pointed out by @Lukasa if needed. Waiting for the comments to improve. \n",
"I'm not really sure. It's a small change but we would then have to expose it to the user and that's what I'm not exactly a fan of. Certainly they can find it on their own now but that's not the same as exposing it to them which would be encouraging its use. How commonly do requests users set their own User-Agent string and want it to include all of that information? Frankly I'm not convinced it is all that frequently. That said if @kennethreitz wants this I'm perfectly okay with it granted that it handles strings correctly (as @Lukasa already mentioned).\n",
"Seems harmless enough!\n",
"Let's just not document it.\n"
] |
https://api.github.com/repos/psf/requests/issues/1778
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1778/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1778/comments
|
https://api.github.com/repos/psf/requests/issues/1778/events
|
https://github.com/psf/requests/issues/1778
| 23,764,500 |
MDU6SXNzdWUyMzc2NDUwMA==
| 1,778 |
Browser compatible redirects don't work for APIs
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/416057?v=4",
"events_url": "https://api.github.com/users/jamielennox/events{/privacy}",
"followers_url": "https://api.github.com/users/jamielennox/followers",
"following_url": "https://api.github.com/users/jamielennox/following{/other_user}",
"gists_url": "https://api.github.com/users/jamielennox/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jamielennox",
"id": 416057,
"login": "jamielennox",
"node_id": "MDQ6VXNlcjQxNjA1Nw==",
"organizations_url": "https://api.github.com/users/jamielennox/orgs",
"received_events_url": "https://api.github.com/users/jamielennox/received_events",
"repos_url": "https://api.github.com/users/jamielennox/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jamielennox/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jamielennox/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jamielennox",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2013-12-05T03:28:14Z
|
2021-09-09T00:28:31Z
|
2013-12-09T19:12:46Z
|
NONE
|
resolved
|
In the redirection code there are a couple of special rules for handling redirections:
kennethreitz/requests@4bceb312f1b99d36a25f2985b5606e98b6f0d8cd/requests/sessions.py#L127 shows handling of rfc2616 and some browser specific handling for https://en.wikipedia.org/wiki/Post/Redirect/Get
I understand why that is the default, but for writing code that talks to APIs this doesn't work.
Could we introduce a flag or something that we can set to be 'api_compatible' and do more predictable redirect handling?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1778/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1778/timeline
| null |
completed
| null | null | false |
[
"If you want to override Requests' redirect logic, the way to do it is to set `allow_redirects` to false.\n\nIt's also unclear to me why APIs would ignore that set of behaviours. Writing an API that assumes that a 301/302 will keep the verb the same on redirect is willfully blind, especially when two other status codes exist that do not have this ambiguity. If an API writer really wants to ensure that the same verb is used after a redirect, they should be using 307.\n"
] |
https://api.github.com/repos/psf/requests/issues/1777
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1777/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1777/comments
|
https://api.github.com/repos/psf/requests/issues/1777/events
|
https://github.com/psf/requests/issues/1777
| 23,736,831 |
MDU6SXNzdWUyMzczNjgzMQ==
| 1,777 |
AttributeError: 'HTTPAdapter' object has no attribute 'proxy_manager'
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/113129?v=4",
"events_url": "https://api.github.com/users/erikcw/events{/privacy}",
"followers_url": "https://api.github.com/users/erikcw/followers",
"following_url": "https://api.github.com/users/erikcw/following{/other_user}",
"gists_url": "https://api.github.com/users/erikcw/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/erikcw",
"id": 113129,
"login": "erikcw",
"node_id": "MDQ6VXNlcjExMzEyOQ==",
"organizations_url": "https://api.github.com/users/erikcw/orgs",
"received_events_url": "https://api.github.com/users/erikcw/received_events",
"repos_url": "https://api.github.com/users/erikcw/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/erikcw/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/erikcw/subscriptions",
"type": "User",
"url": "https://api.github.com/users/erikcw",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2013-12-04T18:48:11Z
|
2021-09-09T00:10:11Z
|
2014-02-03T11:02:16Z
|
CONTRIBUTOR
|
resolved
|
I'm running requests version 2.0.1 on OSX 10.9 and proxying through Charles Proxy.
I've been getting the following traceback when trying query an API. Looks like the `proxy_manager` is not getting set in the object.
``` python
# requests session is setup like this:
headers = {
'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9) AppleWebKit/537.71 (KHTML, like Gecko) Version/7.0 Safari/537.71',
'Content-type': 'application/json',
}
proxies = {
"http": "http://127.0.0.1:8888",
"https": "http://127.0.0.1:8888",
}
s = requests.session()
s.headers.update(headers)
s.proxies = proxies
s.get(url) # pseudo code
File "/Users/erik/.virtualenvs/proj/lib/python2.7/site-packages/IPython/core/interactiveshell.py", line 2827, in run_code
exec code_obj in self.user_global_ns, self.user_ns
File "<ipython-input-6-db63d61cf602>", line 1, in <module>
tasks.process_transparency_report(r.pk)
File "/Users/erik/.virtualenvs/proj/lib/python2.7/site-packages/celery/local.py", line 165, in <lambda>
__call__ = lambda x, *a, **kw: x._get_current_object()(*a, **kw)
File "/Users/erik/.virtualenvs/proj/lib/python2.7/site-packages/celery/app/task.py", line 409, in __call__
return self.run(*args, **kwargs)
File "/Users/erik/Dropbox/home/git/proj/anx/tasks.py", line 417, in process_transparency_report
resp = report_job.check_report_job_status()
File "/Users/erik/Dropbox/home/git/proj/portal/models.py", line 4437, in check_report_job_status
resp = anx.report_service.report_job_status(self.report_id)
File "/Users/erik/Dropbox/home/git/proj/anx/api.py", line 784, in report_job_status
resp = self.get(url, params=params)
File "/Users/erik/Dropbox/home/git/proj/anx/api.py", line 124, in get
resp = self.browser.get(url, params=params, **kwargs)
File "/Users/erik/.virtualenvs/proj/lib/python2.7/site-packages/requests/sessions.py", line 373, in get
return self.request('GET', url, **kwargs)
File "/Users/erik/.virtualenvs/proj/lib/python2.7/site-packages/requests/sessions.py", line 361, in request
resp = self.send(prep, **send_kwargs)
File "/Users/erik/.virtualenvs/proj/lib/python2.7/site-packages/requests/sessions.py", line 464, in send
r = adapter.send(request, **kwargs)
File "/Users/erik/.virtualenvs/proj/lib/python2.7/site-packages/requests/adapters.py", line 296, in send
conn = self.get_connection(request.url, proxies)
File "/Users/erik/.virtualenvs/proj/lib/python2.7/site-packages/requests/adapters.py", line 202, in get_connection
if not proxy in self.proxy_manager:
AttributeError: 'HTTPAdapter' object has no attribute 'proxy_manager'
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1777/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1777/timeline
| null |
completed
| null | null | false |
[
"You're going to have to use some real code, I'm afraid, because that simply shouldn't be possible. Initialising the `HTTPAdapter` object also initialises `proxy_manager` to an empty dictionary, so it should be impossible for that `AttributeError` to be raised.\n",
"@erikcw Any updates on this issue? I'm really curious about how that happens in normal use case.\n",
"I haven't had time to factor the code out into a small reproducible sample yet. From my testing, it seems to either be an issue with Charles Proxy (strange since everything else seems work work well with it), or OSX. I don't have this issue when I run the code on our production servers (Linux). \n\nAlso, as an additional data point, I'm using a pickled copy of the the session object (to preserve session cookies for the API I'm consuming).\n\nCreating a new session seems to help in avoiding the exception.\n",
"Heh, well mentioning that you're pickling looks like the obvious problem to me. I think `HTTPAdapter.__setstate__()` should probably be setting `HTTPAdapter.proxy_manager` to the empty dictionary. =D\n",
"That did the trick!\n\nI put together a quick pull request with the fix. This commit also fixes one of the failing unit tests in the existing suite... (`test_session_pickling`)\n",
"@erikcw There already is related PR :)\nhttps://github.com/kennethreitz/requests/pull/1793\n",
"Fixed by #1793 (ages ago). =3\n"
] |
https://api.github.com/repos/psf/requests/issues/1776
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1776/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1776/comments
|
https://api.github.com/repos/psf/requests/issues/1776/events
|
https://github.com/psf/requests/pull/1776
| 23,713,024 |
MDExOlB1bGxSZXF1ZXN0MTA1MTEzNzI=
| 1,776 |
Fix 1728 (Fixed up from #1729)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 1 |
2013-12-04T12:46:50Z
|
2021-09-08T23:06:15Z
|
2013-12-05T22:26:58Z
|
CONTRIBUTOR
|
resolved
|
This is just @gazpachoking's PR #1729 in a mergable state and with the PR feedback that @Lukasa left on it.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1776/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1776/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1776.diff",
"html_url": "https://github.com/psf/requests/pull/1776",
"merged_at": "2013-12-05T22:26:58Z",
"patch_url": "https://github.com/psf/requests/pull/1776.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1776"
}
| true |
[
"LGTM. =) :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/1775
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1775/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1775/comments
|
https://api.github.com/repos/psf/requests/issues/1775/events
|
https://github.com/psf/requests/pull/1775
| 23,649,105 |
MDExOlB1bGxSZXF1ZXN0MTA0NzQ5OTQ=
| 1,775 |
Add README to requests/packages
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
] | null | 15 |
2013-12-03T15:19:22Z
|
2021-09-08T23:10:59Z
|
2013-12-04T01:02:25Z
|
CONTRIBUTOR
|
resolved
|
- Hopefully people will read this before making changes to vendored libraries
Hopefully this will prevent PRs like: #1773
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1775/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1775/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1775.diff",
"html_url": "https://github.com/psf/requests/pull/1775",
"merged_at": "2013-12-04T01:02:25Z",
"patch_url": "https://github.com/psf/requests/pull/1775.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1775"
}
| true |
[
"I approve this message. I don't approve the fact that Travis CI tested it instead of Jenkins.\n",
"@kennethreitz can you turn off the Travis integration hook altogether? This is annoying. I'd be willing to guess that they both ran this but Travis ran it last and got precendence over the other.\n",
"uh, i thought i did. grumble.\n",
"I'm not sure what's going on. I disabled the WebHook and no longer have a Travis CI account.\n",
"> I'm not sure what's going on. I disabled the WebHook and no longer have a Travis CI account.\n\nGo to Settings (upper right corner) > Applications and disable the one for Travis. That will deny them the privilege to write to any of your repos. I'm not entirely certain applications can request GitHub delete the authorization tokens they were granted by the user.\n",
"Ah, good call! Now, to find out why our own tests aren't being reported for some reason.\n",
"I develop and maintain an API wrapper, I _should_ know these things ;)\n\nAs for why Jenkins isn't getting reported, check the webhooks for the post-receive that sends the hook data to it, then check to make sure you have an authorization in the same section for it. If necessary, delete your existing one and make a new one. I don't know what else could have happened with that though. Let me also take a look at the commit status API returns for these PRs to see if Jenkins is getting reported and not just overtaken by Travis.\n",
"I enabled two-factor auth for work recently. Might be related.\n\nThe tests are being run, just not being reported back properly. I suspect your theory may be correct.\n",
"Two-Factor Auth might invalidate old tokens but really _shouldn't_. That also only effects it if Jenkins is using Basic Authentication to access the API. If it is, then that is most certainly the issue. If you made a token however, then that's really bizarre and should be mentioned to the API team via the contact form. We could also just ping @izuzak here ;)\n",
"I suspect travis interference :)\n",
"Well there are absolutely _no statuses_ on that commit according to the API... that's just weird.\n",
"FWIW:\n\n``` python\nimport github3\n\nrequests = github3.repository('kennethreitz', 'requests')\npull = requests.pull_request(1775)\ncommit = next(pull.iter_commits())\nstatuses = list(requests.iter_statuses(commit.sha)) # => []\n```\n",
"I know why Travis ran on this PR. I had set up Travis to run on my branch too so I could check stuff _before_ submitting a PR. Somehow that must be bleeding over here especially since I have commit privileges. That's extraordinarily bizarre but it make sense since it only seemed to happen on my PR and not on others (unless I missed them). \n\nTurning off now.\n",
"@sigmavirus24 you're more than welcome to work in the main repo in side-branches :)\n",
"> We could also just ping @izuzak here ;)\n\nPong! Just wanted to drop a note about those statuses. Notice the difference when you request statuses for the commit in the context of sigmavirus24/requests:\n\nhttps://api.github.com/repos/sigmavirus24/requests/statuses/617bc8cc1c1b34b11954cf9e4caec3a25a07167b\n\nand in the context of kennethreitz/requests\n\nhttps://api.github.com/repos/kennethreitz/requests/statuses/617bc8cc1c1b34b11954cf9e4caec3a25a07167b\n\nSo, statuses were set by Travis, but just for sigmavirus24/requests (and not for kennethreitz/requests), which was expected, right?\n\nAlso, as far as I know, turning on 2FA should not invalidate existing OAuth tokens. [Let us know](https://github.com/contact) if you suspect that indeed did happen.\n"
] |
https://api.github.com/repos/psf/requests/issues/1774
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1774/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1774/comments
|
https://api.github.com/repos/psf/requests/issues/1774/events
|
https://github.com/psf/requests/issues/1774
| 23,643,925 |
MDU6SXNzdWUyMzY0MzkyNQ==
| 1,774 |
Response should not return 'ISO-8859-1' as default encoding
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/494412?v=4",
"events_url": "https://api.github.com/users/weiqiyiji/events{/privacy}",
"followers_url": "https://api.github.com/users/weiqiyiji/followers",
"following_url": "https://api.github.com/users/weiqiyiji/following{/other_user}",
"gists_url": "https://api.github.com/users/weiqiyiji/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/weiqiyiji",
"id": 494412,
"login": "weiqiyiji",
"node_id": "MDQ6VXNlcjQ5NDQxMg==",
"organizations_url": "https://api.github.com/users/weiqiyiji/orgs",
"received_events_url": "https://api.github.com/users/weiqiyiji/received_events",
"repos_url": "https://api.github.com/users/weiqiyiji/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/weiqiyiji/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/weiqiyiji/subscriptions",
"type": "User",
"url": "https://api.github.com/users/weiqiyiji",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2013-12-03T13:59:38Z
|
2021-09-08T23:10:44Z
|
2013-12-03T14:07:09Z
|
NONE
|
resolved
|
Hi, the code that get encoding, when fetching http://lianxu.me/blog/2012/11/14/10-cocoa-objc-newbie-problems/, it will return default encoding 'ISO-8859-1' (The page's content-type is `text/html`, not `text/html; charset=utf-8`)
```
def get_encoding_from_headers(headers):
"""Returns encodings from given HTTP Header Dict.
:param headers: dictionary to extract encoding from.
"""
content_type = headers.get('content-type')
if not content_type:
return None
content_type, params = cgi.parse_header(content_type)
if 'charset' in params:
return params['charset'].strip("'\"")
if 'text' in content_type:
return 'ISO-8859-1'
```
And then, encoding is 'ISO-8859-1', so the text will call unicode(content, 'ISO-8859-1'), but the content is already utf-8 encoded, so this will return an invalid unicode string that I cannot call `unicode.decode('utf-8')` on it.
```
@property
def text(self):
"""Content of the response, in unicode.
if Response.encoding is None and chardet module is available, encoding
will be guessed.
"""
# Try charset from content-type
content = None
encoding = self.encoding
if not self.content:
return str('')
# Fallback to auto-detected encoding.
if self.encoding is None:
encoding = self.apparent_encoding
# Decode unicode from given encoding.
try:
content = str(self.content, encoding, errors='replace')
except (LookupError, TypeError):
# A LookupError is raised if the encoding was not found which could
# indicate a misspelling or similar mistake.
#
# A TypeError can be raised if encoding is None
#
# So we try blindly encoding.
content = str(self.content, errors='replace')
return content
```
I'll show you the code
```
resp = requests.get('http://lianxu.me/blog/2012/11/14/10-cocoa-objc-newbie-problems/')
resp.apprent_encoding # == 'utf-8'
resp.encoding # == 'ISO-8859-1'
resp.content # is byte array encoded in 'utf-8'
resp.text # is a unicode string that wrap content in 'ISO-8859-1'
```
I think requests should return None when no encoding found, otherwise this will lead wrong text that user cannot decode on it
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1774/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1774/timeline
| null |
completed
| null | null | false |
[
"I'm 90% sure we just received a similar issue. Let me find it before I give you what I think I remember as the conclusion\n",
"And [the issue I was thinking of](https://github.com/kennethreitz/requests/issues/1737) is still open. I'm closing this to centralize discussion over there with the added request that you please look at the open issues before you open a new one. Look closely and maybe even read some of the issue bodies because we're keeping a bunch of \"discussion\" issues open as well as already fixed issues.\n",
"This issue has been raised many times in the past (please see #1737, #1604, #1589, #1588, #1546. There are others, but this list should be sufficient). The issue @sigmavirus24 is looking for is #1604.\n\nRFC 2616 is very clear here: if no encoding is declared in the Content-Type header, the encoding for text/html is assumed to be ISO-8859-1. If you know better, you are encouraged to either decode `Response.content` yourself or to set `Response.encoding` to the relevant encoding.\n",
"As usual @Lukasa is 100% correct.\n",
"@Lukasa thanks for your explanation! I think not every user knows the detail defined in RFC2616, so should you add some comment on `Response.text`?\n",
"Adding to the documentation never hurts. It also doesn't hurt to make check \nstandards surrounding what web clients are supposed to do before making a \nfeature request. The standards are well defined and well documented.\n"
] |
https://api.github.com/repos/psf/requests/issues/1773
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1773/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1773/comments
|
https://api.github.com/repos/psf/requests/issues/1773/events
|
https://github.com/psf/requests/pull/1773
| 23,639,960 |
MDExOlB1bGxSZXF1ZXN0MTA0Njk5NTg=
| 1,773 |
Ensure sockets are properly closed on object destruction
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3353832?v=4",
"events_url": "https://api.github.com/users/vincentxb/events{/privacy}",
"followers_url": "https://api.github.com/users/vincentxb/followers",
"following_url": "https://api.github.com/users/vincentxb/following{/other_user}",
"gists_url": "https://api.github.com/users/vincentxb/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/vincentxb",
"id": 3353832,
"login": "vincentxb",
"node_id": "MDQ6VXNlcjMzNTM4MzI=",
"organizations_url": "https://api.github.com/users/vincentxb/orgs",
"received_events_url": "https://api.github.com/users/vincentxb/received_events",
"repos_url": "https://api.github.com/users/vincentxb/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/vincentxb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vincentxb/subscriptions",
"type": "User",
"url": "https://api.github.com/users/vincentxb",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2013-12-03T12:44:55Z
|
2021-09-08T23:10:54Z
|
2013-12-03T14:10:19Z
|
NONE
|
resolved
|
Fix issue @tardyp reported on #239
We have not been able to reproduce on a simple testcase. We can only make it happen on our complex environment; involving buildbot, twisted and txrequests.
This fix no longer exhibits the issue. Please, give us a feedback whether this solutions is correct.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1773/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1773/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1773.diff",
"html_url": "https://github.com/psf/requests/pull/1773",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1773.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1773"
}
| true |
[
"It's certainly possible that the fixes are correct, but all your changes are in urllib3. This is actually a third party library that is maintained separately from Requests, and we simply include a direct copy of their code. Can I suggest you raise the issue at [shazow/urllib3](https://github.com/shazow/urllib3)?\n"
] |
https://api.github.com/repos/psf/requests/issues/1772
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1772/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1772/comments
|
https://api.github.com/repos/psf/requests/issues/1772/events
|
https://github.com/psf/requests/pull/1772
| 23,618,218 |
MDExOlB1bGxSZXF1ZXN0MTA0NTgyNzI=
| 1,772 |
Make morsel_to_cookie more pythonic
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1069047?v=4",
"events_url": "https://api.github.com/users/mdbecker/events{/privacy}",
"followers_url": "https://api.github.com/users/mdbecker/followers",
"following_url": "https://api.github.com/users/mdbecker/following{/other_user}",
"gists_url": "https://api.github.com/users/mdbecker/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mdbecker",
"id": 1069047,
"login": "mdbecker",
"node_id": "MDQ6VXNlcjEwNjkwNDc=",
"organizations_url": "https://api.github.com/users/mdbecker/orgs",
"received_events_url": "https://api.github.com/users/mdbecker/received_events",
"repos_url": "https://api.github.com/users/mdbecker/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mdbecker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mdbecker/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mdbecker",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 16 |
2013-12-03T02:56:53Z
|
2021-09-08T23:07:16Z
|
2013-12-06T15:38:12Z
|
CONTRIBUTOR
|
resolved
|
Fix a pep8 error. Prior to fixing this, I implemented some tests to cover the code I was fixing. This should help prevent regression issues.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1772/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1772/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1772.diff",
"html_url": "https://github.com/psf/requests/pull/1772",
"merged_at": "2013-12-06T15:38:12Z",
"patch_url": "https://github.com/psf/requests/pull/1772.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1772"
}
| true |
[
"@sigmavirus24 - Pushed the changes we discussed. Let me know if you have anything else!\n\nThanks!\n",
"With the exception that the diff is a bit noisy due to @mdbecker alphabetizing portions (imports and one functions params), this is good to merge in my opinion.\n\nFor those who don't care to read :+1: :shipit: \n",
"Breaks the builds!\n\nhttp://ci.kennethreitz.org/job/requests-pr/123/PYTHON=2.7/console\n",
"Thanks @kennethreitz. Looks like there is a bug here (either in my test or in the code.) The issue is that `time.mktime` returns a time in localtime which I wasn't accounting for in my test. There are 2 ways to solve this problem:\n1. Modify `morsel_to_cookie` to subtract `time.timezone` so that the resulting value represents UTC time. This will result in the value of `expires` being the same on all systems regardless of timezone.\n2. Modify `TestMorselToCookieExpires.test_expires_valid_str` to subtract `time.timezone` so that the test always passes.\n\nThe former solution seems correct to me, and I can't find anything in the code or RFCs to contradict this. Unfortunately this change would definitely impact users of this module so I'd like to hear others opinions on this.\n\nThanks!\n",
"I've pushed a new copy of the branch with the first solution mentioned above. This should fix the build.\n",
"Nifty CI system! Let me know if you want any more changes!\n",
"@mdbecker the PR currently doesn't merge cleanly, could you rebase off of master?\n",
"But, the tests do pass! :D\n",
"Sure, let me rebase!\n",
"All set!\n",
":metal:\n",
"Took one more quick look over it and I'm once again :+1: :shipit: \n",
"Ah, before we do this, can we try to keep the tests in `py.test` style? This means removing all those calls to `self.assert*` and replacing them with plain `assert` statements. =)\n",
"@Lukasa re-pushed\n",
":sparkles: :cake: :sparkles:\n",
"Awesome, this is great work! Thanks @mdbecker! :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/1771
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1771/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1771/comments
|
https://api.github.com/repos/psf/requests/issues/1771/events
|
https://github.com/psf/requests/issues/1771
| 23,618,167 |
MDU6SXNzdWUyMzYxODE2Nw==
| 1,771 |
Object type comparison in morsel_to_cookie is unpythonic
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1069047?v=4",
"events_url": "https://api.github.com/users/mdbecker/events{/privacy}",
"followers_url": "https://api.github.com/users/mdbecker/followers",
"following_url": "https://api.github.com/users/mdbecker/following{/other_user}",
"gists_url": "https://api.github.com/users/mdbecker/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mdbecker",
"id": 1069047,
"login": "mdbecker",
"node_id": "MDQ6VXNlcjEwNjkwNDc=",
"organizations_url": "https://api.github.com/users/mdbecker/orgs",
"received_events_url": "https://api.github.com/users/mdbecker/received_events",
"repos_url": "https://api.github.com/users/mdbecker/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mdbecker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mdbecker/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mdbecker",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2013-12-03T02:55:18Z
|
2021-09-09T00:34:14Z
|
2013-12-03T03:14:22Z
|
CONTRIBUTOR
|
resolved
|
Found this one via the pep8 tool:
requests/cookies.py:386:26: E721 do not compare types, use 'isinstance()'
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1771/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1771/timeline
| null |
completed
| null | null | false |
[
"Closing to centralize discussion on #1772\n"
] |
https://api.github.com/repos/psf/requests/issues/1770
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1770/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1770/comments
|
https://api.github.com/repos/psf/requests/issues/1770/events
|
https://github.com/psf/requests/pull/1770
| 23,604,803 |
MDExOlB1bGxSZXF1ZXN0MTA0NTA1Njc=
| 1,770 |
Implemetation of IP address ranges for no_proxy environment variable
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3167253?v=4",
"events_url": "https://api.github.com/users/kmadac/events{/privacy}",
"followers_url": "https://api.github.com/users/kmadac/followers",
"following_url": "https://api.github.com/users/kmadac/following{/other_user}",
"gists_url": "https://api.github.com/users/kmadac/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kmadac",
"id": 3167253,
"login": "kmadac",
"node_id": "MDQ6VXNlcjMxNjcyNTM=",
"organizations_url": "https://api.github.com/users/kmadac/orgs",
"received_events_url": "https://api.github.com/users/kmadac/received_events",
"repos_url": "https://api.github.com/users/kmadac/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kmadac/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kmadac/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kmadac",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2013-12-02T21:55:18Z
|
2021-09-08T23:07:17Z
|
2013-12-05T22:40:44Z
|
CONTRIBUTOR
|
resolved
|
Hello,
During OpenStack implementation we found out that it is not possible to use IP ranges in no_proxy environment variable. We needed to exclude whole range of IP addresses (172.28.1.0/24), not only domain names or particular IP addresses.
Now it is possible to specify no_proxy like
```
no_proxy=127.0.0.1,192.168.0.0/24
```
and request to http://192.168.0.20:5000/ won't go through http proxy.
I added tests for no_proxy function as well.
Kamil
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1770/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1770/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1770.diff",
"html_url": "https://github.com/psf/requests/pull/1770",
"merged_at": "2013-12-05T22:40:44Z",
"patch_url": "https://github.com/psf/requests/pull/1770.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1770"
}
| true |
[
"Hi Kamil, thanks for this work! Unfortunately, there are a number of problems that would need to be addressed before we could merge this pull request.\n1. You've checked in an Intellij IDEA file, `.idea/dictionaries/kmadac.xml`. This needs to be removed.\n2. `netaddr` is a third-party module that isn't part of the standard library. Requests tries very hard not to have external dependencies other than those we vendor in, and vendoring in `netaddr` for this task is excessive. You'll need an implementation that doesn't rely on it.\n\nIf you can resolve these, I see no reason for us not to merge this.\n",
"I second everything @Lukasa said. If you can, please rebase out everything except f9a48e0\n",
"Hi Guys,\n\nAll objections should be fixed now. There is no dependency on non-stdlib modules, tests are added and unnecessary files are removed too. \n",
"I'm :+1:\n",
"I'm so sorry, I have further feedback. =(\n",
"@Lukasa don't apologize. Those were good catches that I totally missed because I skimmed over them.\n\nI'm :+1: when @Lukasa's comments are addressed.\n",
"@Lukasa what feedback do you have? Is it still standing?\n",
"All satisfied, I think I'm happy now. =)\n"
] |
https://api.github.com/repos/psf/requests/issues/1769
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1769/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1769/comments
|
https://api.github.com/repos/psf/requests/issues/1769/events
|
https://github.com/psf/requests/issues/1769
| 23,563,843 |
MDU6SXNzdWUyMzU2Mzg0Mw==
| 1,769 |
Requests failing over integer in cookie
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2722815?v=4",
"events_url": "https://api.github.com/users/CarstVaartjes/events{/privacy}",
"followers_url": "https://api.github.com/users/CarstVaartjes/followers",
"following_url": "https://api.github.com/users/CarstVaartjes/following{/other_user}",
"gists_url": "https://api.github.com/users/CarstVaartjes/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/CarstVaartjes",
"id": 2722815,
"login": "CarstVaartjes",
"node_id": "MDQ6VXNlcjI3MjI4MTU=",
"organizations_url": "https://api.github.com/users/CarstVaartjes/orgs",
"received_events_url": "https://api.github.com/users/CarstVaartjes/received_events",
"repos_url": "https://api.github.com/users/CarstVaartjes/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/CarstVaartjes/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/CarstVaartjes/subscriptions",
"type": "User",
"url": "https://api.github.com/users/CarstVaartjes",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2013-12-02T10:35:37Z
|
2021-09-09T00:34:14Z
|
2013-12-02T10:46:54Z
|
NONE
|
resolved
|
Hi,
I described it here:
http://stackoverflow.com/questions/20325571/python-requests-failing-over-integer-in-cookie
But basically this creates an error:
```
import requests
r = requests.get('http://www.c1000.nl/kies-uw-winkel.aspx')
...
File "/srv/www/li/venv/local/lib/python2.7/site-packages/requests/cookies.py", line 270, in set_cookie
if cookie.value.startswith('"') and cookie.value.endswith('"'):
AttributeError: 'int' object has no attribute 'startswith'
```
Is this a known issue? Is there any workaround? (i.e. do not parse cookies?)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1769/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1769/timeline
| null |
completed
| null | null | false |
[
"Hi there!\n\nFirst, the answer. Yes, this is a known issue, and a fix is already present in master.\n\nNow, some notes. Asking this question on Stack Overflow was exactly the right thing to do. When you do that, please wait more than 30 minutes before moving to this issue tracker. Both I and @sigmavirus24 regularly look at the Python-Requests tag on Stack Overflow, so you'd have got our attention before long. Asking the question both there and here is unnecessary duplication.\n\nSecondly, the question 'is this a known issue' is one you should be able to answer just as well as us. It is a known issue: issue #1745 tracked it. A [very simple GitHub search](https://github.com/kennethreitz/requests/search?o=desc&q=cookie+AttributeError&s=created&type=Issues) could have shown you this issue quite quickly. I appreciate that it's not always obvious, but if everyone took five minutes to browse some old related issues I'd save hours of my life over the long term.\n",
"Hi,\n\nFirst of all: sorry, as I of course do not want to waste your time and also for double posting both on SO and here. \nI did search for it on both SO and on github actually, but I searched for \"cookie\" and \"integer\" which didn't come up with any hits. So I hope you understand that I wasn't consciously/lazily wasting your time at the very least...\n",
"I do. =) Just wanted to make sure. Thanks for taking the effort! :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/1768
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1768/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1768/comments
|
https://api.github.com/repos/psf/requests/issues/1768/events
|
https://github.com/psf/requests/pull/1768
| 23,535,179 |
MDExOlB1bGxSZXF1ZXN0MTA0MTUwMDg=
| 1,768 |
Unencode usernames and passwords taken from URLs.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 2 |
2013-12-01T11:34:25Z
|
2021-09-08T23:08:18Z
|
2013-12-04T01:36:00Z
|
MEMBER
|
resolved
|
Fixes #1767.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1768/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1768/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1768.diff",
"html_url": "https://github.com/psf/requests/pull/1768",
"merged_at": "2013-12-04T01:36:00Z",
"patch_url": "https://github.com/psf/requests/pull/1768.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1768"
}
| true |
[
":+1:\n",
"good call!\n"
] |
https://api.github.com/repos/psf/requests/issues/1767
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1767/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1767/comments
|
https://api.github.com/repos/psf/requests/issues/1767/events
|
https://github.com/psf/requests/issues/1767
| 23,533,930 |
MDU6SXNzdWUyMzUzMzkzMA==
| 1,767 |
HTTP Basic Auth credentials extracted from URL are %-encoded
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/120810?v=4",
"events_url": "https://api.github.com/users/lw/events{/privacy}",
"followers_url": "https://api.github.com/users/lw/followers",
"following_url": "https://api.github.com/users/lw/following{/other_user}",
"gists_url": "https://api.github.com/users/lw/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/lw",
"id": 120810,
"login": "lw",
"node_id": "MDQ6VXNlcjEyMDgxMA==",
"organizations_url": "https://api.github.com/users/lw/orgs",
"received_events_url": "https://api.github.com/users/lw/received_events",
"repos_url": "https://api.github.com/users/lw/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/lw/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lw/subscriptions",
"type": "User",
"url": "https://api.github.com/users/lw",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
},
{
"color": "5319e7",
"default": false,
"description": null,
"id": 67760318,
"name": "Fixed",
"node_id": "MDU6TGFiZWw2Nzc2MDMxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Fixed"
}
] |
closed
| true | null |
[] | null | 2 |
2013-12-01T09:22:15Z
|
2021-09-09T00:34:14Z
|
2013-12-04T01:36:00Z
|
NONE
|
resolved
|
I was relying on credential auto-extraction from the hostname [1] to perform HTTP Basic Auth but I was getting "401 Unauthorized": the spaces in the password were substituted by `%20`. Manually extracting them with `urlsplit` and passing them to the `auth` parameter as a tuple works.
[1] http://docs.python-requests.org/en/latest/user/authentication/#netrc-authentication (second paragraph)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1767/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1767/timeline
| null |
completed
| null | null | false |
[
"Hmm, that behaviour feels wrong to me. Probably should fix this. =) Thanks! :cake:\n",
"Thanks for reporting this @lerks \n"
] |
https://api.github.com/repos/psf/requests/issues/1766
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1766/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1766/comments
|
https://api.github.com/repos/psf/requests/issues/1766/events
|
https://github.com/psf/requests/pull/1766
| 23,480,226 |
MDExOlB1bGxSZXF1ZXN0MTAzODg0ODk=
| 1,766 |
Quote qop values in digest auth.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 4 |
2013-11-29T08:38:09Z
|
2021-09-09T00:01:16Z
|
2013-12-04T01:33:59Z
|
MEMBER
|
resolved
|
Resolves #1765.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1766/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1766/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1766.diff",
"html_url": "https://github.com/psf/requests/pull/1766",
"merged_at": "2013-12-04T01:33:59Z",
"patch_url": "https://github.com/psf/requests/pull/1766.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1766"
}
| true |
[
"#1765 was already a PR, why did you open this one? We could have added tests after the fact. Can you rebase this to work off of #1765 so that @bicycle1885 gets some credit?\n",
"Uh, yeah, that's because I'm an idiot and didn't spot it. Like a moron.\n",
"> Uh, yeah, that's because I'm an idiot and didn't spot it. Like a moron.\n\nNo need to be so hard on yourself.\n",
"Also, :+1: will label with \"Minion Seal of Approval\"\n"
] |
https://api.github.com/repos/psf/requests/issues/1765
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1765/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1765/comments
|
https://api.github.com/repos/psf/requests/issues/1765/events
|
https://github.com/psf/requests/pull/1765
| 23,477,378 |
MDExOlB1bGxSZXF1ZXN0MTAzODY5MDM=
| 1,765 |
quote qop options in Digest Auth
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/905683?v=4",
"events_url": "https://api.github.com/users/bicycle1885/events{/privacy}",
"followers_url": "https://api.github.com/users/bicycle1885/followers",
"following_url": "https://api.github.com/users/bicycle1885/following{/other_user}",
"gists_url": "https://api.github.com/users/bicycle1885/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bicycle1885",
"id": 905683,
"login": "bicycle1885",
"node_id": "MDQ6VXNlcjkwNTY4Mw==",
"organizations_url": "https://api.github.com/users/bicycle1885/orgs",
"received_events_url": "https://api.github.com/users/bicycle1885/received_events",
"repos_url": "https://api.github.com/users/bicycle1885/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bicycle1885/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bicycle1885/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bicycle1885",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 10 |
2013-11-29T06:45:00Z
|
2021-09-09T00:01:15Z
|
2013-11-29T16:47:12Z
|
CONTRIBUTOR
|
resolved
|
Based on RFC2617 (http://tools.ietf.org/html/rfc2617), the value of
'qop-options' directive should be quoted with double quotes:
```
qop-options
This directive is optional, but is made so only for backward
compatibility with RFC 2069 [6]; it SHOULD be used by all
implementations compliant with this version of the Digest
scheme. If present, it is a quoted string of one or more
tokens indicating the "quality of protection" values supported by
the server. The value "auth" indicates authentication; the
value "auth-int" indicates authentication with
integrity protection; see the
```
curl comamnd-line tool also appends these quotes. You can see this
by `curl -v --digest --user user:passwd http://example.com/digest-auth`.
Unfortunately, some minor server-side implementations seem to be sensitive
on this difference.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1765/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1765/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1765.diff",
"html_url": "https://github.com/psf/requests/pull/1765",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1765.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1765"
}
| true |
[
"These are packet dumps from wireshark. Please note the difference of quotes around the value of qop options.\n\ncurl:\n\n```\n0050 69 3f 6d 6f 64 65 3d 31 20 48 54 54 50 2f 31 2e i?mode=1 HTTP/1.\n0060 31 0d 0a 41 75 74 68 6f 72 69 7a 61 74 69 6f 6e 1..Authorization\n0070 3a 20 44 69 67 65 73 74 20 75 73 65 72 6e 61 6d : Digest usernam\n0080 65 3d 22 61 64 6d 69 6e 22 2c 20 72 65 61 6c 6d e=\"admin\", realm\n0090 3d 22 2f 73 65 74 75 70 2f 22 2c 20 6e 6f 6e 63 =\"/setup/\", nonc\n00a0 65 3d 22 63 36 33 38 30 35 63 39 30 36 36 65 31 e=\"c63805c9066e1\n00b0 65 63 35 63 30 66 66 39 32 61 32 33 35 38 34 62 ec5c0ff92a23584b\n00c0 63 38 66 22 2c 20 75 72 69 3d 22 2f 73 65 74 75 c8f\", uri=\"/setu\n00d0 70 2f 66 69 6c 65 64 6f 77 6e 6c 6f 61 64 2e 63 p/filedownload.c\n00e0 67 69 3f 6d 6f 64 65 3d 31 22 2c 20 63 6e 6f 6e gi?mode=1\", cnon\n00f0 63 65 3d 22 4d 54 4d 34 4e 54 59 7a 22 2c 20 6e ce=\"MTM4NTYz\", n\n0100 63 3d 30 30 30 30 30 30 30 31 2c 20 71 6f 70 3d c=00000001, qop=\n0110 22 61 75 74 68 22 2c 20 72 65 73 70 6f 6e 73 65 \"auth\", response\n0120 3d 22 64 32 33 31 62 32 33 32 31 63 30 39 37 35 =\"d231b2321c0975\n0130 37 34 39 30 31 33 32 30 37 38 30 63 33 63 39 61 74901320780c3c9a\n0140 32 66 22 2c 20 61 6c 67 6f 72 69 74 68 6d 3d 22 2f\", algorithm=\"\n0150 4d 44 35 22 0d 0a 55 73 65 72 2d 41 67 65 6e 74 MD5\"..User-Agent\n0160 3a 20 63 75 72 6c 2f 37 2e 32 31 2e 34 20 28 75 : curl/7.21.4 (u\n0170 6e 69 76 65 72 73 61 6c 2d 61 70 70 6c 65 2d 64 niversal-apple-d\n0180 61 72 77 69 6e 31 31 2e 30 29 20 6c 69 62 63 75 arwin11.0) libcu\n0190 72 6c 2f 37 2e 32 31 2e 34 20 4f 70 65 6e 53 53 rl/7.21.4 OpenSS\n01a0 4c 2f 30 2e 39 2e 38 79 20 7a 6c 69 62 2f 31 2e L/0.9.8y zlib/1.\n01b0 32 2e 35 0d 0a 48 6f 73 74 3a 20 74 65 73 74 30 2.5..Host: test0\n```\n\nrequests:\n\n```\n0080 6a 70 0d 0a 41 75 74 68 6f 72 69 7a 61 74 69 6f jp..Authorizatio\n0090 6e 3a 20 44 69 67 65 73 74 20 75 73 65 72 6e 61 n: Digest userna\n00a0 6d 65 3d 22 61 64 6d 69 6e 22 2c 20 72 65 61 6c me=\"admin\", real\n00b0 6d 3d 22 2f 73 65 74 75 70 2f 22 2c 20 6e 6f 6e m=\"/setup/\", non\n00c0 63 65 3d 22 39 33 36 61 62 62 36 31 32 64 66 34 ce=\"936abb612df4\n00d0 35 38 62 38 66 32 65 63 30 61 33 38 66 64 64 32 58b8f2ec0a38fdd2\n00e0 34 38 34 65 22 2c 20 75 72 69 3d 22 2f 73 65 74 484e\", uri=\"/set\n00f0 75 70 2f 66 69 6c 65 64 6f 77 6e 6c 6f 61 64 2e up/filedownload.\n0100 63 67 69 3f 6d 6f 64 65 3d 31 22 2c 20 72 65 73 cgi?mode=1\", res\n0110 70 6f 6e 73 65 3d 22 66 64 31 30 66 31 39 64 36 ponse=\"fd10f19d6\n0120 63 37 35 36 32 34 36 34 32 35 33 63 62 64 31 66 c7562464253cbd1f\n0130 33 34 37 39 36 62 33 22 2c 20 61 6c 67 6f 72 69 34796b3\", algori\n0140 74 68 6d 3d 22 4d 44 35 22 2c 20 71 6f 70 3d 61 thm=\"MD5\", qop=a\n0150 75 74 68 2c 20 6e 63 3d 30 30 30 30 30 30 30 31 uth, nc=00000001\n0160 2c 20 63 6e 6f 6e 63 65 3d 22 33 35 33 63 62 31 , cnonce=\"353cb1\n0170 39 61 30 39 30 30 39 36 62 33 22 0d 0a 41 63 63 9a090096b3\"..Acc\n0180 65 70 74 2d 45 6e 63 6f 64 69 6e 67 3a 20 67 7a ept-Encoding: gz\n0190 69 70 2c 20 64 65 66 6c 61 74 65 2c 20 63 6f 6d ip, deflate, com\n01a0 70 72 65 73 73 0d 0a 41 63 63 65 70 74 3a 20 2a press..Accept: *\n01b0 2f 2a 0d 0a 55 73 65 72 2d 41 67 65 6e 74 3a 20 /*..User-Agent: \n01c0 70 79 74 68 6f 6e 2d 72 65 71 75 65 73 74 73 2f python-requests/\n01d0 32 2e 30 2e 31 20 43 50 79 74 68 6f 6e 2f 32 2e 2.0.1 CPython/2.\n01e0 37 2e 35 20 44 61 72 77 69 6e 2f 31 31 2e 34 2e 7.5 Darwin/11.4.\n01f0 32 0d 0a 0d 0a 2....\n```\n",
"Yeah, the RFC is consistent:\n\n```\n qop-options = \"qop\" \"=\" <\"> 1#qop-value <\">\n qop-value = \"auth\" | \"auth-int\" | token\n```\n\nWe should have literal quotes there. Good spot!\n",
"Fix prepared at #1766. Thanks for reporting this! :cake:\n",
"I hope the next release include this fix, because I'm avoiding this auth problem with my ad-hoc patch :)\n",
"@bicycle1885 Sorry, I didn't spot this was a Pull Request! Do you want to add the test I wrote in #1766 to this PR, and I'll close the other one?\n",
"@Lukasa if you rebase your branch off of his, you can give him credit for the fix, while adding the test yourself. You kill two birds with one stone. You both get credit for the work you did.\n",
"Ignore that, I merged your changes into #1766 so you still get the credit for your work. =)\n",
"Let's close this and track on #1766 so we don't have too many PRs for this issue.\n",
":+1: Thanks for taking care of that @Lukasa \n",
"Thanks for your attention @sigmavirus24.\nAnd I can save the cost to write a test code thanks to @Lukasa's job!\n"
] |
https://api.github.com/repos/psf/requests/issues/1764
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1764/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1764/comments
|
https://api.github.com/repos/psf/requests/issues/1764/events
|
https://github.com/psf/requests/pull/1764
| 23,463,487 |
MDExOlB1bGxSZXF1ZXN0MTAzODAxMTg=
| 1,764 |
Prepare new release (2.1.0)?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2013-11-28T18:36:29Z
|
2021-09-08T23:07:33Z
|
2013-12-04T01:35:02Z
|
MEMBER
|
resolved
|
There are a few nasty bugs that we've fixed since the last release, including the broken SNI support. It'd be good to get another release out. I think a few things have changed dramatically enough that this is probably a minor release (e.g. takes us to 2.1.0). Thoughts? /cc @kennethreitz @sigmavirus24
Still to do:
- [ ] Merge outstanding PRs (definitely ~~#1713, #1657, #1768, and #1766~~; maybe #1729, #1628 and #1743 depending on how @kennethreitz feels).
- [ ] Update changelog to match those PRs.
If nothing else Runscope could do with this release, so it'd be nice to be in good shape.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1764/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1764/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1764.diff",
"html_url": "https://github.com/psf/requests/pull/1764",
"merged_at": "2013-12-04T01:35:02Z",
"patch_url": "https://github.com/psf/requests/pull/1764.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1764"
}
| true |
[
"2.1.0 is a little overdue so I'm :+1:. I'll review the PRs you listed and provide feedback again just to be safe (in case I missed any of them).\n",
"I would say \"definitely\" #1766. That shouldn't be a \"maybe\"\n",
"Looking over the PRs listed above now:\n- #1713 :+1: merge it\n- #1657 :+1: merge it\n- #1729 :-1: do not merge it (I don't fee it is ready yet. @Lukasa and I left feedback that hasn't been addressed)\n- #1628 :-1: do not merge it (neither @kennethreitz nor I are very fond of this)\n- #1766 :+1: merge it\n- #1743 +0 I don't see anything awful about allowing for separate timeouts, it is just the API under question. I've proposed a different way of handling the same feature. I'm not sure this _has_ to be in 2.1 though \n",
"Updated to inclue #1768 which is a :+1: merge it.\n",
"I don't want to be pushy, but I'm very excited for 2.1 :smile: The pending 2.x SNI stuff is huge for us. If I can help with the release at all, let me know!\n",
"@kennethreitz this PR wasn't finished. There's still stuff missing from the Changelog\n",
"@sigmavirus24 d'oh, sorry guys. I got ahead of myself :) Hoping to do this release tonight. \n"
] |
https://api.github.com/repos/psf/requests/issues/1763
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1763/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1763/comments
|
https://api.github.com/repos/psf/requests/issues/1763/events
|
https://github.com/psf/requests/issues/1763
| 23,445,810 |
MDU6SXNzdWUyMzQ0NTgxMA==
| 1,763 |
Requests to a domain name defined in /etc/hosts fails.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/229453?v=4",
"events_url": "https://api.github.com/users/Natim/events{/privacy}",
"followers_url": "https://api.github.com/users/Natim/followers",
"following_url": "https://api.github.com/users/Natim/following{/other_user}",
"gists_url": "https://api.github.com/users/Natim/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Natim",
"id": 229453,
"login": "Natim",
"node_id": "MDQ6VXNlcjIyOTQ1Mw==",
"organizations_url": "https://api.github.com/users/Natim/orgs",
"received_events_url": "https://api.github.com/users/Natim/received_events",
"repos_url": "https://api.github.com/users/Natim/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Natim/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Natim/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Natim",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2013-11-28T11:58:29Z
|
2021-09-09T00:34:15Z
|
2013-11-28T12:05:57Z
|
NONE
|
resolved
|
```
~/workspace/venv/local/lib/python2.7/site-packages/requests/api.py line 55 in get
return request('get', url, **kwargs)
~/workspace/venv/local/lib/python2.7/site-packages/requests/api.py line 44 in request
return session.request(method=method, url=url, **kwargs)
~/workspace/venv/local/lib/python2.7/site-packages/requests/sessions.py line 335 in request
resp = self.send(prep, **send_kwargs)
~/workspace/venv/local/lib/python2.7/site-packages/requests/sessions.py line 438 in send
r = adapter.send(request, **kwargs)
~/workspace/venv/local/lib/python2.7/site-packages/requests/adapters.py line 327 in send
raise ConnectionError(e)
ConnectionError: HTTPConnectionPool(host='myserver.dev', port=80): Max retries exceeded with url: /accounts/ (Caused by <class 'socket.gaierror'>: [Errno -2] Name or service not known)
```
`myserver.dev` is defined in `/etc/hosts`
**Edit:** Rewrite bug request.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1763/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1763/timeline
| null |
completed
| null | null | false |
[
"Why is this a requests bug?\n",
"Ok let me explain it differently. I had the problem before and the fix was to use gevent master. Since gevent has been released, the bug came back.\n",
"Your references were useful, so I'm going to reproduce them here: surfly/gevent#271 and surfly/gevent#21.\n\nBoth of those refer to bugs in gevent. The bug you're seeing now is that when you use requests with gevent, those bugs appear.\n\nThis is not a bug in requests. If there is a bug in an underlying library, it should be reported and fixed there, not here.\n",
"Ok\n"
] |
https://api.github.com/repos/psf/requests/issues/1762
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1762/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1762/comments
|
https://api.github.com/repos/psf/requests/issues/1762/events
|
https://github.com/psf/requests/pull/1762
| 23,363,879 |
MDExOlB1bGxSZXF1ZXN0MTAzMjQ3OTA=
| 1,762 |
Update urllib3 to 929f1586
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/145979?v=4",
"events_url": "https://api.github.com/users/dstufft/events{/privacy}",
"followers_url": "https://api.github.com/users/dstufft/followers",
"following_url": "https://api.github.com/users/dstufft/following{/other_user}",
"gists_url": "https://api.github.com/users/dstufft/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dstufft",
"id": 145979,
"login": "dstufft",
"node_id": "MDQ6VXNlcjE0NTk3OQ==",
"organizations_url": "https://api.github.com/users/dstufft/orgs",
"received_events_url": "https://api.github.com/users/dstufft/received_events",
"repos_url": "https://api.github.com/users/dstufft/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dstufft/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dstufft/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dstufft",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 1 |
2013-11-27T02:42:29Z
|
2021-09-08T23:05:16Z
|
2013-11-27T17:23:40Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1762/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1762/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1762.diff",
"html_url": "https://github.com/psf/requests/pull/1762",
"merged_at": "2013-11-27T17:23:40Z",
"patch_url": "https://github.com/psf/requests/pull/1762.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1762"
}
| true |
[
"Always good! :cake:\n"
] |
|
https://api.github.com/repos/psf/requests/issues/1761
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1761/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1761/comments
|
https://api.github.com/repos/psf/requests/issues/1761/events
|
https://github.com/psf/requests/issues/1761
| 23,306,845 |
MDU6SXNzdWUyMzMwNjg0NQ==
| 1,761 |
binary file upload fails with python 2.7 under windows
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6039820?v=4",
"events_url": "https://api.github.com/users/kloodf/events{/privacy}",
"followers_url": "https://api.github.com/users/kloodf/followers",
"following_url": "https://api.github.com/users/kloodf/following{/other_user}",
"gists_url": "https://api.github.com/users/kloodf/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kloodf",
"id": 6039820,
"login": "kloodf",
"node_id": "MDQ6VXNlcjYwMzk4MjA=",
"organizations_url": "https://api.github.com/users/kloodf/orgs",
"received_events_url": "https://api.github.com/users/kloodf/received_events",
"repos_url": "https://api.github.com/users/kloodf/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kloodf/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kloodf/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kloodf",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2013-11-26T10:57:13Z
|
2021-09-09T00:34:16Z
|
2013-11-26T14:17:42Z
|
NONE
|
resolved
|
Following three lines cause a decoding error under windows.
somehow something seems to attempt decoding the binary file contents.
Same code works fine on Linux ( with a real url of course ;-) )
```
#!/usr/bin/env python
import requests
files = { 'file' : ( 'raw.dat', '\xff' ) }
requests.post('http://example.dies.before.url.is.used', files=files)
```
```
c:\tmp>python reqbug.py
Traceback (most recent call last):
File "reqbug.py", line 4, in <module>
requests.post(my_url, files=files)
File "C:\Python27\lib\site-packages\requests\api.py", line 88, in post
return request('post', url, data=data, **kwargs)
File "C:\Python27\lib\site-packages\requests\api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 361, in request
resp = self.send(prep, **send_kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 464, in send
r = adapter.send(request, **kwargs)
File "C:\Python27\lib\site-packages\requests\adapters.py", line 321, in send
timeout=timeout
File "C:\Python27\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 471, in urlopen
body=body, headers=headers)
File "C:\Python27\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 285, in _make_request
conn.request(method, url, **httplib_request_kw)
File "C:\Python27\lib\httplib.py", line 946, in request
self._send_request(method, url, body, headers)
File "C:\Python27\lib\httplib.py", line 987, in _send_request
self.endheaders(body)
File "C:\Python27\lib\httplib.py", line 940, in endheaders
self._send_output(message_body)
File "C:\Python27\lib\httplib.py", line 801, in _send_output
msg += message_body
UnicodeDecodeError: 'ascii' codec can't decode byte 0xff in position 103: ordinal not in range(128)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1761/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1761/timeline
| null |
completed
| null | null | false |
[
"Not for me it doesn't: Windows 7, Python 2.7.4, Requests 2.0.1. Can you provide the versions of all of your software as well?\n",
"By the way, the overwhelming cause of this is having a unicode string as either the method or in the headers. Make sure that you don't do either of those things. =)\n",
"d:\\work\\mhcomm\\mg_app>python\nPython 2.7 (r27:82525, Jul 4 2010, 09:01:59) [MSC v.1500 32 bit (Intel)] on win32\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n\nd:\\work\\mhcomm\\mg_app>python -c \"import requests ; print requests.**version**\"\n2.0.1\n\nTried on another windows machine with 2.7.4 and it seems to work.\nWill update that machine and report back.\nDidn't have any other issues with this machine so far.\n",
"So it looks like your version of Python is massively out of date. For comparison, I'm running `Python 2.7.4 (default, Apr 6 2013, 19:55:15) [MSC v.1500 64 bit (AMD64)] on win32`. Try running in a more recent version of Python 2.7 than that.\n",
"Thanks all. \nProblem disappeared with updating 2.7 to 2.7.4\nI'm now a happy requests user :-) again even on this machine\n",
"Glad to hear it!\n"
] |
https://api.github.com/repos/psf/requests/issues/1760
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1760/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1760/comments
|
https://api.github.com/repos/psf/requests/issues/1760/events
|
https://github.com/psf/requests/pull/1760
| 23,245,156 |
MDExOlB1bGxSZXF1ZXN0MTAyNjEwMDI=
| 1,760 |
Fix hangs on streaming uploads with HTTPDigestAuth
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/53781?v=4",
"events_url": "https://api.github.com/users/akitada/events{/privacy}",
"followers_url": "https://api.github.com/users/akitada/followers",
"following_url": "https://api.github.com/users/akitada/following{/other_user}",
"gists_url": "https://api.github.com/users/akitada/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/akitada",
"id": 53781,
"login": "akitada",
"node_id": "MDQ6VXNlcjUzNzgx",
"organizations_url": "https://api.github.com/users/akitada/orgs",
"received_events_url": "https://api.github.com/users/akitada/received_events",
"repos_url": "https://api.github.com/users/akitada/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/akitada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/akitada/subscriptions",
"type": "User",
"url": "https://api.github.com/users/akitada",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 2 |
2013-11-25T14:26:36Z
|
2021-09-08T23:05:27Z
|
2013-11-25T19:40:10Z
|
CONTRIBUTOR
|
resolved
|
When using Digest Authentication, the client resends the same request
after the server responds with the 401 "Unauthorized". However, when
doing streaming uploads, it gets stuck because the body data (a
file-like object) is already consumed at the initial request.
The patch fixes this by rewinding the file-like object before
resending the request.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1760/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1760/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1760.diff",
"html_url": "https://github.com/psf/requests/pull/1760",
"merged_at": "2013-11-25T19:40:10Z",
"patch_url": "https://github.com/psf/requests/pull/1760.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1760"
}
| true |
[
"This fixes #1755\n",
"LGTM. Thanks for this! :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/1759
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1759/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1759/comments
|
https://api.github.com/repos/psf/requests/issues/1759/events
|
https://github.com/psf/requests/issues/1759
| 23,235,742 |
MDU6SXNzdWUyMzIzNTc0Mg==
| 1,759 |
builtin_str in python2
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/747262?v=4",
"events_url": "https://api.github.com/users/nahuelange/events{/privacy}",
"followers_url": "https://api.github.com/users/nahuelange/followers",
"following_url": "https://api.github.com/users/nahuelange/following{/other_user}",
"gists_url": "https://api.github.com/users/nahuelange/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nahuelange",
"id": 747262,
"login": "nahuelange",
"node_id": "MDQ6VXNlcjc0NzI2Mg==",
"organizations_url": "https://api.github.com/users/nahuelange/orgs",
"received_events_url": "https://api.github.com/users/nahuelange/received_events",
"repos_url": "https://api.github.com/users/nahuelange/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nahuelange/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nahuelange/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nahuelange",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2013-11-25T11:18:58Z
|
2021-09-09T00:34:16Z
|
2013-11-25T14:51:00Z
|
NONE
|
resolved
|
builtin_str should be defined as unicode for py2 version OR in models.py (line 107) don't re-encode data.
When passing files & datas, datas are réencoded from unicode to str, and raise an UnicodeEncodeError.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/747262?v=4",
"events_url": "https://api.github.com/users/nahuelange/events{/privacy}",
"followers_url": "https://api.github.com/users/nahuelange/followers",
"following_url": "https://api.github.com/users/nahuelange/following{/other_user}",
"gists_url": "https://api.github.com/users/nahuelange/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nahuelange",
"id": 747262,
"login": "nahuelange",
"node_id": "MDQ6VXNlcjc0NzI2Mg==",
"organizations_url": "https://api.github.com/users/nahuelange/orgs",
"received_events_url": "https://api.github.com/users/nahuelange/received_events",
"repos_url": "https://api.github.com/users/nahuelange/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nahuelange/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nahuelange/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nahuelange",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1759/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1759/timeline
| null |
completed
| null | null | false |
[
"`builtin_str` means literally natural `str` type of python. So, there is no reason that `builtin_str` should be defined as `unicode` in python2.\nCan you explain in detail about how you passed data to `requests`?\n",
"It's not clear to me what your problem is here. You've provided fixes, but not indicated what the problem is. Let me address your fixes.\n\nFirstly, `builtin_str` should not be `unicode` in Python 2. It's purpose is to be a shorthand for 'whatever kind of string is used by default'. To be clear, `isinstance(\"string literal\", builtin_str)` should evaluate to `True` on every platform. In Python 2, that means `bytes`.\n\nSecondly, we don't re-encode data. If your data value is a bytestring, it does not get re-encoded. If you care about what encoding we use, encode your data before passing it to us.\n",
"Sorry, I close this issue, It came from a weird installation of the module.\nThanks for your answers.\n"
] |
https://api.github.com/repos/psf/requests/issues/1757
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1757/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1757/comments
|
https://api.github.com/repos/psf/requests/issues/1757/events
|
https://github.com/psf/requests/issues/1757
| 23,213,540 |
MDU6SXNzdWUyMzIxMzU0MA==
| 1,757 |
partial download using range header not working
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4",
"events_url": "https://api.github.com/users/ghost/events{/privacy}",
"followers_url": "https://api.github.com/users/ghost/followers",
"following_url": "https://api.github.com/users/ghost/following{/other_user}",
"gists_url": "https://api.github.com/users/ghost/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ghost",
"id": 10137,
"login": "ghost",
"node_id": "MDQ6VXNlcjEwMTM3",
"organizations_url": "https://api.github.com/users/ghost/orgs",
"received_events_url": "https://api.github.com/users/ghost/received_events",
"repos_url": "https://api.github.com/users/ghost/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ghost/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ghost",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2013-11-24T21:42:18Z
|
2021-09-09T00:34:15Z
|
2013-11-27T06:45:07Z
|
NONE
|
resolved
|
This is supposed to download a partial chunk of the URL (bytes 1000 to 2000)
```
target = 'http://10.0.0.10/words.txt'
headers = {'Range':'bytes=1000-2000'}
r = requests.get (target, headers=headers)
print r.text
```
It's not working and I get the whole file.
It does however work with the standard library's urllib2. I believe the problem is in the way requests is sending the request, since the server's logs show a difference in the response code.
The full code and some server configuration information here (I've tried the commented out things). I've also shown the code I use for urllib2.
http://bpaste.net/show/dWrjTbxBZKRxjSGZEma7/
I am using 2.0.1 of requests.
```
>>> import requests
>>> requests.__version__
'2.0.1'
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1757/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1757/timeline
| null |
completed
| null | null | false |
[
"I suspect the server is at fault here.\n\n``` python\n>>> import requests\n>>> url = 'http://mirrors.manchester.m247.com/ubuntu-releases//precise/ubuntu-12.04.3-desktop-amd64.iso'\n>>> r = requests.get(url, headers={'Range': 'bytes=1000-2000'}, stream=True)\n>>> r.status_code\n206\n```\n\nIf you believe the difference is in the request being sent, you should use a tool like tcpdump or wireshark to compare the actual requests on the wire.\n",
"Here is tcpdump for requests: http://bpaste.net/show/152701/\n\nHere is tcpdump for urllib2: http://bpaste.net/show/152703/\n\nCommand lines and code are included.\n\nThe full headers being sent are indeed different. As you say it could be a combination of that and the server, or a server setting: however it _does_ work with the standard library's urllib2 and one would expect requests to emulate this behaviour?\n\nI could probably get it working from seeing the difference in the headers I guess. Not really sure what the extra repeated bits at the end of the requests are. I'd have to read up on the protocol.\n",
"Update: Looking at tcpdump's output, I determined that adding \"Accept-Encoding:Identity\" to the headers send by requests made the code work for this server (and a few other random ones on the net such as BBC News, for which it was not working before). The test server does have compression (Apache mod_deflate) installed. Something somewhere is not following some standard I guess, and I don't know what it is (yet).\n",
"Ugh, that server is being stupid. From the [relevant spec](http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html):\n\n> The \"identity\" content-coding is always acceptable, unless specifically refused because the Accept-Encoding field includes \"identity;q=0\", or because the field includes \"*;q=0\" and does not explicitly include the \"identity\" content-coding. If the Accept-Encoding field-value is empty, then only the \"identity\" encoding is acceptable.\n\nIf Accept-Encoding is not present, it's still acceptable, because we haven't specifically excluded it. I'm still blaming the server. =)\n",
"I'm with @Lukasa \n\nIf there's a header we send by default which causes this, you can override it with `headers={'Header-Name': None}` to get rid of it before we send the request.\n\nI'm in favor of closing this as it is not a real bug.\n"
] |
https://api.github.com/repos/psf/requests/issues/1756
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1756/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1756/comments
|
https://api.github.com/repos/psf/requests/issues/1756/events
|
https://github.com/psf/requests/issues/1756
| 23,209,412 |
MDU6SXNzdWUyMzIwOTQxMg==
| 1,756 |
import error if ndg-httpsclient and pyasn1 are installed
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/662249?v=4",
"events_url": "https://api.github.com/users/dimaqq/events{/privacy}",
"followers_url": "https://api.github.com/users/dimaqq/followers",
"following_url": "https://api.github.com/users/dimaqq/following{/other_user}",
"gists_url": "https://api.github.com/users/dimaqq/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dimaqq",
"id": 662249,
"login": "dimaqq",
"node_id": "MDQ6VXNlcjY2MjI0OQ==",
"organizations_url": "https://api.github.com/users/dimaqq/orgs",
"received_events_url": "https://api.github.com/users/dimaqq/received_events",
"repos_url": "https://api.github.com/users/dimaqq/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dimaqq/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dimaqq/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dimaqq",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2013-11-24T17:50:02Z
|
2021-09-09T00:34:17Z
|
2013-11-24T17:51:54Z
|
NONE
|
resolved
|
Supposedly these dependencies are needed for SNI support in python2.
However requests doesn't even import when these dependencies are installed.
virtualenv-2.7 testX; . testX/bin/activate
pip install ipython requests ndg-httpsclient pyasn1
```
In [1]: import requests
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-1-686486c241c8> in <module>()
----> 1 import requests
/Users/dima/testr/testX/lib/python2.7/site-packages/requests/__init__.py in <module>()
51 # Attempt to enable urllib3's SNI support, if possible
52 try:
---> 53 from .packages.urllib3.contrib import pyopenssl
54 pyopenssl.inject_into_urllib3()
55 except ImportError:
/Users/dima/testr/testX/lib/python2.7/site-packages/requests/packages/urllib3/contrib/pyopenssl.py in <module>()
53
54 orig_util_HAS_SNI = util.HAS_SNI
---> 55 orig_connectionpool_ssl_wrap_socket = connectionpool.ssl_wrap_socket
56
57
AttributeError: 'module' object has no attribute 'ssl_wrap_socket'
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1756/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1756/timeline
| null |
completed
| null | null | false |
[
"Please check the open issues before you raise one. This is being tracked in #1732.\n",
"thanks for catching this. glad to see it already got some attention!\n"
] |
https://api.github.com/repos/psf/requests/issues/1755
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1755/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1755/comments
|
https://api.github.com/repos/psf/requests/issues/1755/events
|
https://github.com/psf/requests/issues/1755
| 23,204,417 |
MDU6SXNzdWUyMzIwNDQxNw==
| 1,755 |
Streaming uploads with HTTPBasicAuth causes stuck on resending the request
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/53781?v=4",
"events_url": "https://api.github.com/users/akitada/events{/privacy}",
"followers_url": "https://api.github.com/users/akitada/followers",
"following_url": "https://api.github.com/users/akitada/following{/other_user}",
"gists_url": "https://api.github.com/users/akitada/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/akitada",
"id": 53781,
"login": "akitada",
"node_id": "MDQ6VXNlcjUzNzgx",
"organizations_url": "https://api.github.com/users/akitada/orgs",
"received_events_url": "https://api.github.com/users/akitada/received_events",
"repos_url": "https://api.github.com/users/akitada/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/akitada/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/akitada/subscriptions",
"type": "User",
"url": "https://api.github.com/users/akitada",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
},
{
"color": "5319e7",
"default": false,
"description": null,
"id": 67760318,
"name": "Fixed",
"node_id": "MDU6TGFiZWw2Nzc2MDMxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Fixed"
}
] |
closed
| true | null |
[] | null | 5 |
2013-11-24T12:30:06Z
|
2021-09-09T00:28:34Z
|
2013-12-05T23:04:56Z
|
CONTRIBUTOR
|
resolved
|
`HTTPBasicAuth` seems to have the same issue described in requests/requests-kerberos#23.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1755/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1755/timeline
| null |
completed
| null | null | false |
[
"Resolved by #1760\n",
"@akitada please don't close your own issues. We'll close them when we are ready to.\n",
"Oh, sorry! I'm not familiar with the process.\n",
"No need to apologize @akitada \n",
"Fixed in 2.1.0.\n"
] |
https://api.github.com/repos/psf/requests/issues/1754
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1754/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1754/comments
|
https://api.github.com/repos/psf/requests/issues/1754/events
|
https://github.com/psf/requests/pull/1754
| 23,203,552 |
MDExOlB1bGxSZXF1ZXN0MTAyNDEzNjU=
| 1,754 |
Match the HTTPbis on HTTP 301.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 1 |
2013-11-24T11:20:20Z
|
2021-09-08T23:05:26Z
|
2013-11-25T19:40:00Z
|
MEMBER
|
resolved
|
See the discussion on issue #1704.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1754/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1754/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1754.diff",
"html_url": "https://github.com/psf/requests/pull/1754",
"merged_at": "2013-11-25T19:40:00Z",
"patch_url": "https://github.com/psf/requests/pull/1754.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1754"
}
| true |
[
":+1:\n"
] |
https://api.github.com/repos/psf/requests/issues/1753
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1753/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1753/comments
|
https://api.github.com/repos/psf/requests/issues/1753/events
|
https://github.com/psf/requests/issues/1753
| 23,193,409 |
MDU6SXNzdWUyMzE5MzQwOQ==
| 1,753 |
Support method to gzip or otherwise compress HTTP request body?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/77864?v=4",
"events_url": "https://api.github.com/users/dkador/events{/privacy}",
"followers_url": "https://api.github.com/users/dkador/followers",
"following_url": "https://api.github.com/users/dkador/following{/other_user}",
"gists_url": "https://api.github.com/users/dkador/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dkador",
"id": 77864,
"login": "dkador",
"node_id": "MDQ6VXNlcjc3ODY0",
"organizations_url": "https://api.github.com/users/dkador/orgs",
"received_events_url": "https://api.github.com/users/dkador/received_events",
"repos_url": "https://api.github.com/users/dkador/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dkador/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dkador/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dkador",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2013-11-23T20:22:39Z
|
2018-08-31T22:42:22Z
|
2013-11-27T02:24:26Z
|
NONE
|
resolved
|
It doesn't appear that requests currently supports any built-in mechanism for automatically compressing a given HTTP body payload (e.g. JSON). Correct me if I missed something terribly obvious here, and apologies if I did.
I could certainly just implement the compression part myself, but I was wondering if the requests community thinks that a feature like this would be useful in general or is too specific to each integration's problem domain and so should remain there?
If people think this sounds interesting I'd be happy to take a stab at implementing it. Thanks!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/1753/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1753/timeline
| null |
completed
| null | null | false |
[
"I think it is a good idea to at least have methods for doing this documented (if not packaged in a separate library) but I'm pretty sure it doesn't belong in requests proper. \n",
"Agreed. The fact of the matter is that it isn't very hard to compress a request body yourself, and it's hard to see how we'd have a nice API for it. I'm fairly sure we don't need a separate library for it because of how trivial it is, but I'm not averse to documenting it (maybe informally):\n\n``` python\nimport zlib\nimport json\n\ndata = {'some': {'json': 'data'}}\npostdata = zlib.compress(json.dumps(data))\nr = requests.post('http://posturl.com/', data=postdata, headers={'Content-Encoding': 'gzip'})\n```\n",
"It might be worthwhile to include a GzipAdapter:\n\n``` python\nimport gzip\nimport zlib\nimport json\n\nimport requests\n\nfrom requests.adapters import HTTPAdapter\n\nfrom pprint import pprint\n\n\nclass GzipAdapter(HTTPAdapter):\n\n def add_headers(self, request, **kw):\n request.headers['Content-Encoding'] = 'gzip'\n\n def send(self, request, stream=False, **kw):\n if stream is True:\n request.body = gzip.GzipFile(filename=request.url,\n fileobj=request.body)\n else:\n request.body = zlib.compress(request.body)\n\n return super(GzipAdapter, self).send(request, stream, **kw)\n\n\nif __name__ == '__main__':\n sess = requests.Session()\n sess.mount('http://', GzipAdapter())\n\n msg = {\n 'msg': 'hello world',\n }\n\n resp = sess.put('http://httpbin.org/put', data=json.dumps(msg))\n pprint(resp.json())\n```\n\nI haven't tested this very extensively, but if it seems like a helpful tool, an adapter seems like a nice place to put it. I'd be happy to take a more in depth stab at it if people are interested. \n\nJust throwing it out there. \n",
"You should absolutely write one, but Requests core won't include it. =)\n",
"> You should absolutely write one, but Requests core won't include it. =)\n\nThis.\n",
"I landed here looking for a way to gzip a POST payload using requests.\r\nSince requests is HTTP for human being, I think that it should include a parameter to handle gzip automatically",
"Thank you for your opinion @tapionx but given the nearly 5 years from the first suggestion o fthis until now, I don't think this is actually worth the surface area to the core library. Further, adding a parameter implies an expansion of a Frozen API - one which has been frozen (successfully) for years. The library that tends to include these patterns outside of the core, [requests-toolbelt](/requests/toolbelt) is happy to include this if someone wants to write it.",
"I would advise more an adapter like this:\r\n\r\n```python\r\nCOMPRESSION_SCHEMES = [\r\n \"http://\",\r\n \"https://\",\r\n]\r\n\r\n\r\nclass CompressionAdapter(HTTPAdapter):\r\n\r\n \"\"\"Adapter used with `requests` library for sending compressed data.\"\"\"\r\n\r\n CONTENT_LENGTH = \"Content-Length\"\r\n\r\n def add_headers(self, request, **kwargs):\r\n \"\"\"Tell the server that we support compression.\"\"\"\r\n super(CompressionAdapter, self).add_headers(request, **kwargs)\r\n\r\n body = request.body\r\n if isinstance(body, bytes):\r\n content_length = len(body)\r\n else:\r\n content_length = body.seek(0, 2)\r\n body.seek(0, 0)\r\n\r\n headers = {\r\n Compression.CONTENT_ENCODING: Compression.GZIP,\r\n Compression.ACCEPT_ENCODING: Compression.GZIP,\r\n self.CONTENT_LENGTH: content_length,\r\n }\r\n request.headers.update(headers)\r\n\r\n def send(self, request, stream=False, **kwargs):\r\n \"\"\"Compress data before sending.\"\"\"\r\n if stream:\r\n # Having a file-like object, therefore we need to stream the\r\n # content into a new one through the compressor.\r\n compressed_body = io.BytesIO()\r\n compressed_body.name = request.url\r\n compressor = gzip.open(compressed_body, mode=\"wb\")\r\n # Read, write and compress the content at the same time.\r\n compressor.write(request.body.read())\r\n compressor.flush()\r\n compressor.close()\r\n # Make available the new compressed file-like object as the new\r\n # request body.\r\n compressed_body.seek(0, 0) # make it readable\r\n request.body = compressed_body\r\n else:\r\n # We're dealing with a plain bytes content, so compress it.\r\n request.body = gzip.compress(request.body)\r\n\r\n return super(CompressionAdapter, self).send(\r\n request,\r\n stream=stream,\r\n **kwargs\r\n )\r\n\r\n\r\ndef get_session():\r\n \"\"\"Get a requests session supporting compression.\"\"\"\r\n # Prepare the adapter and the session.\r\n compression_adapter = CompressionAdapter()\r\n session = requests.Session()\r\n\r\n # Mount the adapter to all affected schemes.\r\n for scheme in COMPRESSION_SCHEMES:\r\n session.mount(scheme, compression_adapter)\r\n\r\n # Use this sessions for CRUD-ing.\r\n return session\r\n```"
] |
https://api.github.com/repos/psf/requests/issues/1752
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1752/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1752/comments
|
https://api.github.com/repos/psf/requests/issues/1752/events
|
https://github.com/psf/requests/issues/1752
| 23,122,581 |
MDU6SXNzdWUyMzEyMjU4MQ==
| 1,752 |
Fails to make some https connections with PyPy 2.2
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2013-11-22T07:51:22Z
|
2021-09-09T00:28:30Z
|
2013-12-09T19:12:53Z
|
CONTRIBUTOR
|
resolved
|
This is effecting Pip pretty aggressively.
Also being tracked by PyPy: https://bugs.pypy.org/issue1645
/cc @Lukasa @sigmavirus24
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1752/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1752/timeline
| null |
completed
| null | null | false |
[
"Are both communities seeing the same bug? Because it looks like Alex fixed it [here](https://bitbucket.org/pypy/pypy/commits/963c6d6d7d6c8cb32d1e338ef620e9edd9e479fa). He's certainly marked it as fixed in the PyPy bug tracker.\n"
] |
https://api.github.com/repos/psf/requests/issues/1751
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1751/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1751/comments
|
https://api.github.com/repos/psf/requests/issues/1751/events
|
https://github.com/psf/requests/issues/1751
| 23,112,906 |
MDU6SXNzdWUyMzExMjkwNg==
| 1,751 |
Add option to limit response downloads by size.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/23648?v=4",
"events_url": "https://api.github.com/users/skorokithakis/events{/privacy}",
"followers_url": "https://api.github.com/users/skorokithakis/followers",
"following_url": "https://api.github.com/users/skorokithakis/following{/other_user}",
"gists_url": "https://api.github.com/users/skorokithakis/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/skorokithakis",
"id": 23648,
"login": "skorokithakis",
"node_id": "MDQ6VXNlcjIzNjQ4",
"organizations_url": "https://api.github.com/users/skorokithakis/orgs",
"received_events_url": "https://api.github.com/users/skorokithakis/received_events",
"repos_url": "https://api.github.com/users/skorokithakis/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/skorokithakis/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/skorokithakis/subscriptions",
"type": "User",
"url": "https://api.github.com/users/skorokithakis",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 19 |
2013-11-22T01:55:19Z
|
2021-09-09T00:10:00Z
|
2013-11-22T08:29:38Z
|
NONE
|
resolved
|
Would it be possible to add a "limit" argument that will limit downloads to a certain size? I want my users to be able to make requests to URLs, but I don't want them to download gigabytes through them, I want the request to end after, say, 1 KB.
This will complement the "timeout" argument.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/23648?v=4",
"events_url": "https://api.github.com/users/skorokithakis/events{/privacy}",
"followers_url": "https://api.github.com/users/skorokithakis/followers",
"following_url": "https://api.github.com/users/skorokithakis/following{/other_user}",
"gists_url": "https://api.github.com/users/skorokithakis/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/skorokithakis",
"id": 23648,
"login": "skorokithakis",
"node_id": "MDQ6VXNlcjIzNjQ4",
"organizations_url": "https://api.github.com/users/skorokithakis/orgs",
"received_events_url": "https://api.github.com/users/skorokithakis/received_events",
"repos_url": "https://api.github.com/users/skorokithakis/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/skorokithakis/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/skorokithakis/subscriptions",
"type": "User",
"url": "https://api.github.com/users/skorokithakis",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/1751/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1751/timeline
| null |
completed
| null | null | false |
[
"The API is frozen so the short answer is no.\n\nA longer answer can probably be found on other discussions of this exact same feature. If you want to do this for your users you can use the `stream` parameter and measure what you've downloaded thus far yourself. After you've hit your limit, you can close the connect yourself.\n\nSomething along those lines would be:\n\n``` python\n\nr = requests.get(url, stream=True)\n\nchunk = r.raw.read(1024)\nr.close()\n```\n",
"Does it matter if the API is frozen? Keyword parameters (maybe together with putting this argument last) make it backwards-compatible.\n\nWill the solution you describe still enable processing of the response? I.e. If the response is less than 1 KB, will response.json() still decode the JSON?\n",
"@skorokithakis it does matter. Frozen does not mean that backwards compatible changes are allowable. Generally speaking a frozen API means that no new features can be made. On the other hand, given a frozen API and semantic versioning a backwards compatible change can be made assuming it doesn't add anything to the API.\n\nThe solution is just for the cases you mentioned in your issue report. You can always pass the `stream=True` parameter and still use `r.json()`.\n",
"I agree that using streaming is the correct approach here. Perhaps we should add some reference documentation around this?\n",
"Alright, thanks, I'll close this then.\n",
"What you proposed doesn't work, unfortunately:\n\n```\nIn [35]: requests.get(\"http://pastebin.com/raw.php?i=jJywkuB5\").json()\nOut[35]: {u'response': u'Yep all ok'}\n\nIn [36]: r = requests.get(\"http://pastebin.com/raw.php?i=jJywkuB5\", stream=True)\n\nIn [37]: r.raw.read(1024)\nOut[37]: '\\x1f\\x8b\\x08\\x00\\x00\\x00\\x00\\x00\\x00\\x03\\xabV*J-.\\xc8\\xcf+NU\\xb2RP\\x8aL-PH\\xcc\\xc9Q\\xc8\\xcfV\\xaa\\x05\\x006\\x16xH\\x1a\\x00\\x00\\x00'\n\nIn [38]: r.close()\n\nIn [39]: r.json()\n---------------------------------------------------------------------------\nJSONDecodeError Traceback (most recent call last)\n<ipython-input-39-9c78734cbfe1> in <module>()\n----> 1 r.json()\n\n...snip...\n\nJSONDecodeError: No JSON object could be decoded: line 1 column 0 (char 0)\n\nIn [41]: r.content\nOut[41]: ''\n\nIn [42]: r.text\nOut[42]: u''\n```\n",
"If you read directly from the raw response object, this will necessarily fail.\n\nWhat you're doing is not the right way to approach this. If you're downloading JSON that is more than 1kB in size and you truncate it at 1kB, that's just not going to be valid JSON. No JSON decoder will accept it.\n\nYou have some contradictory goals. What exactly are you attempting to do?\n",
"As you can see above, the JSON is around 30 bytes, so reading a KB shouldn't fail (since the returned data isn't even close to 1 KB). I want to limit downloads to 1 KB, and close the socket and return without processing anything if it's over 1 KB.\n",
"Then the correct behaviour is this:\n\n``` python\nr = requests.get('http://www.url.com/', stream=True)\n\nif int(r.headers['Content-Length']) > 1024:\n return None\nelse:\n return r.json()\n```\n",
"Thanks, that's better, but there are two issues:\n1. Will requests stop downloading when it gets that that many bytes? I can see a malicious server saying it will send me 10 bytes and sending 10 GB.\n2. This:\n\n```\nIn [7]: requests.get(\"http://pastebin.com/raw.php?i=jJywkuB5\", stream=True).headers\nOut[7]: \n{'cf-ray': 'd15713f2f5606e8',\n 'connection': 'keep-alive',\n 'content-encoding': 'gzip',\n 'content-type': 'text/plain; charset=utf-8',\n 'date': 'Fri, 22 Nov 2013 11:57:29 GMT',\n 'server': 'cloudflare-nginx',\n 'set-cookie': '__cfduid=db28dc4309546661b06adc9f4138883611385121448827; expires=Mon, 23-Dec-2019 23:50:00 GMT; path=/; domain=.pastebin.com; HttpOnly',\n 'transfer-encoding': 'chunked',\n 'vary': 'Accept-Encoding',\n 'x-powered-by': 'PHP/5.5.5'}\n```\n",
"If you're determined to force it, you have to abandon some of the convenience of requests. You can try this:\n\n``` python\nr = requests.get('http://url.com/', stream=True\ndata = next(r.iter_content(1024))\nr.close()\n\n# Work with the data.\n```\n",
"Right, since all I need is to extract some short JSON from it and fail otherwise, that will be good enough for my purposes, thank you.\n",
"Is there any chance to reopen this request? No being able to limit the received size is a significant issue for any user that processes user provided URLs.\n\nI see the stream option mentioned here, but it's a bit inconvenient to use (perhaps it's just a lack of documentaion on how it works). But it also doesn't appear to consider the headers, streaming only the data. A misbhaving server could still send me infinite header data.\n",
"@mortoray The way the `stream` option is intended to work is to have the user (in this case yourself) take control of how to handle data downloads. For example, you've got a use case where you want to limit the size of the downloaded body. You can do something like this:\n\n``` python\ndef download_limited(url, max_size):\n r = requests.get(url, stream=True)\n\n if int(r.headers.get('Content-Length', 0)) > max_size:\n return None\n\n data = []\n length = 0\n\n for chunk in r.iter_content(1024):\n data.append(chunk)\n length += len(chunk)\n if length > max_size:\n return None\n\n return b''.join(data)\n```\n\nIf you're concerned about being sent infinite header data you have a very separate problem, which is that Requests uses urllib3 which uses httplib for its header parsing. There's an [open Python issue](http://bugs.python.org/issue16037) which tracks the fact that `httplib` has no maximum limit on the number of headers downloaded. This has been fixed in 2.6 and 3.3, but I don't see a fix on my copy of 2.7.6. I'm going to check out the codebase and confirm, but I suspect this fix hasn't been merged. There's nothing Requests can do for you in that situation.\n",
"Yeah, it doesn't look like it's in my copy of Mercurial, oddly.\n",
"I'll see if reviewing that patch provides any impetus to get that merged into 2.7.\n",
"Is there any way I can get that streamed data back into the response? It'd be convenient if I can still use the rest of the response interface, like json and data. That is, I'm only streaming it to check for the size limit, not because I actually want the stream.\n",
"I added the below issue about timeouts and streaming. It limits this method of checking the maximum size.\nhttps://github.com/kennethreitz/requests/issues/1948\n",
"Sure: when you've read the whole interface, just do `r._content = b''.join(chunks)`.\n"
] |
https://api.github.com/repos/psf/requests/issues/1750
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1750/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1750/comments
|
https://api.github.com/repos/psf/requests/issues/1750/events
|
https://github.com/psf/requests/issues/1750
| 23,111,514 |
MDU6SXNzdWUyMzExMTUxNA==
| 1,750 |
sessions.request not processing cookies properly
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/704592?v=4",
"events_url": "https://api.github.com/users/glennbach/events{/privacy}",
"followers_url": "https://api.github.com/users/glennbach/followers",
"following_url": "https://api.github.com/users/glennbach/following{/other_user}",
"gists_url": "https://api.github.com/users/glennbach/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/glennbach",
"id": 704592,
"login": "glennbach",
"node_id": "MDQ6VXNlcjcwNDU5Mg==",
"organizations_url": "https://api.github.com/users/glennbach/orgs",
"received_events_url": "https://api.github.com/users/glennbach/received_events",
"repos_url": "https://api.github.com/users/glennbach/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/glennbach/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/glennbach/subscriptions",
"type": "User",
"url": "https://api.github.com/users/glennbach",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2013-11-22T01:17:26Z
|
2021-09-09T00:34:18Z
|
2013-11-22T01:34:12Z
|
NONE
|
resolved
|
we do a post, passing the cookies from the previous response as the cookies:
requests.post(...,cookies=resp.cookies...)
the request function of the Session class appears to be presuming the cookies are always a dict, even though the comments indicate it could also be a cookiejar.
self.cookies = cookiejar_from_dict(cookies, cookiejar=self.cookies, overwrite=False)
the only way we can get it to work is if we use:
requests.post(...,cookies=resp.cookies.get_dict...)
Thanks
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1750/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1750/timeline
| null |
completed
| null | null | false |
[
"This is a duplicate of #1711.\n",
"Thanks for reporting it!\n",
"Excellent. Can I get an updated version from git? It doesn’t appear to be in master. I’m seeing other cookie problems, and I’d like to try the latest code.\n\nThanks!\n\nGlenn\n\nOn Nov 21, 2013, at 5:34 PM, Ian Cordasco [email protected] wrote:\n\n> Thanks for reporting it!\n> \n> —\n> Reply to this email directly or view it on GitHub.\n",
"@glennbach Linked from that issue is the relevant pull request, which is #1713. It hasn't yet been accepted, but feel free to try it.\n"
] |
https://api.github.com/repos/psf/requests/issues/1749
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1749/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1749/comments
|
https://api.github.com/repos/psf/requests/issues/1749/events
|
https://github.com/psf/requests/issues/1749
| 23,094,666 |
MDU6SXNzdWUyMzA5NDY2Ng==
| 1,749 |
The method resolve_redirects should also prepare_auth again so OAuth1 would work
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/372914?v=4",
"events_url": "https://api.github.com/users/metatoaster/events{/privacy}",
"followers_url": "https://api.github.com/users/metatoaster/followers",
"following_url": "https://api.github.com/users/metatoaster/following{/other_user}",
"gists_url": "https://api.github.com/users/metatoaster/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/metatoaster",
"id": 372914,
"login": "metatoaster",
"node_id": "MDQ6VXNlcjM3MjkxNA==",
"organizations_url": "https://api.github.com/users/metatoaster/orgs",
"received_events_url": "https://api.github.com/users/metatoaster/received_events",
"repos_url": "https://api.github.com/users/metatoaster/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/metatoaster/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/metatoaster/subscriptions",
"type": "User",
"url": "https://api.github.com/users/metatoaster",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2013-11-21T20:22:21Z
|
2021-09-09T00:10:11Z
|
2014-02-03T11:01:44Z
|
NONE
|
resolved
|
Whenever a request made with an OAuth1 auth object results in a redirect to a new location, it would then result in the Authorization header becoming invalid (wrong signature) thus making the function provided by `allow_redirect` ultimately useless in the context of an OAuth1 session. A potential fix is to call `prepare_auth` again after the `prepare_cookies` in the `resolve_redirects` loop, but I have no idea what potential side effects are and how to begin writing a test in that gigantic method within a loop so no pull requests from me, for now.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1749/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1749/timeline
| null |
completed
| null | null | false |
[
"Also this is reported under requests/requests-oauthlib#94.\n",
"So, the initial evaluation is that `prepare_auth` can't be placed there. We throw the authentication object away, we aren't carrying it around. The question is: do we think we should re-evaluate authentication for each new request? If we do, how should we do it?\n\nOne option is to extend the Requests authentication handlers to allow them to be placed on `Session`s. How's that for an idea?\n",
"We now strip all authentication from redirects, so this issue is now invalid. =)\n"
] |
https://api.github.com/repos/psf/requests/issues/1748
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1748/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1748/comments
|
https://api.github.com/repos/psf/requests/issues/1748/events
|
https://github.com/psf/requests/issues/1748
| 23,054,631 |
MDU6SXNzdWUyMzA1NDYzMQ==
| 1,748 |
Make charade library optional
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/414336?v=4",
"events_url": "https://api.github.com/users/guettli/events{/privacy}",
"followers_url": "https://api.github.com/users/guettli/followers",
"following_url": "https://api.github.com/users/guettli/following{/other_user}",
"gists_url": "https://api.github.com/users/guettli/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/guettli",
"id": 414336,
"login": "guettli",
"node_id": "MDQ6VXNlcjQxNDMzNg==",
"organizations_url": "https://api.github.com/users/guettli/orgs",
"received_events_url": "https://api.github.com/users/guettli/received_events",
"repos_url": "https://api.github.com/users/guettli/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/guettli/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/guettli/subscriptions",
"type": "User",
"url": "https://api.github.com/users/guettli",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2013-11-21T09:44:57Z
|
2021-09-09T00:34:18Z
|
2013-11-21T11:27:46Z
|
NONE
|
resolved
|
The charade library in the packages directory is quite big (500kByte)
It is only needed if you give a response not encoding. I think this is an error in the code which uses the request library.
Zen of Python "In the face of ambiguity, refuse the temptation to guess."
I think the library should be optional, and not in this repository.
I prefer an exception, if I don't give encoding, since I would get this exception during development.
With the current implementation the guess could be good during development, and wrong in the live system.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1748/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1748/timeline
| null |
completed
| null | null | false |
[
"Charade _is_ optional. As you've spotted, if you set the encoding yourself, we won't use it.\n\nAs for making the charade download itself optional, that doesn't fit Requests' policy of making the common use case easy. We're strongly disinclined to force people to go out and download charade to make encoding guessing work.\n\nIf you really can't spare the 500kB, it's not very difficult to strip out charade yourself and use that packaged version of the library. But it's hard to imagine that you really can't spare 500kB. =D\n",
"I just wanted state my concerns. I am happy with your decision. \n",
"@guettli Thanks for providing your feedback! It's always appreciated. :cookie:\n"
] |
https://api.github.com/repos/psf/requests/issues/1747
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1747/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1747/comments
|
https://api.github.com/repos/psf/requests/issues/1747/events
|
https://github.com/psf/requests/pull/1747
| 22,903,571 |
MDExOlB1bGxSZXF1ZXN0MTAwODQyMDg=
| 1,747 |
#1746 don't lowercase the whole url
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3482660?v=4",
"events_url": "https://api.github.com/users/daftshady/events{/privacy}",
"followers_url": "https://api.github.com/users/daftshady/followers",
"following_url": "https://api.github.com/users/daftshady/following{/other_user}",
"gists_url": "https://api.github.com/users/daftshady/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/daftshady",
"id": 3482660,
"login": "daftshady",
"node_id": "MDQ6VXNlcjM0ODI2NjA=",
"organizations_url": "https://api.github.com/users/daftshady/orgs",
"received_events_url": "https://api.github.com/users/daftshady/received_events",
"repos_url": "https://api.github.com/users/daftshady/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/daftshady/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/daftshady/subscriptions",
"type": "User",
"url": "https://api.github.com/users/daftshady",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 3 |
2013-11-19T11:01:25Z
|
2021-09-08T23:06:26Z
|
2013-11-20T09:05:15Z
|
CONTRIBUTOR
|
resolved
|
#1746
Modified adapters not to lowercase the whole url.
Because `urlparse` automatically makes scheme lowercase, `scheme.lower()` is unnecessary.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1747/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1747/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1747.diff",
"html_url": "https://github.com/psf/requests/pull/1747",
"merged_at": "2013-11-20T09:05:15Z",
"patch_url": "https://github.com/psf/requests/pull/1747.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1747"
}
| true |
[
"Heh, so it does. Good spot!\n\nI'm happy with this, :+1:.\n",
"Not sure if this is useful, but in the course of investigating urlparse/ursplit for my own purposes, I thought an explicit comment here might be good for the long term -- the standard library docs don't seem to mention at all that urlparse/urlsplit lower-case/normalize the scheme, so it seems to me that doing it that way would depend potentially on a side-effect that could change.\n\nHowever, the docs do say that `geturl()` invoked on a structured parse result \"may differ from the original URL in that the scheme may be normalized to lower case and empty components may be dropped\" (http://docs.python.org/3.3/library/urllib.parse.html#structured-parse-results), so presumably if that's the effect you want, you should get it by using `SplitResult.geturl()` on the result from an urlparse/urlsplit.\n\nThus +daftshady's decision to use geturl to do the scheme lowering, instead of parsing and then unparsing, only?\n\nApologies if this level of explicit comment isn't considered good form.\n",
"@ViktorHaag it's considered the most excellent of form! Thanks for the fantastic observations.\n"
] |
https://api.github.com/repos/psf/requests/issues/1746
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1746/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1746/comments
|
https://api.github.com/repos/psf/requests/issues/1746/events
|
https://github.com/psf/requests/issues/1746
| 22,895,987 |
MDU6SXNzdWUyMjg5NTk4Nw==
| 1,746 |
Don't lowercase the whole URL when opening a connection
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
},
{
"color": "5319e7",
"default": false,
"description": null,
"id": 67760318,
"name": "Fixed",
"node_id": "MDU6TGFiZWw2Nzc2MDMxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Fixed"
}
] |
closed
| true | null |
[] | null | 8 |
2013-11-19T08:53:32Z
|
2021-09-09T00:28:34Z
|
2013-12-05T23:05:08Z
|
MEMBER
|
resolved
|
As mentioned [here](https://github.com/kennethreitz/requests/commit/01993d21dc7d6fa8f0acf657425d2fead9b2ddea).
The specific line of code in question is [here](https://github.com/kennethreitz/requests/blob/master/requests/adapters.py#L209). We're lowercasing the whole URL. This _should_ be fine, but in practice there are clearly servers that will expect the `Host` header to match a specific case (because they're stupid).
We should fix this, especially as fixing it is so easy (we only want to lowercase the scheme) and doesn't hurt us at all.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1746/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1746/timeline
| null |
completed
| null | null | false |
[
"I agree; it probably should have just been an URL with lowered() scheme, leaving the rest intact. My apologies for this.\n",
"@ViktorHaag we will never forgive you. You've broken requests in such an irreparable way that you are banned for life. (I'm kidding of course. It isn't a big deal at all. We all make mistakes. Thank you for contributing what you did.)\n",
"On a separate, more serious, note since #1747 was merged, should we leave this open until the next release of requests or close it now? It might be useful to have open in case others want to file the same issue but then again, we've seen that people sometimes don't even look at open issues before filing their own. Opinions @kennethreitz @Lukasa ?\n",
"I've established a trend with #1732 of leaving issues open until the next release. I think we should continue to do that. =) I'm sick of people repeatedly re-raising fixed issues. The other option is to maintain a list of issues that have been fixed in trunk in somewhere like a GitHub wiki.\n",
"@Lukasa It might be worthwhile to tag them as \"fixed\" in order to help people get a better perspective on what is still open. Of course, I'm being selfish here as I was just browsing to see if there was any low hanging fruit and read this thread ;)\n",
"@ionrock Not a bad idea actually, would help me keep track of which ones have to be closed at the point of release.\n",
"I've gone through and started adding that label to some of them\n",
"Fixed in 2.1.0.\n"
] |
https://api.github.com/repos/psf/requests/issues/1745
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1745/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1745/comments
|
https://api.github.com/repos/psf/requests/issues/1745/events
|
https://github.com/psf/requests/pull/1745
| 22,869,125 |
MDExOlB1bGxSZXF1ZXN0MTAwNjY2MDM=
| 1,745 |
Small fix broken cookie parse
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5973312?v=4",
"events_url": "https://api.github.com/users/nonm/events{/privacy}",
"followers_url": "https://api.github.com/users/nonm/followers",
"following_url": "https://api.github.com/users/nonm/following{/other_user}",
"gists_url": "https://api.github.com/users/nonm/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nonm",
"id": 5973312,
"login": "nonm",
"node_id": "MDQ6VXNlcjU5NzMzMTI=",
"organizations_url": "https://api.github.com/users/nonm/orgs",
"received_events_url": "https://api.github.com/users/nonm/received_events",
"repos_url": "https://api.github.com/users/nonm/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nonm/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nonm/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nonm",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2013-11-18T21:35:38Z
|
2021-09-08T23:06:19Z
|
2013-11-20T08:58:21Z
|
CONTRIBUTOR
|
resolved
|
Code:
``` python
import requests
resp = requests.get('http://www.servanegaxotte.com')
# or khvigra.ru, nastersrl.ru, spcentre2000.ru etc.
```
Raised exception:
```
File "C:\Python27\lib\cookielib.py", line 1645, in extract_cookies
self.set_cookie(cookie)
File "C:\Python27\lib\site-packages\requests\cookies.py", line 270, in set_cookie
if cookie.value.startswith('"') and cookie.value.endswith('"'):
AttributeError: 'NoneType' object has no attribute 'startswith'
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1745/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1745/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1745.diff",
"html_url": "https://github.com/psf/requests/pull/1745",
"merged_at": "2013-11-20T08:58:21Z",
"patch_url": "https://github.com/psf/requests/pull/1745.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1745"
}
| true |
[
"IMHO, is it better to check whether `cookie.value` is not None or catching AttributeError instead of using `hasattr`?\n",
"Ordinary I'd insist on catching `AttributeError`. However, because of short-circuit logical evaluation, in this case I think `hasattr` is the better way to go.\n",
" :sparkles: :cookie: :sparkles: \n"
] |
https://api.github.com/repos/psf/requests/issues/1744
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1744/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1744/comments
|
https://api.github.com/repos/psf/requests/issues/1744/events
|
https://github.com/psf/requests/issues/1744
| 22,831,484 |
MDU6SXNzdWUyMjgzMTQ4NA==
| 1,744 |
Upgrade to 2.0.1 breaks existing code wherever cookies are passed as keyword arguments.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3863128?v=4",
"events_url": "https://api.github.com/users/auworkshare/events{/privacy}",
"followers_url": "https://api.github.com/users/auworkshare/followers",
"following_url": "https://api.github.com/users/auworkshare/following{/other_user}",
"gists_url": "https://api.github.com/users/auworkshare/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/auworkshare",
"id": 3863128,
"login": "auworkshare",
"node_id": "MDQ6VXNlcjM4NjMxMjg=",
"organizations_url": "https://api.github.com/users/auworkshare/orgs",
"received_events_url": "https://api.github.com/users/auworkshare/received_events",
"repos_url": "https://api.github.com/users/auworkshare/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/auworkshare/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/auworkshare/subscriptions",
"type": "User",
"url": "https://api.github.com/users/auworkshare",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2013-11-18T12:29:48Z
|
2021-09-09T00:34:19Z
|
2013-11-18T15:46:30Z
|
NONE
|
resolved
|
We use request to run regression tests on restful services.
I upgraded to 2.0.1 now I get a Keyword error , wherever cookies are passed as keyword arguments.
Simple stacktrace:
self.result = requests.get(url, params=params, headers={'content-type':'application/json'}, cookies=self.cookies)
KeyError("name=Cookie(version=0, name='qa_session_id', value='e32393932842fba755d57f359ac1b0ae', port=None, port_specified=False, domain='.workshare.com', domain_specified=True, domain_initial_dot=True, path='/', path_specified=True, secure=True, expires=None, discard=True, comment=None, comment_url=None, rest={'HttpOnly': None}, rfc2109=False), domain=None, path=None",)
Error in C:\Users\automation\FeatureTest\libraries\ApiAutomation.py on line 177
Error in C:\Users\automation\FeatureTest\libraries\RequestsSupport.py on line 44
Error in C:\PYTHON27\lib\site-packages\requests\api.py on line 55
Error in C:\PYTHON27\lib\site-packages\requests\api.py on line 44
Error in C:\PYTHON27\lib\site-packages\requests\sessions.py on line 327
Error in C:\PYTHON27\lib\site-packages\requests\cookies.py on line 410
Error in C:\PYTHON27\lib\site-packages\requests\cookies.py on line 256
Error in C:\PYTHON27\lib\site-packages\requests\cookies.py on line 311
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1744/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1744/timeline
| null |
completed
| null | null | false |
[
"Tell me, is `self.cookies` in your example a `CookieJar`?\n",
"The cookie is obtained from a successful login which is done via requests post .. so whatever object that returns\nBTW I have reverted to version 1.2.3 and everything is working fine again\n\nFrom: Cory Benfield [mailto:[email protected]]\nSent: 18 November 2013 13:18\nTo: kennethreitz/requests\nCc: Adrian Upton\nSubject: Re: [requests] Upgrade to 2.0.1 breaks existing code wherever cookies are passed as keyword arguments. (#1744)\n\nTell me, is self.cookies in your example a CookieJar?\n\n—\nReply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/issues/1744#issuecomment-28696779.\n",
"I feel fairly confident this is exactly the same issue as https://github.com/kennethreitz/requests/issues/1711 and that @auworkshare did not look at open issues before filing this one.\n",
"@auworkshare If you're using cookies you got from a successful login, you should really just be using a `Session` to persist your state for you. I'm assuming you're just taking the cookies from the `Response` object, which are in a `CookieJar`. This means that @sigmavirus24 is correct, this is a duplicate of #1711.\n",
"When will the fix be available ?\n\nFrom: Cory Benfield [mailto:[email protected]]\nSent: 18 November 2013 15:47\nTo: kennethreitz/requests\nCc: Adrian Upton\nSubject: Re: [requests] Upgrade to 2.0.1 breaks existing code wherever cookies are passed as keyword arguments. (#1744)\n\n@auworksharehttps://github.com/auworkshare If you're using cookies you got from a successful login, you should really just be using a Session to persist your state for you. I'm assuming you're just taking the cookies from the Response object, which are in a CookieJar. This means that @sigmavirus24https://github.com/sigmavirus24 is correct, this is a duplicate of #1711https://github.com/kennethreitz/requests/issues/1711.\n\n—\nReply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/issues/1744#issuecomment-28708654.\n",
"In the next version of Requests. We don't commit to a timeline more specific than that. =)\n",
"OK\nThank for the advice on the Sessions, It will take us a lot of recoding to implement\nWhich is why we would prefer to wait for the fix.\nCheers.\n\nFrom: Cory Benfield [mailto:[email protected]]\nSent: 18 November 2013 16:33\nTo: kennethreitz/requests\nCc: Adrian Upton\nSubject: Re: [requests] Upgrade to 2.0.1 breaks existing code wherever cookies are passed as keyword arguments. (#1744)\n\nIn the next version of Requests. We don't commit to a timeline more specific than that. =)\n\n—\nReply to this email directly or view it on GitHubhttps://github.com/kennethreitz/requests/issues/1744#issuecomment-28713407.\n"
] |
https://api.github.com/repos/psf/requests/issues/1743
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1743/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1743/comments
|
https://api.github.com/repos/psf/requests/issues/1743/events
|
https://github.com/psf/requests/pull/1743
| 22,821,719 |
MDExOlB1bGxSZXF1ZXN0MTAwNDE3MTU=
| 1,743 |
Separate connect timeouts
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4",
"events_url": "https://api.github.com/users/kevinburke/events{/privacy}",
"followers_url": "https://api.github.com/users/kevinburke/followers",
"following_url": "https://api.github.com/users/kevinburke/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kevinburke",
"id": 234019,
"login": "kevinburke",
"node_id": "MDQ6VXNlcjIzNDAxOQ==",
"organizations_url": "https://api.github.com/users/kevinburke/orgs",
"received_events_url": "https://api.github.com/users/kevinburke/received_events",
"repos_url": "https://api.github.com/users/kevinburke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kevinburke",
"user_view_type": "public"
}
|
[
{
"color": "fbca04",
"default": false,
"description": null,
"id": 44501249,
"name": "Needs BDFL Input",
"node_id": "MDU6TGFiZWw0NDUwMTI0OQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input"
},
{
"color": "e11d21",
"default": false,
"description": null,
"id": 44501305,
"name": "Not Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTMwNQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Not%20Ready%20To%20Merge"
}
] |
closed
| true | null |
[] | null | 11 |
2013-11-18T08:54:49Z
|
2021-09-08T23:06:28Z
|
2013-12-05T22:29:56Z
|
CONTRIBUTOR
|
resolved
|
Support separate connect timeouts in Requests by allowing you to write something like
``` python
import requests.structures.Timeout
t = Timeout(connect=3, read=7)
requests.get('https://api.twilio.com', timeout=t)
```
Adds tests, docs and some new exceptions for this functionality.
Please note, this kills the current `stream` based branching of timeouts. The `timeout` value applies only to the amount of time between bytes sent by the server. So, if a read timeout is set to 20 seconds and the server streams one byte every 15 seconds, the read timeout will never get hit. This means long streaming downloads are ok and won't hit the read timeout even if it's specified.
It is perfectly valid to set a read timeout on a streaming request - if the request doesn't start streaming in 15 seconds, then raise an error. With the current master, this is impossible.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1743/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1743/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1743.diff",
"html_url": "https://github.com/psf/requests/pull/1743",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1743.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1743"
}
| true |
[
"I'm 100% sitting on the fence here. This is an API change, so we need to see if @kennethreitz likes it or not. =)\n",
"I don't think this is needed, personally. I appreciate the contribution, though.\n\nPerhaps we should automatically enforce the timeout for the first bytes being sent for the stream. Would this solve your initial problem?\n",
"> I don't think this is needed, personally.\n\nAre you referring to separate connect timeouts in general, or the ability to set a timeout on a streaming request? I'm more or less indifferent about the latter, and it's easy to revert.\n",
"Well, I've done a lot of work to try to prevent there from being a Timeout object :)\n",
"That's fine... I guess I am indifferent about the interface, but would appreciate the ability to specify different connect and read timeouts via python-requests.\n",
"@kevinburke i'll try to chat with you on IRC about this this week. :)\n",
"Sorry if it seems like I'm being OCD about mundane details... that's kinda the point of the project though :)\n",
"This is pretty essential for any use of a streaming API, since without a way to set a timeout there's no good way to detect a stalled request. Twitter, for instance, sends a newline every 30 seconds, and advises you consider the connection stalled if 90 seconds go by without one.\n\nI naturally assumed that the timeout parameter applied to all reads as documented, and wasn't special-cased for streaming requests - why not just revert to that behaviour?\n",
"That's basically what I'm thinking of doing :)\n",
"Why not use a tuple to represent the different kinds of timeout when people want to differentiate. It avoids adding a `Timeout` class and we can destructure it to the above usage?\n",
"Hi Kenneth,\nWould be nice to chat a bit more about your reasoning for closing this PR - I've tried reaching out in IRC, not sure of the best way to get in touch.\n\nBest,\nKevin\n"
] |
https://api.github.com/repos/psf/requests/issues/1742
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1742/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1742/comments
|
https://api.github.com/repos/psf/requests/issues/1742/events
|
https://github.com/psf/requests/pull/1742
| 22,812,245 |
MDExOlB1bGxSZXF1ZXN0MTAwMzcwMzA=
| 1,742 |
Add make target to build the documentation
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4",
"events_url": "https://api.github.com/users/kevinburke/events{/privacy}",
"followers_url": "https://api.github.com/users/kevinburke/followers",
"following_url": "https://api.github.com/users/kevinburke/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kevinburke",
"id": 234019,
"login": "kevinburke",
"node_id": "MDQ6VXNlcjIzNDAxOQ==",
"organizations_url": "https://api.github.com/users/kevinburke/orgs",
"received_events_url": "https://api.github.com/users/kevinburke/received_events",
"repos_url": "https://api.github.com/users/kevinburke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kevinburke",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 2 |
2013-11-18T02:39:07Z
|
2021-09-08T11:00:52Z
|
2013-11-20T09:02:38Z
|
CONTRIBUTOR
|
resolved
|
Hooray makefiles :)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1742/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1742/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1742.diff",
"html_url": "https://github.com/psf/requests/pull/1742",
"merged_at": "2013-11-20T09:02:38Z",
"patch_url": "https://github.com/psf/requests/pull/1742.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1742"
}
| true |
[
"Sure, why not? :+1:\n",
"this is super cool! thanks!\n"
] |
https://api.github.com/repos/psf/requests/issues/1741
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1741/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1741/comments
|
https://api.github.com/repos/psf/requests/issues/1741/events
|
https://github.com/psf/requests/pull/1741
| 22,785,659 |
MDExOlB1bGxSZXF1ZXN0MTAwMjc0MDc=
| 1,741 |
Significantly reduce library footprint
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5554910?v=4",
"events_url": "https://api.github.com/users/lordjabez/events{/privacy}",
"followers_url": "https://api.github.com/users/lordjabez/followers",
"following_url": "https://api.github.com/users/lordjabez/following{/other_user}",
"gists_url": "https://api.github.com/users/lordjabez/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/lordjabez",
"id": 5554910,
"login": "lordjabez",
"node_id": "MDQ6VXNlcjU1NTQ5MTA=",
"organizations_url": "https://api.github.com/users/lordjabez/orgs",
"received_events_url": "https://api.github.com/users/lordjabez/received_events",
"repos_url": "https://api.github.com/users/lordjabez/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/lordjabez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lordjabez/subscriptions",
"type": "User",
"url": "https://api.github.com/users/lordjabez",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2013-11-16T21:50:05Z
|
2021-09-08T23:05:07Z
|
2013-11-16T21:52:44Z
|
NONE
|
resolved
|
Closes #1. Closes #2.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5554910?v=4",
"events_url": "https://api.github.com/users/lordjabez/events{/privacy}",
"followers_url": "https://api.github.com/users/lordjabez/followers",
"following_url": "https://api.github.com/users/lordjabez/following{/other_user}",
"gists_url": "https://api.github.com/users/lordjabez/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/lordjabez",
"id": 5554910,
"login": "lordjabez",
"node_id": "MDQ6VXNlcjU1NTQ5MTA=",
"organizations_url": "https://api.github.com/users/lordjabez/orgs",
"received_events_url": "https://api.github.com/users/lordjabez/received_events",
"repos_url": "https://api.github.com/users/lordjabez/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/lordjabez/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lordjabez/subscriptions",
"type": "User",
"url": "https://api.github.com/users/lordjabez",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1741/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1741/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1741.diff",
"html_url": "https://github.com/psf/requests/pull/1741",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1741.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1741"
}
| true |
[
"Oops, this was supposed to be a local pull request to my own fork. Sorry!\n",
"Err, what's going on?\n\nEdit: nvm :D \n"
] |
https://api.github.com/repos/psf/requests/issues/1740
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1740/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1740/comments
|
https://api.github.com/repos/psf/requests/issues/1740/events
|
https://github.com/psf/requests/pull/1740
| 22,745,473 |
MDExOlB1bGxSZXF1ZXN0MTAwMDY5MjI=
| 1,740 |
Added multipart param, to be used in post request (harmless otherwise)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/657989?v=4",
"events_url": "https://api.github.com/users/bafio/events{/privacy}",
"followers_url": "https://api.github.com/users/bafio/followers",
"following_url": "https://api.github.com/users/bafio/following{/other_user}",
"gists_url": "https://api.github.com/users/bafio/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bafio",
"id": 657989,
"login": "bafio",
"node_id": "MDQ6VXNlcjY1Nzk4OQ==",
"organizations_url": "https://api.github.com/users/bafio/orgs",
"received_events_url": "https://api.github.com/users/bafio/received_events",
"repos_url": "https://api.github.com/users/bafio/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bafio/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bafio/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bafio",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2013-11-15T17:12:17Z
|
2021-09-08T22:01:05Z
|
2013-11-15T17:16:45Z
|
NONE
|
resolved
|
basic usage:
r = requests.post(url, data=data, multipart=True)
it will behave same as
r = requests.post(url, data=data, files={'filename':data})
whithout having ----> ; filename="filename" <----- in the Content-Disposition's multipart body
this somethimes will not be accepted:
Content-Disposition: form-data; name="data"; filename="filename"
this it should:
Content-Disposition: form-data; name="data"
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1740/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1740/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1740.diff",
"html_url": "https://github.com/psf/requests/pull/1740",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1740.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1740"
}
| true |
[
"Thanks for this! However, we are almost never going to accept new parameters in the Requests functional API, especially not ones that are simply True/False. =)\n\nThe expected method for doing what you're trying to do [can be found here](http://stackoverflow.com/a/19974452/1401686). Basically what you want is:\n\n``` python\nr = requests.post(url, files={\"data\": (\"\", data, '', {})})\n```\n\nAgain, thanks for putting in the time! Let me know if you have any further problems.\n"
] |
https://api.github.com/repos/psf/requests/issues/1739
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1739/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1739/comments
|
https://api.github.com/repos/psf/requests/issues/1739/events
|
https://github.com/psf/requests/pull/1739
| 22,734,687 |
MDExOlB1bGxSZXF1ZXN0MTAwMDA3Mjc=
| 1,739 |
add coverage make target
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1533151?v=4",
"events_url": "https://api.github.com/users/coldfire-x/events{/privacy}",
"followers_url": "https://api.github.com/users/coldfire-x/followers",
"following_url": "https://api.github.com/users/coldfire-x/following{/other_user}",
"gists_url": "https://api.github.com/users/coldfire-x/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/coldfire-x",
"id": 1533151,
"login": "coldfire-x",
"node_id": "MDQ6VXNlcjE1MzMxNTE=",
"organizations_url": "https://api.github.com/users/coldfire-x/orgs",
"received_events_url": "https://api.github.com/users/coldfire-x/received_events",
"repos_url": "https://api.github.com/users/coldfire-x/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/coldfire-x/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/coldfire-x/subscriptions",
"type": "User",
"url": "https://api.github.com/users/coldfire-x",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 3 |
2013-11-15T14:22:03Z
|
2021-09-08T22:01:04Z
|
2013-11-15T18:51:47Z
|
CONTRIBUTOR
|
resolved
|
[issue 1545](https://github.com/kennethreitz/requests/issues/1545), setup code coverage.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1739/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1739/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1739.diff",
"html_url": "https://github.com/psf/requests/pull/1739",
"merged_at": "2013-11-15T18:51:47Z",
"patch_url": "https://github.com/psf/requests/pull/1739.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1739"
}
| true |
[
"Sweet, this looks reasonable to me. Thanks so much for this! :+1:\n",
"i love py.test :)\n",
"i also love you, @pengfei-xue :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/1738
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1738/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1738/comments
|
https://api.github.com/repos/psf/requests/issues/1738/events
|
https://github.com/psf/requests/pull/1738
| 22,723,903 |
MDExOlB1bGxSZXF1ZXN0OTk5NDQ3MQ==
| 1,738 |
add coverage make target
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1533151?v=4",
"events_url": "https://api.github.com/users/coldfire-x/events{/privacy}",
"followers_url": "https://api.github.com/users/coldfire-x/followers",
"following_url": "https://api.github.com/users/coldfire-x/following{/other_user}",
"gists_url": "https://api.github.com/users/coldfire-x/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/coldfire-x",
"id": 1533151,
"login": "coldfire-x",
"node_id": "MDQ6VXNlcjE1MzMxNTE=",
"organizations_url": "https://api.github.com/users/coldfire-x/orgs",
"received_events_url": "https://api.github.com/users/coldfire-x/received_events",
"repos_url": "https://api.github.com/users/coldfire-x/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/coldfire-x/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/coldfire-x/subscriptions",
"type": "User",
"url": "https://api.github.com/users/coldfire-x",
"user_view_type": "public"
}
|
[
{
"color": "e11d21",
"default": false,
"description": null,
"id": 44501305,
"name": "Not Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTMwNQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Not%20Ready%20To%20Merge"
}
] |
closed
| true | null |
[] | null | 2 |
2013-11-15T10:16:41Z
|
2021-09-08T23:05:11Z
|
2013-11-15T14:24:16Z
|
CONTRIBUTOR
|
resolved
|
[issue 1545](https://github.com/kennethreitz/requests/issues/1545), setup code coverage.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1533151?v=4",
"events_url": "https://api.github.com/users/coldfire-x/events{/privacy}",
"followers_url": "https://api.github.com/users/coldfire-x/followers",
"following_url": "https://api.github.com/users/coldfire-x/following{/other_user}",
"gists_url": "https://api.github.com/users/coldfire-x/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/coldfire-x",
"id": 1533151,
"login": "coldfire-x",
"node_id": "MDQ6VXNlcjE1MzMxNTE=",
"organizations_url": "https://api.github.com/users/coldfire-x/orgs",
"received_events_url": "https://api.github.com/users/coldfire-x/received_events",
"repos_url": "https://api.github.com/users/coldfire-x/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/coldfire-x/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/coldfire-x/subscriptions",
"type": "User",
"url": "https://api.github.com/users/coldfire-x",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1738/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1738/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1738.diff",
"html_url": "https://github.com/psf/requests/pull/1738",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1738.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1738"
}
| true |
[
"Can you rebase against master? Some pull requests got accepted this morning that mean your `requirements.txt` is out of date. =)\n",
"@Lukasa i created another pull reuqest, please close this out thanks.\n"
] |
https://api.github.com/repos/psf/requests/issues/1737
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1737/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1737/comments
|
https://api.github.com/repos/psf/requests/issues/1737/events
|
https://github.com/psf/requests/issues/1737
| 22,655,515 |
MDU6SXNzdWUyMjY1NTUxNQ==
| 1,737 |
[Suggestion] Simplify charset handling
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/142512?v=4",
"events_url": "https://api.github.com/users/itsadok/events{/privacy}",
"followers_url": "https://api.github.com/users/itsadok/followers",
"following_url": "https://api.github.com/users/itsadok/following{/other_user}",
"gists_url": "https://api.github.com/users/itsadok/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/itsadok",
"id": 142512,
"login": "itsadok",
"node_id": "MDQ6VXNlcjE0MjUxMg==",
"organizations_url": "https://api.github.com/users/itsadok/orgs",
"received_events_url": "https://api.github.com/users/itsadok/received_events",
"repos_url": "https://api.github.com/users/itsadok/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/itsadok/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/itsadok/subscriptions",
"type": "User",
"url": "https://api.github.com/users/itsadok",
"user_view_type": "public"
}
|
[
{
"color": "fbca04",
"default": false,
"description": null,
"id": 44501249,
"name": "Needs BDFL Input",
"node_id": "MDU6TGFiZWw0NDUwMTI0OQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input"
},
{
"color": "eb6420",
"default": false,
"description": null,
"id": 44501256,
"name": "Breaking API Change",
"node_id": "MDU6TGFiZWw0NDUwMTI1Ng==",
"url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change"
}
] |
open
| false | null |
[] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2013-11-17T11:29:34Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": null,
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/20",
"id": 487518,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels",
"node_id": "MDk6TWlsZXN0b25lNDg3NTE4",
"number": 20,
"open_issues": 12,
"state": "open",
"title": "3.0.0",
"updated_at": "2024-05-19T18:43:00Z",
"url": "https://api.github.com/repos/psf/requests/milestones/20"
}
| 6 |
2013-11-14T10:56:19Z
|
2014-10-05T17:26:54Z
| null |
CONTRIBUTOR
| null |
To cut to the chase, here are my suggestions:
- Remove automatic character set detection (charade) from the library
- Have response.encoding represent the charset from the Content-Type header
- Mention the caveat in the documentation
### Long version
There seems to be a lot of confusion regarding how the `.text` property works. After getting into some trouble with it myself, I searched the issues list, and found a dozen or so issues, all boiling down to the same mismatch between users' expectations and the intent of the library designers.
#147 - bytecode string returned when page has charset=UTF-8
#156 - get_unicode_from_response does not check charsets from meta tags
#592 - Internal encoding detection doesn't match external chardet call
#654 - requests.get() ignores charset=UTF-8 and BOM
#765 - Chardet sometimes fails and force the wrong encoding
#861 - parsing encoding utf-8 page doesn't as expected
#1087 - Encodings from content
#1150 - On some pages requests detect encoding incorrectly
#1546 - use a default encoding in Response's text property
#1589 - Make sure content is checked when setting encoding
#1604 - Response.text returns improperly decoded text
#1683 - models.text Behaviour (encoding choice)
(It must be tiring to have the same conversation over and over again. I hope I'm being helpful here and not just piling on).
The argument seems to be:
- As an HTTP library, requests should not know or care about HTML and META attributes
- RFC 2616 states that if no charset is defined, "text/*" media types should be regarded as ISO-8859-1
I accept both these arguments. However, the documentation seems a bit coy, saying that "Requests makes an educated guess about the encoding", implying chardet/charade. In practice, for any content with a "text" media subtype, charade will not be used unless the user explicitly sets the `response.encoding` to None before reading the `.text` property.
Additionally, while ISO-8859-1 can be used as a default, won't it make more sense to handle that default in `.text` and not in the `get_encoding_from_headers` method? This way, the `encoding` property will be None if indeed no encoding is specified, allowing the user to make the decision on how to proceed.
If you're going to keep the `.text` property, I think it should do a simple decoding if the charset is specified in the headers, and throw an exception otherwise. This way is much less confusing than the state of affairs now. Additionally, the documentation should contain a warning not to use it for arbitrary web pages, and perhaps a code sample showing the proper way to do it.
``` python
import re
import charade
import requests
def get_encodings_from_content(content):
charset_re = re.compile(r'<meta.*?charset=["\']*(.+?)["\'>]', flags=re.I)
pragma_re = re.compile(r'<meta.*?content=["\']*;?charset=(.+?)["\'>]', flags=re.I)
xml_re = re.compile(r'^<\?xml.*?encoding=["\']*(.+?)["\'>]')
# FIXME: Does not work in python 3
return (charset_re.findall(content) +
pragma_re.findall(content) +
xml_re.findall(content))
r = requests.get('https://example.com/page.html')
if "charset" not in r.headers.get("content-type", ""):
encodings = get_encodings_from_content(r.content)
if encodings:
r.encoding = encodings[0]
else:
r.encoding = charade.detect(r.content)['encoding']
html = r.text
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/142512?v=4",
"events_url": "https://api.github.com/users/itsadok/events{/privacy}",
"followers_url": "https://api.github.com/users/itsadok/followers",
"following_url": "https://api.github.com/users/itsadok/following{/other_user}",
"gists_url": "https://api.github.com/users/itsadok/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/itsadok",
"id": 142512,
"login": "itsadok",
"node_id": "MDQ6VXNlcjE0MjUxMg==",
"organizations_url": "https://api.github.com/users/itsadok/orgs",
"received_events_url": "https://api.github.com/users/itsadok/received_events",
"repos_url": "https://api.github.com/users/itsadok/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/itsadok/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/itsadok/subscriptions",
"type": "User",
"url": "https://api.github.com/users/itsadok",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1737/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1737/timeline
| null | null | null | null | false |
[
"You've got lots of good suggestions here. I'll respond to some of them. In no particular order:\n1. `Response.text` should never ever ever ever throw an exception if we can possibly avoid it. Properties throwing exceptions is bad. If we were going to start throwing exceptions from `Response.text` I'd want it to become `Response.text()`.\n2. Your observation that `charade` will never get called for any content with a `text/*` MIME type is totally correct, and by design. RFC 2616 is incredibly clear on this point.\n3. I am open to moving the ISO-8859-1 default to `Response.text` and out of `get_encoding_from_headers`.\n4. I am not open to removing flexibility from `Response.text`.\n\nStripping all the functionality from `Response.text`, as you suggest in your last point, seems silly to me. If we're going that far, we should remove `Response.text` altogether.\n\n`Response.text` has _carte blanche_ to do whatever it can to correctly decode the output. There are points of finesse here, but that will always be the case. No matter what `Response.text` does, someone will disapprove.\n",
"@Lukasa you were tricked into saying this:\n\n> Stripping all the functionality from Response.text, as you suggest in your last point, seems silly to me. If we're going that far, we should remove Response.text altogether.\n\nThis is clearly the agenda of this issue as you can tell by:\n\n> If you're going to keep the .text property\n\n@itsadok clearly wants the `.text` property to disappear because issues have been filed regarding it in the past.\n\nLet me address one other thing that @Lukasa didn't before I add my opinion.\n\n> Additionally, the documentation should contain a warning not to use it for arbitrary web pages, and perhaps a code sample showing the proper way to do it.\n1. charade works fairly well for well established codecs. There are new ones that subsume old ones which it doesn't support yet. Why? There aren't publicly available statistics for those encodings and that's what charade relies on. If you disagree with how something is being detected, why not file a bug report on charade?\n2. That code sample is **not** the proper way to do it. Using regular expressions on HTML is insanity and is **never** the correct answer.\n\nWith that addressed, let me address one more theme of this issue: Because _some negligible percentage_ of all issues have been filed about _x_, _x_ should be (changed|removed).\n\nOne thing to note is that all the issues with numbers lower than 1000 were filed before requests 1.0 which is when the API was finalized. If there were legitimate bugs in this attribute prior to that, I would be far from surprised. Also some of those issues are instead about the choice that chardet/charade made. Those are not bugs in requests or `.text` but instead in the underlying support.\n\nFinally, after the release of 1.0 we had a lot of issues about the change from `json` being a property on a Response object to becoming a method. We didn't remove it or change it back for a good reason. It was a deliberate design decision.\n\nThe `.text` attribute is quite crucial to this library, especially to the `json` method, and it will likely never be removed. Can it be improved? Almost certainly. You provided a couple of good suggestions, but the overall tone this issue is meant to convince the reader that it should be removed and that will not happen. Without a reasonable guess at the encoding of the text, we cannot provide the user with the `json` method which also will not go away. Simply, the user is not the sole consumer of `.text`.\n",
"Sorry about the closing and reopening, that was a mis-click.\n\n@sigmavirus24 I'm sorry if it seems like I have an agenda. I'm honestly just trying to help. I read through the discussions in **all** of the issues I posted. They all really seem to revolve around the same basic confusion, with people expecting Response.text to be an all-encompassing solution where in reality it is not. \n\nLet me just clarify some misunderstandings in what I wrote.\n\n> > Additionally, the documentation should contain a warning not to use it for arbitrary web pages, and perhaps a code sample showing the proper way to do it.\n> > charade works fairly well for well established codecs. There are new ones that subsume old ones which it doesn't support yet. Why? There aren't publicly available statistics for those encodings and that's what charade relies on. If you disagree with how something is being detected, why not file a bug report on charade?\n\nI merely meant that it should be noted that `Response.text` should not be used willy-nilly on arbitrary web page, precisely because it avoids using charade in many places where it can be used.\n\n> That code sample is not the proper way to do it. Using regular expressions on HTML is insanity and is never the correct answer.\n\nThe `get_encodings_from_content` function is fully copied from `requests.utils`, with the added comment by me that it doesn't really work in Python 3. In any case the point was that this needs to be clarified, not my specific solution.\n\n> The .text attribute is quite crucial to this library, especially to the json method, and it will likely never be removed. Can it be improved? Almost certainly. You provided a couple of good suggestions, but the overall tone this issue is meant to convince the reader that it should be removed and that will not happen. Without a reasonable guess at the encoding of the text, we cannot provide the user with the json method which also will not go away. Simply, the user is not the sole consumer of .text.\n\nThis is a valid point, and perhaps serves to explain the weird nature of .text. Perhaps all that is needed is a note in the documentation.\n",
"OK, mea culpa. I missed the part where `Response.json` relies almost entirely on `Response.text` for charset issues. That explains some of the design, and I would have phrased my post differently if I had noticed.\n\nIt seems like I hit a nerve, which was really not my intention. I'm going to close the issue, but there is one pun that I just have to make.\n\n> 1. Your observation that charade will never get called for any content with a text/\\* MIME type is totally correct, and by design. RFC 2616 is incredibly clear on this point.\n\nThe thing is, it seems that in most cases where you'd even want to access the `.text` property, you would also have a text/\\* MIME type (application/json being the exception). That means that the cases where automatic charset detection is actually used are pretty rare, and unpredictable for the user. _So why keep the charade?_\n",
"> I merely meant that it should be noted that Response.text should not be used willy-nilly on arbitrary web page, precisely because it avoids using charade in many places where it can be used.\n\nJust because something _can_ be used somewhere does not mean it should be. `charade` is slow as a result of its accuracy and painstaking meticulousness. Using it when it _can_ be used as opposed to when it _must_ be used makes the performance difference in the user's eyes.\n\n> The get_encodings_from_content function is fully copied from requests.utils\n\nBut we never use it. It is cruft and _should_ be removed. It is the wrong way to do things.\n\n> That means that the cases where automatic charset detection is actually used are pretty rare, and unpredictable for the user. So why keep the charade?\n\nYou're assuming everything behaves the same way and that RFCs are followed by servers. They're not. Charade is occasionally used. We keep it because it is essentially part of the API. Response's couldn't have an `apparent_encoding` attribute if we discarded charade. We can break the API if we ever release 3.0 but until then, the API is frozen except for backwards compatible changes. Also, that's an excellent pun.\n\n> It seems like I hit a nerve, which was really not my intention. I'm going to close the issue\n\nI'm going to re-open it. You made valid points as @Lukasa pointed out. It just needs to be clear that this is not any agreement about removing the property.\n",
"Heh, @sigmavirus24 both have the same reactions. Neither of us thinks this issue should be closed: I reopened it just before he could!\n\nNeither of us is angry, or unhappy about having this feedback. We're both delighted. You just can't tell because of the limitations of textual communication! =D\n\nThe reality is that text encoding is hard, everyone is doing the wrong thing from time to time, and we cannot possibly please anyone. For that reason, we do the best we can, and then we expose the `Response.encoding` property for people who don't like the way we do it.\n\nYou've identified good issues with `Response.text`, and I plan to fix them (unless someone else does so first). They're not all backward compatible, so we'll need to sit on them for a bit, but they're good.\n"
] |
https://api.github.com/repos/psf/requests/issues/1736
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1736/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1736/comments
|
https://api.github.com/repos/psf/requests/issues/1736/events
|
https://github.com/psf/requests/issues/1736
| 22,645,627 |
MDU6SXNzdWUyMjY0NTYyNw==
| 1,736 |
Accept per-field headers in multipart POST requests (and create multipart/mixed requests)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/141436?v=4",
"events_url": "https://api.github.com/users/JanJakes/events{/privacy}",
"followers_url": "https://api.github.com/users/JanJakes/followers",
"following_url": "https://api.github.com/users/JanJakes/following{/other_user}",
"gists_url": "https://api.github.com/users/JanJakes/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/JanJakes",
"id": 141436,
"login": "JanJakes",
"node_id": "MDQ6VXNlcjE0MTQzNg==",
"organizations_url": "https://api.github.com/users/JanJakes/orgs",
"received_events_url": "https://api.github.com/users/JanJakes/received_events",
"repos_url": "https://api.github.com/users/JanJakes/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/JanJakes/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JanJakes/subscriptions",
"type": "User",
"url": "https://api.github.com/users/JanJakes",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2013-11-14T06:39:24Z
|
2021-09-09T00:34:17Z
|
2013-11-24T10:53:11Z
|
NONE
|
resolved
|
Currently there is no way of doing such requests:
``` bash
POST /api/ HTTP/1.1
Host: 127.0.0.1:5000
Content-Type: multipart/form-data; boundary=495b38322a354377913185cb452518fd
--495b38322a354377913185cb452518fd
Content-Disposition: form-data; name="data"
Content-Type: application/json; charset=UTF-8
{"key2": "value2", "key1": "value1"}
--495b38322a354377913185cb452518fd
Content-Disposition: form-data; name="file"; filename="image.jpg"
...image data...
--495b38322a354377913185cb452518fd--
```
Sending a file along with a JSON encoded metadata is a very typical task for API implementations.
What is even is weirder that we can do this for files but not for other fields that are sent with the files.
Also, is there any way to compose a multipart POST request with `multipart/mixed` header? (With keeping the possibility to choose the fields and their headers.)
Thanks, Jan
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1736/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1736/timeline
| null |
completed
| null | null | false |
[
"Yes there is. Take a look at the [changelog](https://github.com/kennethreitz/requests/blob/master/HISTORY.rst#201-2013-10-24).\n\nThe syntax for doing this and including things that aren't files (like some JSON) is a bit of a pain, but here it is:\n\n``` python\nimport json\nfrom cStringIO import StringIO\n\nmetadata = {'key1': 'value1', 'key2': 'value2'}\njson = StringIO(json.dumps(metadata))\n\nfiles = {\"data\": (\"\", json, \"application/json; charset=UTF-8\", {}), \"file\": (\"image.jpg\", imagefile, \"\", {})}\nr = requests.post('http://www.api.com/, files=files)\n```\n\nThe reason this API is totally nasty is because the `multipart/form-data` content type is very flexible. This makes it difficult for us to provide a clean API that covers all use cases. I'm strongly averse to the current method of doing it. I think we should be providing a contrib library (something like `requests-multipart`) that is capable of doing all of this craziness, rather than leaving it in Requests.\n\nFinally, yes, you can do a `multipart/mixed` header by breaking out the request preparation [as you can see in the documentation](http://docs.python-requests.org/en/latest/user/advanced/#prepared-requests). An example:\n\n``` python\nfrom requests import Request, Session\n\n# Insert all that files nonsense here.\n\ns = Session()\nreq = Request('POST', url, files=files)\nprepped = s.prepare_request(req)\nprepped.headers['Content-Type'] = 'multipart/mixed'\nresp = s.send(prepped)\n```\n",
"Thanks a lot for such a great answer! Yes, the API for this isn't straightforward but at least it's possible without touching `urllib3` and stuff.\n\nI also don't like the current method because it kind of covers the most typical use cases but some more special cases (yet still valid ones) are maybe possible but only with those \"hacks\". The `multipart` request is so flexible that the best thing in my opinion would be to have something like a `MultipartRequest` that would be a composition of any `Request`s (or `RequestField`s or whatever).\n",
"Anything we do to improve our flexibility here should be done outside of `requests` core. It's just complexity we don't need. If anything, I'd like to yank most of this out of Requests core, allow people only to do the most basic multipart uploads, and then provide an external module that does the rest. Not sure Kenneth would agree with me though.\n",
"@Lukasa there was a reason I started https://github.com/sigmavirus24/requests-data-schemes. It was a bad name but the intent was to provide a better way of doing complicated multipart uploads (specifically when no files should be sent, which is an **extremely** common use case).\n",
"Related: shazow/urllib3#215\n"
] |
https://api.github.com/repos/psf/requests/issues/1735
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1735/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1735/comments
|
https://api.github.com/repos/psf/requests/issues/1735/events
|
https://github.com/psf/requests/pull/1735
| 22,599,396 |
MDExOlB1bGxSZXF1ZXN0OTkzMzg0MQ==
| 1,735 |
Wheel
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
] | null | 8 |
2013-11-13T15:39:09Z
|
2021-09-08T22:01:18Z
|
2013-11-15T09:33:38Z
|
MEMBER
|
resolved
|
Let's get ourselves green on [this list](http://pythonwheels.com/). We're pure python, so this should be totally trivial for us. I'll try to provide a PR tonight.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1735/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1735/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1735.diff",
"html_url": "https://github.com/psf/requests/pull/1735",
"merged_at": "2013-11-15T09:33:38Z",
"patch_url": "https://github.com/psf/requests/pull/1735.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1735"
}
| true |
[
"Alright, I've extended the Makefile so that it should be possible to just start doing this transparently. =) Let me know what we think!\n",
":+1:\n\nCan we add \"wheel\" to the (dev-)?requirements file? This means that any virtualenv will be fully equipped to deploy it. Then again not everyone needs it.\n",
"Sure, why not?\n",
"I guess this makes me doubly :+1:?\n",
"I was so afraid of this\n",
":wheelchair: \n",
"Why were you afraid @kennethreitz ?\n",
"i just thought it'd be a lot harder than that. :P\n"
] |
https://api.github.com/repos/psf/requests/issues/1734
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1734/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1734/comments
|
https://api.github.com/repos/psf/requests/issues/1734/events
|
https://github.com/psf/requests/issues/1734
| 22,397,098 |
MDU6SXNzdWUyMjM5NzA5OA==
| 1,734 |
UnicodeDecodeError on a multipart file upload
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1396111?v=4",
"events_url": "https://api.github.com/users/gcq/events{/privacy}",
"followers_url": "https://api.github.com/users/gcq/followers",
"following_url": "https://api.github.com/users/gcq/following{/other_user}",
"gists_url": "https://api.github.com/users/gcq/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gcq",
"id": 1396111,
"login": "gcq",
"node_id": "MDQ6VXNlcjEzOTYxMTE=",
"organizations_url": "https://api.github.com/users/gcq/orgs",
"received_events_url": "https://api.github.com/users/gcq/received_events",
"repos_url": "https://api.github.com/users/gcq/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gcq/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gcq/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gcq",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2013-11-10T00:54:47Z
|
2021-09-09T00:34:23Z
|
2013-11-10T07:38:25Z
|
NONE
|
resolved
|
Since im in python3.3 and requests takes a file-like object as the file argument on a post request, why is there an Unicode error? What should i do to be able to upload a multimedia file using requests?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1734/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1734/timeline
| null |
completed
| null | null | false |
[
"Questions belong on [StackOverflow](http://stackoverflow.com/questions/tagged/python-requests). In general though, multimedia files will not be Unicode compatible because it will attempt to read sequences of bytes in the file object as Unicode multi-byte sequences instead. Ostensibly there's inherent string addition which is causing this. Regardless, without code to reproduce this with, I'm leaning towards closing this.\n",
"Until we get code and a stack trace, I'm closing this.\n",
"Oh, how in time! I have the same here. \nPython 2.7, requests latest from Github.\n\n``` Bash\nException in thread Thread-55:\nTraceback (most recent call last):\n File \"C:\\Program Files\\python27\\lib\\threading.py\", line 551, in __bootstrap_inner\n self.run()\n File \"C:\\Program Files\\python27\\lib\\threading.py\", line 504, in run\n self.__target(*self.__args, **self.__kwargs)\n File \"D:\\gitrepos\\TheQube\\src\\transfer_dialogs\\transfer_dialogs.py\", line 63, in perform_transfer\n r = requests.post(self.url, files = file)\n File \"C:\\Program Files\\python27\\lib\\site-packages\\requests-2.0.1-py2.7.egg\\requests\\api.py\", line 88, in post\n return request('post', url, data=data, **kwargs)\n File \"C:\\Program Files\\python27\\lib\\site-packages\\requests-2.0.1-py2.7.egg\\requests\\api.py\", line 44, in request\n return session.request(method=method, url=url, **kwargs)\n File \"C:\\Program Files\\python27\\lib\\site-packages\\requests-2.0.1-py2.7.egg\\requests\\sessions.py\", line 327, in request\n prep = self.prepare_request(req)\n File \"C:\\Program Files\\python27\\lib\\site-packages\\requests-2.0.1-py2.7.egg\\requests\\sessions.py\", line 265, in prepare_request\n hooks=merge_setting(request.hooks, self.hooks),\n File \"C:\\Program Files\\python27\\lib\\site-packages\\requests-2.0.1-py2.7.egg\\requests\\models.py\", line 286, in prepare\n self.prepare_body(data, files)\n File \"C:\\Program Files\\python27\\lib\\site-packages\\requests-2.0.1-py2.7.egg\\requests\\models.py\", line 417, in prepare_body\n (body, content_type) = self._encode_files(files, data)\n File \"C:\\Program Files\\python27\\lib\\site-packages\\requests-2.0.1-py2.7.egg\\requests\\models.py\", line 139, in _encode_files\n rf = RequestField(name=k, data=fp.read(),\n File \"C:\\Program Files\\python27\\lib\\codecs.py\", line 671, in read\n return self.reader.read(size)\n File \"C:\\Program Files\\python27\\lib\\codecs.py\", line 477, in read\n newchars, decodedbytes = self.decode(data, self.errors)\nUnicodeDecodeError: 'utf8' codec can't decode byte 0xc5 in position 21: invalid continuation byte\n```\n",
"@Oire can you give more details than that? What kind of file are you trying to post? Any other details? \n",
"I'm trying to post MP3 files with either Cyrillic names (like _Я сюда ещё вернусь.mp3_) or with extended latin names (like _Mànran — Oran na cloiche.mp3_). \nWhen I'm debugging, I see these names as \\u-something sequences, like this one:\n\n``` Bash\nSun Nov 10, 2013 22:27:11 transfer_dialogs DEBUG: File: {'file': <open file u'D:\\\\playlists\\\\Collection\\\\Russian\\\\\\u0425\\u0443\\u0443\\u043d \\u0425\\u0443\\u0443\\u0440 \\u0422\\u0443 - \\u0415\\u0441\\u043b\\u0438 \\u0431\\u044b \\u044f \\u0431\\u044b\\u043b \\u0440\\u043e\\u0436\\u0434\\u0435\\u043d \\u043e\\u0440\\u043b\\u043e\\u043c.mp3', mode 'rb' at 0x0A597D30>}\n```\n\nI'm using not just `open()` but `codecs.open()` because plain `open()` didn't allow me to send without such errors. The other side gave me _Use HTTP post and provide only a file_.\n",
"I don't understand what this means:\n\n> The other side gave me _Use HTTP post and provide only a file._\n\nBut I can only assume that you're using the `files` parameter when you should not be using `multipart/form-data` requests, but instead should just be sending the plain data via the `data` parameter. That's not a bug and further difficulties you have should be posed to [StackOverflow](http://stackoverflow.com/questions/tagged/python-requests)\n"
] |
https://api.github.com/repos/psf/requests/issues/1733
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1733/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1733/comments
|
https://api.github.com/repos/psf/requests/issues/1733/events
|
https://github.com/psf/requests/pull/1733
| 22,395,404 |
MDExOlB1bGxSZXF1ZXN0OTgyNDE3Nw==
| 1,733 |
Response and Request objects are pickleable.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/509830?v=4",
"events_url": "https://api.github.com/users/ionrock/events{/privacy}",
"followers_url": "https://api.github.com/users/ionrock/followers",
"following_url": "https://api.github.com/users/ionrock/following{/other_user}",
"gists_url": "https://api.github.com/users/ionrock/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ionrock",
"id": 509830,
"login": "ionrock",
"node_id": "MDQ6VXNlcjUwOTgzMA==",
"organizations_url": "https://api.github.com/users/ionrock/orgs",
"received_events_url": "https://api.github.com/users/ionrock/received_events",
"repos_url": "https://api.github.com/users/ionrock/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ionrock/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ionrock/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ionrock",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 3 |
2013-11-09T22:49:23Z
|
2021-09-08T23:11:06Z
|
2013-11-20T09:01:26Z
|
CONTRIBUTOR
|
resolved
|
Includes a basic test. More could be add to confirm known attributes
that could cause problems.
This should fix #1367.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/1733/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1733/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1733.diff",
"html_url": "https://github.com/psf/requests/pull/1733",
"merged_at": "2013-11-20T09:01:26Z",
"patch_url": "https://github.com/psf/requests/pull/1733.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1733"
}
| true |
[
"Looks fine in principle to me, but I'd want @sigmavirus24 to take a look. =)\n",
"If you feel you want to add more tests @ionrock then go ahead but I'm happy with this as it is. :+1:\n",
":pickle: (this should be a thing)\n\nThanks!\n"
] |
https://api.github.com/repos/psf/requests/issues/1732
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1732/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1732/comments
|
https://api.github.com/repos/psf/requests/issues/1732/events
|
https://github.com/psf/requests/issues/1732
| 22,382,764 |
MDU6SXNzdWUyMjM4Mjc2NA==
| 1,732 |
SNI support broken on Python 2.X in 2.0.1.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
},
{
"color": "5319e7",
"default": false,
"description": null,
"id": 67760318,
"name": "Fixed",
"node_id": "MDU6TGFiZWw2Nzc2MDMxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Fixed"
}
] |
closed
| true | null |
[] |
{
"closed_at": "2014-12-25T16:11:49Z",
"closed_issues": 1,
"created_at": "2013-11-09T08:57:36Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": null,
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/19",
"id": 479442,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/19/labels",
"node_id": "MDk6TWlsZXN0b25lNDc5NDQy",
"number": 19,
"open_issues": 0,
"state": "closed",
"title": "2.0.2",
"updated_at": "2014-12-25T16:11:49Z",
"url": "https://api.github.com/repos/psf/requests/milestones/19"
}
| 3 |
2013-11-09T08:57:36Z
|
2021-09-08T23:08:00Z
|
2013-12-05T23:05:18Z
|
MEMBER
|
resolved
|
`urllib3` had a bug, we accidentally pulled it in, we didn't catch it because we don't test with SNI support (should we have that as a separate build?). Happily the awesome @t-8ch has already provided us with a patch, which is already merged, so we don't necessarily have an action on this: it'll be quietly fixed in the next release. Just raising this issue so that people who look on the issue tracker see it.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1732/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1732/timeline
| null |
completed
| null | null | false |
[
"I bumped into this today, adding a little background to help some people who might have the same issue:\n\nIt's not just that SNI support is broken ; if you have the relevant dependencies installed, it will be impossible to install requests 2.0.1. To reproduce, do the following in a new virtualenv:\n- pip install pyOpenSSL ndg-httpsclient pyasn1\n- pip install requests==2.0.1\n\nHere is what I get:\n\n```\nDownloading/unpacking requests==2.0.1\n Downloading requests-2.0.1.tar.gz (412kB): 412kB downloaded\n Running setup.py egg_info for package requests\n Traceback (most recent call last):\n File \"<string>\", line 16, in <module>\n File \"/home/mat/.virtualenvs/testenv/build/requests/setup.py\", line 6, in <module>\n import requests\n File \"requests/__init__.py\", line 53, in <module>\n from .packages.urllib3.contrib import pyopenssl\n File \"requests/packages/urllib3/contrib/pyopenssl.py\", line 55, in <module>\n orig_connectionpool_ssl_wrap_socket = connectionpool.ssl_wrap_socket\n AttributeError: 'module' object has no attribute 'ssl_wrap_socket'\n Complete output from command python setup.py egg_info:\n Traceback (most recent call last):\n\n File \"<string>\", line 16, in <module>\n\n File \"/home/mat/.virtualenvs/testenv/build/requests/setup.py\", line 6, in <module>\n\n import requests\n\n File \"requests/__init__.py\", line 53, in <module>\n\n from .packages.urllib3.contrib import pyopenssl\n\n File \"requests/packages/urllib3/contrib/pyopenssl.py\", line 55, in <module>\n\n orig_connectionpool_ssl_wrap_socket = connectionpool.ssl_wrap_socket\n\nAttributeError: 'module' object has no attribute 'ssl_wrap_socket'\n```\n\nCleaning up the broken build and installing any other version works.\n",
"Good spot, I only bumped into this the other way around (requests installed, then subsequently install the dependencies for SNI).\n",
"Fixed in 2.1.0.\n"
] |
https://api.github.com/repos/psf/requests/issues/1731
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1731/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1731/comments
|
https://api.github.com/repos/psf/requests/issues/1731/events
|
https://github.com/psf/requests/issues/1731
| 22,370,370 |
MDU6SXNzdWUyMjM3MDM3MA==
| 1,731 |
SSL certificate verify failed
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1396111?v=4",
"events_url": "https://api.github.com/users/gcq/events{/privacy}",
"followers_url": "https://api.github.com/users/gcq/followers",
"following_url": "https://api.github.com/users/gcq/following{/other_user}",
"gists_url": "https://api.github.com/users/gcq/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gcq",
"id": 1396111,
"login": "gcq",
"node_id": "MDQ6VXNlcjEzOTYxMTE=",
"organizations_url": "https://api.github.com/users/gcq/orgs",
"received_events_url": "https://api.github.com/users/gcq/received_events",
"repos_url": "https://api.github.com/users/gcq/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gcq/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gcq/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gcq",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 11 |
2013-11-08T23:07:02Z
|
2021-09-09T00:28:13Z
|
2013-11-09T12:06:20Z
|
NONE
|
resolved
|
When using requests on https://cloudconvert.org/ i get this Exception, but Firefox says the certificate is alright.
I alredy notified cloudconvert, and wil update this accordingly, but as far as i know this can be a requests issue.
```
Traceback (most recent call last):
File "C:\Users\Gcq\Documents\GitHub\python-cloudconvert\CloudConvert.py", line 75, in <module>
print requests.get("https://cloudconvert.org/contact").text
File "C:\Python27\lib\site-packages\requests\api.py", line 55, in get
return request('get', url, **kwargs)
File "C:\Python27\lib\site-packages\requests\api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 335, in request
resp = self.send(prep, **send_kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 438, in send
r = adapter.send(request, **kwargs)
File "C:\Python27\lib\site-packages\requests\adapters.py", line 331, in send
raise SSLError(e)
SSLError: [Errno 1] _ssl.c:510: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1396111?v=4",
"events_url": "https://api.github.com/users/gcq/events{/privacy}",
"followers_url": "https://api.github.com/users/gcq/followers",
"following_url": "https://api.github.com/users/gcq/following{/other_user}",
"gists_url": "https://api.github.com/users/gcq/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gcq",
"id": 1396111,
"login": "gcq",
"node_id": "MDQ6VXNlcjEzOTYxMTE=",
"organizations_url": "https://api.github.com/users/gcq/orgs",
"received_events_url": "https://api.github.com/users/gcq/received_events",
"repos_url": "https://api.github.com/users/gcq/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gcq/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gcq/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gcq",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1731/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1731/timeline
| null |
completed
| null | null | false |
[
"Firstly, what version of Requests are you using? Requests 2.0.1 was released recently that contains a number of updates to our baked-in certificates.\n\nUsing 2.0.1, I get different behaviour to you. In Python 3, everything works perfectly out of the box. In Python 2, you'll need to install the requirements for SNI support. It's worth noting that our SNI support is actually bugged in 2.0.1: this will be fixed in the next release.\n",
"I had an outdated version indeed. I updated to 2.0.1 and still getting the exception.\n\nWhat are the requeriments for SNI? I looked trough some related issues, ans installed urllib3, but that doesn't fix it.\n",
"Are you getting that exact exception? Because I get one that's clearly related to SNI.\n\nThe requirements for SNI in Python 2.X can be found [here](http://stackoverflow.com/questions/18578439/using-requests-with-tls-doesnt-give-sni-support/18579484#18579484).\n",
"The Exception is exactly the same. No SNI related. Should I install the requeriments and see if it works or am i missing something?\n",
"No, 2.0.1 has bugged SNI support so it won't work. That's very strange. Can you try in Python 3.3?\n",
"Installing Python3. How can i get the commands to work without messing up my Python27 installation?\n",
"Python 3 should cleanly install alongside Python 2.7. You can install pip in Python 3.3, and then use the Python 3 pip executable to install Requests.\n",
"OK. It works. But sadly this means i'll have to change to Py3k. A change i would have to do sonner than later, but really sucks.\n\nAnyway :) thanks. It works.\n",
"I expect the SNI issue should get fixed in the main release fairly soon, which will mean you can go back to Python 2.7.\n",
"Alternatively, you can install the version of Requests from the GitHub repository, which already contains the SNI fix.\n",
"Thats easier! Thank you.\n"
] |
https://api.github.com/repos/psf/requests/issues/1730
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/1730/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/1730/comments
|
https://api.github.com/repos/psf/requests/issues/1730/events
|
https://github.com/psf/requests/pull/1730
| 22,324,227 |
MDExOlB1bGxSZXF1ZXN0OTc4ODA0MQ==
| 1,730 |
lowercase cookie names when check
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/542246?v=4",
"events_url": "https://api.github.com/users/solos/events{/privacy}",
"followers_url": "https://api.github.com/users/solos/followers",
"following_url": "https://api.github.com/users/solos/following{/other_user}",
"gists_url": "https://api.github.com/users/solos/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/solos",
"id": 542246,
"login": "solos",
"node_id": "MDQ6VXNlcjU0MjI0Ng==",
"organizations_url": "https://api.github.com/users/solos/orgs",
"received_events_url": "https://api.github.com/users/solos/received_events",
"repos_url": "https://api.github.com/users/solos/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/solos/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/solos/subscriptions",
"type": "User",
"url": "https://api.github.com/users/solos",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2013-11-08T08:47:55Z
|
2021-09-08T22:01:03Z
|
2013-11-08T10:08:26Z
|
NONE
|
resolved
|
lowercase cookie names when check badargs
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/542246?v=4",
"events_url": "https://api.github.com/users/solos/events{/privacy}",
"followers_url": "https://api.github.com/users/solos/followers",
"following_url": "https://api.github.com/users/solos/following{/other_user}",
"gists_url": "https://api.github.com/users/solos/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/solos",
"id": 542246,
"login": "solos",
"node_id": "MDQ6VXNlcjU0MjI0Ng==",
"organizations_url": "https://api.github.com/users/solos/orgs",
"received_events_url": "https://api.github.com/users/solos/received_events",
"repos_url": "https://api.github.com/users/solos/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/solos/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/solos/subscriptions",
"type": "User",
"url": "https://api.github.com/users/solos",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/1730/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/1730/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/1730.diff",
"html_url": "https://github.com/psf/requests/pull/1730",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/1730.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/1730"
}
| true |
[
"I'm +1 on the replacing of `type` with `isinstance`. It's less clear to me that we need to lowercase all the keyword arguments to `create_cookie`. Why is this a problem?\n",
"I want to pass the badargs check and set a cookie which contains upper-case letter. I just lowercase cookie names when checking badargs.\n\n```\ncreate_cookie(name, value, Domain='example.com')\n\nd = {'Domain': 'example.com', 'Path': '/'}\ncreate_cookie(name, value, **d)\n```\n\nor I should lowercase them before passing?\n",
"Hmm. Is it necessary to have the `Domain` key be uppercase?\n\nOrdinarily, if you are using keyword arguments (which we are), you would have to match the case of the argument.\n",
"Ok. I will process them before passing.\n"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.