url
stringlengths 50
53
| repository_url
stringclasses 1
value | labels_url
stringlengths 64
67
| comments_url
stringlengths 59
62
| events_url
stringlengths 57
60
| html_url
stringlengths 38
43
| id
int64 597k
2.65B
| node_id
stringlengths 18
32
| number
int64 1
6.83k
| title
stringlengths 1
296
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 2
classes | assignee
dict | assignees
listlengths 0
4
| milestone
dict | comments
int64 0
211
| created_at
stringlengths 20
20
| updated_at
stringlengths 20
20
| closed_at
stringlengths 20
20
⌀ | author_association
stringclasses 3
values | active_lock_reason
stringclasses 4
values | body
stringlengths 0
65.6k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 59
62
| performed_via_github_app
null | state_reason
stringclasses 3
values | draft
bool 2
classes | pull_request
dict | is_pull_request
bool 2
classes | issue_comments
listlengths 0
30
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/psf/requests/issues/2731
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2731/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2731/comments
|
https://api.github.com/repos/psf/requests/issues/2731/events
|
https://github.com/psf/requests/issues/2731
| 101,679,066 |
MDU6SXNzdWUxMDE2NzkwNjY=
| 2,731 |
content-length header value not available after get
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2753206?v=4",
"events_url": "https://api.github.com/users/medington/events{/privacy}",
"followers_url": "https://api.github.com/users/medington/followers",
"following_url": "https://api.github.com/users/medington/following{/other_user}",
"gists_url": "https://api.github.com/users/medington/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/medington",
"id": 2753206,
"login": "medington",
"node_id": "MDQ6VXNlcjI3NTMyMDY=",
"organizations_url": "https://api.github.com/users/medington/orgs",
"received_events_url": "https://api.github.com/users/medington/received_events",
"repos_url": "https://api.github.com/users/medington/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/medington/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/medington/subscriptions",
"type": "User",
"url": "https://api.github.com/users/medington",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2015-08-18T15:09:45Z
|
2021-09-08T22:00:58Z
|
2015-08-18T15:22:07Z
|
NONE
|
resolved
|
I'm unable to get content-length reliably with the requests library whereas urllib2 seems to retrieve the data consistently. Not sure if additional code is required or if the URL I'm hitting is problematic.
Run this script and compare the results. I consistently get **None** from requests. Using the other test_url I consistently get a result from both. What am I missing?
``` python
# Test which demonstrates issues with requests get of content-length header
# This URL seems problematic (it's a several minute download)
test_url = "http://standards-oui.ieee.org/oui.txt"
# test_url = 'https://github.com/kennethreitz/requests/tarball/master'
def requests_getlen(url):
import requests
r = requests.get(url, stream=True)
return str(r.headers.get('content-length'))
def urllib_getlen(url):
import urllib2
req = urllib2.urlopen(url)
return str(req.info().getheader('content-length'))
print 'requests content-length: ' + requests_getlen(test_url)
print 'urllib2 content-length: ' + urllib_getlen(test_url)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2731/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2731/timeline
| null |
completed
| null | null | false |
[
"Questions belong on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests).\n\nThat said, have you checked to see if you're getting a chunked transfer-encoding?\n",
"Sorry, my bad. I thought it might have been a legitimate issue since the results were inconsistent between runs even using the alternate test_url value pointing at the tarball file on GitHub (that one would **sometimes** have content-length included, but not always). I'll figure out the appropriate question(s) and post on StackOverflow.\n\nIn the meantime, after some poking around with Charles, I determined that the difference stems from the fact that the Requests library specifies `gzip, deflate` as acceptable encodings by default and urllib2 specifies `identity` by default.\n\nHere is code which worked for this URL with requests to obtain content-length header:\n\n```\nr = requests.head(url, headers={'Accept-Encoding': 'identity'}, stream=True)\nreturn str(r.headers.get('content-length'))\n```\n",
"> the results were inconsistent between runs even using the alternate test_url value pointing at the tarball file on GitHub (that one would sometimes have content-length included, but not always)\n\nIf you look at the raw data that we're sending we're sending the exact same time (some of the header order may be different but that's to be expected). This would seem to be a problem with GitHub, but as long as they're also sending `Transfer-Encoding: chunked` then they aren't required to send a content-length. Not everything on the internet has a content-length header associated with it.\n"
] |
https://api.github.com/repos/psf/requests/issues/2730
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2730/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2730/comments
|
https://api.github.com/repos/psf/requests/issues/2730/events
|
https://github.com/psf/requests/pull/2730
| 101,205,225 |
MDExOlB1bGxSZXF1ZXN0NDI1MjQ5MDQ=
| 2,730 |
Fixed minor rendering issue in documentation code blocks
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6345?v=4",
"events_url": "https://api.github.com/users/bmispelon/events{/privacy}",
"followers_url": "https://api.github.com/users/bmispelon/followers",
"following_url": "https://api.github.com/users/bmispelon/following{/other_user}",
"gists_url": "https://api.github.com/users/bmispelon/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bmispelon",
"id": 6345,
"login": "bmispelon",
"node_id": "MDQ6VXNlcjYzNDU=",
"organizations_url": "https://api.github.com/users/bmispelon/orgs",
"received_events_url": "https://api.github.com/users/bmispelon/received_events",
"repos_url": "https://api.github.com/users/bmispelon/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bmispelon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bmispelon/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bmispelon",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-08-15T19:59:00Z
|
2021-09-08T07:00:45Z
|
2015-08-15T20:02:06Z
|
CONTRIBUTOR
|
resolved
|
Some of the code blocks on the quickstart page had an issue where they were rendered as two separate `<pre>`, making them inconsitent with the others.
Before:

After:

|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2730/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2730/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2730.diff",
"html_url": "https://github.com/psf/requests/pull/2730",
"merged_at": "2015-08-15T20:02:06Z",
"patch_url": "https://github.com/psf/requests/pull/2730.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2730"
}
| true |
[
":sparkles: :cake: :sparkles: Thanks @bmispelon!\n",
":rabbit2: :racehorse: :tada: \n"
] |
https://api.github.com/repos/psf/requests/issues/2729
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2729/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2729/comments
|
https://api.github.com/repos/psf/requests/issues/2729/events
|
https://github.com/psf/requests/pull/2729
| 101,195,999 |
MDExOlB1bGxSZXF1ZXN0NDI1MjM0MzU=
| 2,729 |
Add own name to "AUTHORS" after PR #2724
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/354181?v=4",
"events_url": "https://api.github.com/users/smiley/events{/privacy}",
"followers_url": "https://api.github.com/users/smiley/followers",
"following_url": "https://api.github.com/users/smiley/following{/other_user}",
"gists_url": "https://api.github.com/users/smiley/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/smiley",
"id": 354181,
"login": "smiley",
"node_id": "MDQ6VXNlcjM1NDE4MQ==",
"organizations_url": "https://api.github.com/users/smiley/orgs",
"received_events_url": "https://api.github.com/users/smiley/received_events",
"repos_url": "https://api.github.com/users/smiley/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/smiley/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/smiley/subscriptions",
"type": "User",
"url": "https://api.github.com/users/smiley",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-08-15T18:20:47Z
|
2021-09-08T07:00:45Z
|
2015-08-15T18:42:21Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2729/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2729/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2729.diff",
"html_url": "https://github.com/psf/requests/pull/2729",
"merged_at": "2015-08-15T18:42:21Z",
"patch_url": "https://github.com/psf/requests/pull/2729.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2729"
}
| true |
[
"\\o/\n"
] |
|
https://api.github.com/repos/psf/requests/issues/2728
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2728/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2728/comments
|
https://api.github.com/repos/psf/requests/issues/2728/events
|
https://github.com/psf/requests/pull/2728
| 101,157,320 |
MDExOlB1bGxSZXF1ZXN0NDI1MTcxODY=
| 2,728 |
Docs: Fix links to `timeouts` section by using :ref:
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/405124?v=4",
"events_url": "https://api.github.com/users/lukasgraf/events{/privacy}",
"followers_url": "https://api.github.com/users/lukasgraf/followers",
"following_url": "https://api.github.com/users/lukasgraf/following{/other_user}",
"gists_url": "https://api.github.com/users/lukasgraf/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/lukasgraf",
"id": 405124,
"login": "lukasgraf",
"node_id": "MDQ6VXNlcjQwNTEyNA==",
"organizations_url": "https://api.github.com/users/lukasgraf/orgs",
"received_events_url": "https://api.github.com/users/lukasgraf/received_events",
"repos_url": "https://api.github.com/users/lukasgraf/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/lukasgraf/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lukasgraf/subscriptions",
"type": "User",
"url": "https://api.github.com/users/lukasgraf",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-08-15T10:11:50Z
|
2021-09-08T07:00:45Z
|
2015-08-15T13:24:20Z
|
CONTRIBUTOR
|
resolved
|
This fixes three links (from docstrings) to the [<`timeouts`>](http://docs.python-requests.org/en/latest/user/advanced/#timeouts) section by using `:ref:` instead of linking to `.html` files (which are only available when building docs locally, but not in the published docs on RTD).
Fixes #2698
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2728/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2728/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2728.diff",
"html_url": "https://github.com/psf/requests/pull/2728",
"merged_at": "2015-08-15T13:24:20Z",
"patch_url": "https://github.com/psf/requests/pull/2728.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2728"
}
| true |
[
"LGTM, thanks! :cake: :sparkles: :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/2727
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2727/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2727/comments
|
https://api.github.com/repos/psf/requests/issues/2727/events
|
https://github.com/psf/requests/issues/2727
| 101,117,848 |
MDU6SXNzdWUxMDExMTc4NDg=
| 2,727 |
Add bitcoin donation info to README.rst
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1767769?v=4",
"events_url": "https://api.github.com/users/DoWhileGeek/events{/privacy}",
"followers_url": "https://api.github.com/users/DoWhileGeek/followers",
"following_url": "https://api.github.com/users/DoWhileGeek/following{/other_user}",
"gists_url": "https://api.github.com/users/DoWhileGeek/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/DoWhileGeek",
"id": 1767769,
"login": "DoWhileGeek",
"node_id": "MDQ6VXNlcjE3Njc3Njk=",
"organizations_url": "https://api.github.com/users/DoWhileGeek/orgs",
"received_events_url": "https://api.github.com/users/DoWhileGeek/received_events",
"repos_url": "https://api.github.com/users/DoWhileGeek/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/DoWhileGeek/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DoWhileGeek/subscriptions",
"type": "User",
"url": "https://api.github.com/users/DoWhileGeek",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2015-08-14T22:48:58Z
|
2021-09-08T21:00:50Z
|
2015-11-05T10:33:52Z
|
NONE
|
resolved
|
I would like to donate on a monthly basis, and I'm sure others do too. I'm not a particular fan of one time donations which you've recently added to the docs landing page.
I would however love to make small, monthly, bitcoin donations.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2727/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2727/timeline
| null |
completed
| null | null | false |
[
"Looks like Requests already accepts (micro)donations [through Flattr](https://flattr.com/thing/442264/Requests), so you could use that for monthly donations. (But I'm not sure if you can manually set the amount you donate _specifically to requests_...)\n",
"This is a structural problem we need to work out in a way that is going to be most fair. I think the team need to sit down and discuss this a little bit at some point.\n",
"`1Fov3RQFSDTxxGApR19g1GgQazzoutiGCC`\n"
] |
https://api.github.com/repos/psf/requests/issues/2726
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2726/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2726/comments
|
https://api.github.com/repos/psf/requests/issues/2726/events
|
https://github.com/psf/requests/issues/2726
| 101,091,540 |
MDU6SXNzdWUxMDEwOTE1NDA=
| 2,726 |
Connection reset by peer on get requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1319093?v=4",
"events_url": "https://api.github.com/users/mkates/events{/privacy}",
"followers_url": "https://api.github.com/users/mkates/followers",
"following_url": "https://api.github.com/users/mkates/following{/other_user}",
"gists_url": "https://api.github.com/users/mkates/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mkates",
"id": 1319093,
"login": "mkates",
"node_id": "MDQ6VXNlcjEzMTkwOTM=",
"organizations_url": "https://api.github.com/users/mkates/orgs",
"received_events_url": "https://api.github.com/users/mkates/received_events",
"repos_url": "https://api.github.com/users/mkates/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mkates/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mkates/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mkates",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-08-14T19:57:46Z
|
2021-09-08T22:00:59Z
|
2015-08-15T15:00:30Z
|
NONE
|
resolved
|
In making a request to a third party resource, https://www.example.com, I am getting the following traceback:
``` python
import requests; requests.get('http://www.example.com')
Traceback (most recent call last):
(......traceback.....)
File "/Users/python3.4/site-packages/requests/adapters.py", line 415, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetError(54, 'Connection reset by peer'))
```
For reference, I am on python 3.4.3, using requests 2.7.0, and running on Mac 10.9. What is very strange about this issue in particular is that the same call works if i run 'curl -i https://www.example.com' or if I access the URL using a browser. I believe it has something to do with the SSL handshake
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1319093?v=4",
"events_url": "https://api.github.com/users/mkates/events{/privacy}",
"followers_url": "https://api.github.com/users/mkates/followers",
"following_url": "https://api.github.com/users/mkates/following{/other_user}",
"gists_url": "https://api.github.com/users/mkates/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mkates",
"id": 1319093,
"login": "mkates",
"node_id": "MDQ6VXNlcjEzMTkwOTM=",
"organizations_url": "https://api.github.com/users/mkates/orgs",
"received_events_url": "https://api.github.com/users/mkates/received_events",
"repos_url": "https://api.github.com/users/mkates/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mkates/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mkates/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mkates",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2726/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2726/timeline
| null |
completed
| null | null | false |
[
"Yeah, connection reset by peer means the remote peer doesn't like the TLS handshake much. It's difficult to know without tcpdumping what the problem is here. I don't see the problem on my end though: do you have PyOpenSSL installed?\n",
"It was an issue with the SSL certificate on their site, which has since been resolved. Thanks!\n"
] |
https://api.github.com/repos/psf/requests/issues/2725
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2725/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2725/comments
|
https://api.github.com/repos/psf/requests/issues/2725/events
|
https://github.com/psf/requests/issues/2725
| 101,046,018 |
MDU6SXNzdWUxMDEwNDYwMTg=
| 2,725 |
Requests.put blocking on Windows (Tika-Python)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/395887?v=4",
"events_url": "https://api.github.com/users/chrismattmann/events{/privacy}",
"followers_url": "https://api.github.com/users/chrismattmann/followers",
"following_url": "https://api.github.com/users/chrismattmann/following{/other_user}",
"gists_url": "https://api.github.com/users/chrismattmann/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/chrismattmann",
"id": 395887,
"login": "chrismattmann",
"node_id": "MDQ6VXNlcjM5NTg4Nw==",
"organizations_url": "https://api.github.com/users/chrismattmann/orgs",
"received_events_url": "https://api.github.com/users/chrismattmann/received_events",
"repos_url": "https://api.github.com/users/chrismattmann/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/chrismattmann/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chrismattmann/subscriptions",
"type": "User",
"url": "https://api.github.com/users/chrismattmann",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 52 |
2015-08-14T16:22:14Z
|
2021-09-08T21:00:56Z
|
2015-10-24T14:37:16Z
|
NONE
|
resolved
|
Over in the [Apache Tika Python Port](http://github.com/chrismattmann/tika-python/) I'm noticing in [tika-python#44](http://github.com/chrismattmann/tika-python/issues/44) and in [tika-python#58](http://github.com/chrismattmann/tika-python/issues/58) some odd behavior with requests on Python 2.7.9. For whatever reason, when using a file handle and putting it with requests.put it blocks and blocks until finally it gets (correctly) a BadStatusLine back after a timeout. Anyone else seen this?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2725/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2725/timeline
| null |
completed
| null | null | false |
[
"Any requests call that uploads a request body will send the entire body before attempting to read the response. That means that, if the remote end does not close that connection abruptly (throwing an Exception on our end), we'll block until the response has been entirely sent.\n\nSadly, we don't support the 100-continue flow at this time (because httplib has no way of letting us see what's going on there), so it's difficult for us to do anything else.\n",
"Hi @Lukasa - what's weird though is that this works fine on *Nix and Mac. Tika Python is a Python library that uses (at its lowest level) requests to talk to the [Tika JAX RS Server](http://wiki.apache.org/tika/TikaJAXRS). It posts to the /rmeta endpoint. On Linux, calls like `parser.from_file()` work fine - on Windows, they block and block and then finally timeout. I'm not sure how the answer above has to do with that behavior but perhaps I didn't explain it well enough. Any ideas?\n",
"Here's what I'm seeing on the Tika server end:\n\n```\nAug 14, 2015 7:32:38 PM org.apache.tika.server.resource.TikaResource parse\nWARNING: rmeta: Text extraction failed\norg.apache.tika.io.TaggedIOException: timeout\n at org.apache.tika.io.TaggedInputStream.handleIOException(TaggedInputStream.java:133)\n at org.apache.tika.io.ProxyInputStream.read(ProxyInputStream.java:103)\n at org.apache.tika.io.TikaInputStream.peek(TikaInputStream.java:489)\n at org.apache.tika.parser.pkg.ZipContainerDetector.detect(ZipContainerDetector.java:83)\n at org.apache.tika.detect.CompositeDetector.detect(CompositeDetector.java:61)\n at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:112)\n at org.apache.tika.parser.RecursiveParserWrapper.parse(RecursiveParserWrapper.java:159)\n at org.apache.tika.server.resource.TikaResource.parse(TikaResource.java:244)\n at org.apache.tika.server.resource.RecursiveMetadataResource.parseMetadata(RecursiveMetadataResource.java:86)\n at org.apache.tika.server.resource.RecursiveMetadataResource.getMetadata(RecursiveMetadataResource.java:68)\n at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\n at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)\n at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)\n at java.lang.reflect.Method.invoke(Unknown Source)\n at org.apache.cxf.service.invoker.AbstractInvoker.performInvocation(AbstractInvoker.java:181)\n at org.apache.cxf.service.invoker.AbstractInvoker.invoke(AbstractInvoker.java:97)\n at org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:200)\n at org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:99)\n at org.apache.cxf.interceptor.ServiceInvokerInterceptor$1.run(ServiceInvokerInterceptor.java:59)\n at org.apache.cxf.interceptor.ServiceInvokerInterceptor.handleMessage(ServiceInvokerInterceptor.java:96)\n at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:307)\n at org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:121)\n at org.apache.cxf.transport.http.AbstractHTTPDestination.invoke(AbstractHTTPDestination.java:251)\n at org.apache.cxf.transport.http_jetty.JettyHTTPDestination.doService(JettyHTTPDestination.java:261)\n at org.apache.cxf.transport.http_jetty.JettyHTTPHandler.handle(JettyHTTPHandler.java:70)\n at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1088)\n at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1024)\n at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)\n at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)\n at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)\n at org.eclipse.jetty.server.Server.handle(Server.java:370)\n at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)\n at org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:982)\n at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1043)\n at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:865)\n at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)\n at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)\n at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:696)\n at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:53)\n at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)\n at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)\n at java.lang.Thread.run(Unknown Source)\nCaused by: org.eclipse.jetty.io.EofException: timeout\n at org.eclipse.jetty.http.HttpParser.blockForContent(HttpParser.java:1200)\n at org.eclipse.jetty.server.HttpInput.read(HttpInput.java:61)\n at java.io.FilterInputStream.read(Unknown Source)\n at java.io.BufferedInputStream.fill(Unknown Source)\n at java.io.BufferedInputStream.read1(Unknown Source)\n at java.io.BufferedInputStream.read(Unknown Source)\n at org.apache.tika.io.ProxyInputStream.read(ProxyInputStream.java:99)\n ... 40 more\n\nAug 14, 2015 7:32:38 PM org.apache.cxf.phase.PhaseInterceptorChain doDefaultLogging\nWARNING: Interceptor for {http://resource.server.tika.apache.org/}MetadataResource has thrown exception, unwinding now\norg.apache.cxf.interceptor.Fault: Could not send Message.\n at org.apache.cxf.interceptor.MessageSenderInterceptor$MessageSenderEndingInterceptor.handleMessage(MessageSenderInterceptor.java:64)\n at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:307)\n at org.apache.cxf.interceptor.OutgoingChainInterceptor.handleMessage(OutgoingChainInterceptor.java:83)\n at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:307)\n at org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:121)\n at org.apache.cxf.transport.http.AbstractHTTPDestination.invoke(AbstractHTTPDestination.java:251)\n at org.apache.cxf.transport.http_jetty.JettyHTTPDestination.doService(JettyHTTPDestination.java:261)\n at org.apache.cxf.transport.http_jetty.JettyHTTPHandler.handle(JettyHTTPHandler.java:70)\n at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1088)\n at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1024)\n at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)\n at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)\n at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)\n at org.eclipse.jetty.server.Server.handle(Server.java:370)\n at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)\n at org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:982)\n at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1043)\n at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:865)\n at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)\n at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)\n at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:696)\n at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:53)\n at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)\n at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)\n at java.lang.Thread.run(Unknown Source)\nCaused by: org.eclipse.jetty.io.EofException\n at org.eclipse.jetty.http.HttpGenerator.flushBuffer(HttpGenerator.java:914)\n at org.eclipse.jetty.http.HttpGenerator.complete(HttpGenerator.java:798)\n at org.eclipse.jetty.server.AbstractHttpConnection.commitResponse(AbstractHttpConnection.java:650)\n at org.eclipse.jetty.server.AbstractHttpConnection$Output.close(AbstractHttpConnection.java:1106)\n at org.apache.cxf.transport.http.AbstractHTTPDestination.flushHeaders(AbstractHTTPDestination.java:626)\n at org.apache.cxf.transport.http_jetty.JettyHTTPDestination.flushHeaders(JettyHTTPDestination.java:286)\n at org.apache.cxf.transport.http.AbstractHTTPDestination$WrappedOutputStream.close(AbstractHTTPDestination.java:784)\n at org.apache.cxf.transport.AbstractConduit.close(AbstractConduit.java:56)\n at org.apache.cxf.transport.http.AbstractHTTPDestination$BackChannelConduit.close(AbstractHTTPDestination.java:720)\n at org.apache.cxf.interceptor.MessageSenderInterceptor$MessageSenderEndingInterceptor.handleMessage(MessageSenderInterceptor.java:62)\n ... 24 more\nCaused by: java.nio.channels.ClosedChannelException\n at sun.nio.ch.SocketChannelImpl.ensureWriteOpen(Unknown Source)\n at sun.nio.ch.SocketChannelImpl.write(Unknown Source)\n at org.eclipse.jetty.io.nio.ChannelEndPoint.flush(ChannelEndPoint.java:293)\n at org.eclipse.jetty.io.nio.SelectChannelEndPoint.flush(SelectChannelEndPoint.java:404)\n at org.eclipse.jetty.http.HttpGenerator.flushBuffer(HttpGenerator.java:844)\n ... 33 more\n\nAug 14, 2015 7:32:38 PM org.apache.cxf.phase.PhaseInterceptorChain doDefaultLogging\nWARNING: Interceptor for {http://resource.server.tika.apache.org/}MetadataResource has thrown exception, unwinding now\norg.apache.cxf.interceptor.Fault: XML_WRITE_EXC\n at org.apache.cxf.jaxrs.interceptor.JAXRSDefaultFaultOutInterceptor.handleMessage(JAXRSDefaultFaultOutInterceptor.java:102)\n at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:307)\n at org.apache.cxf.interceptor.AbstractFaultChainInitiatorObserver.onMessage(AbstractFaultChainInitiatorObserver.java:113)\n at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:371)\n at org.apache.cxf.interceptor.OutgoingChainInterceptor.handleMessage(OutgoingChainInterceptor.java:83)\n at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:307)\n at org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:121)\n at org.apache.cxf.transport.http.AbstractHTTPDestination.invoke(AbstractHTTPDestination.java:251)\n at org.apache.cxf.transport.http_jetty.JettyHTTPDestination.doService(JettyHTTPDestination.java:261)\n at org.apache.cxf.transport.http_jetty.JettyHTTPHandler.handle(JettyHTTPHandler.java:70)\n at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1088)\n at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1024)\n at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)\n at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)\n at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)\n at org.eclipse.jetty.server.Server.handle(Server.java:370)\n at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)\n at org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:982)\n at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1043)\n at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:865)\n at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)\n at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)\n at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:696)\n at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:53)\n at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)\n at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)\n at java.lang.Thread.run(Unknown Source)\nCaused by: com.ctc.wstx.exc.WstxIOException: null\n at com.ctc.wstx.sw.BaseStreamWriter.flush(BaseStreamWriter.java:255)\n at org.apache.cxf.jaxrs.interceptor.JAXRSDefaultFaultOutInterceptor.handleMessage(JAXRSDefaultFaultOutInterceptor.java:100)\n ... 26 more\nCaused by: org.eclipse.jetty.io.EofException\n at org.eclipse.jetty.server.HttpOutput.write(HttpOutput.java:142)\n at org.eclipse.jetty.server.HttpOutput.write(HttpOutput.java:107)\n at org.apache.cxf.transport.http_jetty.JettyHTTPDestination$JettyOutputStream.write(JettyHTTPDestination.java:325)\n at org.apache.cxf.io.AbstractWrappedOutputStream.write(AbstractWrappedOutputStream.java:51)\n at com.ctc.wstx.sw.EncodingXmlWriter.flushBuffer(EncodingXmlWriter.java:705)\n at com.ctc.wstx.sw.EncodingXmlWriter.flush(EncodingXmlWriter.java:174)\n at com.ctc.wstx.sw.BaseStreamWriter.flush(BaseStreamWriter.java:253)\n ... 27 more\n```\n\nAnd then on the requests end:\n\n<img width=\"525\" alt=\"screen shot 2015-08-14 at 8 09 23 pm\" src=\"https://cloud.githubusercontent.com/assets/395887/9287250/3f2268ec-42c0-11e5-8e48-bffbf9799986.png\">\n",
"note this is all occurring using localhost. System env info:\n\nTika Server - Tried with JDK 1.8, and also with JDK 1.7.\nTika Python 1.9.10\nRequests 2.7.0\nPython version 2.7.10\n",
"Another interesting note. requests.put works fine on windows as long as I read the whole file into memory first, e.g., if I pass open(filename, 'r').read() into the data parameter for the PUT request. If I pass in just open(filename, 'r') as the data parameter, it hangs forever until the BadStatusLine.\n",
"@chrismattmann so there's (as you could guess) a significant difference in behaviour between those two test cases.\n\nWhen you just give us the raw data as a string, we write it all at once. When you give us an open file descriptor, we pass it down to `httplib`. What httplib then does with it is read 8192 bytes (yes 8 KB) at a time and write it to the socket to stream it. **Note** this is not the same as a chunked upload.\n\nI suspect that Tika server doesn't like getting such small amounts over a period of time.\n",
"If you want to test sending larger chunks (by forcing httplib to send what it reads) you could tinker with https://github.com/sigmavirus24/requests-toolbelt/pull/84/files to wrap your file to see if uploading more at once will aid this.\n",
"OK, here is some more info. If I use something like:\n\n```\nrequests.put(serverUrl, files={'filename' : open(filename, 'r')}, headers=headers)\n```\n\nWorks as expected (however it makes Tika server get a 415 b/c for whatever reason Content-Type isn't passed in the HTTP request). However:\n\n```\nrequests.put(serverUrl, data=open(filename, 'r'), headers=headers)\n```\n\nstill fails.\n",
"@sigmavirus24 thanks for the insight. I'll take a look at https://github.com/sigmavirus24/requests-toolbelt/pull/84\n",
"One thing too @sigmavirus24 that is kind of odd though - this same behavior with e.g., data=open(filename, 'r') works fine on Linux, e.g., with Tika-server running on Linux and Tika Python running on Linux. It only seems to fail on Windows.\n",
"Ahh, quick insight too, the 415 error request on Tika server is caused by the following content type passed: (multipart/form-data;boundary=d764925698424d719281a56941edb9b7). So it seems that files={'filename', open(filename, 'r')} causes a multi-part request using PUT; however, data=open(filename, 'r') doesn't.\n\nFull Tika Server stack trace:\n\n```\nAug 14, 2015 7:55:44 PM org.apache.tika.server.resource.TikaResource logRequest\nINFO: rmeta (multipart/form-data;boundary=d764925698424d719281a56941edb9b7)\nAug 14, 2015 7:55:44 PM org.apache.tika.server.resource.TikaResource parse\nWARNING: rmeta: Text extraction failed\norg.apache.tika.exception.TikaException: Unexpected RuntimeException from org.apache.tika.server.resource.TikaResource$1@8a9107e\n at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:283)\n at org.apache.tika.parser.AutoDetectParser.parse(AutoDetectParser.java:120)\n at org.apache.tika.parser.RecursiveParserWrapper.parse(RecursiveParserWrapper.java:159)\n at org.apache.tika.server.resource.TikaResource.parse(TikaResource.java:244)\n at org.apache.tika.server.resource.RecursiveMetadataResource.parseMetadata(RecursiveMetadataResource.java:86)\n at org.apache.tika.server.resource.RecursiveMetadataResource.getMetadata(RecursiveMetadataResource.java:68)\n at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)\n at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)\n at java.lang.reflect.Method.invoke(Unknown Source)\n at org.apache.cxf.service.invoker.AbstractInvoker.performInvocation(AbstractInvoker.java:181)\n at org.apache.cxf.service.invoker.AbstractInvoker.invoke(AbstractInvoker.java:97)\n at org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:200)\n at org.apache.cxf.jaxrs.JAXRSInvoker.invoke(JAXRSInvoker.java:99)\n at org.apache.cxf.interceptor.ServiceInvokerInterceptor$1.run(ServiceInvokerInterceptor.java:59)\n at org.apache.cxf.interceptor.ServiceInvokerInterceptor.handleMessage(ServiceInvokerInterceptor.java:96)\n at org.apache.cxf.phase.PhaseInterceptorChain.doIntercept(PhaseInterceptorChain.java:307)\n at org.apache.cxf.transport.ChainInitiationObserver.onMessage(ChainInitiationObserver.java:121)\n at org.apache.cxf.transport.http.AbstractHTTPDestination.invoke(AbstractHTTPDestination.java:251)\n at org.apache.cxf.transport.http_jetty.JettyHTTPDestination.doService(JettyHTTPDestination.java:261)\n at org.apache.cxf.transport.http_jetty.JettyHTTPHandler.handle(JettyHTTPHandler.java:70)\n at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1088)\n at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1024)\n at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:135)\n at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:255)\n at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:116)\n at org.eclipse.jetty.server.Server.handle(Server.java:370)\n at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:494)\n at org.eclipse.jetty.server.AbstractHttpConnection.content(AbstractHttpConnection.java:982)\n at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:1043)\n at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:865)\n at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:240)\n at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:82)\n at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:696)\n at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:53)\n at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:608)\n at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:543)\n at java.lang.Thread.run(Unknown Source)\nCaused by: javax.ws.rs.WebApplicationException: HTTP 415 Unsupported Media Type\n at org.apache.tika.server.resource.TikaResource$1.parse(TikaResource.java:111)\n at org.apache.tika.parser.CompositeParser.parse(CompositeParser.java:281)\n ... 36 more\n```\n",
"@chrismattmann please be aware that while you debug this in the comments on a closed issue, you're emailing potentially over 700 people subscribed to issues (creation, comments, etc.)\n",
"sorry about that. I just thought providing more information would help. I'll stop spamming now while I continue to try and debug this.\n",
"As an FYI, I don't consider this issue fixed/closed. It's unclear to me where the precise error here is - seems to be either in the way that requests generates PUT HTTP requests when passing data=open('filename', 'r') (which doesn't work on Windows when contacting Jetty from Tika Server on localhost) compared to the way it generates PUT HTTP requests when doing files={'filename' : open('filename', 'r')} (which works on Windows when contacting Jetty from Tika Server on localhost). I've seen things debugging around that Jetty by default uses Memory mapped IO which locks files on Windows. Not sure if that's the root cause of the issue here or not. Anyways wanted to share that.\n",
"To be clear, the root cause of the issue is likely to be in `httplib`, or possibly even lower (e.g. WinSock). Requests simply doesn't have a separate code path here.\n",
"So I've lost my touch with reading Java stacktraces to be honest but I see\n\n```\norg.apache.tika.io.TaggedIOException: timeout\n```\n\nAnd I wonder if Tika has a timeout such that if it doesn't receive the full amount of data in that period, it times out the request and closes the socket. It could be attempting to protect against bad actors who do slow uploads to DDoS the server.\n",
"thanks @Lukasa and @sigmavirus24 . The odd thing is this is an extremely small file (default win.ini) and the behavior is that i make the post - then it waits for like 2-3 minutes, then times out. Just odd.\n",
"@chrismattmann That was not clear to me previously, and makes the whole thing substantially more interesting.\n\nI think I've seen something like this on hyper, and couldn't work out what was going on: I was getting situations where I was failing to read data I knew was in the socket buffer. See also: lukasa/hyper#142.\n",
"@chrismattmann that would have been helpful to know at the beginning of this thread. That description seems a bit more consistent with\n\n```\n[snip]\nCaused by: org.eclipse.jetty.io.EofException: timeout\n at org.eclipse.jetty.http.HttpParser.blockForContent(HttpParser.java:1200)\n[snip]\nCaused by: org.eclipse.jetty.io.EofException\n at org.eclipse.jetty.http.HttpGenerator.flushBuffer(HttpGenerator.java:914)\n[snip]\nCaused by: org.eclipse.jetty.io.EofException\n at org.eclipse.jetty.server.HttpOutput.write(HttpOutput.java:142)\n```\n\nNote that those are from what I believe are three separate tracebacks, but my Java debugging is several years old and quite rusty.\n\nCould you share _exactly_ the file that you used? Further, what happens if you open the file as\n\n``` py\nwith open(file, 'rb') as fd:\n ...\n```\n\n(or the equivalent in your code, the important part is opening the file with the `b` flag).\n\nLet's also try some other permutations:\n1. Tika Server running on a non-Windows operating system, with Tika-Python running on Windows\n2. Tika Server running on Windows, with Tika-Python running on a non-Windows OS.\n\nI'm fairly confident that this is a bug very far down the stack from us (which is why I'm not reopening this), but I'm interested in helping you find out enough information to report it to the appropriate place.\n",
"thanks @Lukasa and @sigmavirus24 . Here is the file I was using (C:\\Windows\\win.ini)\n\n```\n; for 16-bit app support\n[fonts]\n[extensions]\n[mci extensions]\n[files]\n[Mail]\nMAPI=1\n```\n\nI tried with the 'b' flag as well, and there was no difference. \n\nAs for:\n\n> 1. Tika Server running on a non-Windows operating system, with Tika-Python running on Windows\n> 2. Tika Server running on Windows, with Tika-Python running on a non-Windows OS.\n\nGot it. I will try both and report back. Thanks for helping.\n",
"OK have an answer for this one:\n\n> 1. Tika Server running on a non-Windows operating system, with Tika-Python running on Windows\n\nSteps to reproduce:\n1. CentOS centos7-x86_64-generic-cloud image, Openstack, Python 2.7.10 (default, Aug 12 2015, 04:31:43) , JDK \"1.7.0_85\" OpenJDK Runtime Environment (rhel-2.6.1.2.el7_1-x86_64 u85-b01) OpenJDK 64-Bit Server VM (build 24.85-b03, mixed mode) - running Tika-Server\n2. Windows 7 Ultimate - JDK 1.8, Python 2.7.10 - running Tika-Python (client)\n3. Following code on Windows 7 Ultimate (with a modified version of Tika-Python, diff provided last, note localhost:55000 is a proxy to the remote host):\n\n``` python\n from tika import parser\n parser.from_file('/Windows/win.ini', 'http://localhost:55000')\n```\n\n produces (Windows side):\n\n```\n Traceback (most recent call last):\n File \"<pyshell#2>\", line 1, in <module>\n parser.from_file('/Windows/win.ini', 'http://localhost:55000')\n File \"C:\\Python27\\lib\\site-packages\\tika\\parser.py\", line 24, in from_file\n jsonOutput = parse1('all', filename, serverEndpoint)\n File \"C:\\Python27\\lib\\site-packages\\tika\\tika.py\", line 151, in parse1\n verbose, tikaServerJar)\n File \"C:\\Python27\\lib\\site-packages\\tika\\tika.py\", line 256, in callServer\n resp = verbFn(serviceUrl, encodedData, headers=headers)\n File \"C:\\Python27\\lib\\site-packages\\requests\\api.py\", line 122, in put\n return request('put', url, data=data, **kwargs)\n File \"C:\\Python27\\lib\\site-packages\\requests\\api.py\", line 50, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 465, in request\n resp = self.send(prep, **send_kwargs)\n File \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 573, in send\n r = adapter.send(request, **kwargs)\n File \"C:\\Python27\\lib\\site-packages\\requests\\adapters.py\", line 415, in send\n raise ConnectionError(err, request=request)\nConnectionError: ('Connection aborted.', BadStatusLine(\"''\",))\n```\n\nAnd on Linux/CentOS server side produces no log output on the tika-server.log file.\n\nI verified that Windows 7 Ultimate could see the Linux machine (via Telnet, e.g., `telnet linux-machine 9998`).\n\nHere's the diff:\n\n``` diff\ndiff --git a/tika/tika.py b/tika/tika.py\nindex da659ad..6a04f8a 100644\n--- a/tika/tika.py\n+++ b/tika/tika.py\n@@ -247,8 +247,8 @@ def callServer(verb, serverEndpoint, service, data, headers, verbose=Verbose, ti\n die('Tika Server call must be one of %s' % str(httpVerbs.keys()))\n verbFn = httpVerbs[verb]\n\n- if Windows and type(data) is file:\n- data = data.read() \n+ #if Windows and type(data) is file:\n+ # data = data.read() \n\n encodedData = data\n if type(data) is unicode:\n```\n\nSo in case it wasn't clear, the first scenario produces, the exact same result. I'll try the second one now.\n",
"OK here is part 2.\n\n> 1. Tika Server running on Windows, with Tika-Python running on a non-Windows OS.\n\nSteps to reproduce:\n1. MacOS X 10.9.5, Python 2.7.8 (default, Sep 27 2014, 11:46:04)\n2. Windows 7 Ultimate - JDK 1.8, Python 2.7.10 - running Tika-Python (server), Java version \"1.7.0_60\" Java(TM) SE Runtime Environment (build 1.7.0_60-b19)\n\nFollowing code on Mac OS X (with a modified version of Tika-Python, diff provided last note 192.X is the VMWare guest OS ip that I am accessing from my Mac):\n\n``` python\nfrom tika import parser\nparser.from_file('/Users/mattmann/.bashrc', 'http://192.168.151.143:9998')\n```\n\nwhich produces:\n\n``` python\n{'content': u'\\n\\n\\n\\n\\n\\n\\n\\n\\nalias tika=\"java -jar /usr/local/tika/tika-app-1.8.jar\"\\nalias ls=\"ls -FHG\"\\n\\n', 'metadata': {u'resourceName': u'.bashrc', u'X-Parsed-By': [u'org.apache.tika.parser.DefaultParser', u'org.apache.tika.parser.txt.TXTParser'], u'Content-Type': u'text/plain; charset=ISO-8859-1', u'X-TIKA:parse_time_millis': u'4', u'Content-Encoding': u'ISO-8859-1'}}\n```\n\nNo error logs from Windows as request was successful. Note on the Windows side, I had to restart the tika-server.jar file with the argument --host 0.0.0.0 in order to get it to bind to all local interfaces, so that I could communicate with it outside of the Windows box.\n\nHere's the diff:\n\n``` diff\ndiff --git a/tika/tika.py b/tika/tika.py\nindex da659ad..6a04f8a 100644\n--- a/tika/tika.py\n+++ b/tika/tika.py\n@@ -247,8 +247,8 @@ def callServer(verb, serverEndpoint, service, data, headers, verbose=Verbose, ti\n die('Tika Server call must be one of %s' % str(httpVerbs.keys()))\n verbFn = httpVerbs[verb]\n\n- if Windows and type(data) is file:\n- data = data.read() \n+ #if Windows and type(data) is file:\n+ # data = data.read() \n\n encodedData = data\n if type(data) is unicode:\n```\n",
"any comments here @sigmavirus24 @Lukasa ?\n",
"Right now it might be helpful if I could also see some packet capture of the failing scenario. Are you familiar with tcpdump?\n",
"hi @Lukasa nope I haven't used it. I did try and find some programs that would do this after I found out Wireshark won't capture localhost on Windows (or it will if you hack some networking routes, etc., but I tried and none of the approaches worked). Any advice here would be appreciated.\n",
"Ah, yeah, capturing localhost on Windows is tricky. I've used [RawCap](http://www.netresec.com/?page=RawCap) in the past to some effect.\n",
"hey @Lukasa yep I tried to use RawCap unfortunately the frickin' page download redirects to an non existent URL and I can't download the tool :( I also couldn't find it on the internet, sadly.\n",
"=( That's unhelpful. Is it going to be easier to temporarily bring a non-Windows box onto your LAN for the packet capture?\n",
"Hi @Lukasa well see that's the thing - this error only manifests on Windows. There are not any issues from Linux to Linux; from Linux to Windows; but there are errors Windows to Windows.\n",
"Oh, both the server and the client have to be Windows? Can we spin this up on multiple Windows boxes in AWS?\n"
] |
https://api.github.com/repos/psf/requests/issues/2724
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2724/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2724/comments
|
https://api.github.com/repos/psf/requests/issues/2724/events
|
https://github.com/psf/requests/pull/2724
| 101,023,200 |
MDExOlB1bGxSZXF1ZXN0NDI0NjIxMDk=
| 2,724 |
Add documentation in "More complicated POST requests" (Quickstart) to resolve #2686
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/354181?v=4",
"events_url": "https://api.github.com/users/smiley/events{/privacy}",
"followers_url": "https://api.github.com/users/smiley/followers",
"following_url": "https://api.github.com/users/smiley/following{/other_user}",
"gists_url": "https://api.github.com/users/smiley/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/smiley",
"id": 354181,
"login": "smiley",
"node_id": "MDQ6VXNlcjM1NDE4MQ==",
"organizations_url": "https://api.github.com/users/smiley/orgs",
"received_events_url": "https://api.github.com/users/smiley/received_events",
"repos_url": "https://api.github.com/users/smiley/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/smiley/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/smiley/subscriptions",
"type": "User",
"url": "https://api.github.com/users/smiley",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-08-14T14:37:04Z
|
2021-09-08T07:00:46Z
|
2015-08-14T19:30:00Z
|
CONTRIBUTOR
|
resolved
|
As suggested in #2686, I added a snippet about `requests.post(json=...)` to [Quickstart#More complicated POST requests](http://docs.python-requests.org/en/latest/user/quickstart/#more-complicated-post-requests).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2724/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2724/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2724.diff",
"html_url": "https://github.com/psf/requests/pull/2724",
"merged_at": "2015-08-14T19:30:00Z",
"patch_url": "https://github.com/psf/requests/pull/2724.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2724"
}
| true |
[
"\\o/ Beautiful, thankyou so much! :cake: :sparkles: \n",
"Feel free to open a PR that adds you to AUTHORS as well. =)\n"
] |
https://api.github.com/repos/psf/requests/issues/2722
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2722/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2722/comments
|
https://api.github.com/repos/psf/requests/issues/2722/events
|
https://github.com/psf/requests/issues/2722
| 100,882,226 |
MDU6SXNzdWUxMDA4ODIyMjY=
| 2,722 |
Proxies that depend on the URL
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/192614?v=4",
"events_url": "https://api.github.com/users/jasongrout/events{/privacy}",
"followers_url": "https://api.github.com/users/jasongrout/followers",
"following_url": "https://api.github.com/users/jasongrout/following{/other_user}",
"gists_url": "https://api.github.com/users/jasongrout/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jasongrout",
"id": 192614,
"login": "jasongrout",
"node_id": "MDQ6VXNlcjE5MjYxNA==",
"organizations_url": "https://api.github.com/users/jasongrout/orgs",
"received_events_url": "https://api.github.com/users/jasongrout/received_events",
"repos_url": "https://api.github.com/users/jasongrout/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jasongrout/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jasongrout/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jasongrout",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2015-08-13T22:19:04Z
|
2021-09-08T22:00:51Z
|
2015-09-06T03:02:41Z
|
CONTRIBUTOR
|
resolved
|
Currently the proxies dictionary supports only schemes as keys. In networks with multiple proxies, this may not be sufficient for picking the correct proxy. I propose that we allow scheme+hostname, hostname, and scheme (in that order) as keys, searched in that order. Changing this line, https://github.com/kennethreitz/requests/blob/2440b6f089f4c53e96cd2ece80e92774b72ed243/requests/adapters.py#L242, could search first for the scheme+hostname, then the hostname, then the scheme.
I'm not sure of all the ramifications of this in other parts of the codebase, but this would greatly help when we have a more complicated proxy setup than one just relying on schemes.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2722/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2722/timeline
| null |
completed
| null | null | false |
[
"This might be a good idea, but I wonder how it plays out with the other proxy variables. This needs to be carefully considered. \n\nHow is this handled by other tools? `apt-get`, for instance?\n",
"Good idea to look at apt. According to http://manpages.ubuntu.com/manpages/utopic/man5/apt.conf.5.html, apt supports both per-host proxies and a way to call out to an arbitrary external program to get the proxy. The calling out to an external function is like the PAC configuration that is pretty widespread. Relevant parts of that manpage are:\n\n```\n http::Proxy sets the default proxy to use for HTTP URIs. It is in\n the standard form of http://[[user][:pass]@]host[:port]/. Per host\n proxies can also be specified by using the form http::Proxy::<host>\n with the special keyword DIRECT meaning to use no proxies. If no\n one of the above settings is specified, http_proxy environment\n variable will be used.\n```\n\nand \n\n```\n Acquire::http::Proxy-Auto-Detect can be used to specify an external\n command to discover the http proxy to use. Apt expects the command\n to output the proxy on stdout in the style http://proxy:port/. This\n will override the generic Acquire::http::Proxy but not any specific\n host proxy configuration set via Acquire::http::Proxy::$HOST. See\n the squid-deb-proxy-client(1) package for an example implementation\n that uses avahi. This option takes precedence over the legacy\n option name ProxyAutoDetect.\n```\n\nWe could support calling out to an external program (obviously the most flexible), but that would be a much bigger change. The change I'm proposing would be sufficient for my needs.\n\nWhat can we do to continue to move the discussion forward?\n",
"One downside is that if you have a host named 'http', then the proxy dictionary gets confused between whether that is a host or a scheme. I'd be happy also if we just resolved scheme+'://'+hostname first, then scheme. That is forward compatible with the original proposal, and avoids the ambiguity when a hostname is the same as a scheme.\n\nSo, in summary, I'm proposing that the proxies dictionary be something of the form: `{'<scheme>://<hostname>': proxy, '<scheme>': proxy}`, and that a host-specific proxy takes precedence over a scheme proxy.\n",
"Ok, I think that approach sounds reasonable to me: I think this feature request requires relatively little code and enables a fairly useful use-case. Thoughts from the other maintainers? /cc @kennethreitz @sigmavirus24\n",
"Ping about this. Is there anything I can do to help the discussion along?\n",
"Sadly @jasongrout the project has a low communication bandwidth at the moment: I'm moving flat and so am without fixed-line internet, and the other two maintainers are taking well-needed rests. This hasn't been lost, I promise!\n",
"Thanks! I'll continue with my patched version for now, and ping again in a bit. Good luck on your move, and thanks again for all of your efforts!\n",
"I opened a PR implementing this solution at https://github.com/kennethreitz/requests/pull/2741\n"
] |
https://api.github.com/repos/psf/requests/issues/2721
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2721/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2721/comments
|
https://api.github.com/repos/psf/requests/issues/2721/events
|
https://github.com/psf/requests/pull/2721
| 100,845,057 |
MDExOlB1bGxSZXF1ZXN0NDIzODY2NDA=
| 2,721 |
Ignore empty fields in no_proxy env var
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
] |
{
"closed_at": "2015-10-12T10:32:06Z",
"closed_issues": 7,
"created_at": "2015-04-29T13:03:39Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/25",
"id": 1089203,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/25/labels",
"node_id": "MDk6TWlsZXN0b25lMTA4OTIwMw==",
"number": 25,
"open_issues": 0,
"state": "closed",
"title": "2.8.0",
"updated_at": "2015-10-12T10:32:06Z",
"url": "https://api.github.com/repos/psf/requests/milestones/25"
}
| 3 |
2015-08-13T19:05:23Z
|
2021-09-08T06:00:56Z
|
2015-10-05T14:28:49Z
|
MEMBER
|
resolved
|
Resolves #2720.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2721/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2721/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2721.diff",
"html_url": "https://github.com/psf/requests/pull/2721",
"merged_at": "2015-10-05T14:28:49Z",
"patch_url": "https://github.com/psf/requests/pull/2721.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2721"
}
| true |
[
":+1: looks good to me.\n",
"@tlc would you like to test this before we merge it?\n",
"I think this should go in to 2.8.0.\n"
] |
https://api.github.com/repos/psf/requests/issues/2720
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2720/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2720/comments
|
https://api.github.com/repos/psf/requests/issues/2720/events
|
https://github.com/psf/requests/issues/2720
| 100,831,759 |
MDU6SXNzdWUxMDA4MzE3NTk=
| 2,720 |
Extra commas in NO_PROXY or no_proxy env var break all proxy requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/19436?v=4",
"events_url": "https://api.github.com/users/tlc/events{/privacy}",
"followers_url": "https://api.github.com/users/tlc/followers",
"following_url": "https://api.github.com/users/tlc/following{/other_user}",
"gists_url": "https://api.github.com/users/tlc/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tlc",
"id": 19436,
"login": "tlc",
"node_id": "MDQ6VXNlcjE5NDM2",
"organizations_url": "https://api.github.com/users/tlc/orgs",
"received_events_url": "https://api.github.com/users/tlc/received_events",
"repos_url": "https://api.github.com/users/tlc/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tlc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tlc/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tlc",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2015-08-13T18:04:41Z
|
2018-06-22T12:03:13Z
|
2015-10-05T14:28:49Z
|
NONE
|
off-topic
|
Using requests 2.7.0 via TwitterAPI, I successfully proxy requests by setting https_proxy in my environment. If either `no_proxy` or `NO_PROXY` is set to `,` or `foo.com,,bar.com` or `foo.com,`, I get:
```
WARNING:root:<class 'requests.exceptions.ConnectTimeout'> HTTPSConnectionPool(host='api.twitter.com', port=443): Max retries exceeded with url: /1.1/statuses/update.json (Caused by ConnectTimeoutError(<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7f031cdf7908>, 'Connection to api.twitter.com timed out. (connect timeout=5)'))
ERROR:__main__:Tweet exception
Traceback (most recent call last):
File "/home/troy/BUDDIES/lib/python3.4/site-packages/requests/packages/urllib3/connection.py", line 134, in _new_conn
(self.host, self.port), self.timeout, **extra_kw)
File "/home/troy/BUDDIES/lib/python3.4/site-packages/requests/packages/urllib3/util/connection.py", line 88, in create_connection
raise err
File "/home/troy/BUDDIES/lib/python3.4/site-packages/requests/packages/urllib3/util/connection.py", line 78, in create_connection
sock.connect(sa)
socket.timeout: timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/troy/BUDDIES/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 544, in urlopen
body=body, headers=headers)
File "/home/troy/BUDDIES/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 341, in _make_request
self._validate_conn(conn)
File "/home/troy/BUDDIES/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 761, in _validate_conn
conn.connect()
File "/home/troy/BUDDIES/lib/python3.4/site-packages/requests/packages/urllib3/connection.py", line 204, in connect
conn = self._new_conn()
File "/home/troy/BUDDIES/lib/python3.4/site-packages/requests/packages/urllib3/connection.py", line 139, in _new_conn
(self.host, self.timeout))
requests.packages.urllib3.exceptions.ConnectTimeoutError: (<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7f031cdf7908>, 'Connection to api.twitter.com timed out. (connect timeout=5)')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/troy/BUDDIES/lib/python3.4/site-packages/requests/adapters.py", line 370, in send
timeout=timeout
File "/home/troy/BUDDIES/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 597, in urlopen
_stacktrace=sys.exc_info()[2])
File "/home/troy/BUDDIES/lib/python3.4/site-packages/requests/packages/urllib3/util/retry.py", line 271, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
requests.packages.urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='api.twitter.com', port=443): Max retries exceeded with url: /1.1/statuses/update.json (Caused by ConnectTimeoutError(<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7f031cdf7908>, 'Connection to api.twitter.com timed out. (connect timeout=5)'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/troy/BUDDIES/lib/python3.4/site-packages/TwitterAPI/TwitterAPI.py", line 121, in request
proxies=self.proxies)
File "/home/troy/BUDDIES/lib/python3.4/site-packages/requests/sessions.py", line 465, in request
resp = self.send(prep, **send_kwargs)
File "/home/troy/BUDDIES/lib/python3.4/site-packages/requests/sessions.py", line 573, in send
r = adapter.send(request, **kwargs)
File "/home/troy/BUDDIES/lib/python3.4/site-packages/requests/adapters.py", line 419, in send
raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: HTTPSConnectionPool(host='api.twitter.com', port=443): Max retries exceeded with url: /1.1/statuses/update.json (Caused by ConnectTimeoutError(<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7f031cdf7908>, 'Connection to api.twitter.com timed out. (connect timeout=5)'))
```
which I believe indicates it was no longer trying to use the proxy.
So I theorize an empty `no_proxy` environment variable _segment_ defaults to `*`.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2720/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2720/timeline
| null |
completed
| null | null | false |
[
"Yup, this looks like a legitimate bug: our parsing of the `no_proxy` environment variable does not tolerate spaces.\n\nI think we can fix this up pretty easily.\n",
"This should be fixed by #2721.\n",
"Hey Guys, Here's quick solution for this PIP bug. simply Take HTTP_PROXY and HTTPS_PROXY env. vars and delete those from your system. Whenever you connect to proxy network, use the backed up HTTP_PROXY,HTTPS_PROXY vars.",
"@seema-kote how to use no_proxy bypass every url"
] |
https://api.github.com/repos/psf/requests/issues/2719
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2719/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2719/comments
|
https://api.github.com/repos/psf/requests/issues/2719/events
|
https://github.com/psf/requests/issues/2719
| 100,798,164 |
MDU6SXNzdWUxMDA3OTgxNjQ=
| 2,719 |
Wrong enconding detection for text/* content types (ISO-8859-1)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/515021?v=4",
"events_url": "https://api.github.com/users/theyoprst/events{/privacy}",
"followers_url": "https://api.github.com/users/theyoprst/followers",
"following_url": "https://api.github.com/users/theyoprst/following{/other_user}",
"gists_url": "https://api.github.com/users/theyoprst/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/theyoprst",
"id": 515021,
"login": "theyoprst",
"node_id": "MDQ6VXNlcjUxNTAyMQ==",
"organizations_url": "https://api.github.com/users/theyoprst/orgs",
"received_events_url": "https://api.github.com/users/theyoprst/received_events",
"repos_url": "https://api.github.com/users/theyoprst/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/theyoprst/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/theyoprst/subscriptions",
"type": "User",
"url": "https://api.github.com/users/theyoprst",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-08-13T15:24:29Z
|
2021-09-08T23:00:41Z
|
2015-08-13T18:57:50Z
|
NONE
|
resolved
|
Hi.
I've read the docs which refer to RFC 2616:
> The "charset" parameter is used with some media types to define the character set (section 3.4) of the data. When no explicit charset parameter is provided by the sender, media subtypes of the "text" type are defined to have a default charset value of "ISO-8859-1" when received via HTTP. Data in character sets other than "ISO-8859-1" or its subsets MUST be labeled with an appropriate charset value. See section 3.4.1 for compatibility problems.
So, it's no longer relevant.
Why?
Firstly, I can read facebook (which returns content type 'text/html' and sets `<meta charset="utf-8">`, see https://www.facebook.com) in Russian (utf-8, indeed) in any browser. It proves only that all browsers violate this RFC or understand it other way.
Secondly, [it's said that](http://www.w3.org/Protocols/rfc2616/rfc2616.html) "in 2014, RFC2616 was replaced by multiple RFCs (7230-7237)". And there is no mention of default charsets in them.
I also found [RFC 6657](https://tools.ietf.org/html/rfc6657):
> all new "text/*" registrations
> MUST clearly specify how the charset is determined; relying on the
> default defined in Section 4.1.2 of [RFC2046] is no longer permitted
Look at the [registration for text/html](http://www.w3.org/TR/html5/iana.html#text/html):
> Optional parameters:
> charset
> The charset parameter may be provided to definitively specify the document's character encoding, overriding any character encoding declarations in the document. The parameter's value must be one of the labels of the character encoding used to serialize the file. [ENCODING]
So now there is not default charset for text/html specified.
Finally, even in HTML 4.0 specification they [suggested to ignore this rule](http://www.w3.org/TR/html4/charset.html#h-5.2.2) about default charset:
> The HTTP protocol ([RFC2616], section 3.7.1) mentions ISO-8859-1 as a default character encoding when the "charset" parameter is absent from the "Content-Type" header field. In practice, this recommendation has proved useless because some servers don't allow a "charset" parameter to be sent, and others may not be configured to send the parameter. Therefore, user agents must not assume any default value for the "charset" parameter.
My suggestion: use default charset ("US-ASCII") only for text/plain only, so it is said in [RFC 6657](https://tools.ietf.org/html/rfc6657#page-4).
> The default "charset" parameter value for "text/plain" is unchanged
> from [RFC2046] and remains as "US-ASCII".
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2719/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2719/timeline
| null |
completed
| null | null | false |
[
"Hi there! Thanks for this issue, you've done some exhaustive research!\n\nThis is a known issue: see #2086. The upshot of this is that we agree that the current behaviour is wrong. It should be noted that browsers aren't ignoring the heuristic. The heuristic explicitly points out that information from the body (e.g. the `<meta>` tag) can provide more specific information than the header. However, Requests explicitly _never_ parses HTML, so we cannot take advantage of that approach. Similarly, we do not follow HTML specifications because we are not a HTML implementation.\n\nIf you are parsing HTML we recommend using `response.content`: a good HTML parser should be able to do the decoding appropriately.\n\nI'm closing this to center the discussion on #2086. Thanks again!\n",
"Got it. Thanks.\n"
] |
https://api.github.com/repos/psf/requests/issues/2718
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2718/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2718/comments
|
https://api.github.com/repos/psf/requests/issues/2718/events
|
https://github.com/psf/requests/issues/2718
| 100,718,932 |
MDU6SXNzdWUxMDA3MTg5MzI=
| 2,718 |
Passing a list with size one to request.get transforms list into single element
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9353824?v=4",
"events_url": "https://api.github.com/users/hlnlwj/events{/privacy}",
"followers_url": "https://api.github.com/users/hlnlwj/followers",
"following_url": "https://api.github.com/users/hlnlwj/following{/other_user}",
"gists_url": "https://api.github.com/users/hlnlwj/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/hlnlwj",
"id": 9353824,
"login": "hlnlwj",
"node_id": "MDQ6VXNlcjkzNTM4MjQ=",
"organizations_url": "https://api.github.com/users/hlnlwj/orgs",
"received_events_url": "https://api.github.com/users/hlnlwj/received_events",
"repos_url": "https://api.github.com/users/hlnlwj/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/hlnlwj/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hlnlwj/subscriptions",
"type": "User",
"url": "https://api.github.com/users/hlnlwj",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2015-08-13T08:59:23Z
|
2021-09-08T23:00:41Z
|
2015-08-13T13:55:20Z
|
NONE
|
resolved
|
Hello,
I'm using Python 2.7.6 and requests (2.6.2).
When I try to send a GET request with this payload :
``` python
{
'event_parameter_2': datetime(2014, 3, 9, 6, 11, 35),
'event_parameter_3': [u'eligendi'],
'event_parameter_4': u'sc',
}
```
The `event_parameter_3` field isn't a List type anymore, but a String type when my API gets the request.
This is how I send the GET request :
``` python
r = requests.get('http://' + host + ':' + port + '/event/log', params=generate_event())
print r.url
```
Which gets me the following output :
```
http://127.0.0.1:3000?event_parameter_2=2014-03-09+06%3A11%3A35&event_parameter_3=eligendi&event_parameter_4=sc
```
However, when having a list with multiple elements, I have no problem and get the following output :
```
http://127.0.0.1:3000?event_parameter_2=2012-10-12+01%3A29%3A37&event_parameter_3=occaecati&event_parameter_3=possimus&event_parameter_4=gy
```
I think that maybe when sending lists in url requests, formatting the url as such might be better ?
```
http://127.0.0.1:3000?event_parameter_3[0]=occaecati&event_parameter_3[1]=possimus
```
Thank you :smile:
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2718/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2718/timeline
| null |
completed
| null | null | false |
[
"Thanks for this report!\n\nYour summary is not quite accurate. The output that you are unhappy with in the first case is strictly formatted exactly the same as the output you _are_ happy with in the second case. The second case has multiple entries with the same name: the same number of entries as elements in the list. The first case also does: it just happens that there's only one element in the list.\n\nYou can see that if I break the sections out. Here is the two element case:\n\n```\nevent_parameter_3=occaecati\n&\nevent_parameter_3=possimus\n```\n\nHere's the one element case:\n\n```\nevent_parameter_3=eligendi\n```\n\nThis is exactly the same representation. =) We're being very consistent.\n\nWhen sending lists in the URL there is no standard representation. The _most common_ representation is to have empty square brackets (`[]`), which you can do by doing `'event_parameter_3[]': [u'eligendi']`, but the reality is that there is no one right way to do this. Therefore, in the face of this ambiguity, we refuse to guess. You'll need to make sure the data is correct for whatever format your web service would like to receive.\n\nSorry we can't be more helpful!\n",
"Thank you for your answer ! :)\n\nWhat I actually meant was that when passing a single sized list:\n\n```\nevent_parameter_3=eligendi\n```\n\nThere is no way to know whether it was an list or any other type when receiving the request, a string being represented the same way:\n\n```\nevent_parameter_4=sc\n```\n",
"Indeed. This is the problem with query strings: they aren't really fully-structured data. They're just a list of key-value pairs. There are some _conventions_ around representing 'lists' and other kinds of data structures in query strings, but they are only conventions, there are no rules.\n\nAs I said, the closest there is to a 'standard' convention is to use square brackets after the key name: `event_parameter_3[]`.\n",
"Oh, I didn't understand it this way !\nThank you very much ! :blush: \n"
] |
https://api.github.com/repos/psf/requests/issues/2717
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2717/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2717/comments
|
https://api.github.com/repos/psf/requests/issues/2717/events
|
https://github.com/psf/requests/issues/2717
| 100,507,160 |
MDU6SXNzdWUxMDA1MDcxNjA=
| 2,717 |
"OverflowError: string longer than 2147483647 bytes" when trying requests.put
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/12417391?v=4",
"events_url": "https://api.github.com/users/EB123/events{/privacy}",
"followers_url": "https://api.github.com/users/EB123/followers",
"following_url": "https://api.github.com/users/EB123/following{/other_user}",
"gists_url": "https://api.github.com/users/EB123/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/EB123",
"id": 12417391,
"login": "EB123",
"node_id": "MDQ6VXNlcjEyNDE3Mzkx",
"organizations_url": "https://api.github.com/users/EB123/orgs",
"received_events_url": "https://api.github.com/users/EB123/received_events",
"repos_url": "https://api.github.com/users/EB123/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/EB123/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/EB123/subscriptions",
"type": "User",
"url": "https://api.github.com/users/EB123",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 33 |
2015-08-12T09:49:47Z
|
2020-11-10T14:53:32Z
|
2015-08-12T15:31:22Z
|
NONE
|
resolved
|
Hi,
I'm trying to upload a file that weight about 3GB and I'm getting the following error:
"OverflowError: string longer than 2147483647 bytes"
If I understand correctly it seems like there's a 2GB limit? didnt manage to find any reference to such limiation or how to bypass it (if possible).
The code i'm using is:
``` python
datafile = 'someHugeFile'
with open(datafile, 'rb') as myfile:
args = myfile.read()
resp = requests.put(url, data=args, verify=False)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/site-packages/requests-2.3.0-py2.7.egg/requests/api.py", line 99, in put
return request('put', url, data=data, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests-2.3.0-py2.7.egg/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests-2.3.0-py2.7.egg/requests/sessions.py", line 456, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests-2.3.0-py2.7.egg/requests/sessions.py", line 559, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests-2.3.0-py2.7.egg/requests/adapters.py", line 327, in send
timeout=timeout
File "/usr/local/lib/python2.7/site-packages/requests-2.3.0-py2.7.egg/requests/packages/urllib3/connectionpool.py", line 493, in urlopen
body=body, headers=headers)
File "/usr/local/lib/python2.7/site-packages/requests-2.3.0-py2.7.egg/requests/packages/urllib3/connectionpool.py", line 291, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/local/lib/python2.7/httplib.py", line 995, in request
self._send_request(method, url, body, headers)
File "/usr/local/lib/python2.7/httplib.py", line 1029, in _send_request
self.endheaders(body)
File "/usr/local/lib/python2.7/httplib.py", line 991, in endheaders
self._send_output(message_body)
File "/usr/local/lib/python2.7/httplib.py", line 844, in _send_output
self.send(msg)
File "/usr/local/lib/python2.7/httplib.py", line 820, in send
self.sock.sendall(data)
File "/usr/local/lib/python2.7/ssl.py", line 234, in sendall
v = self.send(data[count:])
File "/usr/local/lib/python2.7/ssl.py", line 203, in send
v = self._sslobj.write(data)
OverflowError: string longer than 2147483647 bytes
```
For smaller files this code works fine for me.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2717/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2717/timeline
| null |
completed
| null | null | false |
[
"Rather than reading the entire file and sending it across in a single request, would it be possible for you to use chunked transfer encoding? http://docs.python-requests.org/en/latest/user/advanced/#chunk-encoded-requests\n",
"This limitation is in `httplib`. You can easily avoid it by slightly changing your code:\n\n``` python\ndatafile = 'someHugeFile'\nwith open(datafile, 'rb') as myfile:\n resp = requests.put(url, data=myfile, verify=False)\n```\n",
"@Lukasa that's inaccurate. The traceback comes from an SSL wrapped socket. This has nothing to do with httplib from what I can see. `OverflowErrors` are raised when a value is larger than the underlying C integer size allowed. This can be seen if you call `len(something_larger_than_four_gb)` on a 32 bit system.\n",
"Unfortunately it looks like it cannot be avoided when you do a POST request with several headers. Then the file (or the files) is always read completely.\n\nIt would be great when this could be avoided in requests since I often have to send files which are longer than the available main memory on the system.\n",
"I'm just gonna chime in.\n\nIf you're trying to send files via the Web that are larger than your\nsystems memory can handle, HTTP probably isn't the best protocol to do this\nvia.\n\nOn Tue, 11 Oct 2016, 5:54 AM Erik Tews [email protected] wrote:\n\n> Unfortunately it looks like it cannot be avoided when you do a POST\n> request with several headers. Then the file (or the files) is always read\n> completely.\n> \n> It would be great when this could be avoided in requests since I often\n> have to send files which are longer than the available main memory on the\n> system.\n> \n> —\n> You are receiving this because you are subscribed to this thread.\n> Reply to this email directly, view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2717#issuecomment-252729478,\n> or mute the thread\n> https://github.com/notifications/unsubscribe-auth/AGk_7XIpx0UDRPXDuoopI7jDSfAlurgLks5qypf7gaJpZM4FqFGb\n> .\n",
"That's true, but I don't get to decide on the protocol and endpoint.\n\nDoing the request with curl works fine, and as a workaround, I'm currently printing a curl command to STDOUT so that the user can launch it.\n",
"@eriktews can you share how you're doing the upload? There are ways to stream uploads (like [Lukasa's comment](https://github.com/kennethreitz/requests/issues/2717#issuecomment-130343655) shows). Is there a reason you cannot do that (if you are not already trying that)? Also, can you provide your actual traceback?\n",
"So the comment from Lukasa seems to work when you are uploading a single file, then you can do a streaming upload. But I have to do a normal post request with several variables in the data part and the file as a part of a multipart upload.\n\nThere is an API documentation at https://canvas.instructure.com/doc/api/file.file_uploads.html which shows a curl command in the \"Step 2\" section. Basically I wanna replicate that call with the requests package in streaming mode.\n\nI don't have the traceback at the moment, but when I get it, I will post it here.\n",
"Have you tried using the `MultipartEncoder` from the [requests-toolbelt](https://toolbelt.readthedocs.io)? That allows you to stream multipart/form-data uploads like doing an upload of a single file. It was written specifically for this use case and was added to our docs.\n",
"No, but that looks like right what I need. I will give it a try and see whether this works. I didn't know about the toolbelt package at all, so maybe you should reference it in the normal requests package documentation.\n",
"> so maybe you should reference it in the normal requests package documentation.\n\nIt is :)\n",
"@Lukasa 's method does not work - even with httplib off the signing still happens for the transport itself. In my case I have a 2GB+ POST request (not a file, just POST data). This is for an elasticsearch bulk update. The endpoint only has HTTPS so I'm working through other solutions now.\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/var/virtualenvs/centr/lib/python3.5/site-packages/elasticsearch/connection/http_requests.py\", line 75, in perform_request\r\n timeout=timeout or self.timeout, verify=False)\r\n File \"/var/virtualenvs/centr/lib/python3.5/site-packages/requests/sessions.py\", line 609, in send\r\n r = adapter.send(request, **kwargs)\r\n File \"/var/virtualenvs/centr/lib/python3.5/site-packages/requests/adapters.py\", line 423, in send\r\n timeout=timeout\r\n File \"/var/virtualenvs/centr/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 600, in urlopen\r\n chunked=chunked)\r\n File \"/var/virtualenvs/centr/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 356, in _make_request\r\n conn.request(method, url, **httplib_request_kw)\r\n File \"/usr/lib/python3.5/http/client.py\", line 1106, in request\r\n self._send_request(method, url, body, headers)\r\n File \"/usr/lib/python3.5/http/client.py\", line 1151, in _send_request\r\n self.endheaders(body)\r\n File \"/usr/lib/python3.5/http/client.py\", line 1102, in endheaders\r\n self._send_output(message_body)\r\n File \"/usr/lib/python3.5/http/client.py\", line 936, in _send_output\r\n self.send(message_body)\r\n File \"/usr/lib/python3.5/http/client.py\", line 908, in send\r\n self.sock.sendall(data)\r\n File \"/usr/lib/python3.5/ssl.py\", line 891, in sendall\r\n v = self.send(data[count:])\r\n File \"/usr/lib/python3.5/ssl.py\", line 861, in send\r\n return self._sslobj.write(data)\r\n File \"/usr/lib/python3.5/ssl.py\", line 586, in write\r\n return self._sslobj.write(data)\r\nOverflowError: string longer than 2147483647 bytes\r\n```",
"Sorry, can you demonstrate your code please?",
"This should throw the error:\r\n```\r\ndatafile = 'someHugeFile'\r\nwith open(datafile, 'rb') as myfile:\r\n r = requests.post(endpoint, data={'key': myfile.read()}, verify=False)\r\n```\r\n\r\nIf `endpoint` is https then `ssl` will have to process the payload. I wonder if `requests` or `requests-toolbelt` could have an option to do the signature with some other library that doesn't die when signing a 2GB string. Of course, I would say that people shouldn't be signing such large things but it's definitely a real crash that's happening in the real world.",
"@adamn That was not my proposed solution. My proposed solution was to not read the file in manually at all. You are bumping into the same error as before, which is that we are sending a single gigantic string to httplib.\r\n\r\nThis is a behaviour we can fix: if we spot someone uploading a *gigantic* single string via Python then we can resolve it. But at this point I *strongly* recommend you use an intermediary file object: either one on disk, or by doing the urlencoding yourself and wrapping the result in a `BytesIO`.",
"I've already come up with a workaround so won't be able to dig deeper into this unfortunately. I still suspect that the SSL payload needs to be signed/encrypted so the same thing will happen regardless of whether there is a file object or not since the exception is raised by `ssl.write` itself and I presume that method needs the entire payload. Chunking the POST seems like the only real option. Anyway, thanks for the help.",
"@adamn No, that's not necessary. TLS uses stream encryption, it does not need the entire payload at once.\r\n\r\nWhat you're missing is that when given a file object, requests will automatically stream it in smaller chunks (specifically, 8192 kb chunks). Those cause no problem.",
"Sorry to comment on an old issue, but this looks _similar_ to an issue we've run into and I'm trying to decide whether it's worth opening a new issue for it.\r\n\r\nAgain, `requests.put` where data is a _huge_ string doesn't work, but we don't get an error. requests just hangs sending; a packet capture shows that no more data is being sent.\r\n\r\nThis behaviour is _worse_ that an exception being raised.",
"Has the remote peer been ACKing at the TCP level? Is it still reading from the receive buffer? Has it TCP FIN'd?",
"Yes, the remote end is sending ACKs appropriately, no FIN or anything like that.\r\n\r\nIn fact, if you have a large file `f`, we see the problem if you do `requests.put(url, data=f.read())` but not if you do `requests.put(url, data=f)`. Obviously if we have a file handle, we wouldn't bother to call read on it, but the point is that both calls are supposed generate the same request, and a packet capture shows that they do up until the point at which one stops sending packets.",
"Hrm. Is it possible for you to put together a small repro scenario? Do you see the same effect with other hosts?",
"As luck would have it, I have already done so.\r\n\r\nGithub doesn't seem to want to let me attach files, so:\r\n\r\n```\r\n#!/usr/bin/env python\r\n\r\nimport requests\r\n\r\n\r\nMB = 1024 ** 2\r\nGB = MB * 1024\r\n\r\n\r\nif __name__ == '__main__':\r\n data = 'x' * 4 * GB\r\n resp = requests.put('http://localhost:8000', data=data)\r\n print resp\r\n```\r\n\r\nAnd a server to run it against:\r\n\r\n```\r\n#!/usr/bin/env python\r\n\r\nimport BaseHTTPServer\r\nimport logging\r\n\r\n\r\nREAD_CHUNK = 1024 ** 2\r\n\r\n\r\nclass Handler(BaseHTTPServer.BaseHTTPRequestHandler):\r\n def do_PUT(self):\r\n logging.info(\"PUT request recieved\")\r\n for header, value in self.headers.items():\r\n logging.info(\"header: %s = %s\", header, value)\r\n\r\n length = int(self.headers['Content-Length'])\r\n\r\n logging.info(\"Content length %s, getting content...\", length)\r\n\r\n while length:\r\n to_read = min(length, READ_CHUNK)\r\n logging.info(\"reading %s bytes...\", to_read)\r\n self.rfile.read(to_read)\r\n length -= to_read\r\n\r\n logging.info(\"Recieved content\")\r\n\r\n self.send_response(200)\r\n\r\n\r\ndef run(server_class=BaseHTTPServer.HTTPServer):\r\n server_address = ('', 8000)\r\n httpd = server_class(server_address, Handler)\r\n httpd.serve_forever()\r\n\r\n\r\nif __name__ == '__main__':\r\n logging.basicConfig(\r\n level=logging.DEBUG,\r\n format=\"%(asctime)s %(levelname)s: %(message)s\",\r\n )\r\n logging.debug(\"Starting server\")\r\n run()\r\n```\r\n\r\nObviously this isn't the server we were running against when we first encountered this problem :)",
"Well as a first logical note I should point out that this is necessarily not the same problem as originally reported on this issue, as the original report affected TLS only, as discussed above. :wink: Regardless, let's dig into this a bit.",
"Ah, sorry. Is it worth me opening a new issue then, or should I just leave it here, since you're already looking at it?",
"Let's leave it here for now. =)",
"Huh. That behaves...very oddly. On my machine, over the loopback, I don't see any data sent at all: it's like Requests just gave up on sending it. Further debugging seems to show this is happening at the level of `socket.sendall`, which for some absurd reason is just not sending the complete response. By \"not sending the complete response\" I mean `socket.sendall` is *returning* early, but demonstrably is not sending all the data.\r\n\r\nNaturally, the reason this happens is the same as the reason Python does lots of other stupid crap: `socket.sendall` is written in C. The *very first* thing that `socket.sendall` does is get the length of the data that was sent into it and shoves it into a C `int`. Now, this is wrong to begin with: `Py_buffer.len` is a `Py_ssize_t`, and `sizeof(ssize_t)` is *frequently* larger than `sizeof(int)`. So that's bonkers stupid, and probably the source of this bug.\r\n\r\nIn fact, it definitely is, since the current Python master has a changed `sendall` that uses the correct size. This seems to have been cleaned up around Python 3 time as a general \"64-bit issue\" (see python/cpython@19467d27ff14ebe31978438078ed5a661ffd29fb) in the socket module.\r\n\r\nThat makes this ultimately a duplicate of [CPython issue #18100](https://bugs.python.org/issue18100). This has been open a long time in need of patch review, and given that Python 2.7 is now only getting security fixes I doubt the CPython developers will fix it at this point.\r\n\r\nThis is a difficult issue for Requests to sensibly police. We can tell when people will *definitely* hit it (e.g. because the input is a string which a length greater than 2GB), but there are many situations where people will hit it but we can't tell (e.g. because the string plus the headers is greater than 2GB in size, or because there is a different type in use that CPython will treat as \"stringish\" that is larger than 2GB). So my initial inclination is, given that this is an issue that can be solved by moving to a newer version of Python, and that it can be worked around by not reading gigantic strings into memory (which is a best-practice anyway), and that if we ever move off httplib we'll fix it automatically anyway, I'm inclined to suggest that we probably don't have a huge pressure to resolve the issue? For my part, I think this is getting pretty close to \"Dr, it hurts when I do this.\" \"So don't do that then!\" territory.\r\n\r\nHowever, I'm willing to be disagreed with here.",
"The workaround is really simple - just wrap it in StringIO/BytesIO, but when you run into it, it's difficult to diagnose, so any help that requests could give, even if it's just a warning in the documentation, would be appreciated.",
"I can get behind the idea of a PR that adds a note in the documentation.",
"Running into the same problem,I dont this thread has an accepted solution,anyone has a solution for this ?",
"This also results in an OverflowError in `self._sslobj.write(data)`:\r\n\r\n files = {'file': open(tar_file_path, 'rb')}\r\n headers = {'key': 'abc123'}\r\n r = requests.post(url, files=files, headers=headers)\r\n\r\nThe file is 3GB in size."
] |
https://api.github.com/repos/psf/requests/issues/2716
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2716/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2716/comments
|
https://api.github.com/repos/psf/requests/issues/2716/events
|
https://github.com/psf/requests/issues/2716
| 99,878,720 |
MDU6SXNzdWU5OTg3ODcyMA==
| 2,716 |
Strange behavior when setting cookie value to None in the method-level parameter
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10273733?v=4",
"events_url": "https://api.github.com/users/mcpwlk/events{/privacy}",
"followers_url": "https://api.github.com/users/mcpwlk/followers",
"following_url": "https://api.github.com/users/mcpwlk/following{/other_user}",
"gists_url": "https://api.github.com/users/mcpwlk/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mcpwlk",
"id": 10273733,
"login": "mcpwlk",
"node_id": "MDQ6VXNlcjEwMjczNzMz",
"organizations_url": "https://api.github.com/users/mcpwlk/orgs",
"received_events_url": "https://api.github.com/users/mcpwlk/received_events",
"repos_url": "https://api.github.com/users/mcpwlk/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mcpwlk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mcpwlk/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mcpwlk",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
}
] |
open
| false | null |
[] | null | 4 |
2015-08-09T09:27:56Z
|
2015-08-28T17:36:50Z
| null |
NONE
| null |
After running the following code
```
s = requests.Session()
s.cookies.update({'from-my': 'browser'})
r = s.get('http://httpbin.org/cookies', cookies={'another': 'cookie', 'from-my': None})
print r.text
```
the output is
```
{
"cookies": {
"from-my; another": "cookie"
}
}
```
I used Python 2.7.6 and requests 2.7.0.
For more information, please see http://stackoverflow.com/questions/31902510/strange-requests-behavior-when-setting-cookie-value-to-none-in-the-method-level
| null |
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2716/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2716/timeline
| null | null | null | null | false |
[
"Because I recently updated that section of the docs, I examined this behavior in order to make sure I didn't add incorrect/misleading documentation. \n\nI believe this is because the pattern of _\"Remove a Value From a Dict Parameter by setting it to `None`\"_ simply isn't being considered for cookies when [merging request cookies with session cookies](https://github.com/kennethreitz/requests/blob/408d75d47aecc3724270217cf47793c114670f38/requests/sessions.py#L367-L369).\n\n[`merge_cookies()`](https://github.com/kennethreitz/requests/blob/408d75d47aecc3724270217cf47793c114670f38/requests/cookies.py#L463-L480) basically just updates the target jar, but does not remove any items.\n\nThis means cookies with a `value` of `None` will make it into the `RequestsCookieJar`, and will be serialized to a `Cookie: from-my; another=cookie` header.\n\nThis is where the parsing the `Cookie:` header by [httpbin.org/cookies](http://httpbin.org/cookies) also plays a role: Depending on whether the empty cookie appears first or last in the header, it will either combine it into the key of the next cookie, or drop it:\n\n```\nCookie: from-my; another=cookie ----> \"cookies\": {\"from-my; another\": \"cookie\"}\nCookie: another=cookie; from-my ----> \"cookies\": {\"another\": \"cookie\"}\n```\n\nSo this is why key order seems to matter when testing against [httpbin.org/cookies](http://httpbin.org/cookies).\n\nNow the question is, should a client even be sending empty cookies like that? [RFC 6265 | 5.4. The Cookie Header](http://tools.ietf.org/html/rfc6265#section-5.4) seems to say no:\n\n> ```\n> 1. Output the cookie's name, the %x3D (\"=\") character, and the\n> cookie's value.\n> ```\n\nIt doesn't seem to address empty cookie values specifically, but it does require a `=` delimiter for `cookie-pair`'s. That's also what the [grammar for `cookie-string`](http://tools.ietf.org/html/rfc6265#section-4.2.1) states.\n\nPython's [`cookielib`](https://hg.python.org/cpython/file/2.7/Lib/cookielib.py#l1307) / [`http.cookiejar`](https://hg.python.org/cpython/file/3.4/Lib/http/cookiejar.py#l1310) seems to think differently: \n\n``` python\n if cookie.value is None:\n attrs.append(cookie.name)\n else:\n attrs.append(\"%s=%s\" % (cookie.name, value))\n```\n\nSo when `None` would be used as a sentinel value to mean _\"Omit this cookie for this request\"_ like for other dict parameters, this would obviously prevent a user from intentionally sending a cookie with an empty value, which currently is possible.\n",
"Hmm, interesting. I think `cookielib` just doesn't expect to receive a `None` value here.\n\nI think we should do two things: firstly, we should ensure that we unset cookies with `None` values from the method level argument if we can. Secondly, we should consider whether this is a bug report that should be raised upstream. I searched upstream and didn't find any associated bug report, so we might have been the first to hit this.\n",
"So this does seem to be a bug in the standard library. [RFC 2965](https://tools.ietf.org/html/rfc2965#page-11) even shows the `=VALUE` as being required (even if it didn't use proper ABNF =P). `cookielib` was implemented at a time when 2965 was the actual standard.\n",
"@lukasgraf RFC 6265 addresses the issue of empty cookie values in both the `Set-Cookie` grammar in [section 4.1.1](http://tools.ietf.org/html/rfc6265#section-4.1.1) and its interpretation by the client in [section 5.2](http://tools.ietf.org/html/rfc6265#section-5.2) wherein nil values are permitted. Therefore a user agent should return an empty value, subject to constraints imposed when the cookie was set (lifetime, path, etc.), in future requests.\nAs you remark, an equals symbol is required in the relevant headers, which `cookielib` fails to include.\n\nAs a historical note, the grammar in [section 3.2.2 of RFC 2965](https://tools.ietf.org/html/rfc2965#section-3.2) (and [RFC 2901](https://tools.ietf.org/html/rfc2109#section-4.2.2) before it) forbade nil values by specifying cookie values to be tokens (see [section 2.2 of RFC 2616](https://tools.ietf.org/html/rfc2616#section-2.2)) or quoted strings, though the latter could be empty. The original [Netscape proposal](http://curl.haxx.se/rfc/cookie_spec.html) was silent on the matter.\n"
] |
https://api.github.com/repos/psf/requests/issues/2715
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2715/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2715/comments
|
https://api.github.com/repos/psf/requests/issues/2715/events
|
https://github.com/psf/requests/issues/2715
| 99,818,636 |
MDU6SXNzdWU5OTgxODYzNg==
| 2,715 |
Capitalize response reasons.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/717735?v=4",
"events_url": "https://api.github.com/users/betolink/events{/privacy}",
"followers_url": "https://api.github.com/users/betolink/followers",
"following_url": "https://api.github.com/users/betolink/following{/other_user}",
"gists_url": "https://api.github.com/users/betolink/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/betolink",
"id": 717735,
"login": "betolink",
"node_id": "MDQ6VXNlcjcxNzczNQ==",
"organizations_url": "https://api.github.com/users/betolink/orgs",
"received_events_url": "https://api.github.com/users/betolink/received_events",
"repos_url": "https://api.github.com/users/betolink/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/betolink/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/betolink/subscriptions",
"type": "User",
"url": "https://api.github.com/users/betolink",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2015-08-08T17:21:10Z
|
2021-09-08T23:00:42Z
|
2015-08-08T17:25:33Z
|
NONE
|
resolved
|
This is a somewhat trivial thing, I ran into a "bug" while grouping responses by reason instead of status codes. This is of course a very particular case but just to be consistent it may be a good idea to have response.reason always be "NOT FOUND" instead of "Not Found", "Not found", "not found" etc.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2715/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2715/timeline
| null |
completed
| null | null | false |
[
"Generally speaking, reason phrases are not to be trusted when provided by the server. There is no requirement that they be the same as the status code: nothing's really wrong with sending `200 NO PROBLEM BOSS`, for example, aside from the fact that the spec says you shouldn't.\n\nFor that reason, you should not rely on the reason phrase when making decisions unless the exact binary form of that phrase is something you care about. I don't think requests should canonicalise in any way here.\n\nThanks for the suggestion though!\n",
"+100 to what @Lukasa said. If you want normalized responses, you can use a mapping of status codes to whatever you'd like them to represent.\n",
"Note that I did not mention let's change `200 NO PROBLEM BOSS` to `200 OK` or vice-versa, the only proposed change is to capitalize whatever the reason coming from the server is just to avoid having to do it in an extra map().\n"
] |
https://api.github.com/repos/psf/requests/issues/2714
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2714/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2714/comments
|
https://api.github.com/repos/psf/requests/issues/2714/events
|
https://github.com/psf/requests/pull/2714
| 99,764,744 |
MDExOlB1bGxSZXF1ZXN0NDE5NDc2MDM=
| 2,714 |
[WIP] Add a custom requests cookie policy
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"color": "e11d21",
"default": false,
"description": null,
"id": 44501305,
"name": "Not Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTMwNQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Not%20Ready%20To%20Merge"
},
{
"color": "fef2c0",
"default": false,
"description": null,
"id": 60669570,
"name": "Please Review",
"node_id": "MDU6TGFiZWw2MDY2OTU3MA==",
"url": "https://api.github.com/repos/psf/requests/labels/Please%20Review"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2013-11-17T11:29:34Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": null,
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/20",
"id": 487518,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels",
"node_id": "MDk6TWlsZXN0b25lNDg3NTE4",
"number": 20,
"open_issues": 12,
"state": "open",
"title": "3.0.0",
"updated_at": "2024-05-19T18:43:00Z",
"url": "https://api.github.com/repos/psf/requests/milestones/20"
}
| 14 |
2015-08-08T03:16:16Z
|
2021-09-07T00:06:35Z
|
2017-02-10T17:21:45Z
|
CONTRIBUTOR
|
resolved
|
This ensures that we follow RFC 6265 Section 4.1.2.3 appropriately. If a
cookie is returned without a domain attribute, we do not want to send it
to subdomains.
Closes #2576
---
Needs:
- [ ] Tests
- [ ] Documenting this breaking change
- [ ] Porting to requests-toolbelt for early adopters
- [ ] Backport to master for a 2.x release
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2714/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2714/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2714.diff",
"html_url": "https://github.com/psf/requests/pull/2714",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2714.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2714"
}
| true |
[
"Actually, this isn't right. We probably need to just override `return_ok_domain`. This is too soon, since we only care about this for subdomains.\n",
"Heh, I was just about to leave a code review comment to that effect until I saw your comment. This is definitely WIP, we should tag it as such. =)\n",
"Yep. After spending too much time in the weeds in cookielib, I used the solution that worked at first and then realized it was wrong and went to bed.\n",
"So I can easily add tests around this at this point. I don't think we need to test that the stdlib handles options appropriately. I did a quick test with the \"server\" provided in the bug report. That said, I think we can provide the function to create our new default policy without using it by default in 2.x I don't think we need to use the toolbelt for this.\n\nWe can document that by default we will be using strict domain policy in 3.x and that we will be setting the policy by default. People should start using the new policy as soon as possible. Making it available in requests now makes it a `__future__` like feature.\n\nI guess if we want people with versions older than 2.x.0 (where x is the version in which we include this function) we can put it in the toolbelt too. \n",
"Yeah @sigmavirus24, I'd like to do that too.\n",
"@Lukasa so this breaks some of our tests around cookies. Particularly because HTTPbin doesn't set the `domain` attribute. So we at least know this works as intended. I'll see how difficult it would be to fix this in httpbin\n",
"While I was falling asleep I realized that this _shouldn't_ be breaking our tests since we're not using a subdomain. That makes me think I chose the wrong strict domain option. We're may still need to rewrite that logic regardless. This may also be a bug in the standard library\n",
"So this isn't breaking tests the way I thought it was. When I looked at which tests are failing, I noticed what was happening. The tests in question all do roughly the following:\n\n``` py\nrequests.get('https://httpbin.org/get', cookies={'foo': 'bar'})\n```\n\nIn other words, we're expecting a `Cookie` header to be sent that looks like `Cookie: foo=bar` to that URL. This fails because those cookies are naively added to a Cookie Jar and that cookie has no domain associated with it which causes it to not match the request host.\n\nThis makes me ask some questions (since I rarely use cookies like this):\n- Do users actually use cookies like this?\n- Can we safely assume that cookies like this are always meant for the host, e.g., unless the cookie is parsed to have a domain, we forcibly set it to our request host for that request?\n\nFurther this also affects users who do something like:\n\n``` py\ns = requests.Session()\ns.cookies['foo'] = 'bar'\ns.get('https://httpbin.org/get')\n```\n\nThis, however, is far more nebulous. There is no good way to know what domains to send that cookie for. Previously we did something that was arguably really really awful (send it for all domains, I suspect). I think this begs, then, for a helper to create a cookie to be used here. Thoughts @Lukasa?\n",
"Ugh. Cookies are terrible. All is forgiven, PHK. All is forgiven.\n\nYeah, the only _consistent_ behaviour while we allow that API is to send those cookies to all hosts. I have no idea how we're going to make that work with cookielib as it stands.\n",
"Okay, so @Lukasa and I discussed this in [IRC](https://botbot.me/freenode/python-requests/2015-08-15/?msg=47298372&page=1). The result was the following:\n1. Continue with this as it is\n2. Update `create_cookie` helper to **require** a domain attribute\n3. Disallow string-ish types (str, unicode, bytes, etc.) when assigning to a CookieJar (e.g., the case where you update a session's cookies by doing `s.cookies['foo'] = 'bar'`). This will mean only accepting Cookie objects as the value. (We may also want to validate that said cookie has a domain attribute, or at least issue a warning if it does not.)\n4. When passing `cookies=` to a request method, **assume** that if a domain isn't present, it is explicitly for the request domain.\n\nWhat does this mean for the backport to the `requests-toolbelt` and 2.x:\n1. The toolbelt will start to carry the re-implementation of the CookieJar from this request when we move this functionality there.\n2. The toolbelt will also carry the implementation of `create_cookie` that **requires** a domain\n3. requests 2.x will start issuing a `DeprecationWarning` when doing `s.cookies['foo'] = 'bar'` to warn people of the change coming in 3.0\n4. requests 2.x will carry this policy function that's already written so people can start using it and seeing how their code may break but only if they opt in by creating a Policy and a new Cookie Jar with that policy.\n",
"Ugh so this also affects `cookiejar_from_dict`.\n",
"@Lukasa @sigmavirus24 what is the status of this? I was working a bit in the `RequestsCookieJar` code for #3028, and I'd be willing to work on this a bit if it's still something you think should move forward.\n",
"@davidsoncasey this is not a simple change. I have a lot of work locally that I'm not yet ready to push.\n",
"Closing due to inactivity (not sure if that's our fault or not, sorry!). Please re-submit if you're still interested in contributing this code :)"
] |
https://api.github.com/repos/psf/requests/issues/2713
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2713/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2713/comments
|
https://api.github.com/repos/psf/requests/issues/2713/events
|
https://github.com/psf/requests/pull/2713
| 99,744,420 |
MDExOlB1bGxSZXF1ZXN0NDE5NDEwMjQ=
| 2,713 |
Docs: Clarify that method-level parameters are not persisted in sessions.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/405124?v=4",
"events_url": "https://api.github.com/users/lukasgraf/events{/privacy}",
"followers_url": "https://api.github.com/users/lukasgraf/followers",
"following_url": "https://api.github.com/users/lukasgraf/following{/other_user}",
"gists_url": "https://api.github.com/users/lukasgraf/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/lukasgraf",
"id": 405124,
"login": "lukasgraf",
"node_id": "MDQ6VXNlcjQwNTEyNA==",
"organizations_url": "https://api.github.com/users/lukasgraf/orgs",
"received_events_url": "https://api.github.com/users/lukasgraf/received_events",
"repos_url": "https://api.github.com/users/lukasgraf/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/lukasgraf/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lukasgraf/subscriptions",
"type": "User",
"url": "https://api.github.com/users/lukasgraf",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2015-08-07T23:01:27Z
|
2021-09-08T07:00:46Z
|
2015-08-08T01:18:21Z
|
CONTRIBUTOR
|
resolved
|
This adds a paragraph to the **session docs** that clarifies the fact that **method-level parameters are not persisted across requests**, even when a session is being used (fixes #2488).
As an example I used cookies, and included a pointer to the [Cookie utility functions](http://docs.python-requests.org/en/latest/api/#cookies). In order to be able to link to that section I added some section labels in `docs/api.rst` (prefixed with `api-`, because otherwise there would be label collisions).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2713/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2713/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2713.diff",
"html_url": "https://github.com/psf/requests/pull/2713",
"merged_at": "2015-08-08T01:18:21Z",
"patch_url": "https://github.com/psf/requests/pull/2713.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2713"
}
| true |
[
"Eh, I'd rather not add a bunch of labels to one file just to be able to use _1_ in the docs.\n",
"@sigmavirus24 got it, I'll remove all the ones except the one for the cookies section. Should I stick with the `api-` prefix or drop that as well?\n",
"The api- prefix makes sense. Thanks.\n",
"@sigmavirus24 updated\n",
"Thanks @lukasgraf! :sparkles: :cake: :sparkles: \n"
] |
https://api.github.com/repos/psf/requests/issues/2712
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2712/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2712/comments
|
https://api.github.com/repos/psf/requests/issues/2712/events
|
https://github.com/psf/requests/pull/2712
| 99,709,441 |
MDExOlB1bGxSZXF1ZXN0NDE5MjM0MDE=
| 2,712 |
Document use of sessions as context managers
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/405124?v=4",
"events_url": "https://api.github.com/users/lukasgraf/events{/privacy}",
"followers_url": "https://api.github.com/users/lukasgraf/followers",
"following_url": "https://api.github.com/users/lukasgraf/following{/other_user}",
"gists_url": "https://api.github.com/users/lukasgraf/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/lukasgraf",
"id": 405124,
"login": "lukasgraf",
"node_id": "MDQ6VXNlcjQwNTEyNA==",
"organizations_url": "https://api.github.com/users/lukasgraf/orgs",
"received_events_url": "https://api.github.com/users/lukasgraf/received_events",
"repos_url": "https://api.github.com/users/lukasgraf/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/lukasgraf/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lukasgraf/subscriptions",
"type": "User",
"url": "https://api.github.com/users/lukasgraf",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-08-07T19:07:15Z
|
2021-09-08T07:00:47Z
|
2015-08-07T20:41:21Z
|
CONTRIBUTOR
|
resolved
|
This adds some documentation on how to use sessions as **context managers** (fixes #2580). I also added a brief mention of **connection pooling** and its performance benefits.
I still got one question though: Is instantiating [`Session`](https://github.com/kennethreitz/requests/blob/9b067db19e20226dcb3aa407605d30942d085050/requests/sessions.py#L267) directly the proper API to document? I noticed that there's also a [`session()` factory function](https://github.com/kennethreitz/requests/blob/9b067db19e20226dcb3aa407605d30942d085050/requests/sessions.py#L674-L677) that explicitly mentions context management in its docstring. But I then saw this in the [API changes section](http://www.python-requests.org/en/latest/api/?highlight=backwards%20compatibility#api-changes):
> The `Session` API has changed. Sessions objects no longer take parameters. `Session` is also now capitalized, but it can still be instantiated with a lowercase `session` for backwards compatibility.
So I therefore used `Session()` in the examples, and I'm assuming that factory function is only there for backwards compatibility. Should I update its docstring with something like `Alias for backwards compatibility`?
Let me know how you'd like it documented, and I'll update my PR accordingly.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2712/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2712/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2712.diff",
"html_url": "https://github.com/psf/requests/pull/2712",
"merged_at": "2015-08-07T20:41:21Z",
"patch_url": "https://github.com/psf/requests/pull/2712.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2712"
}
| true |
[
"This is great, thanks!\n\n:sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/2711
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2711/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2711/comments
|
https://api.github.com/repos/psf/requests/issues/2711/events
|
https://github.com/psf/requests/issues/2711
| 99,545,848 |
MDU6SXNzdWU5OTU0NTg0OA==
| 2,711 |
FR: Ability to enable raise_for_status for all requests on a Session object
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1811813?v=4",
"events_url": "https://api.github.com/users/fgimian/events{/privacy}",
"followers_url": "https://api.github.com/users/fgimian/followers",
"following_url": "https://api.github.com/users/fgimian/following{/other_user}",
"gists_url": "https://api.github.com/users/fgimian/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/fgimian",
"id": 1811813,
"login": "fgimian",
"node_id": "MDQ6VXNlcjE4MTE4MTM=",
"organizations_url": "https://api.github.com/users/fgimian/orgs",
"received_events_url": "https://api.github.com/users/fgimian/received_events",
"repos_url": "https://api.github.com/users/fgimian/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/fgimian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fgimian/subscriptions",
"type": "User",
"url": "https://api.github.com/users/fgimian",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-08-06T23:37:50Z
|
2021-09-08T22:00:54Z
|
2015-08-31T06:52:47Z
|
NONE
|
resolved
|
Hey there,
Firstly, I absolutely adore requests, it's the reference library for Python and would like to thank you greatly for creating it.
Today I found the little utility function raise_for_status which I think is very useful, but I think it's a little tedious having to call this for each request if this is the way you prefer to work.
Would you please consider a way to allow for this to be run automatically each time a request is made? I was thinking you could have a kwarg to Session which enables it for all requests, and possibly a kwarg for requests.get and so forth?
Thanks again! :smile:
Fotis
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2711/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2711/timeline
| null |
completed
| null | null | false |
[
"Sorry @fgimian, it looks like this got lost!\n\nGenerally I don't believe automating a single function call is a particularly good idea. The cost of doing that is that it makes it hard to tell from reading code exactly what exceptions a given call into Requests might raise, because they depend on the `Session` object elsewhere in the code.\n\nGiven that the only advantage is to save you a single function call, I think this is probably not worthwhile. Sorry!\n",
"Thanks for your reply @Lukasa, I understand where you're coming from. Really appreciate you considering my idea :smile: \n"
] |
https://api.github.com/repos/psf/requests/issues/2710
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2710/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2710/comments
|
https://api.github.com/repos/psf/requests/issues/2710/events
|
https://github.com/psf/requests/issues/2710
| 99,530,104 |
MDU6SXNzdWU5OTUzMDEwNA==
| 2,710 |
guess_filename(obj) doesn't recognize werkzeug.datastructures.FileStorage
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10097486?v=4",
"events_url": "https://api.github.com/users/shahzzzam/events{/privacy}",
"followers_url": "https://api.github.com/users/shahzzzam/followers",
"following_url": "https://api.github.com/users/shahzzzam/following{/other_user}",
"gists_url": "https://api.github.com/users/shahzzzam/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/shahzzzam",
"id": 10097486,
"login": "shahzzzam",
"node_id": "MDQ6VXNlcjEwMDk3NDg2",
"organizations_url": "https://api.github.com/users/shahzzzam/orgs",
"received_events_url": "https://api.github.com/users/shahzzzam/received_events",
"repos_url": "https://api.github.com/users/shahzzzam/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/shahzzzam/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shahzzzam/subscriptions",
"type": "User",
"url": "https://api.github.com/users/shahzzzam",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2015-08-06T21:37:14Z
|
2021-09-08T23:00:42Z
|
2015-08-10T22:28:22Z
|
NONE
|
resolved
|
This is my _first_ issue ever on any repo. So kindly forgive me if I'm asking a naive question or don't have enough information provided.
I am currently working on an API in **Flask-Restful** on Client side and also have to consume the Server side API using **Requests** API.
- So I was adding an image along with some other form components like text and sending it to the Server side API using Requests' POST call.
- The datastructure understood by Flask-restful for Images/ Files is werkzeug.datastructures.FileStorage, which I think is similar to a Multi-dict.
``` Python
def post():
payload_data = {} # some data
payload_files = {} # abc.png
url = http://xyz/api/
response_obj = requests.post(url, data=payload_data, files=payload_file)
```
- Now inside the **models.py**, at line https://github.com/kennethreitz/requests/blob/master/requests/models.py#L117 gives me the value for **files** as
``` python
[('form_image', <FileStorage: 'abc.png' ('image/png')>)]
# form_image is just the name of the image in form.
```
The key-value structure of a normal FileStorage object is something like this (from debugger)
```
* {FileStorage} <FileStorage: 'abc.png' ('image/png')>
* _parsed_content_type = {tuple} ('image/png', {})
* content_length = {int} 0
* content_type = {str} 'image/png'
* filename = {str} 'abc.png'
* headers = {Headers} Content-Disposition: form-data; name="form_image";
* filename="abc.png"\r\nContent-Type: image/png\r\n\r\n
* mimetype = {str} 'image/png'
* mimetype_params = {dict} {}
* name = {str} 'form_image'
* stream = {BufferedRandom} <_io.BufferedRandom name=9>
```
Note that the filename is inside the key `filename` and not `name`.
Then it calls the guess_filename at
``` python
fn = guess_filename(v) or k
```
In the guess_filename(obj), the value of the obj is still
``` python
<FileStorage: 'abc.png' ('image/png')>
```
**THIS LINE** checks the object and checks the key "name" instead of **"filename"**
``` python
name = getattr(obj, 'name', None)
```
- After this, the value of `name` is now **"form_image"** _instead of_ **"abc.png"**
- As a result, the server side receives the file with the name `form_image`, instead of the real file name.
- Is there a solution to this? Please let me know.
- If not, it seems that `FileStorage` works differently than was anticipated in `guess_filename`. So does there have to be some line that does this:
``` python
from werkzeug import datastructures
if isinstance(obj, FileStorage):
name = getattr(obj, 'filename', None)
else:
name = getattr(obj, 'name', None)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10097486?v=4",
"events_url": "https://api.github.com/users/shahzzzam/events{/privacy}",
"followers_url": "https://api.github.com/users/shahzzzam/followers",
"following_url": "https://api.github.com/users/shahzzzam/following{/other_user}",
"gists_url": "https://api.github.com/users/shahzzzam/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/shahzzzam",
"id": 10097486,
"login": "shahzzzam",
"node_id": "MDQ6VXNlcjEwMDk3NDg2",
"organizations_url": "https://api.github.com/users/shahzzzam/orgs",
"received_events_url": "https://api.github.com/users/shahzzzam/received_events",
"repos_url": "https://api.github.com/users/shahzzzam/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/shahzzzam/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shahzzzam/subscriptions",
"type": "User",
"url": "https://api.github.com/users/shahzzzam",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2710/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2710/timeline
| null |
completed
| null | null | false |
[
"> This is my first issue ever on any repo. So kindly forgive me if I'm asking a naive question or don't have enough information provided.\n\nWelcome! Thanks for getting started with us! :tada: :cake: :balloon: \n\nSo to your core question, the `guess_filename` function is a bit tricky. It basically does its best to handle things that might reasonably be passed in that come from the standard library. Sadly, writing special-case code to handle all non-stdlib objects is probably outside the scope of this library.\n\nThe \"correct\" way to do this is to use the extended form of the `files` parameter. Rather than pass just the `FileStorage` object as the value of the dict, try passing a tuple of the form `(payload_file.filename, payload_file)`. The first element will be used as the filename, overriding the choice of `guess_filename`. That's what you should use whenever our choice is bad.\n",
"Oh yeah. I should have read [this](http://www.python-requests.org/en/latest/api/#main-interface) properly. It solved the **filename** problem. :+1: Thank you so much.\n\nAlthough, it opened up a problem incomprehensible to me,\n\nWhile feeding the file from Client-side API to Server-side API through `Requests`, the `content-type` gets set to `None`.\n\nEdit: **Not only that, the file itself is None (= empty). Why is that?**\n- This is when I POST to the `Requests` API, inside the `sessions.py`, the value of `files` is this:\n\n<img width=\"892\" alt=\"before_req\" src=\"https://cloud.githubusercontent.com/assets/10097486/9141625/1f96a104-3d08-11e5-88b9-5e940fb484e7.png\">\n- But, when it comes out of it (I tried really hard to debug, but didn't understand what's happening behind the scenes), this is the format of `files` in the entry point of Server-side API.\n\n<img width=\"695\" alt=\"after_req\" src=\"https://cloud.githubusercontent.com/assets/10097486/9141634/30df18ce-3d08-11e5-98ad-9e932518bdc4.png\">\n- Do you think I am doing something wrong here? Can you suggest something?\n- Also, why can't I do this:\n\n``` Python\ndef post():\n payload_json = {} # data in json format\n payload_files = {} # abc.png\n url = http://xyz/api/\n response_obj = requests.post(url, json=payload_data, files=payload_file)\n```\n\n(If I do this, the `files` data gets sent and the `json` and `data` header are `None`.)\n\nInstead of this:\n\n``` Python\ndef post():\n payload_data = {} # some data\n payload_files = {} # abc.png\n url = http://xyz/api/\n response_obj = requests.post(url, data=payload_data, files=payload_file)\n```\n",
"Probably the server side takes the content type from a specific header. Try using the even more extended form of the `files` parameter, by passing a tuple of the form `(payload_file.filename, payload_file, payload_file.content_type)`.\n\nAs to why you can't use JSON with files, the main reason is that it's not clear how the JSON will fit into the multipart upload. It's generally better to let users do this themselves,as we will just get it wrong.\n"
] |
https://api.github.com/repos/psf/requests/issues/2709
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2709/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2709/comments
|
https://api.github.com/repos/psf/requests/issues/2709/events
|
https://github.com/psf/requests/issues/2709
| 99,188,937 |
MDU6SXNzdWU5OTE4ODkzNw==
| 2,709 |
Problem sending POST data to a login form with the character "ñ" (With Curl Works)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3330998?v=4",
"events_url": "https://api.github.com/users/nguaman/events{/privacy}",
"followers_url": "https://api.github.com/users/nguaman/followers",
"following_url": "https://api.github.com/users/nguaman/following{/other_user}",
"gists_url": "https://api.github.com/users/nguaman/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nguaman",
"id": 3330998,
"login": "nguaman",
"node_id": "MDQ6VXNlcjMzMzA5OTg=",
"organizations_url": "https://api.github.com/users/nguaman/orgs",
"received_events_url": "https://api.github.com/users/nguaman/received_events",
"repos_url": "https://api.github.com/users/nguaman/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nguaman/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nguaman/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nguaman",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2015-08-05T12:03:20Z
|
2021-09-08T23:00:43Z
|
2015-08-05T12:52:58Z
|
NONE
|
resolved
|
With curl works perfectly and I can Login.
curl 'https://server.com/server.jsp?login' --data 'username=usernamex&PASSWORD=contrase%F1a'
contrase%F1a15 means "contraseña15"
but when I try to send the same content with request I recieve the password is incorrect.
payload = 'username=usernamex&PASSWORD=contrase%F1a'
login = "https://server.com/server.jsp?login"
r = requests.post(url=login,data=payload)
print r.text
How I can fix that? I try with serveral options. Like payload = {'password':'contraseña15" ..} and using json.dumps, but nothing, Always I recieve the same Result.
Please Help!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2709/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2709/timeline
| null |
completed
| null | null | false |
[
"I also I try this\n\nimport urllib2, urllib\nurl = 'https://server.com/server.jsp?login'\ndata = urllib.urlencode({'USUARIO':'username','PASSWORD':'contrase%F1a15'})\nreq = urllib2.Request(url, data)\nresponse = urllib2.urlopen(req)\nd = response.read()\nprint d\n\nAnd Nothing. Only works when I send the request from curl \n",
"Something extra\n\nWith this curl I recieve the same result like with requests\ncurl 'https://server.com/server.jsp?login' --data 'USUARIO=username&PASSWORD=contraseña15'\n\nAnd this works perfeclty.\ncurl 'https://server.com/server.jsp?login' --data 'USUARIO=username&PASSWORD=contrase%F1a15' \n",
"All questions and people looking for help using requests should ask on StackOverflow. This is a place for known defects not support requests.\n",
"@sigmavirus24 I understand that.\nbut why works perfeclty with pycurl?\n\nimport pycurl\nc = pycurl.Curl()\nc.setopt(c.URL, 'https://server.com/server.jsp?login')\npostfields = 'USUARIO=username&PASSWORD=contrase%F1a15'\nc.setopt(c.VERBOSE, True)\nc.setopt(c.POSTFIELDS, postfields)\nc.perform()\nc.close()\n\nI think maybe is some bug with the encoding.\n\nBecause if I try the same with requests, this dont work.\n",
"In Details\n\nimport requests\nimport pycurl\n\nlogin_url = \"https://server.com/server.jsp?login\"\npostfields = 'username=usernamex&PASSWORD=contrase%F1a'\n\nr = requests.post(login_url,postfields)\n#This return a html with relatedd with error Login\nprint r.text\n\n#And this Works perfectly.\nc = pycurl.Curl()\nc.setopt(c.URL, login_url)\nc.setopt(c.VERBOSE, True)\nc.setopt(c.POSTFIELDS, postfields)\nc.perform()\nc.close()\n",
"This was solved [here](https://stackoverflow.com/questions/31832015/issue-sending-post-data-to-a-login-form-with-the-character-%c3%b1-with-curl-works) as it should have been.\n\n``` py\n>>> req = requests.Request(method='POST', url='http://example.com', data={'username': 'usernamex', 'PASSWORD': 'contraseña15'})\n>>> req\n<Request [POST]>\n>>> req.data\n{'username': 'usernamex', 'PASSWORD': 'contrase\\xc3\\xb1a15'}\n>>> req.prepare()\n<PreparedRequest [POST]>\n>>> p = _\n>>> p.body\n'username=usernamex&PASSWORD=contrase%C3%B1a15'\n>>> req2 = requests.Request(method='POST', url='http://example.com', data='username=usernamex&PASSWORD=contrase%F1a15')\n>>> req2.prepare().body\n'username=usernamex&PASSWORD=contrase%F1a15'\n>>> req3 = requests.Request(method='POST', url='http://example.com', data={'username': 'usernamex', 'PASSWORD': u'contraseña15'})\n>>> req3.data\n{'username': 'usernamex', 'PASSWORD': u'contrase\\xf1a15'}\n>>> req3.prepare().body\n'username=usernamex&PASSWORD=contrase%C3%B1a15'\n```\n\nSo the encoding there seems unnecessary on Python 2. The following is my output from a Python 3 interpreter:\n\n``` py\n>>> req = requests.Request(method='POST', url='http://example.com', data={'username': 'usernamex', 'PASSW\nORD': 'contraseña15'})\n>>> req.prepare().body\n'PASSWORD=contrase%C3%B1a15&username=usernamex'\n>>> req2 = requests.Request(method='POST', url='http://example.com', data='username=usernamex&PASSWORD=co\nntrase%F1a15')\n>>> req2.prepare().body\n'username=usernamex&PASSWORD=contrase%F1a15'\n>>> req3 = requests.Request(method='POST', url='http://example.com', data={'username': 'usernamex', 'PASSWORD': u'contraseña15'})\n>>> req3.prepare().body\n'PASSWORD=contrase%C3%B1a15&username=usernamex'\n```\n\nAgain, I don't see a bug here.\n",
"Thanks for the reply, I hope this could help to another people with the same Issue.\nThanks @sigmavirus24 \n"
] |
https://api.github.com/repos/psf/requests/issues/2708
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2708/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2708/comments
|
https://api.github.com/repos/psf/requests/issues/2708/events
|
https://github.com/psf/requests/issues/2708
| 99,162,722 |
MDU6SXNzdWU5OTE2MjcyMg==
| 2,708 |
user-agent not sent in CONNECT makes the request fail (proxy is used)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1769773?v=4",
"events_url": "https://api.github.com/users/sim0nx/events{/privacy}",
"followers_url": "https://api.github.com/users/sim0nx/followers",
"following_url": "https://api.github.com/users/sim0nx/following{/other_user}",
"gists_url": "https://api.github.com/users/sim0nx/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sim0nx",
"id": 1769773,
"login": "sim0nx",
"node_id": "MDQ6VXNlcjE3Njk3NzM=",
"organizations_url": "https://api.github.com/users/sim0nx/orgs",
"received_events_url": "https://api.github.com/users/sim0nx/received_events",
"repos_url": "https://api.github.com/users/sim0nx/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sim0nx/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sim0nx/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sim0nx",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
}
] |
closed
| false | null |
[] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2024-05-19T18:29:04Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/34",
"id": 11073254,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/34/labels",
"node_id": "MI_kwDOABTKOs4AqPbm",
"number": 34,
"open_issues": 0,
"state": "open",
"title": "Bankruptcy",
"updated_at": "2024-05-20T14:37:16Z",
"url": "https://api.github.com/repos/psf/requests/milestones/34"
}
| 15 |
2015-08-05T09:29:13Z
|
2024-05-20T14:36:16Z
|
2024-05-20T14:36:16Z
|
NONE
| null |
Hi,
I am trying a simple get request to a https-url which goes through a proxy (http proxy).
Now the issue is that this proxy blocks connection attempts which have no user-agent set, and for whatever reason requests does not set one in the CONNECT request and I cannot seem to force it to do so.
Going through the same proxy using e.g. curl or wget works as expected as they do set one (same for browsers).
I couldn't find this having been reported before but couldn't find a solution to it either.
python: 2.7.9
requests: 2.4.3
python-ndg-httpsclient: 0.3.2
python-openssl: 0.14
python-pyasn1: 0.1.7
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 3,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 3,
"url": "https://api.github.com/repos/psf/requests/issues/2708/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2708/timeline
| null |
completed
| null | null | false |
[
"Hey, that's weird.\n\nI suppose that makes a degree of sense though. Can you try this?\n\n``` python\nimport requests\nfrom requests.adapters import HTTPAdapter\n\nclass ProxyUAAdapter(HTTPAdapter):\n def proxy_headers(self, proxy):\n headers = super(ProxyUAAdapter, self).proxy_headers(proxy)\n headers['User-Agent'] = requests.utils.default_user_agent()\n return headers\n\ns = requests.Session()\ns.mount('http://', ProxyUAAdapter())\ns.mount('https://', ProxyUAAdapter())\n\n# Make requests through the session, e.g. s.get(url)\n```\n\nCan you confirm that this does in fact work?\n",
"Hi,\n\nYes that does indeed work, thank you very much!\nI am wondering if this is expected behaviour and if I am supposed to implement it like that or if this is a bug ?\n\nMy guess is that it should use the same user-agent it does for the rest of the connection also for the CONNECT part. What do you think?\n",
"> I am wondering if this is expected behaviour and if I am supposed to implement it like that or if this is a bug ?\n\nHeh, that's difficult. I'd say that the best way to describe this is that it's an oversight. The extension hooks are in place to do this, but I agree that the logic seems to be wrong here. One thing I do want to check is what happens with a plaintext HTTP request. Are you skilled enough with tcpdump or wireshark to snoop your HTTP requests?\n",
"Yes I am, just tell me what you want me to check and I can do that.\n\nYou are basically interested in a GET via proxy ?\n",
"Yeah, I want a GET to a `http://` website via the proxy, and then the tcpdump/wireshark of that. What I'm specifically worried about is having two different user-agents in the request.\n",
"I sent you the pcap file via e-mail.\n",
"Alright, based on a really quick look at the pcap file this seems to behave mostly right: that is, we don't appear to send two user-agent headers.\n\nHowever, it doesn't work well with our user-agent override. If you override the user-agent header from the CLI, the proxy gets the requests default user-agent. That sucks a little bit. It would be nice if we could adjust the code to use the user-agent provided by the user. Unfortunately, the Transport Adapter is potentially a bit low-level for that.\n\nI suppose the `proxy_headers` function could be passed the headers on the request and could search for a user-agent header. I don't like that much though.\n",
":+1: \n",
"Thanks for the review.\nI am wondering why doing that in the proxy_headers function would be bad idea?\n\nDo you think of any better solution ?\n",
"Two reasons.\n\nFirstly, it changes the signature of the `proxy_headers` function, which represents an API change. That hurts people subclassing the adapter if they've overridden that method, which means we'd need to defer the change to 3.0.0.\n\nSecondly, the `proxy_headers` function really shouldn't need to know that information.\n\nI wonder if we can just pass the scheme instead and use that. That doesn't avoid problem 1, but it does restrict the scope of problem 2.\n",
"So I've been looking at this for 2.9.0, and I just can't think of an API change that doesn't wreck the API of the transport adapter. So I think we need to move this to 3.0.0.\n",
"As an interim solution, I created PR https://github.com/requests/requests/pull/4794 to allow headers to be passed in without breaking the API.",
"I have the same issue using pip behind a proxy. It blocks the CONNECT because there is no USER_AGENT.\r\n\r\n```\r\nCONNECT download.pytorch.org:443 HTTP/1.0\r\nProxy-Authorization: Basic \r\n\r\nHTTP/1.1 403 Forbidden...\r\n```\r\n\r\n",
"I submitted a PR #4794 that fixed this a year ago but the maintainers would rather wait for 3.0 to close this issue, which has been open for over 4 years.\r\n\r\nIt may just be me, but I'd rather have a \"hacky\" fix to a real problem instead of waiting half a decade to address the issue. Perfect is the enemy of good.",
"In an effort to clean up the issue tracker to only have issues that are still relevant to the project we've done a quick pass and decided this issue may no longer be relevant for a variety of potential reasons, including:\r\n\r\n* Applies to a much older version, unclear whether the issue still applies.\r\n* Change requires a backwards incompatible release and it's unclear if the benefits are worth the migration effort from the community.\r\n* There isn't a clear demand from the community on the change landing in Requests.\r\n\r\nIf you think the issue should remain open, please comment so below or open a new issue and link back to the original issue. Again, thank you for opening the issue and for the discussion, it's much appreciated."
] |
https://api.github.com/repos/psf/requests/issues/2707
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2707/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2707/comments
|
https://api.github.com/repos/psf/requests/issues/2707/events
|
https://github.com/psf/requests/issues/2707
| 99,147,163 |
MDU6SXNzdWU5OTE0NzE2Mw==
| 2,707 |
parse_header_links fails for momento link headers
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/15092?v=4",
"events_url": "https://api.github.com/users/jayvdb/events{/privacy}",
"followers_url": "https://api.github.com/users/jayvdb/followers",
"following_url": "https://api.github.com/users/jayvdb/following{/other_user}",
"gists_url": "https://api.github.com/users/jayvdb/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jayvdb",
"id": 15092,
"login": "jayvdb",
"node_id": "MDQ6VXNlcjE1MDky",
"organizations_url": "https://api.github.com/users/jayvdb/orgs",
"received_events_url": "https://api.github.com/users/jayvdb/received_events",
"repos_url": "https://api.github.com/users/jayvdb/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jayvdb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jayvdb/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jayvdb",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
}
] |
closed
| true | null |
[] | null | 4 |
2015-08-05T08:05:52Z
|
2021-09-08T15:00:45Z
|
2016-09-23T06:43:09Z
|
CONTRIBUTOR
|
resolved
|
The link header for http://ws-dl-05.cs.odu.edu/demo/index.php/Tyrion_Lannister is not being parsed correctly by `parse_header_links`, and that function doesnt look completely RFC 5988 compliant.
```
>>> r = requests.get('http://ws-dl-05.cs.odu.edu/demo/index.php/Tyrion_Lannister')
>>> r.headers['link']
'<http://ws-dl-05.cs.odu.edu/demo/index.php/Tyrion_Lannister>; rel="original latest-version",<http://ws-dl-05.cs.odu.edu/demo/index.php/Special:TimeGate/Tyrion_Lannister>; rel="timegate",<http://ws-dl-05.cs.odu.edu/demo/index.php/Special:TimeMap/Tyrion_Lannister>; rel="timemap"; type="application/link-format"; from="Mon, 23 Apr 2007 20:26:15 GMT"; until="Fri, 06 Sep 2013 17:19:06 GMT",<http://ws-dl-05.cs.odu.edu/demo/index.php?title=Tyrion_Lannister&oldid=1714>; rel="first memento"; datetime="Mon, 23 Apr 2007 20:26:15 GMT",<http://ws-dl-05.cs.odu.edu/demo/index.php?title=Tyrion_Lannister&oldid=107222>; rel="last memento"; datetime="Fri, 06 Sep 2013 17:19:06 GMT"'
>>> pp = pprint.PrettyPrinter(indent=4)
>>> pp.pprint(r.links)
{ '06 Sep 2013 17:19:06 GMT': { 'url': '06 Sep 2013 17:19:06 GMT'},
'23 Apr 2007 20:26:15 GMT': { 'url': '23 Apr 2007 20:26:15 GMT'},
'first memento': { 'datetime': 'Mon',
'rel': 'first memento',
'url': 'http://ws-dl-05.cs.odu.edu/demo/index.php?title=Tyrion_Lannister&oldid=1714'},
'last memento': { 'datetime': 'Fri',
'rel': 'last memento',
'url': 'http://ws-dl-05.cs.odu.edu/demo/index.php?title=Tyrion_Lannister&oldid=107222'},
'original latest-version': { 'rel': 'original latest-version',
'url': 'http://ws-dl-05.cs.odu.edu/demo/index.php/Tyrion_Lannister'},
'timegate': { 'rel': 'timegate',
'url': 'http://ws-dl-05.cs.odu.edu/demo/index.php/Special:TimeGate/Tyrion_Lannister'},
'timemap': { 'from': 'Mon',
'rel': 'timemap',
'type': 'application/link-format',
'url': 'http://ws-dl-05.cs.odu.edu/demo/index.php/Special:TimeMap/Tyrion_Lannister'}}
```
Oddly this was the subject of #2250, as it appears the comma's are still causing the problem, so it may be that some corner cases still need to be handled better, and some unit tests written to confirm the problem is actually solved. Happy to help with either.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/15092?v=4",
"events_url": "https://api.github.com/users/jayvdb/events{/privacy}",
"followers_url": "https://api.github.com/users/jayvdb/followers",
"following_url": "https://api.github.com/users/jayvdb/following{/other_user}",
"gists_url": "https://api.github.com/users/jayvdb/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jayvdb",
"id": 15092,
"login": "jayvdb",
"node_id": "MDQ6VXNlcjE1MDky",
"organizations_url": "https://api.github.com/users/jayvdb/orgs",
"received_events_url": "https://api.github.com/users/jayvdb/received_events",
"repos_url": "https://api.github.com/users/jayvdb/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jayvdb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jayvdb/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jayvdb",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2707/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2707/timeline
| null |
completed
| null | null | false |
[
"I'm not really surprised that the links parser isn't perfect. Contributions are welcome to improve it: otherwise, it'll go on our backlog until one of us has time to take a swing at doing it properly.\n",
"And to be clear we explicitly aren't 5988 complaint because Kenneth decided he did not want to handle the case where one rel could have multiple links. There's an effort to add that to the requests-toolbelt though.\n",
"This has already been fixed by https://github.com/kennethreitz/requests/pull/2271.\n",
"Indeed. I cant see any problems with the output using 2.11.1, with that website.\n\n``` python\n>>> import requests, pprint\n>>> pp = pprint.PrettyPrinter(indent=4)\n>>> requests.__version__\n'2.11.1'\n>>> r = requests.get('http://ws-dl-05.cs.odu.edu/demo/index.php/Tyrion_Lannister')\n>>> pp.pprint(r.headers['link'])\n('<http://ws-dl-05.cs.odu.edu/demo/index.php/Tyrion_Lannister>; rel=\"original '\n 'latest-version\",<http://ws-dl-05.cs.odu.edu/demo/index.php/Special:TimeGate/Tyrion_Lannister>; '\n 'rel=\"timegate\",<http://ws-dl-05.cs.odu.edu/demo/index.php/Special:TimeMap/Tyrion_Lannister>; '\n 'rel=\"timemap\"; type=\"application/link-format\"; from=\"Mon, 23 Apr 2007 '\n '20:26:15 GMT\"; until=\"Fri, 06 Sep 2013 17:19:06 '\n 'GMT\",<http://ws-dl-05.cs.odu.edu/demo/index.php?title=Tyrion_Lannister&oldid=1714>; '\n 'rel=\"first memento\"; datetime=\"Mon, 23 Apr 2007 20:26:15 '\n 'GMT\",<http://ws-dl-05.cs.odu.edu/demo/index.php?title=Tyrion_Lannister&oldid=107222>; '\n 'rel=\"last memento\"; datetime=\"Fri, 06 Sep 2013 17:19:06 GMT\"')\n>>> pp.pprint(r.links)\n{ 'first memento': { 'datetime': 'Mon, 23 Apr 2007 20:26:15 GMT',\n 'rel': 'first memento',\n 'url': 'http://ws-dl-05.cs.odu.edu/demo/index.php?title=Tyrion_Lannister&oldid=1714'},\n 'last memento': { 'datetime': 'Fri, 06 Sep 2013 17:19:06 GMT',\n 'rel': 'last memento',\n 'url': 'http://ws-dl-05.cs.odu.edu/demo/index.php?title=Tyrion_Lannister&oldid=107222'},\n 'original latest-version': { 'rel': 'original latest-version',\n 'url': 'http://ws-dl-05.cs.odu.edu/demo/index.php/Tyrion_Lannister'},\n 'timegate': { 'rel': 'timegate',\n 'url': 'http://ws-dl-05.cs.odu.edu/demo/index.php/Special:TimeGate/Tyrion_Lannister'},\n 'timemap': { 'from': 'Mon, 23 Apr 2007 20:26:15 GMT',\n 'rel': 'timemap',\n 'type': 'application/link-format',\n 'until': 'Fri, 06 Sep 2013 17:19:06 GMT',\n 'url': 'http://ws-dl-05.cs.odu.edu/demo/index.php/Special:TimeMap/Tyrion_Lannister'}}\n```\n\n@shawnmjones, maybe there are other websites to test with for compliance?\n"
] |
https://api.github.com/repos/psf/requests/issues/2706
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2706/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2706/comments
|
https://api.github.com/repos/psf/requests/issues/2706/events
|
https://github.com/psf/requests/pull/2706
| 99,069,689 |
MDExOlB1bGxSZXF1ZXN0NDE2MjE5NTQ=
| 2,706 |
Fix merge setting for not preserving original order of dict parameters
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10278601?v=4",
"events_url": "https://api.github.com/users/ak1r4/events{/privacy}",
"followers_url": "https://api.github.com/users/ak1r4/followers",
"following_url": "https://api.github.com/users/ak1r4/following{/other_user}",
"gists_url": "https://api.github.com/users/ak1r4/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ak1r4",
"id": 10278601,
"login": "ak1r4",
"node_id": "MDQ6VXNlcjEwMjc4NjAx",
"organizations_url": "https://api.github.com/users/ak1r4/orgs",
"received_events_url": "https://api.github.com/users/ak1r4/received_events",
"repos_url": "https://api.github.com/users/ak1r4/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ak1r4/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ak1r4/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ak1r4",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 13 |
2015-08-04T21:50:38Z
|
2021-09-08T07:00:44Z
|
2015-08-15T15:39:46Z
|
CONTRIBUTOR
|
resolved
|
Fix a bug introduced by https://github.com/kennethreitz/requests/pull/1921
Bug: the ordered dictionary (the default `dict_class`) is accidentally converted back to normal dictionary.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2706/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2706/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2706.diff",
"html_url": "https://github.com/psf/requests/pull/2706",
"merged_at": "2015-08-15T15:39:46Z",
"patch_url": "https://github.com/psf/requests/pull/2706.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2706"
}
| true |
[
"Can you explain how this is a bug? In what context do you expect order to preserved but it isn't?\n",
"I think the bug here is that we don't use the chosen dict subclass, which means that an ordered dict might be lost. I think I'm tentatively +1 on this change.\n",
"Based on the _test_ I'm guessing this affects `params`, `files`, `data`, and `headers`. In that case I'm +1 too, but I'd still like confirmation rather than accepting a fix without a bug description.\n",
"@sigmavirus24 @Lukasa This bug was found when we generate signature based on the order of parameters (signature would be a hash of the url, for example), while on the server side, the signature is validated using the same process based on the query string passed to the server. In this case, the order of the parameters were changed by requests library underneath.\n\nOn the other hand, the approach #1921 took is not recommended either, because the order of keys even in normal dictionaries depends on when the insertion and deletion happens, so it is better not to reconstruct the dictionary, do a `del` instead. See the example below:\n\n``` python\n>>> d = {'b': 1, 'a': 1}\n>>> d.update({'d': 1, 'z': 1, 'f': 1})\n>>> d.items()\n[('a', 1), ('b', 1), ('z', 1), ('f', 1), ('d', 1)]\n>>> d1 = dict(d)\n>>> d1.items()\n[('a', 1), ('d', 1), ('b', 1), ('z', 1), ('f', 1)]\n>>> d1 == d\nTrue\n>>> d1.items() == d.items()\nFalse\n```\n",
"@sigmavirus24 @Lukasa Any thoughts on this PR?\n",
"I'm :+1:: @sigmavirus24?\n",
":+1: from me too\n",
"Alright, let's try to land this in 2.8.0. @sigmavirus24 I'll go ahead and merge it to the 2.8.0 branch and update the changelog appropriately.\n",
"Ok, merged into the 2.8.0 branch in 5d7392f140ed476cdc99f4178e425ce2fbb05884.\n",
":sparkles: :cake: :sparkles:\n",
"This introduced a bug on Python 3; you are deleting keys while iterating over the dictionary items so Python 3 will throw a `RuntimeError: dictionary changed size during iteration` exception. Easily reproduced by using `requests.get('http://httpbin.org/get', headers={'foo': None})`:\n\n``` python\n>>> requests.get('http://httpbin.org/get', headers={'foo': None})\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/Users/mj/Development/venvs/stackoverflow-3.4/requests/requests/api.py\", line 69, in get\n return request('get', url, params=params, **kwargs)\n File \"/Users/mj/Development/venvs/stackoverflow-3.4/requests/requests/api.py\", line 50, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"/Users/mj/Development/venvs/stackoverflow-3.4/requests/requests/sessions.py\", line 455, in request\n prep = self.prepare_request(req)\n File \"/Users/mj/Development/venvs/stackoverflow-3.4/requests/requests/sessions.py\", line 382, in prepare_request\n headers=merge_setting(request.headers, self.headers, dict_class=CaseInsensitiveDict),\n File \"/Users/mj/Development/venvs/stackoverflow-3.4/requests/requests/sessions.py\", line 66, in merge_setting\n for (k, v) in merged_setting.items():\n File \"/Users/mj/Development/Library/buildout.python/parts/opt/lib/python3.4/_collections_abc.py\", line 503, in __iter__\n for key in self._mapping:\n File \"/Users/mj/Development/venvs/stackoverflow-3.4/requests/requests/structures.py\", line 60, in <genexpr>\n return (casedkey for casedkey, mappedvalue in self._store.values())\nRuntimeError: dictionary changed size during iteration\n```\n",
"A work-around would be to collect the keys first:\n\n``` python\nnone_keys = [k for k, v in merged_setting.items() if v is None]\nfor key in none_keys:\n del merged_setting[key]\n```\n",
"Fix available in #2737; ironically the test suite already caught this but I guess it wasn't yet run on 3.x? I'm ignoring the `test_expires_valid_str` failure I'm seeing, just so you know. :-P\n"
] |
https://api.github.com/repos/psf/requests/issues/2705
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2705/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2705/comments
|
https://api.github.com/repos/psf/requests/issues/2705/events
|
https://github.com/psf/requests/issues/2705
| 98,965,184 |
MDU6SXNzdWU5ODk2NTE4NA==
| 2,705 |
Using requests.get with unverified certificates
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3463456?v=4",
"events_url": "https://api.github.com/users/jakubvojacek/events{/privacy}",
"followers_url": "https://api.github.com/users/jakubvojacek/followers",
"following_url": "https://api.github.com/users/jakubvojacek/following{/other_user}",
"gists_url": "https://api.github.com/users/jakubvojacek/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jakubvojacek",
"id": 3463456,
"login": "jakubvojacek",
"node_id": "MDQ6VXNlcjM0NjM0NTY=",
"organizations_url": "https://api.github.com/users/jakubvojacek/orgs",
"received_events_url": "https://api.github.com/users/jakubvojacek/received_events",
"repos_url": "https://api.github.com/users/jakubvojacek/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jakubvojacek/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jakubvojacek/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jakubvojacek",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2015-08-04T13:11:47Z
|
2021-09-08T23:00:43Z
|
2015-08-04T20:43:25Z
|
NONE
|
resolved
|
Hello, I am trying to use requests module to get & post our intranet websevice that has unverified certificates. We had no issues until the new version of python that actually does verify certificates by default. What is the correct way (or is it even supported) of doing so?
What I am currently doing is that I get the certificate and public key this way:
`openssl s_client -connect 192.168.137.1:4443 | openssl x509 -pubkey > ca.crt`
The content of `ca.crt` then looks as follows:
```
-----BEGIN PUBLIC KEY-----
...
-----END PUBLIC KEY-----
-----BEGIN CERTIFICATE-----
...
-----END CERTIFICATE-----
```
And the request I send looks like:
```
requests.get('https://192.168.137.1:4443', verify=True, cert='ca.crt')
```
which gives me error:
```
Traceback (most recent call last):
File "req.py", line 2, in <module>
requests.get('https://192.168.137.1:4443', verify=False, cert='testicek')
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 69, in get
return request('get', url, params=params, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 50, in request
response = session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 465, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 573, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 431, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: [Errno 336265225] _ssl.c:351: error:140B0009:SSL routines:SSL_CTX_use_PrivateKey_file:PEM lib
```
Can you please guide me what I am doing wrong?
Thank you for all your support!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3463456?v=4",
"events_url": "https://api.github.com/users/jakubvojacek/events{/privacy}",
"followers_url": "https://api.github.com/users/jakubvojacek/followers",
"following_url": "https://api.github.com/users/jakubvojacek/following{/other_user}",
"gists_url": "https://api.github.com/users/jakubvojacek/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jakubvojacek",
"id": 3463456,
"login": "jakubvojacek",
"node_id": "MDQ6VXNlcjM0NjM0NTY=",
"organizations_url": "https://api.github.com/users/jakubvojacek/orgs",
"received_events_url": "https://api.github.com/users/jakubvojacek/received_events",
"repos_url": "https://api.github.com/users/jakubvojacek/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jakubvojacek/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jakubvojacek/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jakubvojacek",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2705/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2705/timeline
| null |
completed
| null | null | false |
[
"Thanks for raising this issue!\n\nThe correct form of the request is:\n\n``` python\nrequests.get('https://192.168.137.1:4443', verify='ca.crt')\n```\n\nThe `cert` keyword argument is for client certificates.\n",
"Thank you, I just tested it, the code now looks as follows:\n\n```\nimport requests\nrequests.get('https://192.168.137.1:4443', verify='ca.crt')\n```\n\nHowever when I attempt to run it, I get error:\n\n```\nTraceback (most recent call last):\n File \"req.py\", line 2, in <module>\n requests.get('https://192.168.137.1:4443', verify='testicek')\n File \"/usr/local/lib/python2.7/dist-packages/requests/api.py\", line 69, in get\n return request('get', url, params=params, **kwargs)\n File \"/usr/local/lib/python2.7/dist-packages/requests/api.py\", line 50, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"/usr/local/lib/python2.7/dist-packages/requests/sessions.py\", line 465, in request\n resp = self.send(prep, **send_kwargs)\n File \"/usr/local/lib/python2.7/dist-packages/requests/sessions.py\", line 573, in send\n r = adapter.send(request, **kwargs)\n File \"/usr/local/lib/python2.7/dist-packages/requests/adapters.py\", line 431, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: [Errno 1] _ssl.c:504: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed\n```\n\nIs it possible that the `ca.crt` that was generated with the previous mentioned command is incorrect? \n\nThank you!\n",
"Yeah, that command is wrong. You don't want the public keys, you want the certificates themselves. It's kinda tricky to write the command, but if you run `openssl s_client -showcerts -connect 192.168.137.1:4443`, and then grab each of the `BEGIN CERTIFICATE/END CERTIFICATE` blocks and put them together in a `pem` file. Give that a shot.\n",
"Thank you! That was exactly what I needed, you're a life saver!\n"
] |
https://api.github.com/repos/psf/requests/issues/2704
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2704/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2704/comments
|
https://api.github.com/repos/psf/requests/issues/2704/events
|
https://github.com/psf/requests/pull/2704
| 98,964,102 |
MDExOlB1bGxSZXF1ZXN0NDE1Njc0Nzk=
| 2,704 |
Mention SNI being backported to Python2.7.9 in FAQ
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1011548?v=4",
"events_url": "https://api.github.com/users/sYnfo/events{/privacy}",
"followers_url": "https://api.github.com/users/sYnfo/followers",
"following_url": "https://api.github.com/users/sYnfo/following{/other_user}",
"gists_url": "https://api.github.com/users/sYnfo/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sYnfo",
"id": 1011548,
"login": "sYnfo",
"node_id": "MDQ6VXNlcjEwMTE1NDg=",
"organizations_url": "https://api.github.com/users/sYnfo/orgs",
"received_events_url": "https://api.github.com/users/sYnfo/received_events",
"repos_url": "https://api.github.com/users/sYnfo/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sYnfo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sYnfo/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sYnfo",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-08-04T13:08:33Z
|
2021-09-08T07:00:47Z
|
2015-08-04T13:54:30Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2704/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2704/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2704.diff",
"html_url": "https://github.com/psf/requests/pull/2704",
"merged_at": "2015-08-04T13:54:30Z",
"patch_url": "https://github.com/psf/requests/pull/2704.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2704"
}
| true |
[
"Thanks! \\o/\n",
":cake: :sparkles: :cake:\n"
] |
|
https://api.github.com/repos/psf/requests/issues/2703
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2703/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2703/comments
|
https://api.github.com/repos/psf/requests/issues/2703/events
|
https://github.com/psf/requests/issues/2703
| 98,864,415 |
MDU6SXNzdWU5ODg2NDQxNQ==
| 2,703 |
Broken parsing of authenticated URI for Proxy
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1562904?v=4",
"events_url": "https://api.github.com/users/kharmalord/events{/privacy}",
"followers_url": "https://api.github.com/users/kharmalord/followers",
"following_url": "https://api.github.com/users/kharmalord/following{/other_user}",
"gists_url": "https://api.github.com/users/kharmalord/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kharmalord",
"id": 1562904,
"login": "kharmalord",
"node_id": "MDQ6VXNlcjE1NjI5MDQ=",
"organizations_url": "https://api.github.com/users/kharmalord/orgs",
"received_events_url": "https://api.github.com/users/kharmalord/received_events",
"repos_url": "https://api.github.com/users/kharmalord/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kharmalord/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kharmalord/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kharmalord",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2015-08-04T00:38:50Z
|
2021-09-08T21:00:49Z
|
2015-11-05T10:35:55Z
|
NONE
|
resolved
|
Opening issue (appears to be related to issue https://github.com/kennethreitz/requests/issues/1856) as this issue is still breaking for me.
When I run the following it breaks:
``` py
>>> import requests
>>> url = "http://sitereview.bluecoat.com/rest/categorization"
>>> data = {'url':'google.com'}
>>> proxy = { "http": "http://test.username:testp@ssword#@proxy.url.com:8080"}
>>> header = {'user-agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.9; rv:33.0) Gecko/20100101 Firefox/33.0'}
>>> request = requests.post(url, proxies=proxy, headers=header, data=data)
```
The error it is giving is as follows:
```
requests.exceptions.ConnectionError: HTTPConnectionPool(host='ssword', port=80): Max retries exceeded with url: http://sitereview.bluecoat.com/rest/categorization (Caused by ProxyError('Cannot connect to proxy.', gaierror(-2, 'Name or service not known')))
```
If I change the password over to testp@ssword instead of `testp@ssword#` it works just fine.
It may be that I am rather new but I would appreciate any help or insight you can provide.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2703/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2703/timeline
| null |
completed
| null | null | false |
[
"What version of requests are you using?\n",
"I am using version 2.5.1. I also tried it with the latest version in the pip repo's which I believe is version 2.7 Same issue on both versions.\n",
"So it's definitely the case that requests doesn't handle this well. In the short term, try encoding the fragment to `%23`. Longer term, we should work out how we can make this work correctly.\n",
"Alright so I was able to get this clean by adding the following to the password before it is put inside the proxies variable that gets passed to request:\n\nclean_pass = urllib.quote_plus(password)\n\nI found that quote_plus worked more effectively than urlencode as urlencode was throwing errors on the password with the error of \"TypeError: not a valid non-string sequence or mapping object\". \n\nJust know that quote_plus worked for me. It may give you folks a better possible solution (who knows).\n",
"So, I don't think there is a \"good way\" to handle this. Looking at the [User Info portion of RFC 3986](https://tools.ietf.org/html/rfc3986#section-3.2.1), you'll see that the user info is composed of:\n\n```\n unreserved = ALPHA / DIGIT / \"-\" / \".\" / \"_\" / \"~\"\n pct-encoded = \"%\" HEXDIG HEXDIG\n sub-delims = \"!\" / \"$\" / \"&\" / \"'\" / \"(\" / \")\"\n / \"*\" / \"+\" / \",\" / \";\" / \"=\"\n```\n\nAnd `:`. That means `@` and `#` are not valid characters unless they're percent-encoded, i.e., the user calls `urllib.quote_plus` on that information. The fact that we happen to parse the URI in such a way that is forgiving towards users with `@`s in their userinfo is nice but technically incorrect based on the specification. I don't think we have an obligation to further violate the spec to allow for `#` in user-info as well.\n",
"@sigmavirus24 \n\nWhile I don't disagree with you it may be a good idea to update the documentation to state that the quote_plus (or equivalent) is required if passwords are going to contain those characters. This may be the simplest way to resolve this issue. That way future users don't get confused.\n",
"Hmm, documenting our way our of this is really sad. I feel like we should be able to do something better than that, although I cannot for the life of me work out what that should be at the moment.\n",
"As far as a possible solution on your end the only thing I was able to come up was to break apart the string before it is parsed in the get_auth_from_url(url) function of the utils.py and encode the password function somewhere before the parsed = urlparse(url) was being called. It appears from my testing that that is where it is being broken.\n\nJust tossing out any options I can think of (even if they are hacky).\n"
] |
https://api.github.com/repos/psf/requests/issues/2702
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2702/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2702/comments
|
https://api.github.com/repos/psf/requests/issues/2702/events
|
https://github.com/psf/requests/issues/2702
| 98,713,924 |
MDU6SXNzdWU5ODcxMzkyNA==
| 2,702 |
Allow to modify query parameters in authentication classes
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/48936?v=4",
"events_url": "https://api.github.com/users/thedrow/events{/privacy}",
"followers_url": "https://api.github.com/users/thedrow/followers",
"following_url": "https://api.github.com/users/thedrow/following{/other_user}",
"gists_url": "https://api.github.com/users/thedrow/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/thedrow",
"id": 48936,
"login": "thedrow",
"node_id": "MDQ6VXNlcjQ4OTM2",
"organizations_url": "https://api.github.com/users/thedrow/orgs",
"received_events_url": "https://api.github.com/users/thedrow/received_events",
"repos_url": "https://api.github.com/users/thedrow/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/thedrow/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/thedrow/subscriptions",
"type": "User",
"url": "https://api.github.com/users/thedrow",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-08-03T10:26:17Z
|
2021-09-08T23:00:44Z
|
2015-08-03T13:19:39Z
|
NONE
|
resolved
|
Some APIs like https://www.virustotal.com/en/documentation/private-api require you to pass the API key in a query parameter which is not possible at the moment because the request is prepared before the authentication class is applied.
Using the following authentication class:
``` python
class VirusTotalAuth(auth.AuthBase):
def __init__(self, api_key):
self.api_key = api_key
def __call__(self, r):
r.params['apikey'] = self.api_key
return r
```
Produces this error:
```
AttributeError: 'PreparedRequest' object has no attribute 'params'
```
Which makes sense because you can't change a request's parameters after it has been prepared.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2702/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2702/timeline
| null |
completed
| null | null | false |
[
"> Which makes sense because you can't change a request's parameters after it has been prepared.\n\nYou can change a request's parameters. The current way to do so (since you're actually receiving a `PreparedRequest` instance) is to:\n1. Parse the url\n2. Change the parameters\n3. Unparse the url\n4. Set the `url` attribute\n\nThis is well documented in other similar issues.\n\nThat said, I would posit that an Authentication Handler for this purpose is not ideal. Once you've authenticated, it makes more sense to add the API key to a `Session`'s `params` dictionary so it is added to every request there-after automatically.\n"
] |
https://api.github.com/repos/psf/requests/issues/2701
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2701/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2701/comments
|
https://api.github.com/repos/psf/requests/issues/2701/events
|
https://github.com/psf/requests/issues/2701
| 98,671,701 |
MDU6SXNzdWU5ODY3MTcwMQ==
| 2,701 |
Option to disable percent encoding disabled?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4",
"events_url": "https://api.github.com/users/ghost/events{/privacy}",
"followers_url": "https://api.github.com/users/ghost/followers",
"following_url": "https://api.github.com/users/ghost/following{/other_user}",
"gists_url": "https://api.github.com/users/ghost/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ghost",
"id": 10137,
"login": "ghost",
"node_id": "MDQ6VXNlcjEwMTM3",
"organizations_url": "https://api.github.com/users/ghost/orgs",
"received_events_url": "https://api.github.com/users/ghost/received_events",
"repos_url": "https://api.github.com/users/ghost/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ghost/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ghost",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-08-03T04:59:27Z
|
2021-09-08T23:00:44Z
|
2015-08-03T13:12:28Z
|
NONE
|
resolved
|
Hi,
I know it's not good practice to disable percent encoding, but the server I'm connecting to doesn't support it. There seemed to be an option using requests.defaults in an earlier version, but it seems to be gone now. Is there any way to disable percent encoding? Thanks.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2701/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2701/timeline
| null |
completed
| null | null | false |
[
"This has been discussed before in:\n- https://github.com/kennethreitz/requests/pull/757\n- https://github.com/kennethreitz/requests/issues/1454\n\nAnd I'm sure other issues (but those are just the first two that a search of the issues found). Our current behaviour suits easily more than 95% of our users and thus it will stay. If you need to bypass (or change) how we encode the URL, the [PreparedRequest flow](http://docs.python-requests.org/en/latest/user/advanced/#prepared-requests) is what you need. For questions about that, please use [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests).\n"
] |
https://api.github.com/repos/psf/requests/issues/2700
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2700/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2700/comments
|
https://api.github.com/repos/psf/requests/issues/2700/events
|
https://github.com/psf/requests/issues/2700
| 98,494,157 |
MDU6SXNzdWU5ODQ5NDE1Nw==
| 2,700 |
inclusion of timeout parameter on PUT causes error 14, 'Bad address'
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2319207?v=4",
"events_url": "https://api.github.com/users/EliRibble/events{/privacy}",
"followers_url": "https://api.github.com/users/EliRibble/followers",
"following_url": "https://api.github.com/users/EliRibble/following{/other_user}",
"gists_url": "https://api.github.com/users/EliRibble/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/EliRibble",
"id": 2319207,
"login": "EliRibble",
"node_id": "MDQ6VXNlcjIzMTkyMDc=",
"organizations_url": "https://api.github.com/users/EliRibble/orgs",
"received_events_url": "https://api.github.com/users/EliRibble/received_events",
"repos_url": "https://api.github.com/users/EliRibble/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/EliRibble/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/EliRibble/subscriptions",
"type": "User",
"url": "https://api.github.com/users/EliRibble",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2015-08-01T00:51:46Z
|
2021-09-08T21:00:49Z
|
2015-11-05T10:36:12Z
|
NONE
|
resolved
|
Here's my minimal reproducing script:
``` python
Your branch is behind 'origin/master' by 1 commit, and can be fast-forwarded.
#!/usr/bin/env python
import requests
url = 'https://prod-vision-analysis.s3.amazonaws.com/3ad6c771-52aa-4a1a-b2b6-6cdb17068321-40-real?Signature=khZjzyCjahKuK7z0QFrLhdIP6iE%3D&Expires=1438388766&AWSAccessKeyId=AKIAJEDT3K6WCNEGLKTA'
data = 190079 # works
data = 190080 # does not
image = b'1' * data
requests.put(url, data=image[:data], headers={'Content-Type': 'image/png'}, timeout=60, verify=False)
```
Executing that script, provided my signed S3 URL doesn't expire, will cause the following stack trace on my mac:
```
/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py:768: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.org/en/latest/security.html
InsecureRequestWarning)
Traceback (most recent call last):
File "./test-upload.py", line 9, in <module>
requests.put(url, data=image[:data], headers={'Content-Type': 'image/png'}, timeout=60, verify=False)
File "/usr/local/lib/python2.7/site-packages/requests/api.py", line 122, in put
return request('put', url, data=data, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/api.py", line 50, in request
response = session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 465, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 573, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/adapters.py", line 415, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', error(14, 'Bad address'))
```
If I change the script to use 190079 rather than 190080 for the data length the request goes through just fine.
If I change the script to not have a timeout the request goes through just fine
If I change the script to set the timeout to be any other value, anything at all, it will still fail. It just has to be non-zero and non-None and long enough that the request can reasonably go through
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2700/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2700/timeline
| null |
completed
| null | null | false |
[
"So I just ran your script as is and I get a 403, not a ConnectionError. So I guess authentication needs to succeed before this behaviour can be confirmed.\n",
"Yeah, sorry, as mentioned it's a signed URL with a timeout. I'll generate a custom URL for this script that times out in like a year and repost\n",
"Okay, I created a new URL that has a one-year timeout, which should work. Interestingly I had to bump the payload by one to reproduce. Here's the new minimally reproducing script:\n\n``` python\n#!/usr/bin/env python\nimport requests\n\nurl = 'https://dev-vision-analysis.s3.amazonaws.com:443/c61888ab-58ea-467f-ab8e-201592b075b9-10-real?Signature=Qwe31z3yJvttQPaqamFbyzlNwV0%3D&Expires=1470328603&AWSAccessKeyId=AKIAJFMV35HPPDAUG66A'\n\ndata = 190080 # works\ndata = 190081 # does not\n\nimage = b'1' * data\nrequests.put(url, data=image[:data], headers={'Content-Type': 'image/png'}, timeout=1.0)\n```\n",
"Odd, I'm not getting a bad address message.\n\n``` py\nTraceback (most recent call last):\n File \"test2700.py\", line 10, in <module>\n requests.put(url, data=image[:data], headers={'Content-Type': 'image/png'}, timeout=1.0)\n File \"/Users/ian/sandbox/requests/requests/api.py\", line 122, in put\n return request('put', url, data=data, **kwargs)\n File \"/Users/ian/sandbox/requests/requests/api.py\", line 50, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"/Users/ian/sandbox/requests/requests/sessions.py\", line 465, in request\n resp = self.send(prep, **send_kwargs)\n File \"/Users/ian/sandbox/requests/requests/sessions.py\", line 573, in send\n r = adapter.send(request, **kwargs)\n File \"/Users/ian/sandbox/requests/requests/adapters.py\", line 418, in send\n raise ConnectTimeout(e, request=request)\nrequests.exceptions.ConnectTimeout: HTTPSConnectionPool(host='dev-vision-analysis.s3.amazonaws.com', port=443): Max retries exceeded with url: /c61888ab-58ea-467f-ab8e-201592b075b9-10-real?Signature=Qwe31z3yJvttQPaqamFbyzlNwV0%3D&Expires=1470328603&AWSAccessKeyId=AKIAJFMV35HPPDAUG66A (Caused by ConnectTimeoutError(<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x1025b1210>, 'Connection to dev-vision-analysis.s3.amazonaws.com timed out. (connect timeout=1.0)'))\n```\n\nIt does cause a connection time out. That said, when I use the smaller data size (by 1 byte) then I get a Read timeout.\n",
"The read timeout I could understand if there's some kind of limit imposed by S3 (the service handling the PUT request). I'll look into that. Is there something more I can test on my end to find out why I'd be getting a 'Bad address' error?\n",
"> The read timeout I could understand if there's some kind of limit imposed by S3 (the service handling the PUT request).\n\nI'm confused by this. The timeout is from the timeout parameter.\n\n> Is there something more I can test on my end to find out why I'd be getting a 'Bad address' error?\n\nYou could try doing \n\n``` py\nimport socket\n\ns = socket.socket()\ns.settimeout(1.0)\ns.connect(('dev-vision-analysis.s3.amazonaws.com', 443))\n```\n\nThat _should_ emulate the `ReadTimeout`. That said, emulating the `ConnectTimeout` will be a bit more work and I'm still researching how we implement that.\n",
"> > The read timeout I could understand if there's some kind of limit imposed by S3 (the service handling >>the PUT request).\n> \n> I'm confused by this. The timeout is from the timeout parameter.\n\nYeah, don't worry about it, you're right. I was just trying to brainstorm ideas for why we might see the behavior I'm seeing and didn't think about what actually causes a `ReadTimeout`\n\n> You could try doing\n\nDid that. The first time I ran it I got a `ReadTimeout` as you expected. After that successive attempts did not raise an error at all.\n",
"> After that successive attempts did not raise an error at all.\n\nCan you show your successive attempts? Did you re-run the script or call `.connect(` over and over again?\n",
"I created a script with these contents:\n\n``` python\n(ve)12:42:52 {AE-697} ~/src/archer$ cat test-connect.py\n#!/usr/bin/env python\n\nimport socket\n\ns = socket.socket()\ns.settimeout(1.0)\ns.connect(('dev-vision-analysis.s3.amazonaws.com', 443))\nprint('made it through')\n```\n\nGot this console output:\n\n```\n12:43:51 {AE-697} ~/src/archer$ ./test-connect.py\nmade it through\n12:44:00 {AE-697} ~/src/archer$ ./test-connect.py\nmade it through\n12:44:01 {AE-697} ~/src/archer$ ./test-connect.py\nmade it through\n12:44:03 {AE-697} ~/src/archer$ ./test-connect.py\nmade it through\n12:44:04 {AE-697} ~/src/archer$ ./test-connect.py\nmade it through\n12:44:05 {AE-697} ~/src/archer$ ./test-connect.py\nmade it through\n12:44:06 {AE-697} ~/src/archer$ ./test-connect.py\nmade it through\n12:44:07 {AE-697} ~/src/archer$ ./test-connect.py\nmade it through\n```\n\nI can't give you the output on the first run when it actually emitted the exception since I didn't save it and I can't reproduce it.\n"
] |
https://api.github.com/repos/psf/requests/issues/2699
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2699/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2699/comments
|
https://api.github.com/repos/psf/requests/issues/2699/events
|
https://github.com/psf/requests/issues/2699
| 98,258,031 |
MDU6SXNzdWU5ODI1ODAzMQ==
| 2,699 |
timeout=False puts the socket in non-blocking mode, causing weird errors.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/181693?v=4",
"events_url": "https://api.github.com/users/offbyone/events{/privacy}",
"followers_url": "https://api.github.com/users/offbyone/followers",
"following_url": "https://api.github.com/users/offbyone/following{/other_user}",
"gists_url": "https://api.github.com/users/offbyone/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/offbyone",
"id": 181693,
"login": "offbyone",
"node_id": "MDQ6VXNlcjE4MTY5Mw==",
"organizations_url": "https://api.github.com/users/offbyone/orgs",
"received_events_url": "https://api.github.com/users/offbyone/received_events",
"repos_url": "https://api.github.com/users/offbyone/repos",
"site_admin": true,
"starred_url": "https://api.github.com/users/offbyone/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/offbyone/subscriptions",
"type": "User",
"url": "https://api.github.com/users/offbyone",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 56 |
2015-07-30T20:33:15Z
|
2021-09-08T14:00:37Z
|
2016-11-15T17:10:14Z
|
NONE
|
resolved
|
I'm receiving a strange connection error using Requests 2.7.0
connecting to an HTTP service as part of an integration test. Below is
the call stack when I call into `session.request` in the API (for
reasons of NDA I can't provide the rest of the callstack).
The error is below in full detail, but in short, send() in requests is emitting `ConnectionError: ('Connection aborted.', error(115, 'Operation now in progress'))`
Python=2.7
Requests=2.7.0
The parameters being passed in to this are:
verb=POST
url=(an HTTP url for an internal service)
data=(a dict)
verify=False
cert=None
headers={'Content-Encoding': 'amz-1.0', 'Connection': 'keep-alive', 'Accept': 'application/json, text/javascript, _/_', 'User-Agent': 'A thing', 'Host': 'hostname', 'Pragma': 'no-cache', 'Cache-Control': 'no-cache', 'Content-Type': 'application/json'} # plus some service headers used by our service
timeout=False
proxies=Nonefi
> package-cache/packages/Requests/lib/python2.7/site-packages/requests/api.py:50: in request
> response = session.request(method=method, url=url, *_kwargs)
> package-cache/packages/Requests/lib/python2.7/site-packages/requests/sessions.py:465: in request
> resp = self.send(prep, *_send_kwargs)
> package-cache/packages/Requests/lib/python2.7/site-packages/requests/sessions.py:573: in send
> r = adapter.send(request, **kwargs)
> ---
>
> self = <requests.adapters.HTTPAdapter object at 0x7ff845dccd50>, request = <PreparedRequest [POST]>, stream = False, timeout = <requests.packages.urllib3.util.timeout.Timeout object at 0x7ff845dd0cd0>, verify = False, cert = None
> proxies = {}
>
> ```
> def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
> """Sends PreparedRequest object. Returns Response object.
>
> :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
> :param stream: (optional) Whether to stream the request content.
> :param timeout: (optional) How long to wait for the server to send
> data before giving up, as a float, or a (`connect timeout, read
> timeout <user/advanced.html#timeouts>`_) tuple.
> :type timeout: float or tuple
> :param verify: (optional) Whether to verify SSL certificates.
> :param cert: (optional) Any user-provided SSL certificate to be trusted.
> :param proxies: (optional) The proxies dictionary to apply to the request.
> """
>
> conn = self.get_connection(request.url, proxies)
>
> self.cert_verify(conn, request.url, verify, cert)
> url = self.request_url(request, proxies)
> self.add_headers(request)
>
> chunked = not (request.body is None or 'Content-Length' in request.headers)
>
> if isinstance(timeout, tuple):
> try:
> connect, read = timeout
> timeout = TimeoutSauce(connect=connect, read=read)
> except ValueError as e:
> # this may raise a string formatting error.
> err = ("Invalid timeout {0}. Pass a (connect, read) "
> "timeout tuple, or a single float to set "
> "both timeouts to the same value".format(timeout))
> raise ValueError(err)
> else:
> timeout = TimeoutSauce(connect=timeout, read=timeout)
>
> try:
> if not chunked:
> resp = conn.urlopen(
> method=request.method,
> url=url,
> body=request.body,
> headers=request.headers,
> redirect=False,
> assert_same_host=False,
> preload_content=False,
> decode_content=False,
> retries=self.max_retries,
> timeout=timeout
> )
> # Send the request.
> else:
> if hasattr(conn, 'proxy_pool'):
> conn = conn.proxy_pool
>
> low_conn = conn._get_conn(timeout=timeout)
>
> try:
> low_conn.putrequest(request.method,
> url,
> skip_accept_encoding=True)
>
> for header, value in request.headers.items():
> low_conn.putheader(header, value)
>
> low_conn.endheaders()
>
> for i in request.body:
> low_conn.send(hex(len(i))[2:].encode('utf-8'))
> low_conn.send(b'\r\n')
> low_conn.send(i)
> low_conn.send(b'\r\n')
> low_conn.send(b'0\r\n\r\n')
>
> r = low_conn.getresponse()
> resp = HTTPResponse.from_httplib(
> r,
> pool=conn,
> connection=low_conn,
> preload_content=False,
> decode_content=False
> )
> except:
> # If we hit any problems here, clean up the connection.
> # Then, reraise so that we can handle the actual exception.
> low_conn.close()
> raise
> else:
> # All is well, return the connection to the pool.
> conn._put_conn(low_conn)
>
> except (ProtocolError, socket.error) as err:
> ```
>
> > ```
> > raise ConnectionError(err, request=request)
> > ```
> >
> > E ConnectionError: ('Connection aborted.', error(115, 'Operation now in progress'))
>
> cert = None
> chunked = False
> conn = <requests.packages.urllib3.connectionpool.HTTPConnectionPool object at 0x7ff844c68c10>
> err = ProtocolError('Connection aborted.', error(115, 'Operation now in progress'))
> proxies = {}
> request = <PreparedRequest [POST]>
> self = <requests.adapters.HTTPAdapter object at 0x7ff845dccd50>
> stream = False
> timeout = <requests.packages.urllib3.util.timeout.Timeout object at 0x7ff845dd0cd0>
> url = '/'
> verify = False
>
> package-cache/packages/Requests/lib/python2.7/site-packages/requests/adapters.py:415: ConnectionError
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2699/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2699/timeline
| null |
completed
| null | null | false |
[
"(Hi Chris!)\n\nWell this is weird. Are you passing any socket options?\n",
"None explicitly, no, and I checked in pdb; the options are the `default_socket_options`\n",
"Basically, I've bottomed out on my socket-related knowhow. This code works for other services, so it's certainly possible that it's something related to this particular service endpoint, but I don't know what it would be.\n",
"Ok.\n\nSo, my basic understanding of this error is as follows. First, the error in question is EINPROGRESS (errno 115). I've seen this happen with a socket in non-blocking mode on a `connect()` call: it's very common in that case. However, if you're not passing explicit socket options then we aren't putting the socket in non-blocking mode, and it's hard to see why you'd do that (lots of stuff would go wrong if you did).\n\nCan we clarify some stuff? Are you running in a deployment with PyOpenSSL, or just with the standard ssl library?\n",
"I believe without PyOpenSSL, but... checking now...\n\nAs far as I know the only instance of PyOpenSSL on my sys.path is in requests/contrib, which I assume is not what you're talking about.\n",
"> As far as I know the only instance of PyOpenSSL on my sys.path is in requests/contrib, which I assume is not what you're talking about.\n\nCorrect. =) What I want to know is whether PyOpenSSL the package is accessible: that is, that running `python -c 'import OpenSSL'` fails in some way.\n\nSo, in that case I recommend running `strace` on your process. If the bug is readily reproducible then this will be particularly useful, because we can see the failing syscall, but I also want to see all syscalls relating to the socket in question. Let's see if it's actually being set to non-blocking mode or not.\n",
"(To be clear, setting a socket nonblocking will involve a `fnctl` call that can be seen in strace.)\n",
"Okay, gimme a few; it's a 200Mb strace file all-in; narrowing this down will take me a bit.\n",
"No rush. =)\n",
"```\nsocket(PF_NETLINK, SOCK_RAW, 0) = 10\nbind(10, {sa_family=AF_NETLINK, pid=0, groups=00000000}, 12) = 0\ngetsockname(10, {sa_family=AF_NETLINK, pid=13834, groups=00000000}, [12]) = 0\nsendto(10, \"\\24\\0\\0\\0\\26\\0\\1\\3s\\217\\272U\\0\\0\\0\\0\\0\\0\\0\\0\", 20, 0, {sa_family=AF_NETLINK, pid=0, groups=00000000}, 12) = 20\nrecvmsg(10, {msg_name(12)={sa_family=AF_NETLINK, pid=0, groups=00000000}, msg_iov(1)=[{\"D\\0\\0\\0\\24\\0\\2\\0s\\217\\272U\\n6\\0\\0\\2\\10\\200\\376\\1\\0\\0\\0\\10\\0\\1\\0\\177\\0\\0\\1\"..., 4096}], msg_controllen=0, msg_flags=0}, 0) = 220\nrecvmsg(10, {msg_name(12)={sa_family=AF_NETLINK, pid=0, groups=00000000}, msg_iov(1)=[{\"@\\0\\0\\0\\24\\0\\2\\0s\\217\\272U\\n6\\0\\0\\n\\200\\200\\376\\1\\0\\0\\0\\24\\0\\1\\0\\0\\0\\0\\0\"..., 4096}], msg_controllen=0, msg_flags=0}, 0) = 192\nrecvmsg(10, {msg_name(12)={sa_family=AF_NETLINK, pid=0, groups=00000000}, msg_iov(1)=[{\"\\24\\0\\0\\0\\3\\0\\2\\0s\\217\\272U\\n6\\0\\0\\0\\0\\0\\0\\1\\0\\0\\0\\24\\0\\1\\0\\0\\0\\0\\0\"..., 4096}], msg_controllen=0, msg_flags=0}, 0) = 20\nclose(10) = 0\nsocket(PF_FILE, SOCK_STREAM|SOCK_CLOEXEC|SOCK_NONBLOCK, 0) = 10\nconnect(10, {sa_family=AF_FILE, path=\"/var/run/nscd/socket\"}, 110) = 0\nsendto(10, \"\\2\\0\\0\\0\\r\\0\\0\\0\\6\\0\\0\\0hosts\\0\", 18, MSG_NOSIGNAL, NULL, 0) = 18\npoll([{fd=10, events=POLLIN|POLLERR|POLLHUP}], 1, 5000) = 1 ([{fd=10, revents=POLLIN|POLLHUP}])\nrecvmsg(10, {msg_name(0)=NULL, msg_iov(2)=[{\"hosts\\0\", 6}, {\"H\\344\\17\\0\\0\\0\\0\\0\", 8}], msg_controllen=24, {cmsg_len=20, cmsg_level=SOL_SOCKET, cmsg_type=SCM_RIGHTS, {11}}, msg_flags=MSG_CMSG_CLOEXEC}, MSG_CMSG_CLOEXEC) = 14\nmmap(NULL, 1041480, PROT_READ, MAP_SHARED, 11, 0) = 0x7f866d0b6000\nclose(11) = 0\nclose(10) = 0\nsocket(PF_INET, SOCK_STREAM, IPPROTO_TCP) = 10\nsetsockopt(10, SOL_TCP, TCP_NODELAY, [1], 4) = 0\nfcntl(10, F_GETFL) = 0x2 (flags O_RDWR)\nfcntl(10, F_SETFL, O_RDWR|O_NONBLOCK) = 0\nconnect(10, {sa_family=AF_INET, sin_port=htons(80), sin_addr=inet_addr(\"10.12.164.47\")}, 16) = -1 EINPROGRESS (Operation now in progress)\nclose(10) = 0\n```\n\nSo, it looks like something _is_ setting it.\n",
"Fun!\n\nSo, the `setsockopt` call is us setting the \"default socket options\". I don't believe we call `setblocking` in our code. Out of interest, does any of yours make that call?\n\nOne way or another, it's happening somewhere between the point where we create the socket and call connect. In a vanilla requests/urllib3 that's really only a very small number of lines of code: requests.packages.urllib3.util.connection:68 to requests.packages.urllib3.util.connection:78.\n\nOne thing that might be worth doing is running in pdb and setting a breakpoint there, and also open strace in another window for that program, then step through and see if you can see where the fcntl call is coming from.\n",
"I'll do that & report back. For what it's worth, this is the code calling requests:\n\n```\n reply = requests.request(verb.lower(), uri.geturl(), data=http_data,\n verify=verify_cert, cert=cert, headers=headers,\n auth=http_auth, timeout=self._timeout,\n proxies=self._proxies)\n```\n\nDoesn't seem to be anything there about sockets :/\n",
"No, I doubt there is, but I worry about monkeypatching and other kinds of fun. =D No rush, take your time on this.\n",
"Based on the strace, it looks like socket.settimeout is doing it:\n\n```\nwrite(1, \"-> sock.settimeout(timeout)\\n\", 28) = 28\nioctl(0, SNDCTL_TMR_TIMEBASE or TCGETS, {B38400 opost isig icanon echo ...}) = 0\nioctl(1, SNDCTL_TMR_TIMEBASE or TCGETS, {B38400 opost isig icanon echo ...}) = 0\nioctl(0, SNDCTL_TMR_TIMEBASE or TCGETS, {B38400 opost isig icanon echo ...}) = 0\nioctl(1, SNDCTL_TMR_TIMEBASE or TCGETS, {B38400 opost isig icanon echo ...}) = 0\nrt_sigprocmask(SIG_BLOCK, [INT], [], 8) = 0\nioctl(0, TIOCGWINSZ, {ws_row=0, ws_col=0, ws_xpixel=0, ws_ypixel=0}) = 0\nioctl(0, TIOCSWINSZ, {ws_row=0, ws_col=0, ws_xpixel=0, ws_ypixel=0}) = 0\nioctl(0, SNDCTL_TMR_TIMEBASE or TCGETS, {B38400 opost isig icanon echo ...}) = 0\nioctl(0, SNDCTL_TMR_STOP or TCSETSW, {B38400 opost isig -icanon -echo ...}) = 0\nrt_sigprocmask(SIG_SETMASK, [], NULL, 8) = 0\nrt_sigaction(SIGWINCH, {0x7fafa61156a0, [], SA_RESTORER|SA_RESTART, 0x7fafb36daae0}, {SIG_DFL, [], SA_RESTORER, 0x7fafb36daae0}, 8) = 0\nrt_sigprocmask(SIG_BLOCK, [INT], [], 8) = 0\nwrite(1, \"(Pdb) \", 6) = 6\nrt_sigprocmask(SIG_SETMASK, [], NULL, 8) = 0\nselect(1, [0], NULL, NULL, NULL) = 1 (in [0])\nrt_sigprocmask(SIG_BLOCK, NULL, [], 8) = 0\nrt_sigprocmask(SIG_BLOCK, NULL, [], 8) = 0\nread(0, \"\\n\", 1) = 1\nwrite(1, \"\\n\", 1) = 1\n```\n",
"That strace doesn't appear to have any fcntl calls in it at all...\n",
"Shit. I missed them on paste. One sec.\n",
"```\nwrite(1, \"-> sock.settimeout(timeout)\\n\", 28) = 28\nioctl(0, SNDCTL_TMR_TIMEBASE or TCGETS, {B38400 opost isig icanon echo ...}) = 0\nioctl(1, SNDCTL_TMR_TIMEBASE or TCGETS, {B38400 opost isig icanon echo ...}) = 0\nioctl(0, SNDCTL_TMR_TIMEBASE or TCGETS, {B38400 opost isig icanon echo ...}) = 0\nioctl(1, SNDCTL_TMR_TIMEBASE or TCGETS, {B38400 opost isig icanon echo ...}) = 0\nrt_sigprocmask(SIG_BLOCK, [INT], [], 8) = 0\nioctl(0, TIOCGWINSZ, {ws_row=0, ws_col=0, ws_xpixel=0, ws_ypixel=0}) = 0\nioctl(0, TIOCSWINSZ, {ws_row=0, ws_col=0, ws_xpixel=0, ws_ypixel=0}) = 0\nioctl(0, SNDCTL_TMR_TIMEBASE or TCGETS, {B38400 opost isig icanon echo ...}) = 0\nioctl(0, SNDCTL_TMR_STOP or TCSETSW, {B38400 opost isig -icanon -echo ...}) = 0\nrt_sigprocmask(SIG_SETMASK, [], NULL, 8) = 0\nrt_sigaction(SIGWINCH, {0x7fafa61156a0, [], SA_RESTORER|SA_RESTART, 0x7fafb36daae0}, {SIG_DFL, [], SA_RESTORER, 0x7fafb36daae0}, 8) = 0\nrt_sigprocmask(SIG_BLOCK, [INT], [], 8) = 0\nwrite(1, \"(Pdb) \", 6) = 6\nrt_sigprocmask(SIG_SETMASK, [], NULL, 8) = 0\nselect(1, [0], NULL, NULL, NULL) = 1 (in [0])\nrt_sigprocmask(SIG_BLOCK, NULL, [], 8) = 0\nrt_sigprocmask(SIG_BLOCK, NULL, [], 8) = 0\nread(0, \"\\n\", 1) = 1\nwrite(1, \"\\n\", 1) = 1\nrt_sigprocmask(SIG_BLOCK, [INT], [], 8) = 0\nioctl(0, SNDCTL_TMR_STOP or TCSETSW, {B38400 opost isig icanon echo ...}) = 0\nrt_sigprocmask(SIG_SETMASK, [], NULL, 8) = 0\nrt_sigaction(SIGWINCH, {SIG_DFL, [], SA_RESTORER, 0x7fafb36daae0}, {0x7fafa61156a0, [], SA_RESTORER|SA_RESTART, 0x7fafb36daae0}, 8) = 0\nfcntl(10, F_GETFL) = 0x2 (flags O_RDWR)\nfcntl(10, F_SETFL, O_RDWR|O_NONBLOCK) = 0\nwrite(1, \"> /home/chrisros/workspace/CDSer\"..., 199) = 199\nwrite(1, \"-> if source_address:\\n\", 22) = 22\n```\n",
"Note the prompt boundaries for the pdb prompt; that's how I'm triangulating\n",
"_Aha_! Check this weaselly section of the Python docs (emphasis mine):\n\n> Set a timeout on blocking socket operations. The value argument can be a nonnegative float expressing seconds, or None. If a float is given, subsequent socket operations will raise a timeout exception if the timeout period value has elapsed before the operation has completed. Setting a timeout of None disables timeouts on socket operations. **s.settimeout(0.0) is equivalent to s.setblocking(0)**; s.settimeout(None) is equivalent to s.setblocking(1).\n\nSo, what's the value of `self._timeout` in this call?\n",
":clap: @Lukasa \n",
"`False`\n",
"Yeah, that's not a valid value for that argument. ;) _And_, when evaluated in a numerical context, is equivalent to `0`, which is _probably_ the source of this bug.\n\nWant to try catching that at your level and seeing if the problem goes away? If it does, we should consider whether we can police this in the API.\n",
"Heh, yup, the `Timeout` object coerces its argument to float, and in Python:\n\n``` python\n>>> float(False)\n0.0\n```\n\nSometimes I hate this language.\n",
"> Want to try catching that at your level and seeing if the problem goes away? If it does, we should consider whether we can police this in the API.\n\nThat would be a 3.0.0 change (to raise a ValueError if it isn't `None`, `tuple`, `int`, or `float`).\n",
"@sigmavirus24 Sounds good to me.\n\nSeparately, urllib3 may want a fix for this, because it is _also_ affected. /cc @shazow\n",
"What are we fixing? in urllib3, `timeout=False` has semantic meaning iirc.\n",
"@sigmavirus24 Actually, now that I think about it that's overly aggressive. Generally, things that are `float`-y or `int`-y should be totally acceptable. I think we just explicitly want to forbid `True`/`False`, because they happen to be `int`-y but really shouldn't be.\n",
"@shazow Not according to the docstring for the `Timeout` class, which is where this happens:\n\n```\n Timeouts can be defined as a default for a pool::\n timeout = Timeout(connect=2.0, read=7.0)\n http = PoolManager(timeout=timeout)\n response = http.request('GET', 'http://example.com/')\n Or per-request (which overrides the default for the pool)::\n response = http.request('GET', 'http://example.com/', timeout=Timeout(10))\n Timeouts can be disabled by setting all the parameters to ``None``::\n no_timeout = Timeout(connect=None, read=None)\n response = http.request('GET', 'http://example.com/, timeout=no_timeout)\n :param total:\n This combines the connect and read timeouts into one; the read timeout\n will be set to the time leftover from the connect attempt. In the\n event that both a connect timeout and a total are specified, or a read\n timeout and a total are specified, the shorter timeout will be applied.\n Defaults to None.\n :type total: integer, float, or None\n :param connect:\n The maximum amount of time to wait for a connection attempt to a server\n to succeed. Omitting the parameter will default the connect timeout to\n the system default, probably `the global default timeout in socket.py\n <http://hg.python.org/cpython/file/603b4d593758/Lib/socket.py#l535>`_.\n None will set an infinite timeout for connection attempts.\n :type connect: integer, float, or None\n :param read:\n The maximum amount of time to wait between consecutive\n read operations for a response from the server. Omitting\n the parameter will default the read timeout to the system\n default, probably `the global default timeout in socket.py\n <http://hg.python.org/cpython/file/603b4d593758/Lib/socket.py#l535>`_.\n None will set an infinite timeout.\n :type read: integer, float, or None\n```\n\nNow, admittedly, the docstring does not allow `True`/`False`, but those can be integer or float if evaluated in that context, so you may want to police them. I can kinda see that `timeout=False` putting the socket in non-blocking mode might make sense in some kind of weird world, but I cannot see how `timeout=True` should put the socket into one-second-timeout mode.\n",
"`timeout=None` works for the 'set nonblocking' concept better than `timeout=False` imo\n",
"Erm I'm thinking of retries=False.\n"
] |
https://api.github.com/repos/psf/requests/issues/2698
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2698/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2698/comments
|
https://api.github.com/repos/psf/requests/issues/2698/events
|
https://github.com/psf/requests/issues/2698
| 98,194,342 |
MDU6SXNzdWU5ODE5NDM0Mg==
| 2,698 |
404 for timeouts advanced documentation links
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/131207?v=4",
"events_url": "https://api.github.com/users/nilya/events{/privacy}",
"followers_url": "https://api.github.com/users/nilya/followers",
"following_url": "https://api.github.com/users/nilya/following{/other_user}",
"gists_url": "https://api.github.com/users/nilya/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nilya",
"id": 131207,
"login": "nilya",
"node_id": "MDQ6VXNlcjEzMTIwNw==",
"organizations_url": "https://api.github.com/users/nilya/orgs",
"received_events_url": "https://api.github.com/users/nilya/received_events",
"repos_url": "https://api.github.com/users/nilya/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nilya/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nilya/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nilya",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-07-30T15:12:40Z
|
2021-09-08T23:00:40Z
|
2015-08-15T13:24:20Z
|
NONE
|
resolved
|
Three files [in this search](https://github.com/kennethreitz/requests/search?l=python&q=<user/advanced.html#timeouts>&type=Code&utf8=✓) has links to `<user/advanced.html#timeouts>` which works good for documentation generated on the local machine.
But on the http://docs.python-requests.org/ there is no ".html" in pages urls so 404 Page not found is returned.
I think docs.python-requests.org will be ok if you change links to `<user/advanced/#timeouts>` but local machine documentation will be broken.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2698/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2698/timeline
| null |
completed
| null | null | false |
[
"We should be using `:ref:` in our docs, so sphinx will handle this for us. Thanks for catching this @nilya \n"
] |
https://api.github.com/repos/psf/requests/issues/2697
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2697/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2697/comments
|
https://api.github.com/repos/psf/requests/issues/2697/events
|
https://github.com/psf/requests/issues/2697
| 97,925,783 |
MDU6SXNzdWU5NzkyNTc4Mw==
| 2,697 |
requests.get is partially ignoring proxies parameter?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/13555069?v=4",
"events_url": "https://api.github.com/users/DelboyJay/events{/privacy}",
"followers_url": "https://api.github.com/users/DelboyJay/followers",
"following_url": "https://api.github.com/users/DelboyJay/following{/other_user}",
"gists_url": "https://api.github.com/users/DelboyJay/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/DelboyJay",
"id": 13555069,
"login": "DelboyJay",
"node_id": "MDQ6VXNlcjEzNTU1MDY5",
"organizations_url": "https://api.github.com/users/DelboyJay/orgs",
"received_events_url": "https://api.github.com/users/DelboyJay/received_events",
"repos_url": "https://api.github.com/users/DelboyJay/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/DelboyJay/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DelboyJay/subscriptions",
"type": "User",
"url": "https://api.github.com/users/DelboyJay",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-07-29T12:36:15Z
|
2021-09-08T22:00:55Z
|
2015-08-31T07:01:02Z
|
NONE
|
resolved
|
I have a problem with Requests 2.7.0 and proxies. If I use the following code without setting the *_PROXY environment variables the requests takes up to 2 minutes to get the information I'm requesting by it does work. If I then set the HTTP_PROXY and HTTPS_PROXY env variables it takes 2 seconds to complete. It should be noted that the browsers on the machine are setup to use the proxy and the requests code does indeed pick up these settings from the registry as well as me specifying the values via the proxies parameter.
proxies = {
"http": "http://%s:3128/" % ipaddr,
"https": "http://%s:3128/" % ipaddr
}
r = requests.get("http://nvd.nist.gov/download/nvd-rss.xml", proxies=proxies)
I have profiled the code in pyCharm without use of the env variables and it says the following:
Name Call Count Time(ms) OwnTime(ms)
built-in method gethostbyaddr 3 41998 41998
built-in method gethostbyname 3 38378 38378
Debugging into sessions.py it seems that line 614 which calls get_environ_proxies takes quite a bit of time because it is indirectly calling socket.gethostbyname and socket.getfqdn in proxy_bypass_registry, request.py:2516 and the former throws an OSError but getfqdn succeeds giving the same answer as the rawHost variable.
Running the same test but with the HTTP_PROXY and HTTPS_PROXY environmental variables set shows in the profiler that the gethostbyaddr and gethostbyname take no time at all and the whole call takes 2 seconds.
Am I doing something silly here or is this something that can be fixed in requests? The proxy settings in the registry and passed via the proxies attribute just seem to be being ignored in parts of the code it would appear.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2697/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2697/timeline
| null |
completed
| null | null | false |
[
"This sounds like you're experiencing some kind of timeout in the standard library's `urllib.proxy_bypass()` method: specifically, you seem to be experiencing some _really_ long DNS timeouts.\n\nWe're not strictly ignoring the proxies parameter, but we're trying not to be too clever about it in the code: we basically allow the system to provide us with the list of proxies that it wants to use and then merge those with the list of specified proxies. It's the most generic approach to the problem.\n\nIf you want to speed this up you can use a `Session` and set `Session.trust_env = False`, which will avoid this call altogether. Otherwise, we need to investigate why `urllib.proxy_bypass` is so expensive on your system when it is usually so cheap.\n"
] |
https://api.github.com/repos/psf/requests/issues/2696
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2696/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2696/comments
|
https://api.github.com/repos/psf/requests/issues/2696/events
|
https://github.com/psf/requests/pull/2696
| 97,921,056 |
MDExOlB1bGxSZXF1ZXN0NDExMTQ5OTM=
| 2,696 |
Use debug instead of info to log new connection
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1469982?v=4",
"events_url": "https://api.github.com/users/luca3m/events{/privacy}",
"followers_url": "https://api.github.com/users/luca3m/followers",
"following_url": "https://api.github.com/users/luca3m/following{/other_user}",
"gists_url": "https://api.github.com/users/luca3m/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/luca3m",
"id": 1469982,
"login": "luca3m",
"node_id": "MDQ6VXNlcjE0Njk5ODI=",
"organizations_url": "https://api.github.com/users/luca3m/orgs",
"received_events_url": "https://api.github.com/users/luca3m/received_events",
"repos_url": "https://api.github.com/users/luca3m/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/luca3m/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/luca3m/subscriptions",
"type": "User",
"url": "https://api.github.com/users/luca3m",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-07-29T12:01:06Z
|
2021-09-08T07:00:48Z
|
2015-07-29T12:11:34Z
|
NONE
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2696/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2696/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2696.diff",
"html_url": "https://github.com/psf/requests/pull/2696",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2696.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2696"
}
| true |
[
"This belongs on urllib3, not here.\n"
] |
|
https://api.github.com/repos/psf/requests/issues/2695
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2695/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2695/comments
|
https://api.github.com/repos/psf/requests/issues/2695/events
|
https://github.com/psf/requests/issues/2695
| 97,892,445 |
MDU6SXNzdWU5Nzg5MjQ0NQ==
| 2,695 |
how to prepare response object
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/8228290?v=4",
"events_url": "https://api.github.com/users/aliraza0337/events{/privacy}",
"followers_url": "https://api.github.com/users/aliraza0337/followers",
"following_url": "https://api.github.com/users/aliraza0337/following{/other_user}",
"gists_url": "https://api.github.com/users/aliraza0337/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/aliraza0337",
"id": 8228290,
"login": "aliraza0337",
"node_id": "MDQ6VXNlcjgyMjgyOTA=",
"organizations_url": "https://api.github.com/users/aliraza0337/orgs",
"received_events_url": "https://api.github.com/users/aliraza0337/received_events",
"repos_url": "https://api.github.com/users/aliraza0337/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/aliraza0337/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aliraza0337/subscriptions",
"type": "User",
"url": "https://api.github.com/users/aliraza0337",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-07-29T09:13:59Z
|
2021-09-08T23:00:45Z
|
2015-07-29T12:30:18Z
|
NONE
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2695/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2695/timeline
| null |
completed
| null | null | false |
[
"The defect tracker is not the place to ask questions. Please use [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests) for questions.\n"
] |
|
https://api.github.com/repos/psf/requests/issues/2694
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2694/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2694/comments
|
https://api.github.com/repos/psf/requests/issues/2694/events
|
https://github.com/psf/requests/issues/2694
| 97,889,789 |
MDU6SXNzdWU5Nzg4OTc4OQ==
| 2,694 |
Address family not supported by protocol
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1587290?v=4",
"events_url": "https://api.github.com/users/Johnathon332/events{/privacy}",
"followers_url": "https://api.github.com/users/Johnathon332/followers",
"following_url": "https://api.github.com/users/Johnathon332/following{/other_user}",
"gists_url": "https://api.github.com/users/Johnathon332/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Johnathon332",
"id": 1587290,
"login": "Johnathon332",
"node_id": "MDQ6VXNlcjE1ODcyOTA=",
"organizations_url": "https://api.github.com/users/Johnathon332/orgs",
"received_events_url": "https://api.github.com/users/Johnathon332/received_events",
"repos_url": "https://api.github.com/users/Johnathon332/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Johnathon332/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Johnathon332/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Johnathon332",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-07-29T08:56:27Z
|
2021-09-08T23:00:45Z
|
2015-07-29T12:37:10Z
|
NONE
|
resolved
|
I am getting the error when making a request to my web api hosted on another machine on IIS express, this is in order for me to debug my application, however when I run my python code I get the error
requests.exceptions.ConnectionError: ('Connection aborted.', error(97, 'Address family not supported by protocol'))
Here is the traceback:
Traceback (most recent call last):
File "Allocator.py", line 201, in <module>
Main()
File "Allocator.py", line 154, in Main
requestResult = MakeRequest("")
File "Allocator.py", line 36, in MakeRequest
result = requests.get(constURLString + "?lastUpdate=" + urlParameters)
File "/usr/local/lib/python2.7/dist-packages/requests-2.7.0-py2.7.egg/requests/api.py", line 69, in get
return request('get', url, params=params, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests-2.7.0-py2.7.egg/requests/api.py", line 50, in request
response = session.request(method=method, url=url, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests-2.7.0-py2.7.egg/requests/sessions.py", line 465, in request
resp = self.send(prep, *_send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests-2.7.0-py2.7.egg/requests/sessions.py", line 573, in send
r = adapter.send(request, *_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests-2.7.0-py2.7.egg/requests/adapters.py", line 415, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', error(97, 'Address family not supported by protocol'))
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2694/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2694/timeline
| null |
completed
| null | null | false |
[
"Closing to centralize discussion on [StackOverflow](https://stackoverflow.com/questions/31695430/address-family-not-supported-by-protocol-making-request-to-webapi) given there is more detail in the question than was provided in the bug report.\n"
] |
https://api.github.com/repos/psf/requests/issues/2693
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2693/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2693/comments
|
https://api.github.com/repos/psf/requests/issues/2693/events
|
https://github.com/psf/requests/issues/2693
| 97,859,950 |
MDU6SXNzdWU5Nzg1OTk1MA==
| 2,693 |
Double POST requests sent at the same time when a delay is encountered.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/11473261?v=4",
"events_url": "https://api.github.com/users/itsHaddad/events{/privacy}",
"followers_url": "https://api.github.com/users/itsHaddad/followers",
"following_url": "https://api.github.com/users/itsHaddad/following{/other_user}",
"gists_url": "https://api.github.com/users/itsHaddad/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/itsHaddad",
"id": 11473261,
"login": "itsHaddad",
"node_id": "MDQ6VXNlcjExNDczMjYx",
"organizations_url": "https://api.github.com/users/itsHaddad/orgs",
"received_events_url": "https://api.github.com/users/itsHaddad/received_events",
"repos_url": "https://api.github.com/users/itsHaddad/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/itsHaddad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/itsHaddad/subscriptions",
"type": "User",
"url": "https://api.github.com/users/itsHaddad",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 17 |
2015-07-29T04:54:45Z
|
2021-09-08T23:00:44Z
|
2015-08-03T02:24:53Z
|
NONE
|
resolved
|
I have a code that sends GET and POST requests every 10 seconds. But in case there is a delay, the POST requests line itself posts twice. I tried to debug this to the best of my ability and I couldn't but finalize that the library sends the requests twice, or it is something to do with TCP (but then the library should deal with it, right?
The code: # this is a general code to give an idea of my implementation
import requests
import time
while True:
source = requests.get(url)
result = source.json()
resp = request.post(another_url,json = result,headers)
time.sleep(10)
By default, this should be ok, but I don't understand why I get this :

If it is something to do with my code then please advise me, or otherwise I hope this bug can be fixed.
Thank you so much for this awesome library.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2693/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2693/timeline
| null |
completed
| null | null | false |
[
"Just to clarify: can you confirm you're not getting redirected? Check `resp.history` to see if there's anything in it.\n",
"Thank you for your answer sir. \n\nActually I don't really know about this, I just included the resp.history and source.history into my code and I am getting [] [] for a few loops already. Maybe I need to leave it until that bug occurs and then I can will report back?!\n",
"That feels like it'd be useful if you can spare the time. =)\n",
"Definitely (Y). I will try to leave it running for as much as I can. Hopefully I can get it up for a day. Usually it happens a few times a day. \n",
"\n\nFor this test, the database is supposed to receive 5 parameters each 10 seconds. The picture shows the times when I got 10 parameters in 1 second, which is the time when the server received double entries in the same second. I checked the data, and whenever the data received is doubled, it appears that it is actually just a duplicate, and has exactly the same values of the other entry at that specific second. \n\nRegarding the resp.history and source.history, I got them all as empty lists for each loop, even at the time when the problem occurred. \n",
"Can I confirm you aren't using a custom Transport Adapter, and that you haven't changed any constants in `requests.adapters`?\n",
"I calculated the difference of the between my computer and the server and it is 1 minute an 43 seconds. This allowed me to find this: \n\n\n\nLet me go a bit further into my code:\n\nThis is how I control the delay, but then I figured that instantly repeating the loop might cause this problem. so I put a delay under if in a previous test, but I still got the same thing. I am retesting again with a delay of 1 and 5 seconds under if to get more robust results. \n\nI am not using a custom Transport Adapter, and I have not changed any constants in `requests.adapters`\n",
"I edited my previous comment. Thank you. \n",
"So, that code looks ... interesting. I'm going to guess that it looks something like\n\n``` py\nwhile True:\n start_time_loop = time.time()\n time.sleep(10)\n\n if time.time() - start_time_loop > 10:\n pass\n else:\n time.sleep(abs(10 - (time.time() - start_time_loop)))\n\n r = requests.post(...)\n```\n\nIf I were you I'd rewrite this as\n\n``` py\nSLEEP_TIME = 10 # in seconds\nwhile True:\n start_time_loop = time.time()\n time.sleep(SLEEP_TIME)\n\n time_slept = time.time() - start_time_loop\n if time_slept < SLEEP_TIME:\n time.sleep(SLEEP_TIME - time_slept)\n\n r = requests.post(...)\n```\n\nBasically, you're eliminating the usage of `pass` since you only want to do something if the difference is less than 10. That said, if you're anticipating that sleep could return early there is clearly something in here that you're not telling us about. Some kind of event library perhaps (eventlet, gevent, etc.) and depending on how you're using requests, this could be relevant.\n",
"Thank you so much for taking the time, sir. \n\nFirst of all, I want to apologize, this is actually my first issue here in github and I didn't know that github followed stackoverflow way, and that's why my code up there looks weird. I will rewrite it now. \n\n```\nimport requests\nimport time\nwhile True:\nstart_time_loop = time.time() \n source = requests.get(url)\n result = source.json()\n resp = request.post(another_url,json = result,headers) \n\n# in this situation, the loop might restart immediately resulting in an extra entry at the same second\n if time.time()-start_time_loop> 10: pass # there should be some small delay here, I guess! \n# if not, then just consider the processing time and anything else, like the delay in posting sometimes\n else : time.sleep(abs(10 - (time.time() - start_time_loop))) \n```\n\nYour code actually gave me an idea, why not make the delay first? But then I remembered that I need to consider the time it takes to process all that including posting time ( I am GETting from a sensor, so there's no delay there, but I am POSTing to a server). And as I don't have a timeout as you may have noticed, that it is due to the fact that I want TCP to take over and control here in case there's some read timeout. Because apparently, sometimes when a read timeout occurs, and retry, what happens is that I will will create another connection, and when the server finally responds, all the connections created will be received at the same time. But that's another problem that has been solved by eliminating the timeout. \n\nCurrently, I am trying to eliminate the double entries. It happened in the _comment_ where I mentioned the `requests.adapters`.\n\nBtw, you can say that this is basically my code, I am only using `requests` and `time` libraries. \n",
"I may have come across the reason why this is happening, which could mostly be a mistake from my part. Please don't spend more time on this. I will request for _topic closed_ once I verify. Please forgive me. \n",
"No need to apologise: bug reports opened in good faith are always fine. =)\n",
"@Lukasa I wanna ask though, how is `requests` sending the POST requests? My friend has a C code that sends with me to the same table and I just noticed that in one situation, he was sending while my code suddenly stopped sending for around a minute. I am guessing that my code got stuck in one POST request for that minute before it went through. We are both connected to the same network. \n\nIn this case, is there a way to make this more efficient? If I put a timeout, it will just create connections during that minute and then all of them will be received at the same time!\n",
"\n\nThis is an example from the database to what I mean. I am \"1.5-PC\" and he's the other one.\n",
"I found out that this issue has go nothing to do with the library. Thank you for your patience. I am still trying to figure out how to deal to the problem I mentioned in the last comment, should I open a new issue for it?\n",
"@itsHaddad you can use a timeout certainly. It looks like it would result in a connection error because connecting would timeout (perhaps). Please don't open an issue for how to properly use a timeout. StackOverflow can help you with that.\n",
"I understand. I tried the timeout previosuly but apparently in the case of a read timeout it would still keep trying to send for some time. Here @ www.python-requests.org/en/latest/api/#requests.Response.close it says `Note: Should not normally need to be called explicitly.` \n"
] |
https://api.github.com/repos/psf/requests/issues/2692
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2692/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2692/comments
|
https://api.github.com/repos/psf/requests/issues/2692/events
|
https://github.com/psf/requests/pull/2692
| 97,590,677 |
MDExOlB1bGxSZXF1ZXN0NDA5NjY1NTA=
| 2,692 |
Dealing with removal of Authorization header on redirect and diff domain
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7142475?v=4",
"events_url": "https://api.github.com/users/casahome2000/events{/privacy}",
"followers_url": "https://api.github.com/users/casahome2000/followers",
"following_url": "https://api.github.com/users/casahome2000/following{/other_user}",
"gists_url": "https://api.github.com/users/casahome2000/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/casahome2000",
"id": 7142475,
"login": "casahome2000",
"node_id": "MDQ6VXNlcjcxNDI0NzU=",
"organizations_url": "https://api.github.com/users/casahome2000/orgs",
"received_events_url": "https://api.github.com/users/casahome2000/received_events",
"repos_url": "https://api.github.com/users/casahome2000/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/casahome2000/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/casahome2000/subscriptions",
"type": "User",
"url": "https://api.github.com/users/casahome2000",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-07-28T01:07:29Z
|
2021-09-08T07:00:48Z
|
2015-07-28T02:34:36Z
|
NONE
|
resolved
|
for services that require Authorization in the header, all redirect calls to a different domain would result in a 403-Forbidden. Authorization should be preserved just as Authentication is.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2692/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2692/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2692.diff",
"html_url": "https://github.com/psf/requests/pull/2692",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2692.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2692"
}
| true |
[
"Sorry, we won't be accepting this. This is specifically written to fix a [vulnerability](http://www.cvedetails.com/cve/CVE-2015-2296/) that was reported to us recently. Without knowing your specific use-case I can't advise you on how to replicate the original behaviour, but if you post specifics to [our Q&A area](https://stackoverflow.com/questions/tagged/python-requests) you'll get an excellent answer.\n",
"Thanks for looking into this request as well as providing the reference as to why it's not a good idea. Appreciate the time.\n"
] |
https://api.github.com/repos/psf/requests/issues/2691
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2691/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2691/comments
|
https://api.github.com/repos/psf/requests/issues/2691/events
|
https://github.com/psf/requests/issues/2691
| 97,143,126 |
MDU6SXNzdWU5NzE0MzEyNg==
| 2,691 |
Can't upload files bigger than 2GB (streaming upload)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/13482613?v=4",
"events_url": "https://api.github.com/users/tondy1/events{/privacy}",
"followers_url": "https://api.github.com/users/tondy1/followers",
"following_url": "https://api.github.com/users/tondy1/following{/other_user}",
"gists_url": "https://api.github.com/users/tondy1/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tondy1",
"id": 13482613,
"login": "tondy1",
"node_id": "MDQ6VXNlcjEzNDgyNjEz",
"organizations_url": "https://api.github.com/users/tondy1/orgs",
"received_events_url": "https://api.github.com/users/tondy1/received_events",
"repos_url": "https://api.github.com/users/tondy1/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tondy1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tondy1/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tondy1",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 10 |
2015-07-24T20:43:40Z
|
2021-09-08T12:00:48Z
|
2015-07-24T20:59:27Z
|
NONE
|
resolved
|
Tried this code and it works flawless with files < 2GB but as soon as I'm using files > 2GB it gives me an error.
``` python
e = MultipartEncoder(fields=dict(
file=(rfbase, open(filename, 'rb'), 'text/plain')
))
m = MultipartEncoderMonitor(e, my_callback)
uploadInfo = requests.request('POST', uploadURL,data=m,headers={'Content-Type': m.content_type})
It gives me the following error:
File "D:/Upload-Tool/req.py", line 86, in upload
uploadInfo = requests.request("POST", uploadURL,data=m,headers={'Content-Type': m.content_type})
File "C:\Python27\lib\site-packages\requests\api.py", line 50, in request
response = session.request(method=method, url=url, **kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 444, in request
data = data or {},
File "C:\Python27\lib\site-packages\requests_toolbelt\multipart\encoder.py", line 292, in __len__
return len(self.encoder)
File "C:\Python27\lib\site-packages\requests_toolbelt\multipart\encoder.py", line 96, in __len__
return self._len or self._calculate_length()
File "C:\Python27\lib\site-packages\requests_toolbelt\multipart\encoder.py", line 111, in _calculate_length
) + boundary_len + 4
File "C:\Python27\lib\site-packages\requests_toolbelt\multipart\encoder.py", line 110, in <genexpr>
(boundary_len + len(p) + 4) for p in self.parts
File "C:\Python27\lib\site-packages\requests_toolbelt\multipart\encoder.py", line 377, in __len__
return len(self.headers) + super_len(self.body)
File "C:\Python27\lib\site-packages\requests\utils.py", line 52, in super_len
return len(o)
OverflowError: long int too large to convert to int
```
Please help if my code has some errors :)
edit: sorry, totally forgot to upgrade the toolbelt :) thx. a lot
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2691/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2691/timeline
| null |
completed
| null | null | false |
[
"That's a bizarre error. It seems like Python 2 doesn't like having len() called on really large files.\n",
"Seems like there's literally a requirement that `len()` not return anything that can't fit into an integer:\n\n``` python\n>>> class TestClass(object):\n... def __len__(self):\n... return 100000000000000000000000000000000L\n... \n>>> c = TestClass()\n>>> len(c)\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\nOverflowError: long int too large to convert to int\n```\n",
"I think in principle we could solve this by rearranging the calls in `super_len`, but that puts us at risk of breaking something else.\n",
"Hmm, on my system files in Python 2 have no length. So this may actually be a requests-toolbelt bug.\n",
"Aha, this is actually a duplicate of sigmavirus24/requests-toolbelt#80. Please upgrade the requests-toolbelt to 0.4.0.\n",
"Hello, i have the same beahviour but i don't understand the traceback ...\r\nFor information, the file size is 2715536587, i use my own upload handler with a __len__ method.\r\nRequests version: requests==2.13.0\r\n\r\n```python\r\nTraceback (most recent call last):\r\n File \"/myapp/tools/__init__.py\", line 103, in _wrap_run\r\n self._real_run()\r\n File \"/usr/lib/python2.7/threading.py\", line 763, in run\r\n self.__target(*self.__args, **self.__kwargs)\r\n File \"/myapp/tools/__init__.py\", line 118, in process_low\r\n task(*args)\r\n File \"/myapp/propagate.py\", line 188, in process_upload\r\n f.transferred, error = self._upload(f, rdiff=True if changed and self.psync.rsync_mode else False)\r\n File \"/myapp/propagate.py\", line 477, in _upload\r\n up.start()\r\n File \"/myapp/transfer.py\", line 290, in start\r\n read_timeout=30)\r\n File \"/myapp/http.py\", line 30, in wrapped\r\n return f(self, *args, **kwargs)\r\n File \"/myapp/http.py\", line 152, in post\r\n timeout=(self.connect_timeout, read_timeout))\r\n File \"/usr/local/lib/python2.7/dist-packages/requests/sessions.py\", line 535, in post\r\n return self.request('POST', url, data=data, json=json, **kwargs)\r\n File \"/usr/local/lib/python2.7/dist-packages/requests/sessions.py\", line 467, in request\r\n data = data or {},\r\nOverflowError: long int too large to convert to int\r\n```",
"My guess is that this is happening because of the Boolean check on the data, asking for len and getting the wrong answer. What Python version is this?",
"Python 2.7.9 (default, Aug 13 2016, 17:56:53)\r\ni checked, the size returned by the len method is : 2715536587L",
"I tried to fix the evaluation in requests, for the variable data (with is None or not), but i am blocked again by the super len evaluation:\r\n\r\n```python\r\nTraceback (most recent call last):\r\n File \"/myapp/transfer.py\", line 288, in start\r\n read_timeout=30)\r\n File \"/myapp/http.py\", line 30, in wrapped\r\n return f(self, *args, **kwargs)\r\n File \"/myapp/http.py\", line 152, in post\r\n timeout=(self.connect_timeout, read_timeout))\r\n File \"/usr/local/lib/python2.7/dist-packages/requests/sessions.py\", line 535, in post\r\n return self.request('POST', url, data=data, json=json, **kwargs)\r\n File \"/usr/local/lib/python2.7/dist-packages/requests/sessions.py\", line 474, in request\r\n prep = self.prepare_request(req)\r\n File \"/usr/local/lib/python2.7/dist-packages/requests/sessions.py\", line 407, in prepare_request\r\n hooks=merge_hooks(request.hooks, self.hooks),\r\n File \"/usr/local/lib/python2.7/dist-packages/requests/models.py\", line 305, in prepare\r\n self.prepare_body(data, files, json)\r\n File \"/usr/local/lib/python2.7/dist-packages/requests/models.py\", line 471, in prepare_body\r\n length = super_len(data)\r\n File \"/usr/local/lib/python2.7/dist-packages/requests/utils.py\", line 55, in super_len\r\n total_length = len(o)\r\nOverflowError: long int too large to convert to int\r\n)\r\n```",
"Resolved by deleting __len__ method and set len as class attribute"
] |
https://api.github.com/repos/psf/requests/issues/2690
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2690/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2690/comments
|
https://api.github.com/repos/psf/requests/issues/2690/events
|
https://github.com/psf/requests/issues/2690
| 97,054,690 |
MDU6SXNzdWU5NzA1NDY5MA==
| 2,690 |
Broken commit 5e6ecdad9f69b1ff789a17733b8edc6fd7091bd8
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/35176?v=4",
"events_url": "https://api.github.com/users/anatolyborodin/events{/privacy}",
"followers_url": "https://api.github.com/users/anatolyborodin/followers",
"following_url": "https://api.github.com/users/anatolyborodin/following{/other_user}",
"gists_url": "https://api.github.com/users/anatolyborodin/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/anatolyborodin",
"id": 35176,
"login": "anatolyborodin",
"node_id": "MDQ6VXNlcjM1MTc2",
"organizations_url": "https://api.github.com/users/anatolyborodin/orgs",
"received_events_url": "https://api.github.com/users/anatolyborodin/received_events",
"repos_url": "https://api.github.com/users/anatolyborodin/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/anatolyborodin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/anatolyborodin/subscriptions",
"type": "User",
"url": "https://api.github.com/users/anatolyborodin",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2015-07-24T13:37:04Z
|
2021-08-27T00:08:22Z
|
2015-07-24T13:47:28Z
|
NONE
|
resolved
|
"git fsck" thinks [this commit](https://github.com/kennethreitz/requests/commit/5e6ecdad9f69b1ff789a17733b8edc6fd7091bd8) is broken:
```
error in commit 5e6ecdad9f69b1ff789a17733b8edc6fd7091bd8: invalid author/committer line - bad time zone
```
```
commit 5e6ecdad9f69b1ff789a17733b8edc6fd7091bd8
tree 0d1971c86d047f9530aee85a50b8fc1ca12723ba
parent 9471b0ab889a4684f87952cb951f7f4dc3f59dda
author Shrikant Sharat Kandula <...> 1313584730 +051800
committer Shrikant Sharat Kandula <...> 1313584730 +051800
```
Would it be possible to repair the history?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2690/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2690/timeline
| null |
completed
| null | null | false |
[
"No it isn't possible to repair the history. That will change the SHA value of every commit after it and update the timestamps on ALL of them. It will invalidate every tag since that commit and it will cause far too many headaches.\n",
"To be clear, it is _technically_ possible to repair this, also if repairing it didn't affect the repository's history from that point on, I would be very much in favor of it, but it irrevocably breaks expectations around our history and will cause more trouble than not. You appear to be the first person to have checked this and found it to be an issue.\n\nEven if `git fsck` reports the commit as being damaged, `git` continues to work just fine. This doesn't realistically affect the use of the repository or any person's ability to contribute to it. I appreciate you reporting this, but it simply isn't something we're going to fix.\n",
"> You appear to be the first person to have checked this and found it to be an issue.\n\nOh, really? I thought I'm just too lazy to find an old and closed issue.\n\nI just happen to have \"transfer.fsckObjects = true\" in my ~/.gitignore...\n\n@sigmavirus24, yep, this looks like a very popular repo. Forking just for one commit would be a solution, but... Ok, I'll just disable the checks in .git/config after the checkout.\n",
"@anatolyborodin Recent versions of git (`master`, but not yet in any released version) have support for configurable fsck checks. So you will be able to set `receive.fsck.badtimezone = warn`, for example, if you would like the usual checks, but would prefer to ignore this particular (minor) problem.\n"
] |
https://api.github.com/repos/psf/requests/issues/2689
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2689/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2689/comments
|
https://api.github.com/repos/psf/requests/issues/2689/events
|
https://github.com/psf/requests/issues/2689
| 97,039,521 |
MDU6SXNzdWU5NzAzOTUyMQ==
| 2,689 |
convert requests command
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6317026?v=4",
"events_url": "https://api.github.com/users/a4fr/events{/privacy}",
"followers_url": "https://api.github.com/users/a4fr/followers",
"following_url": "https://api.github.com/users/a4fr/following{/other_user}",
"gists_url": "https://api.github.com/users/a4fr/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/a4fr",
"id": 6317026,
"login": "a4fr",
"node_id": "MDQ6VXNlcjYzMTcwMjY=",
"organizations_url": "https://api.github.com/users/a4fr/orgs",
"received_events_url": "https://api.github.com/users/a4fr/received_events",
"repos_url": "https://api.github.com/users/a4fr/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/a4fr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/a4fr/subscriptions",
"type": "User",
"url": "https://api.github.com/users/a4fr",
"user_view_type": "public"
}
|
[
{
"color": "fbca04",
"default": false,
"description": null,
"id": 615414998,
"name": "GAE Support",
"node_id": "MDU6TGFiZWw2MTU0MTQ5OTg=",
"url": "https://api.github.com/repos/psf/requests/labels/GAE%20Support"
}
] |
closed
| true | null |
[] | null | 1 |
2015-07-24T12:11:49Z
|
2021-09-08T09:00:45Z
|
2015-07-24T12:52:45Z
|
NONE
|
resolved
|
I'm using `requests` module in my app but when i run it in google app engine, raise **SSL Connection** exception:
```
File "/base/data/home/apps/s~myapp/1.385939809966824449/lib/requests/adapters.py", line 429, in send
raise SSLError(e, request=request)
SSLError: Can't connect to HTTPS URL because the SSL module is not available.
```
I google this problem and read some answers that said SSL module is not in _GAE while list_.
I can send a simple GET request with `urllib`. It was ok.
What is equivalent of `requests.request('GET', request_url, params=params, files=files)` in `urllib`?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2689/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2689/timeline
| null |
completed
| null | null | false |
[
"Requests does not consider Google App Engine a supported platform at this time, so I'm afraid we can't help you here. I recommend asking your question on StackOverflow. =) Sorry!\n"
] |
https://api.github.com/repos/psf/requests/issues/2688
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2688/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2688/comments
|
https://api.github.com/repos/psf/requests/issues/2688/events
|
https://github.com/psf/requests/issues/2688
| 96,968,182 |
MDU6SXNzdWU5Njk2ODE4Mg==
| 2,688 |
When path of requests contains non-ascii chars, it failed loading ca_certs
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1819074?v=4",
"events_url": "https://api.github.com/users/yyjdelete/events{/privacy}",
"followers_url": "https://api.github.com/users/yyjdelete/followers",
"following_url": "https://api.github.com/users/yyjdelete/following{/other_user}",
"gists_url": "https://api.github.com/users/yyjdelete/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/yyjdelete",
"id": 1819074,
"login": "yyjdelete",
"node_id": "MDQ6VXNlcjE4MTkwNzQ=",
"organizations_url": "https://api.github.com/users/yyjdelete/orgs",
"received_events_url": "https://api.github.com/users/yyjdelete/received_events",
"repos_url": "https://api.github.com/users/yyjdelete/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/yyjdelete/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yyjdelete/subscriptions",
"type": "User",
"url": "https://api.github.com/users/yyjdelete",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 28 |
2015-07-24T05:20:07Z
|
2021-09-05T00:07:01Z
|
2019-02-14T14:06:18Z
|
NONE
|
resolved
|
Tested with Python 3.4.3, requests2.6.0(Used by [wakatime](https://github.com/wakatime/wakatime)), windows 7
Also try with 2.7.0
```
Traceback (most recent call last):
File "cli.py", line 18, in <module>
sys.exit(wakatime.main(sys.argv))
File "C:\Users\**\AppData\Roaming\Microsoft Corporation\Microsoft\xae Visual Studio\xae 2015\14.0.23107.0\wakatime-master\wakatime\base.py", line 460, in main
if send_heartbeat(**kwargs):
File "C:\Users\**\AppData\Roaming\Microsoft Corporation\Microsoft\xae Visual Studio\xae 2015\14.0.23107.0\wakatime-master\wakatime\base.py", line 377, in send_heartbeat
proxies=proxies)
File "C:\Users\**\AppData\Roaming\Microsoft Corporation\Microsoft\xae Visual Studio\xae 2015\14.0.23107.0\wakatime-master\wakatime\packages\requests\sessions.py", line 507, in post
return self.request('POST', url, data=data, json=json, **kwargs)
File "C:\Users\**\AppData\Roaming\Microsoft Corporation\Microsoft\xae Visual Studio\xae 2015\14.0.23107.0\wakatime-master\wakatime\packages\requests\sessions.py", line 464, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\**\AppData\Roaming\Microsoft Corporation\Microsoft\xae Visual Studio\xae 2015\14.0.23107.0\wakatime-master\wakatime\packages\requests\sessions.py", line 576, in send
r = adapter.send(request, **kwargs)
File "C:\Users\**\AppData\Roaming\Microsoft Corporation\Microsoft\xae Visual Studio\xae 2015\14.0.23107.0\wakatime-master\wakatime\packages\requests\adapters.py", line 370, in send
timeout=timeout
File "C:\Users\**\AppData\Roaming\Microsoft Corporation\Microsoft\xae Visual Studio\xae 2015\14.0.23107.0\wakatime-master\wakatime\packages\requests\packages\urllib3\connectionpool.py", line 544, in urlopen
body=body, headers=headers)
File "C:\Users\**\AppData\Roaming\Microsoft Corporation\Microsoft\xae Visual Studio\xae 2015\14.0.23107.0\wakatime-master\wakatime\packages\requests\packages\urllib3\connectionpool.py", line 341, in _make_request
self._validate_conn(conn)
File "C:\Users\**\AppData\Roaming\Microsoft Corporation\Microsoft\xae Visual Studio\xae 2015\14.0.23107.0\wakatime-master\wakatime\packages\requests\packages\urllib3\connectionpool.py", line 762, in _validate_conn
conn.connect()
File "C:\Users\**\AppData\Roaming\Microsoft Corporation\Microsoft\xae Visual Studio\xae 2015\14.0.23107.0\wakatime-master\wakatime\packages\requests\packages\urllib3\connection.py", line 238, in connect
ssl_version=resolved_ssl_version)
File "C:\Users\**\AppData\Roaming\Microsoft Corporation\Microsoft\xae Visual Studio\xae 2015\14.0.23107.0\wakatime-master\wakatime\packages\requests\packages\urllib3\util\ssl_.py", line 256, in ssl_wrap_socket
context.load_verify_locations(ca_certs)
TypeError: cafile should be a valid filesystem path
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3725578?v=4",
"events_url": "https://api.github.com/users/Projjol-zz/events{/privacy}",
"followers_url": "https://api.github.com/users/Projjol-zz/followers",
"following_url": "https://api.github.com/users/Projjol-zz/following{/other_user}",
"gists_url": "https://api.github.com/users/Projjol-zz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Projjol-zz",
"id": 3725578,
"login": "Projjol-zz",
"node_id": "MDQ6VXNlcjM3MjU1Nzg=",
"organizations_url": "https://api.github.com/users/Projjol-zz/orgs",
"received_events_url": "https://api.github.com/users/Projjol-zz/received_events",
"repos_url": "https://api.github.com/users/Projjol-zz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Projjol-zz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Projjol-zz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Projjol-zz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2688/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2688/timeline
| null |
completed
| null | null | false |
[
"OS language:zh-cn, default encoding: GBK\n",
"Can you show me some sample code please? I want to see if you're using a unicode string or a byte string.\n",
"Just an simple code\n\n``` python\n#!/usr/bin/env python\n#fileencoding: utf-8\n\nimport sys\nfrom packages import requests\n\nr = requests.get('https://www.github.com/')\nsys.stdout.write(r.text)\n\n```\n\n---\n\nC:\\Python34\\python.exe --version\n`Python 3.4.3`\n\n---\n\nC:\\Users**\\AppData\\Roaming\\Microsoft Corporation\\Microsoft® Visual Studio® 2015\\14.0.23107.0\\wakatime-master\\wakatime>C:\\Python34\\python.exe test.py\n\n```\nTraceback (most recent call last):\n File \"test.py\", line 7, in <module>\n r = requests.get('https://www.github.com/')\n File \"C:\\Users\\**\\AppData\\Roaming\\Microsoft Corporation\\Microsoft\\xae Visual Studio\\xae 2**5\\14.0.23107.0\\wakatime-master\\wakatime\\packages\\requests\\api.py\",line 69, in get\n return request('get', url, params=params, **kwargs)\n File \"C:\\Users\\**\\AppData\\Roaming\\Microsoft Corporation\\Microsoft\\xae Visual Studio\\xae 2**5\\14.0.23107.0\\wakatime-master\\wakatime\\packages\\requests\\api.py\",line 50, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"C:\\Users\\**\\AppData\\Roaming\\Microsoft Corporation\\Microsoft\\xae Visual Studio\\xae 2**5\\14.0.23107.0\\wakatime-master\\wakatime\\packages\\requests\\sessions.py\", line 465, in request\n resp = self.send(prep, **send_kwargs)\n File \"C:\\Users\\**\\AppData\\Roaming\\Microsoft Corporation\\Microsoft\\xae Visual Studio\\xae 2**5\\14.0.23107.0\\wakatime-master\\wakatime\\packages\\requests\\sessions.py\", line 573, in send\n r = adapter.send(request, **kwargs)\n File \"C:\\Users\\**\\AppData\\Roaming\\Microsoft Corporation\\Microsoft\\xae Visual Studio\\xae 2**5\\14.0.23107.0\\wakatime-master\\wakatime\\packages\\requests\\adapters.py\", line 371, in send\n timeout=timeout\n File \"C:\\Users\\**\\AppData\\Roaming\\Microsoft Corporation\\Microsoft\\xae Visual Studio\\xae 2**5\\14.0.23107.0\\wakatime-master\\wakatime\\packages\\requests\\packages\\urllib3\\connectionpool.py\", line 544, in urlopen\n body=body, headers=headers)\n File \"C:\\Users\\**\\AppData\\Roaming\\Microsoft Corporation\\Microsoft\\xae Visual Studio\\xae 2**5\\14.0.23107.0\\wakatime-master\\wakatime\\packages\\requests\\packages\\urllib3\\connectionpool.py\", line 341, in _make_request\n self._validate_conn(conn)\n File \"C:\\Users\\**\\AppData\\Roaming\\Microsoft Corporation\\Microsoft\\xae Visual Studio\\xae 2**5\\14.0.23107.0\\wakatime-master\\wakatime\\packages\\requests\\packages\\urllib3\\connectionpool.py\", line 761, in _validate_conn\n conn.connect()\n File \"C:\\Users\\**\\AppData\\Roaming\\Microsoft Corporation\\Microsoft\\xae Visual Studio\\xae 2**5\\14.0.23107.0\\wakatime-master\\wakatime\\packages\\requests\\packages\\urllib3\\connection.py\", line 238, in connect\n ssl_version=resolved_ssl_version)\n File \"C:\\Users\\**\\AppData\\Roaming\\Microsoft Corporation\\Microsoft\\xae Visual Studio\\xae 2**5\\14.0.23107.0\\wakatime-master\\wakatime\\packages\\requests\\packages\\urllib3\\util\\ssl_.py\", line 267, in ssl_wrap_socket\n context.load_verify_locations(ca_certs)\nTypeError: cafile should be a valid filesystem path\n```\n\n\"**\" is my userName and be removed, \"\\xae\" is \"®\" in the filesystem.\n\n---\n\nAnd when move folder `C:\\Users\\**\\AppData\\Roaming\\Microsoft Corporation\\Microsoft® Visual Studio® 2015\\14.0.23107.0\\wakatime-master\\wakatime` to `C:\\Users\\**\\AppData\\Roaming\\Microsoft Corporation\\wakatime-master\\wakatime`(without non-ascii ®), it works well.\n",
"Ah, I see, the problem is that there's a non-ascii character in the location that requests is installed in. That makes sense.\n\nSo the path we generate is generated by doing `os.path.join(os.path.dirname(__file__), 'cacert.pem')`. Given that requests hasn't touched this at all, it looks like it's possible to get this wrong.\n\nI'd like you to quickly `cd` into the directory that wakatime is in, and run the following in a Python interpreter and show me the output (censoring your username):\n\n``` python\n>>> import os\n>>> import os.path\n>>> os.getcwd()\n>>> os.path.dirname(os.getcwd())\n```\n\nRight now this looks to me like the SSL library is not capable of handling Windows paths properly. @tiran?\n",
"```\nC:\\Users\\**\\AppData\\Roaming\\Microsoft Corporation\\Microsoft® Visual Studio® 2015\\14.0.23107.0\\wakatime-master\\wakatime>C:\\Python34\\python.exe\nPython 3.4.3 (v3.4.3:9b73f1c3e601, Feb 24 2015, 22:44:40) [MSC v.1600 64 bit (AMD64)] on win32\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import os\n>>> import os.path\n>>> os.getcwd()\n'C:\\\\Users\\\\**\\\\AppData\\\\Roaming\\\\Microsoft Corporation\\\\Microsoft\\xae Visual Studio\\xae 2015\\\\14.0.23107.0\\\\wakatime-master\\\\wakatime'\n>>> os.path.dirname(os.getcwd())\n'C:\\\\Users\\\\**\\\\AppData\\\\Roaming\\\\Microsoft Corporation\\\\Microsoft\\xae Visual Studio\\xae 2015\\\\14.0.23107.0\\\\wakatime-master'\n>>>\n```\n",
"Right, so this error seems to be entirely in the stdlib. For some reason we're not able to pass a path that we got from the stdlib _to_ the stdlib. I'm hoping that when @tiran wakes up he'll confirm that I'm right, and then we can chase this up in the stdlib.\n",
"The SSL module uses PyUnicode_FSConverter() to convert the path from unicode to bytes using the FS default encoding. What's the output of \n\n```\n>>> import os, sys\n>>> sys.getfilesystemencoding()\n>>> os.path.abspath(os.getcwd()).encode(sys.getfilesystemencoding())\n```\n\non your system?\n\nThe problem might be another step further down the stack. OpenSSL accepts one char arguments. It has no wide char variants for e.g. file names. BIO_new_file() has some magic to detect UTF-8 encoded bytes. You could get lucky if you encode the file name from unicode to UTF-8 first.\n",
"Inside a Windows8 VM I was able to create a folder named `test®` and run some tests using kennethreitz/requests@3385c789991197d9b08da1de203615f6f40c2738.\n\n```\nC:\\testr>\\Python34\\python.exe\nPython 3.4.3 (v3.4.3:9b73f1c3e601, Feb 24 2015, 22:43:06) [MSC v.1600 32 bit (In\ntel)] on win32\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import sys, os\n>>> os.getcwd()\n'C:\\\\test\\xae'\n>>> sys.getfilesystemencoding()\n'mbcs'\n>>> os.path.abspath(os.getcwd()).encode(sys.getfilesystemencoding())\nb'C:\\\\test\\xae'\n>>>\n```\n\nHowever, I'm not able to reproduce the cafile `TypeError`. The request to github succeeds, but when printing the response content I get a `UnicodeEncodeError`:\n\n```\nC:\\testr>\\Python34\\python.exe\nPython 3.4.3 (v3.4.3:9b73f1c3e601, Feb 24 2015, 22:43:06) [MSC v.1600 32 bit (In\ntel)] on win32\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests\n>>> r = requests.get('https://www.github.com/')\n>>> import sys\n>>> sys.stdout.write(r.text)\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"C:\\Python34\\lib\\encodings\\cp437.py\", line 19, in encode\n return codecs.charmap_encode(input,self.errors,encoding_map)[0]\nUnicodeEncodeError: 'charmap' codec can't encode character '\\u2019' in position\n10879: character maps to <undefined>\n>>>\n```\n\nMy first guess is we are using a different filesystem encoding.\n",
"Tested on another PC with Win10\nC:\\Users\\*********\\AppData\\Roaming\\Microsoft Corporation\\Microsoft® Visual Studio® 2015\\14.0.23107.0\\wakatime-masteratime>\"C:\\Python34\\python\"\n\n``` python\nPython 3.4.3 (v3.4.3:9b73f1c3e601, Feb 24 2015, 22:44:40) [MSC v.1600 64 bit (AMD64)] on win32\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import os, sys, locale\n>>> sys.getfilesystemencoding()\n'mbcs'\n>>> sys.getdefaultencoding()\n'utf-8'\n>>> os.path.abspath(os.getcwd()).encode(sys.getfilesystemencoding())\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\nUnicodeEncodeError: 'mbcs' codec can't encode characters in position 0--1: invalid character\n>>> locale.getdefaultlocale()\n('zh_CN', 'cp936')\n>>> locale.getpreferredencoding()\n'cp936'\n>>> os.path.abspath(os.getcwd()).encode('cp936')#also known as gbk\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\nUnicodeEncodeError: 'gbk' codec can't encode character '\\xae' in position 66: illegal multibyte sequence\n```\n",
"Well, this is tricky. That's a real mess of encodings, none of which are going to play real nice together.\n",
"## I wonder if we should page Nick Coghlan and/or Steve Dower. I have no clue how we might handle this\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n",
"Yeah, good idea: I paged Nick.\n",
"My first suggestion is that any experimentation with this be undertaken using either IPython Notebook or with https://pypi.python.org/pypi/win_unicode_console enabled. Otherwise you're going to run into confusingly unrelated issues related to the console's stdin and stdout handling that may obscure the underlying issues in interacting with OpenSSL (if you _do_ use win_unicode_console, feedback on the related CPython tracker issues would be appreciated - it would be nice to see that as the default console behaviour in 3.6).\n\nMy second suggestion would be to try _something else_ that uses OpenSSL on Windows (MinGW, perhaps?) and see if it can open that cert file in that location. If OpenSSL is still internally using the 8-bit mbcs Windows APIs rather than the UTF-16-LE ones, then it may be necessary to find a way load the cert bundle that lets Python code take care of the path handling.\n\nIn relation to that, the ssl module code in 3.4+ and 2.7.9+ to load certs from the Windows cert store may be relevant, although Christian Heimes & Steve Dower would be better folks to ask about that.\n",
"Oops, @tiran already confirmed that OpenSSL only accepts 8-bit paths, so there's no need to experiment to determine that.\n\nThe issues addressed by win_unicode_console are still a potentially confounding factor in some of the experiments above.\n\nAt this point, I'd say the suggestion of force encoding the cert path to UTF-8 (or UTF-8-BOM?) to try to trigger OpenSSL's encoding detection may be worth trying, but it will likely only work in this case if that actually prompts OpenSSL to switch to using the UTF-16-LE Windows APIs.\n",
"path with remote chars success with path.encode('utf8')\n\nTest with an path that include ACSII + cp437??(remote) encoding chars(®) + cp936(local) encoding chars(中文)\n\n```\nC:\\test®中文>C:\\Python34\\py.exe\nPython 3.4.3 (v3.4.3:9b73f1c3e601, Feb 24 2015, 22:44:40) [MSC v.1600 64 bit (AMD64)] on win32\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import os, sys, locale\n>>> os.getcwd()\n'C:\\\\test\\xae中文'\n>>> sys.getdefaultencoding()\n'utf-8'\n>>> sys.getfilesystemencoding()\n'mbcs'\n>>> locale.getdefaultlocale()\n('zh_CN', 'cp936')\n>>> locale.getpreferredencoding()\n'cp936'\n>>> import win_unicode_console\n>>> win_unicode_console.enable()\n>>> os.getcwd()\n'C:\\\\test®中文'\n```\n\nput cacert.pem at 'C:\\test®中文\\'\n\n```\nPython 3.4.3 (v3.4.3:9b73f1c3e601, Feb 24 2015, 22:44:40) [MSC v.1600 64 bit (AMD64)] on win32\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import os, ssl, sys\n>>> origpath = os.getcwd()\n>>> origpath\n'C:\\\\Python34'\n>>> relpath = 'cacert.pem'\n>>> dir = 'C:\\\\test\\xae中文\\\\'\n>>> abspath = dir + relpath\n>>>\n>>> from ssl import SSLContext\n>>> context = SSLContext(ssl.PROTOCOL_SSLv23)\n>>> #failed with abspath\n... context.load_verify_locations(abspath)\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\nTypeError: cafile should be a valid filesystem path\n>>> #but work with encode('utf8')\n... context.load_verify_locations(abspath.encode('utf8'))\n>>> #also work with encode('utf8').decode('mbcs')\n... context.load_verify_locations(abspath.encode('utf8').decode('mbcs'))\n>>> abspath.encode('utf8').decode('mbcs')\n'C:\\\\test庐涓\\ue15f枃\\\\'\n>>> #also work with relpath if relpath is ASCII, but os.chdir does not need encode??\n... os.chdir(dir)\n>>> context.load_verify_locations(relpath)\n```\n\nAnd if there is no 'C:\\test®中文\\cacert.pem', it will try loading'C:\\test庐涓\\ue15f枃\\cacert.pem'(tested with monitor)\nFor path with only ASCII and local encoding chars, it also works without encode('utf8'), and seems do not make an fallback\n",
"So, is the suggested workaround that we should consider paths unicode strings? Is that valid? Can we always assume that `sys.getfilesystemencoding()` will be right?\n",
"We also have to remember Python2.\n`str` in Python3 is unicode, so calling `.encode('utf-8')` on a `str` in Python3 works, but turns that `str` into `bytes`.\n`str` in Python2 is a [string literal](https://docs.python.org/2/reference/lexical_analysis.html#strings) and needs to be decoded into unicode before it can be encoded. Running those tests in Python2 should raise a `UnicodeDecodeError` when calling `.encode('utf-8')`.\n\nTo support Python2, we need to:\n- detect if running in Python2 from `is_py2` in [requests/compat.py](https://github.com/kennethreitz/requests/blob/master/requests/compat.py#L19)\n- if is_py2, decode the `str` using the correct character encoding\n- encode as unicode with `.encode('utf-8')`\n\nI'm not sure the correct character encoding to use when decoding `str`, does anyone know? Should be one from `sys.getdefaultencoding()`, `sys.getfilesystemencoding()`, or `locale.getdefaultlocale()`.\n",
"It seems python2 can't recognize path with non-local encoding chars and replace them with '?', which can be recovered.\nTested with Portable Python 2.7.3.2\n`Python 2.7.3 (default, Apr 10 2012, 23:31:26) [MSC v.1500 32 bit (Intel)] on win32`\n",
"@yyjdelete we have a solution for Python3.4 (always encode as utf-8) but do we have a working solution for Python2? Can you run these tests on Python2:\n\n``` python\nimport ssl, sys\nfrom ssl import SSLContext\n\ncontext = SSLContext(ssl.PROTOCOL_SSLv23)\ncontext.load_verify_locations(abspath)\ncontext.load_verify_locations(abspath.decode(sys.getfilesystemencoding()))\ncontext.load_verify_locations(abspath.decode(sys.getfilesystemencoding()).encode('utf8'))\n```\n",
"I made an mistake. In python2, `os.getcwdu()` and unicode string should be used instead of normal version, so we can still get the path.\n\n@alanhamlett \nIt's hard to test this case, for old python which doesn't has `ssl.SSLContext`(like python 2.7), wrapper in `urllib3/util/ssl_.py` is used instead, which doesn't crash with `load_verify_locations`\n\n---\n\nBTW: Is `os.path.supports_unicode_filenames` should be considered for the case?\n\nSorry for my poor English, and I'm not an python programmer.\n",
"Python 2.7.9+ includes the newer SSL module backported from Python 3.4, and may hit the same problem as the Python 3 version.\n",
"I'm not sure if it really works well with python2 without newer SSL module, because I didn't find an way to run .py with python2 at the path, for that python2 seems not use unicode to load .py and can not recognize the file with remote encoding.\nMaybe I will do some test in interpreter later.\n\n---\n\nTested(NOTE: in fact Python2 not work with .py, only some test with interpreter)\n(Portable )Python 2.7.3.2(without newer SSL module): Failed, only 'ascii' is accepted.\nPython 2.7.10: work with local encoding/unicode(the same as 3.4)\n",
"Ah, OK - Python 2 not running at all from that path is actually what I expected, but I interpreted some of your earlier comments as indicating it was working.\n\nThere are all sorts of \"ASCII only\" assumptions in the Python 2 import machinery and startup code that were only addressed properly through the more pervasive adoption of Unicode in Python 3.\n",
"@yyjdelete what is the value returned from `os.path.supports_unicode_filenames` on your machine?\n",
"@alanhamlett\n`os.path.supports_unicode_filenames` is `True` in both py2 and py3\n",
"This overwritten `where` function got rid of the `TypeError: cafile should be a valid filesystem path` exception for me:\n\nhttps://github.com/wakatime/wakatime/blob/bba2015d9b6797f8eca94fef6dbd964c27734891/wakatime/packages/requests/certs.py#L25\n\nTested using a Windows 8.1 virtual machine with Simpified Chinese language. Was able to reproduce the `TypeError` exception only using Python 3.x, when testing with Python 2.x the original code worked.\n",
"Reproduced that problem on Windows 8.1 x64 with Python 3.5.1.\n\nI had a pem file at the following path: `C:\\Users\\غازي\\AppData\\Local\\Temp\\_غازي_70e5wbxo\\cacert.pem`.\n\n```\nc = ssl.SSLContext(ssl.PROTOCOL_TLSv1_2)\n\nc.load_verify_locations(cafile=r\"C:\\Users\\غازي\\AppData\\Local\\Temp\\_غازي_70e5wbxo\\cacert.pem\")\n> TypeError: cafile should be a valid filesystem path\n\nc.load_verify_locations(cafile=r\"C:\\Users\\غازي\\AppData\\Local\\Temp\\_غازي_70e5wbxo\\cacert.pem\".encode(sys.getfilesystemencoding()))\n> UnicodeEncodeError: 'mbcs' codec can't encode characters in positions 0--1: invalid character\n\nc.load_verify_locations(cafile=r\"C:\\Users\\غازي\\AppData\\Local\\Temp\\_غازي_70e5wbxo\\cacert.pem\".encode('utf-8'))\n> ok\n```\n",
"Closing due to inactivity."
] |
https://api.github.com/repos/psf/requests/issues/2687
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2687/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2687/comments
|
https://api.github.com/repos/psf/requests/issues/2687/events
|
https://github.com/psf/requests/issues/2687
| 96,840,115 |
MDU6SXNzdWU5Njg0MDExNQ==
| 2,687 |
Upgrade to urllib3 1.11 to fix connection pool exhaustion
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/887132?v=4",
"events_url": "https://api.github.com/users/rrva/events{/privacy}",
"followers_url": "https://api.github.com/users/rrva/followers",
"following_url": "https://api.github.com/users/rrva/following{/other_user}",
"gists_url": "https://api.github.com/users/rrva/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rrva",
"id": 887132,
"login": "rrva",
"node_id": "MDQ6VXNlcjg4NzEzMg==",
"organizations_url": "https://api.github.com/users/rrva/orgs",
"received_events_url": "https://api.github.com/users/rrva/received_events",
"repos_url": "https://api.github.com/users/rrva/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rrva/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rrva/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rrva",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2015-07-23T15:28:22Z
|
2021-09-08T23:00:46Z
|
2015-07-23T15:44:48Z
|
NONE
|
resolved
|
An important fix has been made to urllib3 which fixes http connection pool exhaustion when connections fail:
https://github.com/shazow/urllib3/issues/644
Can you release a new version which uses urllib3 1.11 ?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2687/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2687/timeline
| null |
completed
| null | null | false |
[
"We plan to: see #2678.\n\nFor future reference, two of the three requests core developers are also urllib3 core developers. urllib3 does not sneakily release software without us noticing. ;)\n",
"## I'm not a core developer on urllib3. :-P... Or is Kenneth?\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n",
"Pfft, sure you are.\n",
"PR #2678 is currently using a checkout of urrlib3 from some commit that is slightly earlier than 1.11. I understand 1.11 wasn't released yet when the PR was first created. But since I don't see any comments on that PR indicating either way, I was wondering: will you be updating the PR to use the released version, or is there a reason the PR is still using the checkout that it's using?\n",
"We will update the PR. =)\n"
] |
https://api.github.com/repos/psf/requests/issues/2686
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2686/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2686/comments
|
https://api.github.com/repos/psf/requests/issues/2686/events
|
https://github.com/psf/requests/issues/2686
| 96,485,548 |
MDU6SXNzdWU5NjQ4NTU0OA==
| 2,686 |
Misleading POST documentation
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/186461?v=4",
"events_url": "https://api.github.com/users/melbic/events{/privacy}",
"followers_url": "https://api.github.com/users/melbic/followers",
"following_url": "https://api.github.com/users/melbic/following{/other_user}",
"gists_url": "https://api.github.com/users/melbic/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/melbic",
"id": 186461,
"login": "melbic",
"node_id": "MDQ6VXNlcjE4NjQ2MQ==",
"organizations_url": "https://api.github.com/users/melbic/orgs",
"received_events_url": "https://api.github.com/users/melbic/received_events",
"repos_url": "https://api.github.com/users/melbic/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/melbic/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/melbic/subscriptions",
"type": "User",
"url": "https://api.github.com/users/melbic",
"user_view_type": "public"
}
|
[
{
"color": "0b02e1",
"default": false,
"description": null,
"id": 191274,
"name": "Contributor Friendly",
"node_id": "MDU6TGFiZWwxOTEyNzQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly"
},
{
"color": "fad8c7",
"default": false,
"description": null,
"id": 136616769,
"name": "Documentation",
"node_id": "MDU6TGFiZWwxMzY2MTY3Njk=",
"url": "https://api.github.com/repos/psf/requests/labels/Documentation"
}
] |
closed
| true | null |
[] | null | 3 |
2015-07-22T06:32:15Z
|
2021-09-08T23:00:41Z
|
2015-08-14T19:30:04Z
|
NONE
|
resolved
|
The GitHub example at [#more-complicated-post-requests](http://docs.python-requests.org/en/latest/user/quickstart/#more-complicated-post-requests) is very misleading. It would be great to mention the `json` parameter. If you want to know what it does, you'll have to navigate all the way through until the `PreparedRequest` class.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2686/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2686/timeline
| null |
completed
| null | null | false |
[
"The documentation should be something like:\n`:param json: (optional) json data, the passed in object gets serialized to json (json.dumps()) and the content type of the request is set to 'application/json'.`\n",
"@melbic Yeah, sorry, that documentation was written before the `json` parameter was added.\n\nWould you like to propose a pull request with some new wording?\n",
"I would rather that pull request be non-destructive. In other words, please do not remove what's already there, merely add. Many people use the \"latest\" documentation against older versions of requests. We don't really maintain old versions of the documentation so we need the documentation to be somewhat backwards compatible. The section you mention is exactly that. **Adding** information about the `json` parameter is fine, but please note the version for that section so people know \"I'm on 2.2.1 installed on Ubuntu 14.04, because the package archives are so wildly out of date, I can't use that feature because it was introduced in 2.4.0 (or whenever we actually introduced it).\"\n"
] |
https://api.github.com/repos/psf/requests/issues/2685
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2685/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2685/comments
|
https://api.github.com/repos/psf/requests/issues/2685/events
|
https://github.com/psf/requests/issues/2685
| 96,444,600 |
MDU6SXNzdWU5NjQ0NDYwMA==
| 2,685 |
[Request] Simple argument to set an overall timeout for the request
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7551671?v=4",
"events_url": "https://api.github.com/users/boolbag/events{/privacy}",
"followers_url": "https://api.github.com/users/boolbag/followers",
"following_url": "https://api.github.com/users/boolbag/following{/other_user}",
"gists_url": "https://api.github.com/users/boolbag/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/boolbag",
"id": 7551671,
"login": "boolbag",
"node_id": "MDQ6VXNlcjc1NTE2NzE=",
"organizations_url": "https://api.github.com/users/boolbag/orgs",
"received_events_url": "https://api.github.com/users/boolbag/received_events",
"repos_url": "https://api.github.com/users/boolbag/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/boolbag/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/boolbag/subscriptions",
"type": "User",
"url": "https://api.github.com/users/boolbag",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-07-22T00:15:38Z
|
2021-09-08T23:00:47Z
|
2015-07-22T06:02:45Z
|
NONE
|
resolved
|
The `timeout` parameter is useful but does not provide a way to set an overall timeout for the request.
But in keeping with _simple is beautiful_, it would be terrific if we could do:
`requests.get(url, overall_timeout=10)`
Thanks in advance for considering it. :)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7551671?v=4",
"events_url": "https://api.github.com/users/boolbag/events{/privacy}",
"followers_url": "https://api.github.com/users/boolbag/followers",
"following_url": "https://api.github.com/users/boolbag/following{/other_user}",
"gists_url": "https://api.github.com/users/boolbag/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/boolbag",
"id": 7551671,
"login": "boolbag",
"node_id": "MDQ6VXNlcjc1NTE2NzE=",
"organizations_url": "https://api.github.com/users/boolbag/orgs",
"received_events_url": "https://api.github.com/users/boolbag/received_events",
"repos_url": "https://api.github.com/users/boolbag/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/boolbag/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/boolbag/subscriptions",
"type": "User",
"url": "https://api.github.com/users/boolbag",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2685/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2685/timeline
| null |
completed
| null | null | false |
[
"As a workaround, the `@stopit.threading_timeoutable` seems to work.\n"
] |
https://api.github.com/repos/psf/requests/issues/2684
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2684/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2684/comments
|
https://api.github.com/repos/psf/requests/issues/2684/events
|
https://github.com/psf/requests/issues/2684
| 96,443,817 |
MDU6SXNzdWU5NjQ0MzgxNw==
| 2,684 |
[Request] Simple argument to limit the size of a downloaded resource
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7551671?v=4",
"events_url": "https://api.github.com/users/boolbag/events{/privacy}",
"followers_url": "https://api.github.com/users/boolbag/followers",
"following_url": "https://api.github.com/users/boolbag/following{/other_user}",
"gists_url": "https://api.github.com/users/boolbag/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/boolbag",
"id": 7551671,
"login": "boolbag",
"node_id": "MDQ6VXNlcjc1NTE2NzE=",
"organizations_url": "https://api.github.com/users/boolbag/orgs",
"received_events_url": "https://api.github.com/users/boolbag/received_events",
"repos_url": "https://api.github.com/users/boolbag/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/boolbag/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/boolbag/subscriptions",
"type": "User",
"url": "https://api.github.com/users/boolbag",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2015-07-22T00:08:51Z
|
2021-09-08T23:00:46Z
|
2015-07-22T04:17:57Z
|
NONE
|
resolved
|
When we use `stream=True` to cap the size of the response, we lose all the bells and whistles of the response object once the response is consumed. For instance, we can no longer return the response to the caller then inspect `status_code`.
In keeping with _simple is beautiful_, it would be terrific if we could do:
`requests.get(url, max_response_size=1024*1024)`
Thanks in advance for considering it. :)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2684/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2684/timeline
| null |
completed
| null | null | false |
[
"This is a duplicate of https://github.com/kennethreitz/requests/issues/2327\n\nPlease use GitHub's issue search before filing future feature requests.\n",
"@sigmavirus24 the link you posted is interesting, but I don't see that it addresses:\n\n> For instance, we can no longer return the response to the caller then inspect status_code.\n\nThe linked issue had an insightful explanation by @jvanasco of why this feature is needed, then no response... The issue was closed.\n\nIf `requests` means to live up to **http for humans**, then... (quoting the first post):\n\nIn keeping with simple is beautiful, it would be terrific if we could do:\n`requests.get(url, max_response_size=1024*1024)`\n",
"> For instance, we can no longer return the response to the caller then inspect status_code.\n\nWhat does this mean? Sure you can! `status_code` is a static value set once and then left unchanged, streaming the response has _nothing_ to do with it. Can you elaborate by providing some sample code that demonstrates the problem?\n\n> If requests means to live up to http for humans, then... (quoting the first post):\n> \n> In keeping with simple is beautiful, it would be terrific if we could do:\n> `requests.get(url, max_response_size=1024*1024)`\n\nSure.\n\nIt'd also be terrific if we could do `requests.get(url, total_timeout=5)`. And also `requests.get(url, start_delay=5, max_retries=10)` (see #2682). And also `requests.get(url, websocket_callback=handler)` (see #2634). And also `requests.get(url, local_port=6667)` (see #2563). And also `requests.get(url, socket_options=[(socket.SO_KEEPALIVE, 1)])` (see #2447). And...\n\nBut the problem is now you get this line of code:\n\n``` python\nr = requests.post(\n url=url,\n data=data,\n max_response_size=1024*1024,\n total_timeout=15,\n start_delay=5,\n max_retries=10,\n websocket_callback=handler,\n local_port=6667,\n socket_options=[(socket.SO_KEEPALIVE, 1)],\n proxies=proxies,\n timeout=6,\n stream=True,\n)\n```\n\nThat line of code is many things: long, weird, mutually contradictory (I specify timeouts in two different manners there, and I've set stream=True but somehow expect a total timeout to be enforced without it being clear what 'total' really means here, and what the hell is the point of passing SO_KEEPALIVE in this context, and really I'm going to get a websocket upgrade from a POST?). But what it isn't is _simple_.\n\nDo you see the point? There are lots of features we _could_ add whose individual interfaces are simple, and that by themselves represent very little burden. But, if we do that for all of them, then we end up like curl. [Here](http://curl.haxx.se/docs/manpage.html) is curl's manpage. curl is a fantastic tool, but I think we can both agree that its interface is not simple.\n\nSimplicity is about constraint. Simplicity is about _not_ doing things, not about doing them. Keeping a project simple means resisting scope creep. It means resisting attempts to add functionality that can be kept outside the library, when doing so does not cause problems for the vast majority of users.\n\nRequests should do things that have excellent _bang for buck_. We should do things that are a bit complex (so that users implementing it by themselves might get it wrong), and that are extremely broadly useful (so that many users would have to implement it themselves). This feature request is not those things.\n\nIf users are really worried about a file being accidentally massive, then they should use `stream=True` and track the size themselves. Otherwise, if you pass a response object around a lot, you might suddenly find you get 'unexpected size' errors being thrown from weird places in your code all because of an accidental LoC elsewhere in your program.\n\nRequests' job is primarily to concern itself with making HTTP easy to use. It's not about ensuring well-formedness of data. This is an _application_ concern, and the _application_ should police it.\n\nI'm entirely happy to have a documentation addition that provides a recipe for how to do this, or an addition to the requests-toolbelt. But requests itself has a higher bar for feature requests, and right now I'm not sure this meets it.\n",
"@Lukasa \n\nThank you very much for your extraordinarily detailed and cogent reply. \n\nI fully get your point.\n\nI'm more than happy to impose checks on `requests` from outside. I just thought that what I needed was not possible because I had misunderstood how to do it.\n\n> For instance, we can no longer return the response to the caller then inspect status_code.\n\nWhat I probably should have said there is that you can't return the response and inspect the text. I now see that I had misunderstood how to use `stream=True`. My understanding now is that `response.text` gets consumed while we stream, so we have to rebuild it separately, but the other attributes of the response are still available. So instead of returning the response, we can return a tuple with the response and the rebuilt text. Is that right?\n\nYour idea of posting a recipe is a good one, as the few lines currently in the docs are not enough for a hobbyist like me.\n\nThe following seems to work. It follows a [recipe by Martijn Pieters](http://stackoverflow.com/q/23514256/), except that in Python 3 it only worked for me once I set `content` to a byte string.\n\nIs this how you suggest it should be done?\n\nTo work with unicode inside instead of bytes, do I just `content.decode(r.encoding)` at the end? By the way for that page, it doesn't work. I thought `r.encoding` was the same as `chardet`, but I get something different and end up having to `content.decode(chardet.detect(content)['encoding'])` Maybe that detection discrepancy is specific to streams?\n\n```\nimport requests\nimport chardet\n\nurl = 'http://bbc.co.uk'\n\nmax_size = 1024*1024*10 #10MB\nr = requests.get(url, stream=True, timeout=10)\nr.raise_for_status()\n\nsize = 0\ncontent = b''\nfor chunk in r.iter_content(1024):\n size += len(chunk)\n if size > max_size:\n raise ValueError('Maximum exceeded.')\n content += chunk\n\nprint(\"content type\", type(content)) # <class 'bytes'>\nprint(\"response: \", r) # <Response [200]>\nprint(\"status code:\", r.status_code) # 200\nprint(\"detected encoding:\", r.encoding) # ISO-8859-1\ndetected_encoding = chardet.detect(content)['encoding']\nprint(\"it should be the same as chardet:\", detected_encoding) # utf-8\nucontent = content.decode(detected_encoding)\nprint(\"converted content\", type(ucontent)) # <class 'str'>\nprint()\nprint(ucontent)\n```\n\nThank you again for taking the time to explain the interface design.\n",
"> Thank you very much for your extraordinarily detailed and cogent reply.\n\nThank you. =) Fortunately, I've been thinking a lot about this recently because I'm speaking on exactly this topic at PyCon UK this year!\n\n> What I probably should have said there is that you can't return the response and inspect the text. I now see that I had misunderstood how to use stream=True. My understanding now is that response.text gets consumed while we stream, so we have to rebuild it separately, but the other attributes of the response are still available. So instead of returning the response, we can return a tuple with the response and the rebuilt text. Is that right?\n\nCorrect. You can even go further and add the content back to the response object when you're done with it: essentially you're just manually building up the `_content` instance variable, so you can probably just patch it back on when you're done.\n\n> The following seems to work. It follows a recipe by Martijn Pieters, except that in Python 3 it only worked for me once I set content to a byte string.\n\nYeah, you'll have to do that. `content` is always bytes.\n\n> To work with unicode inside instead of bytes, do I just content.decode(r.encoding) at the end?\n\nYou could _try_ doing this:\n\n``` python\nr._content = content\nr._content_consumed = True\n```\n\nThat _should_ give you back a completely standard `Response` object at that point. These are _technically_ implementation details, but for the long term I expect them to be entirely safe. That's not a _promise_, but it is a suspicion.\n\nOnce you've done that, `r.text` should work exactly as it has done in the past: no need to do the decode.\n\n> Is this how you suggest it should be done?\n\nYep, pretty much! =)\n",
"> I'm speaking on exactly this topic at PyCon UK this year!\n\nHa, didn't know you were one of our illustrious tutors/mentors/evangelists.\nJust downloaded your _Hyperactive: HTTP/2 and Python_ video from Montreal, looking forward to watching it later today. :)\n\nThank you for giving me some ideas about how to do things differently by accessing `requests` internals.\n\nWishing you a fun weekend.\n"
] |
https://api.github.com/repos/psf/requests/issues/2683
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2683/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2683/comments
|
https://api.github.com/repos/psf/requests/issues/2683/events
|
https://github.com/psf/requests/issues/2683
| 96,283,765 |
MDU6SXNzdWU5NjI4Mzc2NQ==
| 2,683 |
monkey patch to fix the headers parse in python3
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7009837?v=4",
"events_url": "https://api.github.com/users/littlezz/events{/privacy}",
"followers_url": "https://api.github.com/users/littlezz/followers",
"following_url": "https://api.github.com/users/littlezz/following{/other_user}",
"gists_url": "https://api.github.com/users/littlezz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/littlezz",
"id": 7009837,
"login": "littlezz",
"node_id": "MDQ6VXNlcjcwMDk4Mzc=",
"organizations_url": "https://api.github.com/users/littlezz/orgs",
"received_events_url": "https://api.github.com/users/littlezz/received_events",
"repos_url": "https://api.github.com/users/littlezz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/littlezz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/littlezz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/littlezz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-07-21T10:34:29Z
|
2021-09-08T23:00:48Z
|
2015-07-21T11:20:33Z
|
NONE
|
resolved
|
python3 use latin-1 to decode the headers, if some Chinese encode by utf8, but decode by latin-1, it may contain `\x85` in the result.
``` python
In [276]: '锅团子圣诞树.jpg'.encode('utf8').decode('latin1')
Out[276]: 'é\x94\x85å\x9b¢å\xad\x90å\x9c£è¯\x9eæ\xa0\x91.jpg'
In [278]: '\x85' in '锅团子圣诞树.jpg'.encode('utf8').decode('latin1')
Out[278]: True
```
In `email.feedparser.BufferedSubFile`, the push method split data by `str.splitlines` , which will split on `\x85`. (https://docs.python.org/3.5/library/stdtypes.html#str.splitlines)
This will make headers content lost after the `\x85`.
I write a simple server to return Chinese headers encode by utf8.
``` python
from flask import Flask, make_response
app = Flask(__name__)
@app.route('/rt')
def rt():
r = make_response()
r.headers['chinese-header'] = '锅团子圣诞树.jpg'.encode('utf8')
return r
if __name__ == '__main__':
app.run(port=8088, debug=True)
```
and then get it.
``` python
In [275]: requests.get('http://127.0.0.1:8088/rt').headers['chinese-header']
Out[275]: 'é\x94\x85'
```
It lost content after `\x85`.
I write a function to replace the `push` method, it replace the `str.splitlines` to split only on `\r`, `\n`, `\r\n`.
``` python
__author__ = 'zz'
from email.feedparser import BufferedSubFile
import re
from itertools import zip_longest
sep = re.compile(r'(\r\n|\r|\n)')
def py3_splitlines(s):
split_group = sep.split(s)
return [g1 + g2 for g1, g2 in zip_longest(split_group[::2], split_group[1::2], fillvalue='')]
# monkey patch the push method
def push(self, data):
"""Push some new data into this object."""
# Crack into lines, but preserve the linesep characters on the end of each
# parts = data.splitlines(True)
# use py3_splitlines instead of the str.splitlines
parts = py3_splitlines(data)
if not parts or not parts[0].endswith(('\n', '\r')):
# No new complete lines, so just accumulate partials
self._partial += parts
return
if self._partial:
# If there are previous leftovers, complete them now
self._partial.append(parts[0])
# and here
parts[0:1] = py3_splitlines(''.join(self._partial))
del self._partial[:]
# If the last element of the list does not end in a newline, then treat
# it as a partial line. We only check for '\n' here because a line
# ending with '\r' might be a line that was split in the middle of a
# '\r\n' sequence (see bugs 1555570 and 1721862).
if not parts[-1].endswith('\n'):
self._partial = [parts.pop()]
self.pushlines(parts)
BufferedSubFile.push = push
```
after replace the `splitlines` to `py3_splitlines` in `push` method.
``` python
In [280]: requests.get('http://127.0.0.1:8088/rt').headers['chinese-header']
Out[280]: 'é\x94\x85å\x9b¢å\xad\x90å\x9c£è¯\x9eæ\xa0\x91.jpg'
```
and then we can re-encode the headers and get the correct one (according https://github.com/kennethreitz/requests/pull/2655)
So, I think we can use monkey patch to fix the `BufferedSubFile.push` method when people use python3.
I also find that the str.splitlines in `BufferedSubFile.push` may be a bug (http://bugs.python.org/issue22233).
But until now, it seems that python source code doesn't change.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2683/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2683/timeline
| null |
completed
| null | null | false |
[
"I do not believe that we should monkeypatch the standard library in requests: that's an extremely dangerous practice. If it's a bug, it should be fixed upstream.\n\nThanks for the report! :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/2682
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2682/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2682/comments
|
https://api.github.com/repos/psf/requests/issues/2682/events
|
https://github.com/psf/requests/issues/2682
| 96,279,849 |
MDU6SXNzdWU5NjI3OTg0OQ==
| 2,682 |
implement delayed retry support to failed requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/102495?v=4",
"events_url": "https://api.github.com/users/ssbarnea/events{/privacy}",
"followers_url": "https://api.github.com/users/ssbarnea/followers",
"following_url": "https://api.github.com/users/ssbarnea/following{/other_user}",
"gists_url": "https://api.github.com/users/ssbarnea/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ssbarnea",
"id": 102495,
"login": "ssbarnea",
"node_id": "MDQ6VXNlcjEwMjQ5NQ==",
"organizations_url": "https://api.github.com/users/ssbarnea/orgs",
"received_events_url": "https://api.github.com/users/ssbarnea/received_events",
"repos_url": "https://api.github.com/users/ssbarnea/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ssbarnea/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ssbarnea/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ssbarnea",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-07-21T10:15:02Z
|
2021-09-08T23:00:48Z
|
2015-07-21T11:30:38Z
|
CONTRIBUTOR
|
resolved
|
Currently requests library supports a retry mechanism where you can specify how many retries to make untill it will fail, which is great unless you have a downtime of the service of only few seconds.
In order to make usage of requests more resilient to temporary errors we need an option to delay further retries, preferably using a fibonacci-like increase delay on sequential retries.
Example, `start_delay=5` seconds and `max_retries=10`, would use these delays: 5, 10, 15, 25, 40, 65, 105, ...
I observed that lots of people are encountering the same issue and I do think that is much better to have this as part of requests and not having to hack our own httpadaptor to implement it.
I would go so far to say that I would even argue for having it enabled by default, when max_retries is specified.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/2682/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2682/timeline
| null |
completed
| null | null | false |
[
"Thanks for the suggestion!\n\nThe supported way to do this in requests is to use a HTTP adapter. I notice you've spotted this but consider it to be a 'hack'. It's not: the requests project considers the HTTP adapter a first-class part of our interface.\n\nThe change is very simple:\n\n``` python\nfrom requests.adapters import HTTPAdapter\nfrom requests.packages.urllib3.util.retry import Retry\n\nclass RetryAdapter(HTTPAdapter):\n def __init__(self, *args, **kwargs):\n super(RetryAdapter, self).__init__(*args, **kwargs)\n self.max_retries = Retry(total=self.max_retries, backoff_factor=5)\n\ns = requests.Session()\ns.mount('http://', RetryAdapter())\ns.mount('https://', RetryAdapter())\n```\n\nThis is a very simple adapter and could easily be dropped in. We may even want to add it to the requests-toolbelt: a pull request there would be very welcome. (/cc @sigmavirus24)\n\nThanks for the suggestion!\n"
] |
https://api.github.com/repos/psf/requests/issues/2681
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2681/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2681/comments
|
https://api.github.com/repos/psf/requests/issues/2681/events
|
https://github.com/psf/requests/issues/2681
| 96,271,565 |
MDU6SXNzdWU5NjI3MTU2NQ==
| 2,681 |
pyopenssl: maximum recursion depth exceeded while calling a Python object
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/291289?v=4",
"events_url": "https://api.github.com/users/gsakkis/events{/privacy}",
"followers_url": "https://api.github.com/users/gsakkis/followers",
"following_url": "https://api.github.com/users/gsakkis/following{/other_user}",
"gists_url": "https://api.github.com/users/gsakkis/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gsakkis",
"id": 291289,
"login": "gsakkis",
"node_id": "MDQ6VXNlcjI5MTI4OQ==",
"organizations_url": "https://api.github.com/users/gsakkis/orgs",
"received_events_url": "https://api.github.com/users/gsakkis/received_events",
"repos_url": "https://api.github.com/users/gsakkis/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gsakkis/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gsakkis/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gsakkis",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2015-07-21T09:32:18Z
|
2021-09-08T23:00:48Z
|
2015-07-21T11:19:02Z
|
NONE
|
resolved
|
I recently installed `requests[security]` to get rid of the [InsecurePlatformWarning](http://stackoverflow.com/a/29099439) and have been getting intermittent Runtime errors in pyopenssl. Here's a sample stacktrace:
```
Stacktrace (most recent call last):
File "requests/sessions.py", line 465, in request
resp = self.send(prep, **send_kwargs)
File "requests/sessions.py", line 605, in send
r.content
File "requests/models.py", line 750, in content
self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()
File "requests/models.py", line 673, in generate
for chunk in self.raw.stream(chunk_size, decode_content=True):
File "requests/packages/urllib3/response.py", line 307, in stream
data = self.read(amt=amt, decode_content=decode_content)
File "requests/packages/urllib3/response.py", line 243, in read
data = self._fp.read(amt)
File "python2.7/httplib.py", line 567, in read
s = self.fp.read(amt)
File "python2.7/socket.py", line 380, in read
data = self._sock.recv(left)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 188, in recv
return self.recv(*args, **kwargs)
File "requests/packages/urllib3/contrib/pyopenssl.py", line 171, in recv
data = self.connection.recv(*args, **kwargs)
File "OpenSSL/SSL.py", line 1318, in recv
buf = _ffi.new("char[]", bufsiz)
File "cffi/api.py", line 235, in new
if isinstance(cdecl, basestring):
```
Any ideas?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2681/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2681/timeline
| null |
completed
| null | null | false |
[
"After a quick look at [WrappedSocket.recv()](https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/contrib/pyopenssl.py#L169), seems like converting recursion to iteration could fix it. If that's the case I could make a pull request.\n",
"As you've spotted this is a urllib3 bug, so I'm closing to focus discussion there.\n",
"Thanks. On a related note, another regression I noticed since installing `requests[security]` is that `WrappedSocket.recv()` may raise `socket.timeout` and this isn't wrapped into a `requests.Timeout` that I've been handling. Should I open a new issue for it?\n",
"What's the full stacktrace?\n",
"Here's one example:\n\n```\n File \"requests/sessions.py\", line 465, in request\n resp = self.send(prep, **send_kwargs)\n File \"requests/sessions.py\", line 594, in send\n history = [resp for resp in gen] if allow_redirects else []\n File \"requests/sessions.py\", line 196, in resolve_redirects\n **adapter_kwargs\n File \"requests/sessions.py\", line 605, in send\n r.content\n File \"requests/models.py\", line 750, in content\n self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()\n File \"requests/models.py\", line 673, in generate\n for chunk in self.raw.stream(chunk_size, decode_content=True):\n File \"requests/packages/urllib3/response.py\", line 303, in stream\n for line in self.read_chunked(amt, decode_content=decode_content):\n File \"requests/packages/urllib3/response.py\", line 447, in read_chunked\n self._update_chunk_length()\n File \"requests/packages/urllib3/response.py\", line 394, in _update_chunk_length\n line = self._fp.fp.readline()\n File \"python2.7/socket.py\", line 447, in readline\n data = self._sock.recv(self._rbufsize)\n```\n",
"I _think_ this is another manifestation of the problem fixed in shazow/urllib3#674, and so will be fixed by the 2.8.0 release of requests.\n",
"Indeed it looks like it's the same problem. Thanks, looking forward to 2.8.0!\n",
"@gsakkis feel free to try out https://github.com/kennethreitz/requests/pull/2678 to make sure it fixes the issue for you\n"
] |
https://api.github.com/repos/psf/requests/issues/2680
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2680/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2680/comments
|
https://api.github.com/repos/psf/requests/issues/2680/events
|
https://github.com/psf/requests/pull/2680
| 95,865,840 |
MDExOlB1bGxSZXF1ZXN0NDAyOTk4NjY=
| 2,680 |
Fix docs for passing a list of values for a query string
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/184212?v=4",
"events_url": "https://api.github.com/users/peplin/events{/privacy}",
"followers_url": "https://api.github.com/users/peplin/followers",
"following_url": "https://api.github.com/users/peplin/following{/other_user}",
"gists_url": "https://api.github.com/users/peplin/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/peplin",
"id": 184212,
"login": "peplin",
"node_id": "MDQ6VXNlcjE4NDIxMg==",
"organizations_url": "https://api.github.com/users/peplin/orgs",
"received_events_url": "https://api.github.com/users/peplin/received_events",
"repos_url": "https://api.github.com/users/peplin/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/peplin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/peplin/subscriptions",
"type": "User",
"url": "https://api.github.com/users/peplin",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-07-18T22:50:57Z
|
2021-09-08T07:00:48Z
|
2015-07-19T09:57:01Z
|
CONTRIBUTOR
|
resolved
|
The special `[]` notation at the end of the field name is not necessary to get a field to appear with multiple values in the query string.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2680/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2680/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2680.diff",
"html_url": "https://github.com/psf/requests/pull/2680",
"merged_at": "2015-07-19T09:57:01Z",
"patch_url": "https://github.com/psf/requests/pull/2680.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2680"
}
| true |
[
"Yeah, this is _mostly_ true. I'm going to merge this as-is @peplin, but I'd really love it if you'd follow up with another pull request that adds a paragraph about _why_ you might want to add the `[]` (namely, because many servers expect that notation).\n\nThanks so much for the contribution! :cake: :sparkles: :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/2679
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2679/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2679/comments
|
https://api.github.com/repos/psf/requests/issues/2679/events
|
https://github.com/psf/requests/pull/2679
| 95,835,189 |
MDExOlB1bGxSZXF1ZXN0NDAyOTM4Nzg=
| 2,679 |
Remove broken CaseInsensitiveDict repr test
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] |
{
"closed_at": "2015-10-12T10:32:06Z",
"closed_issues": 7,
"created_at": "2015-04-29T13:03:39Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/25",
"id": 1089203,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/25/labels",
"node_id": "MDk6TWlsZXN0b25lMTA4OTIwMw==",
"number": 25,
"open_issues": 0,
"state": "closed",
"title": "2.8.0",
"updated_at": "2015-10-12T10:32:06Z",
"url": "https://api.github.com/repos/psf/requests/milestones/25"
}
| 1 |
2015-07-18T15:48:09Z
|
2021-09-08T07:00:49Z
|
2015-07-18T15:48:26Z
|
CONTRIBUTOR
|
resolved
|
Fixes #2668
Supersedes #2669
Closes #2669
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2679/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2679/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2679.diff",
"html_url": "https://github.com/psf/requests/pull/2679",
"merged_at": "2015-07-18T15:48:26Z",
"patch_url": "https://github.com/psf/requests/pull/2679.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2679"
}
| true |
[
"Was just about to do this myself. :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/2678
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2678/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2678/comments
|
https://api.github.com/repos/psf/requests/issues/2678/events
|
https://github.com/psf/requests/pull/2678
| 95,835,082 |
MDExOlB1bGxSZXF1ZXN0NDAyOTM4NDQ=
| 2,678 |
Proposed 2.8.0
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
] | null | 15 |
2015-07-18T15:45:52Z
|
2021-09-08T06:00:56Z
|
2015-10-05T14:09:41Z
|
CONTRIBUTOR
|
resolved
|
Includes
- #2674
- #2567
- #2523
- #2706
- Bump for urllib3
Reminder to look at http://ci.kennethreitz.org/job/requests-pr/ for build status
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2678/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2678/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2678.diff",
"html_url": "https://github.com/psf/requests/pull/2678",
"merged_at": "2015-10-05T14:09:41Z",
"patch_url": "https://github.com/psf/requests/pull/2678.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2678"
}
| true |
[
"So http://ci.kennethreitz.org/job/requests-pr/714/PYTHON=3.3/console is an interesting error. I'll work on tracking it down\n",
"Here's hoping http://ci.sigmavir.us/job/requests-real/ picks up on this. @kennethreitz's CI is not picking up on the new commits.\n",
"Ah, here we go: http://ci.kennethreitz.org/job/requests-pr/715/\n",
"Let's be extra cautious of #2567\n",
"We are. That's why we're shipping it in 2.8.0: it should avoid us immediately breaking a ton of production software. =)\n",
"@kennethreitz #2567 is already running on Debian (and maybe ubuntu?) against Requests 2.7.0 and has yet to produce any bug reports there. It's been running there for _at least_ a month. Also iirc, Cramer signed off on it and pip has been using something similar to #2567 for a while now.\n",
"what does @mitsuhiko think?\n",
"Replied to #2567.\n",
"Any traction on this release? Would love to see https://github.com/shazow/urllib3/issues/556 fixed :)\n",
"> Any traction on this release? Would love to see shazow/urllib3#556 fixed :)\n\n+1\n",
"## Sorry Laurent. What are you trying to add 1 to? This isn't JavaScript so we don't do auto type coercion and that comment doesn't parse.\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n",
"Heh, sorry, was just trying to add my vote as we just hit the problem, in case this PR needs some love ;)\n",
"Comments aren't for voting though. We have this planned we just don't have it entirely ready. This was also just to have a branch to have all the open PRs merged into one place so we can test all of them together. This isn't necessarily the PR that will actually merge to create 2.8.0\n",
"@sigmavirus24 IIRC Ubuntu either pulls from Debian testing or unstable during a new OS release cycle, so if #2567 was put in to one of those branches it will be in Ubuntu by now. Also you would be able to check the patches that are in the source for packaging as well. just my 2 cents\n",
"Ok, in preparation for releasing 2.8.0 I am going to merge this now. I'm aiming to get 2.8.0 out the door by the middle of this week, and merging this will help force me to do that.\n"
] |
https://api.github.com/repos/psf/requests/issues/2677
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2677/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2677/comments
|
https://api.github.com/repos/psf/requests/issues/2677/events
|
https://github.com/psf/requests/issues/2677
| 95,668,523 |
MDU6SXNzdWU5NTY2ODUyMw==
| 2,677 |
can't use proxy-authorization in header
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/13382211?v=4",
"events_url": "https://api.github.com/users/matthew-summers/events{/privacy}",
"followers_url": "https://api.github.com/users/matthew-summers/followers",
"following_url": "https://api.github.com/users/matthew-summers/following{/other_user}",
"gists_url": "https://api.github.com/users/matthew-summers/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/matthew-summers",
"id": 13382211,
"login": "matthew-summers",
"node_id": "MDQ6VXNlcjEzMzgyMjEx",
"organizations_url": "https://api.github.com/users/matthew-summers/orgs",
"received_events_url": "https://api.github.com/users/matthew-summers/received_events",
"repos_url": "https://api.github.com/users/matthew-summers/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/matthew-summers/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/matthew-summers/subscriptions",
"type": "User",
"url": "https://api.github.com/users/matthew-summers",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-07-17T14:32:28Z
|
2021-09-08T23:00:50Z
|
2015-07-17T14:44:58Z
|
NONE
|
resolved
|
hi, i try to use Proxy-Authorization header with my request, but it doesn't appear in request headers. If i change header name to 'Proxy-Authorization1', for example, everything is ok, i can see it in output.
``` python
proxy = {'http': 'http://ip:port/'}
token = base64.encodestring('user:pass').strip()
headers = {'Proxy-Authorization': 'Basic ' + token, 'Proxy-Connection': 'Keep-Alive', 'Accept': '*/*'}
r = requests.get('http://url', headers=headers, proxies=proxy)
```
the reason why i'm trying to do this - with authentication like 'http://user:[email protected]' or via HTTPBasicAuth request fails with 407 Authentication Required
so, is there a way to explicitly define Proxy-Authorization header ?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/13382211?v=4",
"events_url": "https://api.github.com/users/matthew-summers/events{/privacy}",
"followers_url": "https://api.github.com/users/matthew-summers/followers",
"following_url": "https://api.github.com/users/matthew-summers/following{/other_user}",
"gists_url": "https://api.github.com/users/matthew-summers/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/matthew-summers",
"id": 13382211,
"login": "matthew-summers",
"node_id": "MDQ6VXNlcjEzMzgyMjEx",
"organizations_url": "https://api.github.com/users/matthew-summers/orgs",
"received_events_url": "https://api.github.com/users/matthew-summers/received_events",
"repos_url": "https://api.github.com/users/matthew-summers/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/matthew-summers/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/matthew-summers/subscriptions",
"type": "User",
"url": "https://api.github.com/users/matthew-summers",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2677/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2677/timeline
| null |
completed
| null | null | false |
[
"Well, in the sample code above you aren't actually telling requests which proxy to talk to. =)\n\nThis is easiest to do by using the username and password in the proxy URL, and then not providing the header at all:\n\n``` python\nproxy = {'http': 'http://user:pass@ip:port/'}\nr = requests.get('http://url', proxies=proxy) \n```\n",
"yep, tried this way, but proxies were still returning 407. however, now everything is ok again, i begin to suspect this is proxy's issue. \n"
] |
https://api.github.com/repos/psf/requests/issues/2676
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2676/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2676/comments
|
https://api.github.com/repos/psf/requests/issues/2676/events
|
https://github.com/psf/requests/pull/2676
| 95,633,493 |
MDExOlB1bGxSZXF1ZXN0NDAyMTk4NTE=
| 2,676 |
Convert non-string header values to their string representation.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1142203?v=4",
"events_url": "https://api.github.com/users/sunu/events{/privacy}",
"followers_url": "https://api.github.com/users/sunu/followers",
"following_url": "https://api.github.com/users/sunu/following{/other_user}",
"gists_url": "https://api.github.com/users/sunu/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sunu",
"id": 1142203,
"login": "sunu",
"node_id": "MDQ6VXNlcjExNDIyMDM=",
"organizations_url": "https://api.github.com/users/sunu/orgs",
"received_events_url": "https://api.github.com/users/sunu/received_events",
"repos_url": "https://api.github.com/users/sunu/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sunu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sunu/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sunu",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2015-07-17T10:46:30Z
|
2021-09-08T07:00:50Z
|
2015-07-18T15:26:20Z
|
NONE
|
resolved
|
Fixes https://github.com/kennethreitz/requests/issues/2675
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1142203?v=4",
"events_url": "https://api.github.com/users/sunu/events{/privacy}",
"followers_url": "https://api.github.com/users/sunu/followers",
"following_url": "https://api.github.com/users/sunu/following{/other_user}",
"gists_url": "https://api.github.com/users/sunu/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sunu",
"id": 1142203,
"login": "sunu",
"node_id": "MDQ6VXNlcjExNDIyMDM=",
"organizations_url": "https://api.github.com/users/sunu/orgs",
"received_events_url": "https://api.github.com/users/sunu/received_events",
"repos_url": "https://api.github.com/users/sunu/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sunu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sunu/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sunu",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2676/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2676/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2676.diff",
"html_url": "https://github.com/psf/requests/pull/2676",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2676.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2676"
}
| true |
[
"The change you've made is in urllib3. @shazow may want this change, in which case we should accept it over at the main urllib3 repository. If he does not want it, we should instead move it into the requests code.\n",
"-1 on having this in urllib3's urlopen, that's far too low-level to mangle user input. Requests is probably the more appropriate layer for this.\n\nAlternatively, it could live in `HTTPHeaderDict` once we get it usable as input as well as output, but that's blocked on https://github.com/shazow/urllib3/pull/633.\n",
"That's pretty much what I thought.\n\nIn that case, I think the right place for this is `requests.models.PreparedRequest.prepare_headers()`.\n",
"Thanks for the review @Lukasa and @shazow :) I've updated the PR now. \nhttps://github.com/kennethreitz/requests/blob/master/test_requests.py#L1275 fails for me on Python 3 because the keys are not in order, but looking at the implementation the keys are not expected to be in order? \n",
"Yup, it's a known issue. =)\n",
"@Lukasa I've updated the PR.\n",
"See discussion on #2675 before continuing work here. I'm not sure we want to encourage this.\n",
"Closing based on discussion on #2675 \nAlas, it couldn't be my first contribution to requests. May be some time in the future :)\n"
] |
https://api.github.com/repos/psf/requests/issues/2675
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2675/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2675/comments
|
https://api.github.com/repos/psf/requests/issues/2675/events
|
https://github.com/psf/requests/issues/2675
| 95,622,350 |
MDU6SXNzdWU5NTYyMjM1MA==
| 2,675 |
Using header with a float value in a request results in error on Python 3.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1142203?v=4",
"events_url": "https://api.github.com/users/sunu/events{/privacy}",
"followers_url": "https://api.github.com/users/sunu/followers",
"following_url": "https://api.github.com/users/sunu/following{/other_user}",
"gists_url": "https://api.github.com/users/sunu/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sunu",
"id": 1142203,
"login": "sunu",
"node_id": "MDQ6VXNlcjExNDIyMDM=",
"organizations_url": "https://api.github.com/users/sunu/orgs",
"received_events_url": "https://api.github.com/users/sunu/received_events",
"repos_url": "https://api.github.com/users/sunu/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sunu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sunu/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sunu",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2015-07-17T09:33:39Z
|
2021-09-08T23:00:49Z
|
2015-07-18T15:24:21Z
|
NONE
|
resolved
|
I have an example below.
This seems to be a bug in `http.client` as it doesn't convert floats to string representation. But as a workaround we can may be explicitly convert any floats in the header to string in [`urlopen`](https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/connectionpool.py#L421)?
```
$ ipython
Python 3.4.0 (default, Apr 11 2014, 13:05:11)
Type "copyright", "credits" or "license" for more information.
IPython 3.2.0 -- An enhanced Interactive Python.
? -> Introduction and overview of IPython's features.
%quickref -> Quick reference.
help -> Python's own help system.
object? -> Details about 'object', use 'object??' for extra details.
In [1]: import requests
In [2]: requests.__version__
Out[2]: '2.7.0'
In [3]: url = "http://sunu.in"
In [4]: headers = {'x-wait': 0.2}
In [5]: requests.get(url, headers=headers)
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-5-951b2e0c9a38> in <module>()
----> 1 requests.get(url, headers=headers)
/home/sunu/gsoc/splash/env3/lib/python3.4/site-packages/requests/api.py in get(url, params, **kwargs)
67
68 kwargs.setdefault('allow_redirects', True)
---> 69 return request('get', url, params=params, **kwargs)
70
71
/home/sunu/gsoc/splash/env3/lib/python3.4/site-packages/requests/api.py in request(method, url, **kwargs)
48
49 session = sessions.Session()
---> 50 response = session.request(method=method, url=url, **kwargs)
51 # By explicitly closing the session, we avoid leaving sockets open which
52 # can trigger a ResourceWarning in some cases, and look like a memory leak
/home/sunu/gsoc/splash/env3/lib/python3.4/site-packages/requests/sessions.py in request(self, method, url, params, data, headers, cookies, files, auth, timeout, allow_redirects, proxies, hooks, stream, verify, cert, json)
463 }
464 send_kwargs.update(settings)
--> 465 resp = self.send(prep, **send_kwargs)
466
467 return resp
/home/sunu/gsoc/splash/env3/lib/python3.4/site-packages/requests/sessions.py in send(self, request, **kwargs)
571
572 # Send the request
--> 573 r = adapter.send(request, **kwargs)
574
575 # Total elapsed time of the request (approximately)
/home/sunu/gsoc/splash/env3/lib/python3.4/site-packages/requests/adapters.py in send(self, request, stream, timeout, verify, cert, proxies)
368 decode_content=False,
369 retries=self.max_retries,
--> 370 timeout=timeout
371 )
372
/home/sunu/gsoc/splash/env3/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py in urlopen(self, method, url, body, headers, retries, redirect, assert_same_host, timeout, pool_timeout, release_conn, **response_kw)
546 httplib_response = self._make_request(conn, method, url,
547 timeout=timeout_obj,
--> 548 body=body, headers=headers)
549
550 # If we're going to release the connection in ``finally:``, then
/home/sunu/gsoc/splash/env3/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py in _make_request(self, conn, method, url, timeout, **httplib_request_kw)
347 # conn.request() calls httplib.*.request, not the method in
348 # urllib3.request. It also calls makefile (recv) on the socket.
--> 349 conn.request(method, url, **httplib_request_kw)
350
351 # Reset the timeout for the recv() on the socket
/usr/lib/python3.4/http/client.py in request(self, method, url, body, headers)
1063 def request(self, method, url, body=None, headers={}):
1064 """Send a complete request to the server."""
-> 1065 self._send_request(method, url, body, headers)
1066
1067 def _set_content_length(self, body):
/usr/lib/python3.4/http/client.py in _send_request(self, method, url, body, headers)
1096 self._set_content_length(body)
1097 for hdr, value in headers.items():
-> 1098 self.putheader(hdr, value)
1099 if isinstance(body, str):
1100 # RFC 2616 Section 3.7.1 says that text default has a
/usr/lib/python3.4/http/client.py in putheader(self, header, *values)
1042 elif isinstance(one_value, int):
1043 values[i] = str(one_value).encode('ascii')
-> 1044 value = b'\r\n\t'.join(values)
1045 header = header + b': ' + value
1046 self._output(header)
TypeError: sequence item 0: expected bytes, bytearray, or an object with the buffer interface, float found
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1142203?v=4",
"events_url": "https://api.github.com/users/sunu/events{/privacy}",
"followers_url": "https://api.github.com/users/sunu/followers",
"following_url": "https://api.github.com/users/sunu/following{/other_user}",
"gists_url": "https://api.github.com/users/sunu/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sunu",
"id": 1142203,
"login": "sunu",
"node_id": "MDQ6VXNlcjExNDIyMDM=",
"organizations_url": "https://api.github.com/users/sunu/orgs",
"received_events_url": "https://api.github.com/users/sunu/received_events",
"repos_url": "https://api.github.com/users/sunu/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sunu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sunu/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sunu",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2675/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2675/timeline
| null |
completed
| null | null | false |
[
"This feels like a reasonable feature request to me, I have no objections to casting non-string values to strings.\n",
"@Lukasa Great! I'll submit a PR. Thanks for the quick response :)\n",
"Actually we've already rejected similar bugs and pull requests in the past: https://github.com/kennethreitz/requests/issues/986\n\nI'm inclined to agree with those decisions. Headers should fundamentally always be strings provided by the user. The implementation of `str(some_float)` can vary from implementation to implementation so while `str(0.2)` may produce `'0.2'` it probably wouldn't with pypy.js or some other implementation of Python.\n",
"Hmm .. I guess this makes sense. I'll handle this in the application level then. \nClosing the issue. Thanks for your time and input @Lukasa and @sigmavirus24 :)\n"
] |
https://api.github.com/repos/psf/requests/issues/2674
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2674/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2674/comments
|
https://api.github.com/repos/psf/requests/issues/2674/events
|
https://github.com/psf/requests/pull/2674
| 95,613,235 |
MDExOlB1bGxSZXF1ZXN0NDAyMTAxODA=
| 2,674 |
Catch and wrap ClosedPoolError
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/954858?v=4",
"events_url": "https://api.github.com/users/ArcTanSusan/events{/privacy}",
"followers_url": "https://api.github.com/users/ArcTanSusan/followers",
"following_url": "https://api.github.com/users/ArcTanSusan/following{/other_user}",
"gists_url": "https://api.github.com/users/ArcTanSusan/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ArcTanSusan",
"id": 954858,
"login": "ArcTanSusan",
"node_id": "MDQ6VXNlcjk1NDg1OA==",
"organizations_url": "https://api.github.com/users/ArcTanSusan/orgs",
"received_events_url": "https://api.github.com/users/ArcTanSusan/received_events",
"repos_url": "https://api.github.com/users/ArcTanSusan/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ArcTanSusan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArcTanSusan/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ArcTanSusan",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] |
{
"closed_at": "2015-10-12T10:32:06Z",
"closed_issues": 7,
"created_at": "2015-04-29T13:03:39Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/25",
"id": 1089203,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/25/labels",
"node_id": "MDk6TWlsZXN0b25lMTA4OTIwMw==",
"number": 25,
"open_issues": 0,
"state": "closed",
"title": "2.8.0",
"updated_at": "2015-10-12T10:32:06Z",
"url": "https://api.github.com/repos/psf/requests/milestones/25"
}
| 2 |
2015-07-17T08:33:52Z
|
2021-09-08T06:00:54Z
|
2015-10-05T14:09:46Z
|
CONTRIBUTOR
|
resolved
|
Partially resolves #1572: "urllib3 exceptions passing through requests
API". #1572
Inspired from @Lukasa's previous 2605be11d82d42438ac7c3993810c955bde74cef.
This is my first PR to requests library; feel free to give me feedback. I generally have a fast response time. Also, available on IRC and Twitter (@ArcTanSusan).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2674/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2674/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2674.diff",
"html_url": "https://github.com/psf/requests/pull/2674",
"merged_at": "2015-10-05T14:09:46Z",
"patch_url": "https://github.com/psf/requests/pull/2674.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2674"
}
| true |
[
"Aw no, we lost my awesome work?\n\n\n\nAll joking aside, this looks good to me, I'd be happy to merge it. However, it's an API change, so it _at least_ needs to go into 2.8.0 and may need to go into 3.0.0. @sigmavirus24?\n",
"2.8.0 seems reasonable to me. I don't think we need to wait for 3.0.0 to wrap a urllib3 exception\n"
] |
https://api.github.com/repos/psf/requests/issues/2673
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2673/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2673/comments
|
https://api.github.com/repos/psf/requests/issues/2673/events
|
https://github.com/psf/requests/issues/2673
| 95,485,672 |
MDU6SXNzdWU5NTQ4NTY3Mg==
| 2,673 |
Low-level exceptions leaking from Response.iter_content()
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1150978?v=4",
"events_url": "https://api.github.com/users/jmoldow/events{/privacy}",
"followers_url": "https://api.github.com/users/jmoldow/followers",
"following_url": "https://api.github.com/users/jmoldow/following{/other_user}",
"gists_url": "https://api.github.com/users/jmoldow/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jmoldow",
"id": 1150978,
"login": "jmoldow",
"node_id": "MDQ6VXNlcjExNTA5Nzg=",
"organizations_url": "https://api.github.com/users/jmoldow/orgs",
"received_events_url": "https://api.github.com/users/jmoldow/received_events",
"repos_url": "https://api.github.com/users/jmoldow/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jmoldow/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jmoldow/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jmoldow",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-07-16T17:32:45Z
|
2021-09-08T23:00:50Z
|
2015-07-16T17:35:25Z
|
NONE
|
resolved
|
Up through v2.6.0, `Response.iter_content()`would catch all expected exceptions coming from `self.raw.stream()`, and wrap them in instances of various subclasses of `RequestException`.
In v2.6.1, urllib3 was updated to 1.10.3, which has issue https://github.com/shazow/urllib3/issues/673 that causes `self.raw.stream()` to raise low-level `httplib.HTTPException`s, particularly `httplib.IncompleteRead`. Because of this, as of v2.6.1 of requests, `Response.iter_content()` leaks these low-level of exceptions, instead of catching them and wrapping them in `RequestException` instances.
Issue https://github.com/shazow/urllib3/issues/673 has been fixed in master, but hasn't been released yet.
To fix this, requests could catch `httplib.HTTPException` along with `urllib3.exceptions.ProtocolError`, converting both into `ChunkedEncodingError`. Or requests can upgrade to the newest version of urllib3 when it is released.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2673/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2673/timeline
| null |
completed
| null | null | false |
[
"Requests will upgrade to the newest version of urllib3 when it is released. =)\n\nThanks for the report!\n"
] |
https://api.github.com/repos/psf/requests/issues/2672
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2672/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2672/comments
|
https://api.github.com/repos/psf/requests/issues/2672/events
|
https://github.com/psf/requests/pull/2672
| 95,304,552 |
MDExOlB1bGxSZXF1ZXN0NDAwNzcxNTI=
| 2,672 |
Fix quickstart "Custom Headers" example intro
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1192314?v=4",
"events_url": "https://api.github.com/users/petedmarsh/events{/privacy}",
"followers_url": "https://api.github.com/users/petedmarsh/followers",
"following_url": "https://api.github.com/users/petedmarsh/following{/other_user}",
"gists_url": "https://api.github.com/users/petedmarsh/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/petedmarsh",
"id": 1192314,
"login": "petedmarsh",
"node_id": "MDQ6VXNlcjExOTIzMTQ=",
"organizations_url": "https://api.github.com/users/petedmarsh/orgs",
"received_events_url": "https://api.github.com/users/petedmarsh/received_events",
"repos_url": "https://api.github.com/users/petedmarsh/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/petedmarsh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/petedmarsh/subscriptions",
"type": "User",
"url": "https://api.github.com/users/petedmarsh",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-07-15T22:31:06Z
|
2021-09-08T07:00:50Z
|
2015-07-16T01:51:07Z
|
CONTRIBUTOR
|
resolved
|
Previously this section prefaced an example with:
```
For example, we didn't specify our content-type
```
But, the actual example set a custom user-agent header on the request. This
changes it to say "user-agent" instead which matches the given example.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2672/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2672/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2672.diff",
"html_url": "https://github.com/psf/requests/pull/2672",
"merged_at": "2015-07-16T01:51:07Z",
"patch_url": "https://github.com/psf/requests/pull/2672.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2672"
}
| true |
[
"Alternatively the example could be changed to actually set the content-type header instead of the user-agent header - either way I think they should be consistent :)\n",
"Thanks @petedmarsh, I missed this when I changed the example from content-type to user-agent!\n"
] |
https://api.github.com/repos/psf/requests/issues/2671
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2671/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2671/comments
|
https://api.github.com/repos/psf/requests/issues/2671/events
|
https://github.com/psf/requests/issues/2671
| 95,244,150 |
MDU6SXNzdWU5NTI0NDE1MA==
| 2,671 |
How to build requests with custom version OpenSSL?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2422745?v=4",
"events_url": "https://api.github.com/users/cecemel/events{/privacy}",
"followers_url": "https://api.github.com/users/cecemel/followers",
"following_url": "https://api.github.com/users/cecemel/following{/other_user}",
"gists_url": "https://api.github.com/users/cecemel/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/cecemel",
"id": 2422745,
"login": "cecemel",
"node_id": "MDQ6VXNlcjI0MjI3NDU=",
"organizations_url": "https://api.github.com/users/cecemel/orgs",
"received_events_url": "https://api.github.com/users/cecemel/received_events",
"repos_url": "https://api.github.com/users/cecemel/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/cecemel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cecemel/subscriptions",
"type": "User",
"url": "https://api.github.com/users/cecemel",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-07-15T17:11:34Z
|
2021-09-08T23:00:51Z
|
2015-07-15T18:39:03Z
|
NONE
|
resolved
|
HI,
what should I do to build/install requests with a custom build of OpenSSL?
thx
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2671/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2671/timeline
| null |
completed
| null | null | false |
[
"Questions belong on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests).\n"
] |
https://api.github.com/repos/psf/requests/issues/2670
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2670/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2670/comments
|
https://api.github.com/repos/psf/requests/issues/2670/events
|
https://github.com/psf/requests/pull/2670
| 95,237,125 |
MDExOlB1bGxSZXF1ZXN0NDAwNDI4NzU=
| 2,670 |
Re-enable Travis-CI builds
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/15092?v=4",
"events_url": "https://api.github.com/users/jayvdb/events{/privacy}",
"followers_url": "https://api.github.com/users/jayvdb/followers",
"following_url": "https://api.github.com/users/jayvdb/following{/other_user}",
"gists_url": "https://api.github.com/users/jayvdb/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jayvdb",
"id": 15092,
"login": "jayvdb",
"node_id": "MDQ6VXNlcjE1MDky",
"organizations_url": "https://api.github.com/users/jayvdb/orgs",
"received_events_url": "https://api.github.com/users/jayvdb/received_events",
"repos_url": "https://api.github.com/users/jayvdb/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jayvdb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jayvdb/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jayvdb",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2015-07-15T16:36:35Z
|
2021-09-08T07:00:51Z
|
2015-07-15T16:42:19Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2670/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2670/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2670.diff",
"html_url": "https://github.com/psf/requests/pull/2670",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2670.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2670"
}
| true |
[
"Ideally this isnt merged until #2668 is fixed.\n",
"We will not be turning on Travis CI. ci.kennethreitz.org just needs a proper kick. /cc @kennethreitz \n",
"The main repo doesnt need to enable Travis. This helps other developers testing on their forks.\n",
"The main repo should be testing pull requests, which will help other developers on their forks. =)\n",
"That only helps developers when they submit a pull request. Some developers like to check their work against all configurations in the build matrix before doing a pull request.\nIf travis was enabled for pull requests, https://github.com/kennethreitz/requests/pull/2666 would not have been merged as travis would have commented on the pull request that it was broken.\nIf http://ci.kennethreitz.org/ is the preferred approach, it should be submitting comments to pull requests, or people with merge rights should be checking it manually before merging.\n",
"> If http://ci.kennethreitz.org/ is the preferred approach, it should be submitting comments to pull requests, or people with merge rights should be checking it manually before merging.\n\nNo-one is denying that. ci.kennethreitz.org is currently broken. It should be functioning correctly. Currently it's the preferred approach, but obviously we need to get it working again.\n\nAs to developers on their forks, we encourage developers to open pull requests early if they want to do testing. Again, all of this assumes our process is functioning correctly: right now it is not. This is an organisational issue within the project, and re-adding travis will not fix it, especially because Travis is _very_ flaky when running our tests and regularly fails when it shouldn't.\n"
] |
|
https://api.github.com/repos/psf/requests/issues/2669
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2669/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2669/comments
|
https://api.github.com/repos/psf/requests/issues/2669/events
|
https://github.com/psf/requests/pull/2669
| 95,236,488 |
MDExOlB1bGxSZXF1ZXN0NDAwNDI2MDA=
| 2,669 |
Test CaseInsensitiveDict.__repr__ unordered
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/15092?v=4",
"events_url": "https://api.github.com/users/jayvdb/events{/privacy}",
"followers_url": "https://api.github.com/users/jayvdb/followers",
"following_url": "https://api.github.com/users/jayvdb/following{/other_user}",
"gists_url": "https://api.github.com/users/jayvdb/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jayvdb",
"id": 15092,
"login": "jayvdb",
"node_id": "MDQ6VXNlcjE1MDky",
"organizations_url": "https://api.github.com/users/jayvdb/orgs",
"received_events_url": "https://api.github.com/users/jayvdb/received_events",
"repos_url": "https://api.github.com/users/jayvdb/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jayvdb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jayvdb/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jayvdb",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-07-15T16:34:07Z
|
2021-09-08T07:00:49Z
|
2015-07-18T15:48:26Z
|
CONTRIBUTOR
|
resolved
|
InsensitiveDict.**repr** does not order the members, which on
Python 3 causes TestCaseInsensitiveDict.test_repr to fail.
Fixes issue #2668
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2669/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2669/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2669.diff",
"html_url": "https://github.com/psf/requests/pull/2669",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2669.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2669"
}
| true |
[
"We should just remove this test. As I said when it was added, there's no need to have it.\n",
"I'm happy to remove this test rather than mess about with it.\n"
] |
https://api.github.com/repos/psf/requests/issues/2668
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2668/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2668/comments
|
https://api.github.com/repos/psf/requests/issues/2668/events
|
https://github.com/psf/requests/issues/2668
| 95,235,853 |
MDU6SXNzdWU5NTIzNTg1Mw==
| 2,668 |
CaseInsensitiveDict.__repr__ is unordered
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/15092?v=4",
"events_url": "https://api.github.com/users/jayvdb/events{/privacy}",
"followers_url": "https://api.github.com/users/jayvdb/followers",
"following_url": "https://api.github.com/users/jayvdb/following{/other_user}",
"gists_url": "https://api.github.com/users/jayvdb/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jayvdb",
"id": 15092,
"login": "jayvdb",
"node_id": "MDQ6VXNlcjE1MDky",
"organizations_url": "https://api.github.com/users/jayvdb/orgs",
"received_events_url": "https://api.github.com/users/jayvdb/received_events",
"repos_url": "https://api.github.com/users/jayvdb/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jayvdb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jayvdb/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jayvdb",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] |
{
"closed_at": "2015-10-12T10:32:06Z",
"closed_issues": 7,
"created_at": "2015-04-29T13:03:39Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/25",
"id": 1089203,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/25/labels",
"node_id": "MDk6TWlsZXN0b25lMTA4OTIwMw==",
"number": 25,
"open_issues": 0,
"state": "closed",
"title": "2.8.0",
"updated_at": "2015-10-12T10:32:06Z",
"url": "https://api.github.com/repos/psf/requests/milestones/25"
}
| 1 |
2015-07-15T16:31:30Z
|
2021-09-08T23:00:49Z
|
2015-07-18T15:48:26Z
|
CONTRIBUTOR
|
resolved
|
CaseInsensitiveDict.**repr** uses dict, and therefore the results are unordered.
However TestCaseInsensitiveDict.test_repr attempts to compare the **repr** with an ordered representation. This fails on Python 3.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2668/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2668/timeline
| null |
completed
| null | null | false |
[
"The alternative is to order the `__repr__`, but by what order..?\n"
] |
https://api.github.com/repos/psf/requests/issues/2667
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2667/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2667/comments
|
https://api.github.com/repos/psf/requests/issues/2667/events
|
https://github.com/psf/requests/issues/2667
| 94,844,066 |
MDU6SXNzdWU5NDg0NDA2Ng==
| 2,667 |
insecure platform warning and failure
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/563900?v=4",
"events_url": "https://api.github.com/users/logicminds/events{/privacy}",
"followers_url": "https://api.github.com/users/logicminds/followers",
"following_url": "https://api.github.com/users/logicminds/following{/other_user}",
"gists_url": "https://api.github.com/users/logicminds/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/logicminds",
"id": 563900,
"login": "logicminds",
"node_id": "MDQ6VXNlcjU2MzkwMA==",
"organizations_url": "https://api.github.com/users/logicminds/orgs",
"received_events_url": "https://api.github.com/users/logicminds/received_events",
"repos_url": "https://api.github.com/users/logicminds/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/logicminds/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/logicminds/subscriptions",
"type": "User",
"url": "https://api.github.com/users/logicminds",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-07-14T00:59:18Z
|
2021-09-08T23:00:51Z
|
2015-07-14T06:57:02Z
|
NONE
|
resolved
|
This might have something to do with the 2.4.0 release and using certi but I have no idea. Similar to #2255 I am seeing an issue with executing the samples on the readme.
Why this happens with a non ssl request, I have no idea.
```
import requests
r = requests.get('http://en.wikipedia.org/wiki/Monty_Python')
/usr/lib/python2.6/site-packages/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.
InsecurePlatformWarning
>>> r = requests.get('http://en.wikipedia.org/wiki/Monty_Python', verify=False)
/usr/lib/python2.6/site-packages/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.
InsecurePlatformWarning
/usr/lib/python2.6/site-packages/requests/packages/urllib3/connectionpool.py:768: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.org/en/latest/security.html
InsecureRequestWarning)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2667/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2667/timeline
| null |
completed
| null | null | false |
[
"Wikipedia redirects you to a HTTPS endpoint. If you check `r.url` you'll see you ended up at a HTTPS version of that page, which is how the insecure request was made. =)\n"
] |
https://api.github.com/repos/psf/requests/issues/2666
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2666/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2666/comments
|
https://api.github.com/repos/psf/requests/issues/2666/events
|
https://github.com/psf/requests/pull/2666
| 94,710,706 |
MDExOlB1bGxSZXF1ZXN0Mzk4MDY3Nzk=
| 2,666 |
Adds extra tests for CaseInsensitiveDict
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/29039?v=4",
"events_url": "https://api.github.com/users/gvangool/events{/privacy}",
"followers_url": "https://api.github.com/users/gvangool/followers",
"following_url": "https://api.github.com/users/gvangool/following{/other_user}",
"gists_url": "https://api.github.com/users/gvangool/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gvangool",
"id": 29039,
"login": "gvangool",
"node_id": "MDQ6VXNlcjI5MDM5",
"organizations_url": "https://api.github.com/users/gvangool/orgs",
"received_events_url": "https://api.github.com/users/gvangool/received_events",
"repos_url": "https://api.github.com/users/gvangool/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gvangool/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gvangool/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gvangool",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2015-07-13T12:35:19Z
|
2021-09-08T07:00:51Z
|
2015-07-14T06:57:58Z
|
CONTRIBUTOR
|
resolved
|
This adds extra tstes for `requests.structures.CaseInsensitiveDict`
- Test for NotImplemented in `__eq__`
- Adds test for `copy()`
- Adds test for `__repr__()`
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2666/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2666/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2666.diff",
"html_url": "https://github.com/psf/requests/pull/2666",
"merged_at": "2015-07-14T06:57:58Z",
"patch_url": "https://github.com/psf/requests/pull/2666.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2666"
}
| true |
[
"@Lukasa how does this look to you? Looks fine to me. I don't mind a couple extra tests personally.\n",
"I have one note, but I'm happy to add in extra tests.\n",
"@Lukasa I believe I've addressed your concern for that :)\n",
"Yeah, that looks good to me. \\o/\n"
] |
https://api.github.com/repos/psf/requests/issues/2665
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2665/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2665/comments
|
https://api.github.com/repos/psf/requests/issues/2665/events
|
https://github.com/psf/requests/issues/2665
| 94,548,834 |
MDU6SXNzdWU5NDU0ODgzNA==
| 2,665 |
cookiejar_from_dict support SimpleCookie
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/735150?v=4",
"events_url": "https://api.github.com/users/spumer/events{/privacy}",
"followers_url": "https://api.github.com/users/spumer/followers",
"following_url": "https://api.github.com/users/spumer/following{/other_user}",
"gists_url": "https://api.github.com/users/spumer/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/spumer",
"id": 735150,
"login": "spumer",
"node_id": "MDQ6VXNlcjczNTE1MA==",
"organizations_url": "https://api.github.com/users/spumer/orgs",
"received_events_url": "https://api.github.com/users/spumer/received_events",
"repos_url": "https://api.github.com/users/spumer/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/spumer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/spumer/subscriptions",
"type": "User",
"url": "https://api.github.com/users/spumer",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2015-07-12T08:59:06Z
|
2021-09-08T23:00:51Z
|
2015-07-12T11:04:44Z
|
NONE
|
resolved
|
Hi there! :)
I have a suggestion about support `http.cookies.SimpleCookie` by `requests.cookies.cookiejar_from_dict`.
Example:
``` python
import http.cookies
import requests
import requests.cookies
cookie_str = 'Key1=Value1;Key2=Value2' # e.g can be obtain from cfg
cookie = http.cookies.SimpleCookie(cookie_str)
sess = requests.session()
sess.cookies = requests.cookies.cookiejar_from_dict(cookie)
```
`SimpleCookie` have mapping interface and `RequestsCookieJar` can assign cookie with `Morsel` value, but `cookiejar_from_dict` don't do this right now.
I thinking about that fixup, because `RequestsCookieJar` just extension of common `CookieJar`:
``` python
cookiejar.set_cookie(create_cookie(name, cookie_dict[name]))
->
RequestsCookieJar.set(cookiejar, name, cookie_dict[name])
```
But method name is strongly say us `from_dict`, SimpleCookie not a dict, just mapping. May be we need new method. I don't know. What do you think?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2665/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2665/timeline
| null |
completed
| null | null | false |
[
"Have you tested your example code? It works fine for me...\n",
"Yes, of course! Code works fine before you send first request. See the picture.\nIf you need I can send full report (.html doc) (rendered by mako lib).\n\nOn the screen you see Python\\lib\\http\\cookiejar.py : 1289\n\n",
"In which case, it's not clear to me what the bug is. Given that `cookiejar_from_dict` works fine with `SimpleCookie`, what problem do we have?\n",
"`SimpleCookie` handle thier values as `Morsel` object, and when we pass `SimpleCookie` object to cookiejar_from_dict, value write without \"unpacking\".\n",
"I think this is not a bug, i just want to do a enchancment request. I suggest my solution, but i think this is not clear enough.\n",
"Aha, ok, good. =) Now we have a problem.\n\nHowever, I think this is a sufficiently simple problem that it doesn't warrant a helper method in requests. It's two lines of code to get the behaviour you want:\n\n``` python\nfor item in cookie.items():\n sess.cookies.set(*item)\n```\n",
"Ou! You are right :+1: Thanks\n",
"Glad I could help! :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/2664
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2664/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2664/comments
|
https://api.github.com/repos/psf/requests/issues/2664/events
|
https://github.com/psf/requests/issues/2664
| 94,306,485 |
MDU6SXNzdWU5NDMwNjQ4NQ==
| 2,664 |
`PUT` to Amazon S3 is failing
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/640792?v=4",
"events_url": "https://api.github.com/users/avinassh/events{/privacy}",
"followers_url": "https://api.github.com/users/avinassh/followers",
"following_url": "https://api.github.com/users/avinassh/following{/other_user}",
"gists_url": "https://api.github.com/users/avinassh/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/avinassh",
"id": 640792,
"login": "avinassh",
"node_id": "MDQ6VXNlcjY0MDc5Mg==",
"organizations_url": "https://api.github.com/users/avinassh/orgs",
"received_events_url": "https://api.github.com/users/avinassh/received_events",
"repos_url": "https://api.github.com/users/avinassh/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/avinassh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/avinassh/subscriptions",
"type": "User",
"url": "https://api.github.com/users/avinassh",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2015-07-10T13:23:10Z
|
2021-09-08T12:01:05Z
|
2015-07-10T16:11:51Z
|
NONE
|
resolved
|
I am trying to upload a file to Amazon S3 with Python Requests (Python is v2.7.9 and requests is v2.7). Following the curl command which works perfectly:
```
curl --request PUT --upload-file img.png https://mybucket-dev.s3.amazonaws.com/6b89e187-26fa-11e5-a04f-a45e60d45b53?Signature=Ow%3D&Expires=1436595966&AWSAccessKeyId=AQ
```
However when I tried to do same with requests, it fails with 403 with following error:
```
<Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message>
```
Here's what I tried:
```
url = https://mybucket-dev.s3.amazonaws.com/6b89e187-26fa-11e5-a04f-a45e60d45b53?Signature=Ow%3D&Expires=1436595966&AWSAccessKeyId=AQ
headers = {'Content-Length': '52369', 'Host': 'mybucket-dev.s3.amazonaws.com', 'Expect': '100-continue', 'Accept': '*/*', 'User-Agent': 'curl/7.37.1'}
payload={'Expires': '1436595966', 'AWSAccessKeyId': 'AQ', 'Signature': 'Ow%3D'}
requests.put(url, files={'file': base64_encoded_image})
requests.put(url, files={'upload_file': base64_encoded_image})
requests.put(url, files={'file': base64_encoded_image}, headers=headers)
requests.put(url, files={'file': base64_encoded_image}, headers=headers, data=payload)
```
They all fail, with same error. Here's curl in verbose mode:
```
* Hostname was NOT found in DNS cache
* Trying 54.231.168.134...
* Connected to mybucket-dev.s3.amazonaws.com (54.231.168.134) port 443 (#0)
* TLS 1.2 connection using TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA
* Server certificate: *.s3.amazonaws.com
* Server certificate: VeriSign Class 3 Secure Server CA - G3
* Server certificate: VeriSign Class 3 Public Primary Certification Authority - G5
> PUT /6b89e187-26fa-11e5-a04f-a45e60d45b53?Signature=Ow%3D&Expires=1436595966&AWSAccessKeyId=AQ HTTP/1.1
> User-Agent: curl/7.37.1
> Host: mybucket-dev.s3.amazonaws.com
> Accept: */*
> Content-Length: 52369
> Expect: 100-continue
>
< HTTP/1.1 100 Continue
* We are completely uploaded and fine
< HTTP/1.1 200 OK
< x-amz-id-2: 5lLCQ3FVrTBg2vkyk44E+MecQJb2OGiloO0+2pKePtxPgZptKECNlUyYN43sl4LBNe9f8idh/cc=
< x-amz-request-id: 636A24D53DEB5215
< Date: Fri, 10 Jul 2015 12:04:44 GMT
< ETag: "5802130d4320b56a72afe720e2c323a7"
< Content-Length: 0
* Server AmazonS3 is not blacklisted
< Server: AmazonS3
<
* Connection #0 to host mybucket-dev.s3.amazonaws.com left intact
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/640792?v=4",
"events_url": "https://api.github.com/users/avinassh/events{/privacy}",
"followers_url": "https://api.github.com/users/avinassh/followers",
"following_url": "https://api.github.com/users/avinassh/following{/other_user}",
"gists_url": "https://api.github.com/users/avinassh/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/avinassh",
"id": 640792,
"login": "avinassh",
"node_id": "MDQ6VXNlcjY0MDc5Mg==",
"organizations_url": "https://api.github.com/users/avinassh/orgs",
"received_events_url": "https://api.github.com/users/avinassh/received_events",
"repos_url": "https://api.github.com/users/avinassh/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/avinassh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/avinassh/subscriptions",
"type": "User",
"url": "https://api.github.com/users/avinassh",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2664/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2664/timeline
| null |
completed
| null | null | false |
[
"Well, step 1 is to avoid sending `Expect: 100 Continue`. We don't handle them properly. Please also don't send your own `Content-Length` header: requests will do that for you.\n\nThe real problem is that I think your upload is going to be multipart/form-encoded, but curl is uploading the file directly. Try: `requests.put(url, data=open('img.png', 'rb'))`\n",
"So, I made a random image and did\n\n```\ncurl --request PUT --upload-file img.png https://httpbin.org/put\n```\n\nTo verify that curl was doing exactly what I expected (which it does). It uploads the file as raw data.\n\nThe `files=` parameter in requests does `multipart/form-data` uploads, not raw data uploads. To replicate the same behaviour in requests all you need to do is the following:\n\n``` py\nwith open('img.png', 'rb') as data:\n requests.put(url, data=data)\n```\n\nRequests will handle setting the content-length and everything else.\n",
"works! Thank you very much guys :dancer: :smile: \n",
"With using trying an zip file I get the following, any clues: (requests-2.13.0) \r\n\r\n```\r\nwith open('default.zip', 'rb') as data:\r\n requests.put(url, data=data)\r\n\r\n```\r\n\r\nOutput:\r\n\r\n```\r\nC:\\svn\\libraries\\cpp>python req_put.py default.zip\r\nMD5: 5dc0658d93e942fa7d1fa443e04bba83\r\nSHA1: 18e7980d9d9ac544ac3e684a4d051932c3a3a336\r\nTraceback (most recent call last):\r\n File \"req_put.py\", line 102, in <module>\r\n resp = requests.put(uri, headers=headers, data=data) # data=data , params=payload)\r\n File \"C:\\Python27\\lib\\site-packages\\requests\\api.py\", line 124, in put\r\n return request('put', url, data=data, **kwargs)\r\n File \"C:\\Python27\\lib\\site-packages\\requests\\api.py\", line 56, in request\r\n return session.request(method=method, url=url, **kwargs)\r\n File \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 488, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 609, in send\r\n r = adapter.send(request, **kwargs)\r\n File \"C:\\Python27\\lib\\site-packages\\requests\\adapters.py\", line 473, in send\r\n raise ConnectionError(err, request=request)\r\nrequests.exceptions.ConnectionError: ('Connection aborted.', error(10054, 'An existing connection was forcibly closed by the remote host'))\r\n\r\n```"
] |
https://api.github.com/repos/psf/requests/issues/2663
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2663/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2663/comments
|
https://api.github.com/repos/psf/requests/issues/2663/events
|
https://github.com/psf/requests/pull/2663
| 94,246,369 |
MDExOlB1bGxSZXF1ZXN0Mzk2NjE1NzE=
| 2,663 |
Allow non-latin1 credentials
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/41663?v=4",
"events_url": "https://api.github.com/users/mar10/events{/privacy}",
"followers_url": "https://api.github.com/users/mar10/followers",
"following_url": "https://api.github.com/users/mar10/following{/other_user}",
"gists_url": "https://api.github.com/users/mar10/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mar10",
"id": 41663,
"login": "mar10",
"node_id": "MDQ6VXNlcjQxNjYz",
"organizations_url": "https://api.github.com/users/mar10/orgs",
"received_events_url": "https://api.github.com/users/mar10/received_events",
"repos_url": "https://api.github.com/users/mar10/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mar10/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mar10/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mar10",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2015-07-10T08:14:54Z
|
2021-09-08T07:00:51Z
|
2015-07-13T21:01:31Z
|
NONE
|
resolved
|
Referring to #1926
Maybe I am wrong, but my understanding was that header fields must be latin-1 encoded.
This is awlays true for Basic Authentication headers, since base64 encoded strings consist of plain ascii.
I would think however that `<username>:<password>` may contain special characters, as long as client and server assume the same encoding (for example utf8).
This code currently encodes the credentials:
``` py
def _basic_auth_str(username, password):
"""Returns a Basic Auth string."""
authstr = 'Basic ' + to_native_string(
b64encode(('%s:%s' % (username, password)).encode('latin1')).strip()
)
return authstr
```
but could be changed to encode the base64 header string instead:
``` py
...
b64encode(('%s:%s' % (username, password))).encode('latin1').strip()
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/41663?v=4",
"events_url": "https://api.github.com/users/mar10/events{/privacy}",
"followers_url": "https://api.github.com/users/mar10/followers",
"following_url": "https://api.github.com/users/mar10/following{/other_user}",
"gists_url": "https://api.github.com/users/mar10/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mar10",
"id": 41663,
"login": "mar10",
"node_id": "MDQ6VXNlcjQxNjYz",
"organizations_url": "https://api.github.com/users/mar10/orgs",
"received_events_url": "https://api.github.com/users/mar10/received_events",
"repos_url": "https://api.github.com/users/mar10/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mar10/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mar10/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mar10",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2663/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2663/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2663.diff",
"html_url": "https://github.com/psf/requests/pull/2663",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2663.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2663"
}
| true |
[
"I _think_ I'm ok with this change. I think we need to confirm it functions as expected on Python 3, though.\n",
"> I think we need to confirm it functions as expected on Python 3, though.\n\nDid you address _me_ with '_we_'?\n\nI guess there should be tests that pass username and passwords as binary strings, unicode, and mixed.\n(The `.encode('latin1')` could probably be replaced by `.encode(ascii)` to point out that it's purpose is now only to convert the `b64encode()` result to a binary ascii sequence.)\n\nAnyway, my main concern would be a risk analysis, since I am not 100% sure what the consequences of this change will be. \nThe current implementation (if I understand correctly) would make it impossible to pass a Chinese username, for example. My suggested change would allow the client to pass any string if he decides to use UTF-8.\n\nThe problem seems to be due to a gap in the original spec . I looked around a bit and found some references:\n\nhttp://stackoverflow.com/a/7243567/19166 , which seems to refer to this latest version https://tools.ietf.org/html/draft-ietf-httpauth-basicauth-update-07\n\nMaybe you want to discuss with your team.\n",
"> Did you address me with 'we'?\n\nI addressed the collective world of people paying attention to this bug. =) It was code for \"I have identified this missing work item but do not have time to deal with it right now.\"\n\n> https://tools.ietf.org/html/draft-ietf-httpauth-basicauth-update-07\n\nThat draft has expired, sadly.\n\nI don't think this change is in violation of any RFC, I think it's reasonable. I may want to try to test it with a few other UAs and see what they do, e.g. curl, before deciding on this as a proposal.\n",
"Cool :) There is no hurry from my side, since we currently pass the credentials directly using `headers` instead of `auth`. \nThanks for taking the time and working on a really great library!\n\n(Btw, draft 07 will expire sept.'15)\n",
"Ok, this change doesn't work. On Python 3, `b64encode` takes a bytestring. This means the encode _has_ to happen first.\n\nOn balance, I think using Latin1 for this space of the code is roughly right, until such time as the `charset` enhancement to Basic Auth lands.\n",
"Ok, I see. \nMy original idea was, to let the caller choose its preferred encoding by (optionally) passing bytestrings to the `auth` argument. But then the `\"%s\"` approach breaks on Py3:\n\n``` py\n>>> user = b\"joe\"\n>>> password = b\"secret\"\n>>> \"%s:%s\" % (user, password)\n\"b'joe':b'secret'\"\n>>> \n```\n\nMaybe s.th. like this could work:\n\n``` py\nb64encode(to_binary(username) + to_binary(\":\") + to_binary(password)).strip()\n```\n\nwhere `to_binary(x)` encodes unicode using latin1 but is a no-op if _x_ is a already a binarystring.\nBut i am not sure if it is worth the effort...\n\nSimply allowing to override the 'latin1' default encoding in your original code could be a simpler solution.\nOr we may simply close this PR :)\n",
"I think if there is really a concern about Latin 1, users should override the auth logic themselves and provide the header manually. With requests things like this are all about trade-offs, and in this case I reckon we hit about 80% of the use-case here with none of the complexity.\n\nAt this point, unless there's any objections, I think this might be fine as-is.\n",
"no objections, we may close it (not sure if Western Europe + US users make up for 80% though ;-)\n"
] |
https://api.github.com/repos/psf/requests/issues/2662
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2662/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2662/comments
|
https://api.github.com/repos/psf/requests/issues/2662/events
|
https://github.com/psf/requests/pull/2662
| 93,715,972 |
MDExOlB1bGxSZXF1ZXN0Mzk0NDc2Njc=
| 2,662 |
fix some small spell problem
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6214861?v=4",
"events_url": "https://api.github.com/users/mudongliang/events{/privacy}",
"followers_url": "https://api.github.com/users/mudongliang/followers",
"following_url": "https://api.github.com/users/mudongliang/following{/other_user}",
"gists_url": "https://api.github.com/users/mudongliang/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mudongliang",
"id": 6214861,
"login": "mudongliang",
"node_id": "MDQ6VXNlcjYyMTQ4NjE=",
"organizations_url": "https://api.github.com/users/mudongliang/orgs",
"received_events_url": "https://api.github.com/users/mudongliang/received_events",
"repos_url": "https://api.github.com/users/mudongliang/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mudongliang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mudongliang/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mudongliang",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-07-08T07:12:00Z
|
2021-09-08T07:00:52Z
|
2015-07-08T07:23:32Z
|
NONE
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2662/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2662/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2662.diff",
"html_url": "https://github.com/psf/requests/pull/2662",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2662.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2662"
}
| true |
[
"Thanks for this @mudongliang!\n\nUnfortunately, neither of those is actually incorrect, they're just informal English.\n\n> (or branch off of it)\n\nTo \"branch off\" of something is to create a new branch from it. Thus, to \"branch off of it [master]\" is to create a new branch from the master branch.\n\n> Send a pull request and bug the maintainer until it gets merged or published.\n\nIn this case, \"bug\" does not mean the noun \"bug\" (a defect or imperfection), it means the verb \"to bug\" (to bother, to annoy, to pester). The sentence therefore means \"Send a pull request and annoy the maintainer until it gets merged or published\".\n\nSorry we can't merge this, but thanks for opening it! :cake:\n"
] |
|
https://api.github.com/repos/psf/requests/issues/2661
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2661/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2661/comments
|
https://api.github.com/repos/psf/requests/issues/2661/events
|
https://github.com/psf/requests/pull/2661
| 92,902,034 |
MDExOlB1bGxSZXF1ZXN0MzkxODU4NjU=
| 2,661 |
Only pass useful timeouts to _get_conn
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
] | null | 3 |
2015-07-03T16:04:53Z
|
2021-09-08T07:00:52Z
|
2015-07-04T20:19:37Z
|
MEMBER
|
resolved
|
So `_get_conn` doesn't use a urllib3 `Timeout` object, it only takes a float. It also only does stuff if `block` is `True`. Given that by default we use `block=False`, it won't do anything, so let's not use it. However, if someone changes that with a global constant they may want to be able to change this too, so let's make it a named constant.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2661/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2661/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2661.diff",
"html_url": "https://github.com/psf/requests/pull/2661",
"merged_at": "2015-07-04T20:19:37Z",
"patch_url": "https://github.com/psf/requests/pull/2661.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2661"
}
| true |
[
"/cc @shazow \n",
":custard:\n",
"LGTM\n"
] |
https://api.github.com/repos/psf/requests/issues/2660
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2660/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2660/comments
|
https://api.github.com/repos/psf/requests/issues/2660/events
|
https://github.com/psf/requests/issues/2660
| 92,899,058 |
MDU6SXNzdWU5Mjg5OTA1OA==
| 2,660 |
HTTPAdapter timeout type error
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/198012?v=4",
"events_url": "https://api.github.com/users/gbishop/events{/privacy}",
"followers_url": "https://api.github.com/users/gbishop/followers",
"following_url": "https://api.github.com/users/gbishop/following{/other_user}",
"gists_url": "https://api.github.com/users/gbishop/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gbishop",
"id": 198012,
"login": "gbishop",
"node_id": "MDQ6VXNlcjE5ODAxMg==",
"organizations_url": "https://api.github.com/users/gbishop/orgs",
"received_events_url": "https://api.github.com/users/gbishop/received_events",
"repos_url": "https://api.github.com/users/gbishop/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gbishop/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gbishop/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gbishop",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2015-07-03T15:44:22Z
|
2021-09-08T23:00:52Z
|
2015-07-09T17:37:28Z
|
NONE
|
resolved
|
At https://github.com/kennethreitz/requests/blob/master/requests/adapters.py#L378 you call
conn._get_conn(timeout=timeout)
The timeout object you pass is a Timeout but that internal method in urllib3 expects a float. This results in a type error about float + Timeout.
A float works there but I wasn't sure which to use.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2660/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2660/timeline
| null |
completed
| null | null | false |
[
"Yeah, I think I spotted this before. I have no idea why we never fixed it, but it's definitely a bug.\n",
"I found it while chasing the Connection pool is full warning on chunked\nposts. No parameter settings to HTTPAdaptor seem to help.\n\nOn Fri, Jul 3, 2015 at 11:55 AM, Cory Benfield [email protected]\nwrote:\n\n> Yeah, I think I spotted this before. I have no idea why we never fixed it,\n> but it's definitely a bug.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2660#issuecomment-118380232\n> .\n",
"See #2661 for what I think is a fix for this.\n",
"Yes, that is equivalent to my hacked test.\n\nOn Fri, Jul 3, 2015 at 12:05 PM, Cory Benfield [email protected]\nwrote:\n\n> See #2661 https://github.com/kennethreitz/requests/pull/2661 for what I\n> think is a fix for this.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2660#issuecomment-118382859\n> .\n",
"Fixed. =) Thanks @gbishop!\n"
] |
https://api.github.com/repos/psf/requests/issues/2659
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2659/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2659/comments
|
https://api.github.com/repos/psf/requests/issues/2659/events
|
https://github.com/psf/requests/issues/2659
| 92,788,456 |
MDU6SXNzdWU5Mjc4ODQ1Ng==
| 2,659 |
Extend "verify" to accept directory argument
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4121436?v=4",
"events_url": "https://api.github.com/users/staticglobal/events{/privacy}",
"followers_url": "https://api.github.com/users/staticglobal/followers",
"following_url": "https://api.github.com/users/staticglobal/following{/other_user}",
"gists_url": "https://api.github.com/users/staticglobal/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/staticglobal",
"id": 4121436,
"login": "staticglobal",
"node_id": "MDQ6VXNlcjQxMjE0MzY=",
"organizations_url": "https://api.github.com/users/staticglobal/orgs",
"received_events_url": "https://api.github.com/users/staticglobal/received_events",
"repos_url": "https://api.github.com/users/staticglobal/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/staticglobal/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/staticglobal/subscriptions",
"type": "User",
"url": "https://api.github.com/users/staticglobal",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2015-07-03T04:32:03Z
|
2021-09-08T21:00:46Z
|
2015-11-07T21:39:41Z
|
NONE
|
resolved
|
On my systems I exclusively work with collections of CA certificates stored as individual files in a single directory (with appropriate c_rehash symlinks) rather than the large monolithic "CA BUNDLE" file with every cert concatenated inside. Most *nix distributions provide both. A directory of certs at /etc/ssl/certs (or similar) as well as the single-large-file.
Monolithic files are probably easiest for most turnkey users who are only going to be connecting to a public server that uses a popular CA. They can simply download a single file from the internet and point the API at it to get up and running. However, the directory-based approach is much more manageable for users who need or want to customize their CA selection. Remove some old untrusted CAs or include a personal or corporate CA.
The openssl and pyopenssl APIs provide for both. load_verify_locations() accepts both a cafile and capath argument. You may specify one or the other or both. However, requests always passes "verify" as the cafile argument, and correctly documents that the input must be a file. If you try to pass a directory anyway, openssl gives you "SSLError: [Errno 21] Is a directory".
Please expand the definition of the "verify" parameter to also accept directory arguments. If the argument is a directory, pass it as the capath argument to load_verify_locations(), else continue passing it as cafile.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2659/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2659/timeline
| null |
completed
| null | null | false |
[
"I like this.\n",
"Alright, cool, let's do it.\n",
"I think this is an issue for `urllib3` -- if I understand correctly (and it's very possible I don't!), when `requests` wants to send a request, it [asks for an `adapter`](https://github.com/kennethreitz/requests/blob/master/requests/sessions.py#L567) -- that adapter wraps a `urllib3` connection via `get_connection()`. When that `urllib3` connection has [its socket wrapped](https://github.com/shazow/urllib3/blob/master/urllib3/connection.py#L237), the call to `load_verify_locations` [is made](https://github.com/shazow/urllib3/blob/master/urllib3/util/ssl_.py#L264), but it simply passes the first parameter (which in this case is just a file location). So in order to pass a `capath` argument through, `urllib3`'s `ssl_wrap_socket`'s function would need to change.\n",
"That's entirely correct. However, this is an issue that we want in requests, so for the moment we drive it here. =) An _enhancement_ is required in urllib3, but the exact form that will take is not yet known.\n",
"Ah gotcha, thanks for the clarification!\n",
"Ok, let's get started on this.\n",
"The urllib3 part of this work is in shazow/urllib3#701.\n",
"We can't progress this until urllib3 next pushes a release that contains the appropriate change, so that we can update our bundled dependency. This is on hold until that time.\n",
"See #2858.\n"
] |
https://api.github.com/repos/psf/requests/issues/2658
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2658/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2658/comments
|
https://api.github.com/repos/psf/requests/issues/2658/events
|
https://github.com/psf/requests/issues/2658
| 92,743,662 |
MDU6SXNzdWU5Mjc0MzY2Mg==
| 2,658 |
Docs inconsistent with code when it comes to params
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1186534?v=4",
"events_url": "https://api.github.com/users/djha-skin/events{/privacy}",
"followers_url": "https://api.github.com/users/djha-skin/followers",
"following_url": "https://api.github.com/users/djha-skin/following{/other_user}",
"gists_url": "https://api.github.com/users/djha-skin/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/djha-skin",
"id": 1186534,
"login": "djha-skin",
"node_id": "MDQ6VXNlcjExODY1MzQ=",
"organizations_url": "https://api.github.com/users/djha-skin/orgs",
"received_events_url": "https://api.github.com/users/djha-skin/received_events",
"repos_url": "https://api.github.com/users/djha-skin/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/djha-skin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/djha-skin/subscriptions",
"type": "User",
"url": "https://api.github.com/users/djha-skin",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-07-02T21:53:13Z
|
2021-09-08T23:00:52Z
|
2015-07-02T23:19:37Z
|
NONE
|
resolved
|
The docs say that if you want to include a list in parameters, you have to append `[]` to the key name:
> In order to pass a list of items as a value you must mark the key as referring to a list like string by
> appending [] to the key:
>
> ```
> >>> payload = {'key1': 'value1', 'key2[]': ['value2', 'value3']}
> >>> r = requests.get("http://httpbin.org/get", params=payload)
> >>> print(r.url)
> http://httpbin.org/get?key1=value1&key2%5B%5D=value2&key2%5B%5D=value3
> ```
but, the code actually seems to support lists transparently, without using the appended brackets. I am so glad it does this, by the way. some REST APIs I make calls to are out of my control, I can't just tell them to include brackets in their keys.
```
75 def _encode_params(data):
76 """Encode parameters in a piece of data.
77
78 Will successfully encode parameters when passed as a dict or a list of
79 2-tuples. Order is retained if data is a list of 2-tuples but arbitrary
80 if parameters are supplied as a dict.
81 """
82
83 if isinstance(data, (str, bytes)):
84 return data
85 elif hasattr(data, 'read'):
86 return data
87 elif hasattr(data, '__iter__'):
88 result = []
89 for k, vs in to_key_val_list(data):
90 if isinstance(vs, basestring) or not hasattr(vs, '__iter__'):
91 vs = [vs]
92 for v in vs:
93 if v is not None:
94 result.append(
95 (k.encode('utf-8') if isinstance(k, str) else k,
96 v.encode('utf-8') if isinstance(v, str) else v))
97 return urlencode(result, doseq=True)
98 else:
99 return data
```
To be certain: _Does requests natively support lists, or am I bonkers?_
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2658/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2658/timeline
| null |
completed
| null | null | false |
[
"You need to inspect `to_key_val_list` to get the full description. `to_key_val_list` will take a dictionary like `{'foo': 'bar', 'biz': 'baz'}` and turn it into `[('foo', 'bar'), ('biz', 'baz')]` (although the order may not be preserved. Alternatively, if `data` in this case is already of the latter form, it'll just return it. That's the kind of list that `params` will support. If the documentation of that has disappeared then it should be replaced, but you cannot pass `['foo', 'bar', 'bogus']`. \n\nA quick test shows that \n\n``` py\n>>> r = requests.get('https://httpbin.org/get', params=[('a', 'b'), ('a', 'c')])\n>>> r.request.url\n'https://httpbin.org/get?a=b&a=c'\n```\n\nWorks fine. The above usecase, however, pertains to specific APIs that expect array-like syntax for query parameters.\n",
"The docs intimate that you can pass something like `{'a':['b', 'c']}` into params. That is the kind of list I am referring to: one associated with a key in a python dictionary, like so:\n\n```\n>>> r = requests.get('https://httpbin.org/get', params={'a': ['b', 'c']})\n>>> r.request.url\n'https://httpbin.org/get?a=b&a=c'\n```\n\nAs you can see, it works fine. However, the docs say you have to add a `[]` to get this kind of behavior to the key:\n\n```\n>>> r = requests.get('https://httpbin.org/get', params={'a[]': ['b', 'c']})\n>>> r.request.url\n'https://httpbin.org/get?a%5B%5D=b&a%5B%5D=c'\n```\n\nwhich would be useless, but thankfully the correct behavior is already present. It is just that the docs need to be updated.\n"
] |
https://api.github.com/repos/psf/requests/issues/2657
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2657/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2657/comments
|
https://api.github.com/repos/psf/requests/issues/2657/events
|
https://github.com/psf/requests/issues/2657
| 91,725,046 |
MDU6SXNzdWU5MTcyNTA0Ng==
| 2,657 |
Requests with german umlauts go failed
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/236668?v=4",
"events_url": "https://api.github.com/users/OlafRadicke/events{/privacy}",
"followers_url": "https://api.github.com/users/OlafRadicke/followers",
"following_url": "https://api.github.com/users/OlafRadicke/following{/other_user}",
"gists_url": "https://api.github.com/users/OlafRadicke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/OlafRadicke",
"id": 236668,
"login": "OlafRadicke",
"node_id": "MDQ6VXNlcjIzNjY2OA==",
"organizations_url": "https://api.github.com/users/OlafRadicke/orgs",
"received_events_url": "https://api.github.com/users/OlafRadicke/received_events",
"repos_url": "https://api.github.com/users/OlafRadicke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/OlafRadicke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/OlafRadicke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/OlafRadicke",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2015-06-29T07:37:07Z
|
2021-09-08T23:00:53Z
|
2015-06-29T09:50:13Z
|
NONE
|
resolved
|
Hi, I get trouble with german umlauts in my json requests. An example:
> ^C[or@prag ~]$ python3
> Python 3.4.1 (default, Nov 3 2014, 14:38:10)
> [GCC 4.9.1 20140930 (Red Hat 4.9.1-11)] on linux
> Type "help", "copyright", "credits" or "license" for more information.
>
> > > > import requests
> > > > request_uri = "http://127.0.0.1:5984/kochrezepte/pesto_alla_genovese_4"
> > > > headers = {'content-type': 'application/json'}
> > > > json_doc = '{"titel":"Pesto alla genovese","zutaten": ["Basilikum","Knoblauch","Pinienkerne","Käse","Öl","Salz","Pfeffer"], "garzeit": 15 }'
> > > > respon = requests.put(request_uri, auth=("olaf", "olaf"), data=json_doc)
> > > > print( respon.text )
> > > > {"error":"bad_request","reason":"invalid_json"}
Now without umlauts:
> > > > json_doc = '{"titel":"Pesto alla genovese","zutaten": ["Basilikum","Knoblauch","Pinienkerne","Kaese","Oel","Salz","Pfeffer"], "garzeit": 15 }'
> > > > request_uri = "http://127.0.0.1:5984/kochrezepte/pesto_alla_genovese_5">>> respon = requests.put(request_uri, auth=("olaf", "olaf"), data=json_doc)
> > > > print( respon.text )
> > > > {"ok":true,"id":"pesto_alla_genovese_5","rev":"1-7a7a44a33eace1f7f8b518ffe80bd482"}
With curl I don't get errors:
> [or@prag ~]$ curl -X PUT http://olaf:[email protected]:5984/kochrezepte/pesto_alla_genovese_6 -d '{"titel":"Pesto alla genovese","zutaten": ["Basilikum","Knoblauch","Pinienkerne","Käse","Öl","Salz","Pfeffer"], "garzeit": 15 }'
> {"ok":true,"id":"pesto_alla_genovese_6","rev":"1-cb3146caa4ed3b042803ae184278cc1f"}
CouchDB say:
> [root@prag or]# tail /var/log/couchdb/couch.log
> [Mon, 29 Jun 2015 07:28:04 GMT] [debug] [<0.116.0>] OAuth Params: []
> [Mon, 29 Jun 2015 07:28:04 GMT] [error] [<0.116.0>] attempted upload of invalid JSON (set log_level to debug to log it)
> [Mon, 29 Jun 2015 07:28:04 GMT] [debug] [<0.116.0>] Invalid JSON: {{error,
> {85,
> "lexical error: invalid bytes in UTF8 string.\n"}},
> <<"{\"titel\":\"Pesto alla genovese\",\"zutaten\": [\"Basilikum\",\"Knoblauch\",\"Pinienkerne\",\"K�se\",\"�l\",\"Salz\",\"Pfeffer\"], \"garzeit\": 15 }">>}
> [Mon, 29 Jun 2015 07:28:04 GMT] [info] [<0.116.0>] 127.0.0.1 - - PUT /kochrezepte/pesto_alla_genovese_7 400
> [Mon, 29 Jun 2015 07:28:04 GMT] [debug] [<0.116.0>] httpd 400 error response:
> {"error":"bad_request","reason":"invalid_json"}
So it's look like a unicode issue. Is my Python code wrong or the requests lib?
Thank you
Olaf (from germany, augsburg)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2657/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2657/timeline
| null |
completed
| null | null | false |
[
"I think it's just that your request is wrong. When you use the `data` keyword we send the data as application/x-www-form-urlencoded data, not as JSON. Try using the `json` keyword, like this:\n\n``` python\nimport requests\nrequest_uri = \"http://127.0.0.1:5984/kochrezepte/pesto_alla_genovese_4\"\njson_doc = '{\"titel\":\"Pesto alla genovese\",\"zutaten\": [\"Basilikum\",\"Knoblauch\",\"Pinienkerne\",\"Käse\",\"Öl\",\"Salz\",\"Pfeffer\"], \"garzeit\": 15 }'\nrespon = requests.put(request_uri, auth=(\"olaf\", \"olaf\"), json=json_doc)\n```\n",
"Hm, yes it's work...\n\n> [or@prag ~]$ python3\n> Python 3.4.1 (default, Nov 3 2014, 14:38:10) \n> [GCC 4.9.1 20140930 (Red Hat 4.9.1-11)] on linux\n> Type \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n> \n> > > import requests\n> > > request_uri = \"http://127.0.0.1:5984/kochrezepte/pesto_alla_genovese_10\"\n> > > headers = {'content-type': 'application/json'}\n> > > json_doc = '{\"titel\":\"Pesto alla genovese\",\"zutaten\": [\"Basilikum\",\"Knoblauch\",\"Pinienkerne\",\"Käse\",\"Öl\",\"Salz\",\"Pfeffer\"], \"garzeit\": 15 }'\n> > > import json\n> > > respon = requests.put(request_uri, auth=(\"olaf\", \"olaf\"), json=json.loads(json_doc))\n> > > print(respon.text)\n> > > {\"ok\":true,\"id\":\"pesto_alla_genovese_10\",\"rev\":\"1-4367a12038ab5809395199002de7e76d\"}\n\nBut i can't not find the documentation of this cool feature. And if I look in the sources: https://github.com/kennethreitz/requests/blob/master/requests/api.py than I can only find \"data=None\". How could I find the?\n\nThank you\n\nOlaf\n",
"You can find it [here](https://github.com/kennethreitz/requests/blob/master/requests/api.py#L24). You can also find it [in the documentation](http://docs.python-requests.org/en/latest/user/quickstart/#more-complicated-post-requests).\n",
"Thank you. That is very helpful.\n"
] |
https://api.github.com/repos/psf/requests/issues/2656
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2656/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2656/comments
|
https://api.github.com/repos/psf/requests/issues/2656/events
|
https://github.com/psf/requests/pull/2656
| 91,671,789 |
MDExOlB1bGxSZXF1ZXN0Mzg3MzcyMzA=
| 2,656 |
Allow get_netrc_auth to raise parse/permission errors to caller
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/836426?v=4",
"events_url": "https://api.github.com/users/dpursehouse/events{/privacy}",
"followers_url": "https://api.github.com/users/dpursehouse/followers",
"following_url": "https://api.github.com/users/dpursehouse/following{/other_user}",
"gists_url": "https://api.github.com/users/dpursehouse/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dpursehouse",
"id": 836426,
"login": "dpursehouse",
"node_id": "MDQ6VXNlcjgzNjQyNg==",
"organizations_url": "https://api.github.com/users/dpursehouse/orgs",
"received_events_url": "https://api.github.com/users/dpursehouse/received_events",
"repos_url": "https://api.github.com/users/dpursehouse/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dpursehouse/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dpursehouse/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dpursehouse",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2015-06-29T02:18:17Z
|
2021-09-08T07:00:53Z
|
2015-06-29T07:24:21Z
|
CONTRIBUTOR
|
resolved
|
If the netrc file exists but cannot be parsed or read, get_netrc_auth
silently fails.
Add a new argument `ignore_errors` which when set to False will cause
any parse/permission errors to be raised to the caller. The default
value is True, which means the default behavior is unchanged.
Fixes #2654
Change-Id: I7436aaaf593178673ab84fd9e7ab4bcb0e3fe75e
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2656/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2656/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2656.diff",
"html_url": "https://github.com/psf/requests/pull/2656",
"merged_at": "2015-06-29T07:24:21Z",
"patch_url": "https://github.com/psf/requests/pull/2656.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2656"
}
| true |
[
"With a badly formed netrc file, `netrc.netrc` fails:\n\n```\n>>> from netrc import netrc\n>>> netrc()\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/usr/lib/python2.7/netrc.py\", line 32, in __init__\n self._parse(file, fp)\n File \"/usr/lib/python2.7/netrc.py\", line 94, in _parse\n file, lexer.lineno)\nnetrc.NetrcParseError: bad follower token 'fssdfs' (/home/user/.netrc, line 23)\n```\n\nDefault behavior of `requests.utils.get_netrc_auth` is to silently fail:\n\n```\n>>> from requests.utils import get_netrc_auth\n>>> get_netrc_auth(\"www.example.com\")\n```\n\nWith this patch, passing `ignore_errors=False` allows the parse error to be raised:\n\n```\n>>> get_netrc_auth(\"www.example.com\", ignore_errors=False)\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"requests/utils.py\", line 101, in get_netrc_auth\n _netrc = netrc(netrc_path).authenticators(host)\n File \"/usr/lib/python2.7/netrc.py\", line 32, in __init__\n self._parse(file, fp)\n File \"/usr/lib/python2.7/netrc.py\", line 94, in _parse\n file, lexer.lineno)\nnetrc.NetrcParseError: bad follower token 'fssdfs' (/home/user/.netrc, line 23)\n>>> \n```\n",
"This is simple enough. Personally I'd rather have something like\n\n``` py\nfrom requests import utils\n\nutils.get_netrc_auth('www.example.com', raise_errors=True)\n```\n\nI've found people tend to become more confused when passing `False` as an override instead of `True`.\n",
"@sigmavirus24 thanks for the feedback. Added a new commit to change the logic as suggested.\n",
"Cool. LGTM. I'll let @Lukasa weigh in.\n\nThanks @dpursehouse \n",
"This LGTM.\n"
] |
https://api.github.com/repos/psf/requests/issues/2655
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2655/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2655/comments
|
https://api.github.com/repos/psf/requests/issues/2655/events
|
https://github.com/psf/requests/pull/2655
| 91,611,905 |
MDExOlB1bGxSZXF1ZXN0Mzg3Mjc1ODM=
| 2,655 |
Handle complex redirect URIs on Python 3
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2015-06-28T15:58:44Z
|
2021-09-08T06:01:07Z
|
2015-09-01T08:32:21Z
|
MEMBER
|
resolved
|
Resolves #2653.
This is one of those annoying changes that's almost impossible to test because of the sheer complexity of our redirect handling code. This also doesn't make it any simpler, sadly.
As to what version we merge this into, I proposed it against the master branch. I don't think it belongs in 3.0.0 (`resolve_redirects` isn't really part of our public API), but it could definitely break people's stuff. Next minor release feels appropriate, but I'd like to hear what you think @sigmavirus24.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2655/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2655/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2655.diff",
"html_url": "https://github.com/psf/requests/pull/2655",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2655.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2655"
}
| true |
[
"The tests fail miserably on this :(\n\nOne way to fix the failing tests is to either change `unquote_unreserved` to call split with a `b'%'`, or to immediately decode the URL to `utf-8`, e.g.,\n\n``` py\nif is_py3 and isinstance(url, str):\n url = url.encode('latin1').decode('utf8')\n```\n\nPersonally, I'd like those to be on separate lines to make it easier to find errors in the future, but that's just me =P.\n\nPersonally, I'm a bit ... skiddish to include this in 2.8. I think 3.0 is the _safer_ option because we don't regularly test against internationalized domains. Then again, I have to wonder how many people are using requests on Python 3 with redirects that behave like this since this is the first report we've had like this in the 2+ years that we've had this code in place. Perhaps that makes it safe, but I'm not a fan of that logic. =/\n",
"Wouldn't it be better to file a bug report on Python 3? I always thought that Python 3 had some kind of \"UTF-8 everywhere where possible\" policy, so I wonder why they decode with ISO-8859-1.\n",
"> Wouldn't it be better to file a bug report on Python 3? I always thought that Python 3 had some kind of \"UTF-8 everywhere where possible\" policy, so I wonder why they decode with ISO-8859-1.\n\nNo that's not the policy. Even so, it doesn't make sense in the context of HTTP/1.1 where headers were (until the latest revision of the specification late last year) always supposed to be Latin-1 (a.k.a, ISO-8859-1). http.client is doing the right thing on Python 3 by returning text as was described in the original version of the HTTP/1.1 specification\n",
"Yeah, this definitely isn't _wrong_ for Python 3, it's trying to do the best it can, it just happens to be inconsistent. I'll think some more, I think I can restructure the code to get it to work properly.\n",
"Ok, I've fixed the broken tests. I agree with you though @sigmavirus24, I think this is scary enough to want to go into 3.0.0, so I'm closing this PR and opening a new one with the correct target.\n",
"Please see #2754 for the new version of this change.\n"
] |
https://api.github.com/repos/psf/requests/issues/2654
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2654/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2654/comments
|
https://api.github.com/repos/psf/requests/issues/2654/events
|
https://github.com/psf/requests/issues/2654
| 91,610,831 |
MDU6SXNzdWU5MTYxMDgzMQ==
| 2,654 |
utils.get_netrc_auth silently fails when netrc exists but fails to parse
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/836426?v=4",
"events_url": "https://api.github.com/users/dpursehouse/events{/privacy}",
"followers_url": "https://api.github.com/users/dpursehouse/followers",
"following_url": "https://api.github.com/users/dpursehouse/following{/other_user}",
"gists_url": "https://api.github.com/users/dpursehouse/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dpursehouse",
"id": 836426,
"login": "dpursehouse",
"node_id": "MDQ6VXNlcjgzNjQyNg==",
"organizations_url": "https://api.github.com/users/dpursehouse/orgs",
"received_events_url": "https://api.github.com/users/dpursehouse/received_events",
"repos_url": "https://api.github.com/users/dpursehouse/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dpursehouse/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dpursehouse/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dpursehouse",
"user_view_type": "public"
}
|
[
{
"color": "02e10c",
"default": false,
"description": null,
"id": 76800,
"name": "Feature Request",
"node_id": "MDU6TGFiZWw3NjgwMA==",
"url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request"
},
{
"color": "0b02e1",
"default": false,
"description": null,
"id": 191274,
"name": "Contributor Friendly",
"node_id": "MDU6TGFiZWwxOTEyNzQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly"
}
] |
closed
| true | null |
[] |
{
"closed_at": "2015-10-12T10:32:06Z",
"closed_issues": 7,
"created_at": "2015-04-29T13:03:39Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/25",
"id": 1089203,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/25/labels",
"node_id": "MDk6TWlsZXN0b25lMTA4OTIwMw==",
"number": 25,
"open_issues": 0,
"state": "closed",
"title": "2.8.0",
"updated_at": "2015-10-12T10:32:06Z",
"url": "https://api.github.com/repos/psf/requests/milestones/25"
}
| 2 |
2015-06-28T15:44:22Z
|
2021-09-08T23:00:53Z
|
2015-06-29T07:24:21Z
|
CONTRIBUTOR
|
resolved
|
My .netrc contains a line for the github auth, [like this](https://gist.github.com/wikimatze/9790374).
It turns out that `netrc.netrc()` doesn't like that:
```
>>> from netrc import netrc
>>> netrc()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/netrc.py", line 35, in __init__
self._parse(file, fp, default_netrc)
File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/netrc.py", line 117, in _parse
file, lexer.lineno)
netrc.NetrcParseError: bad follower token 'protocol' (/Users/david/.netrc, line 9)
```
`get_netrc_auth` catches the `NetrcParseError` [but just ignores it](https://github.com/kennethreitz/requests/blob/master/requests/utils.py#L106).
At least having it emit a warning would have saved some hair-pulling.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2654/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2654/timeline
| null |
completed
| null | null | false |
[
"Yeah, this is tricky. You can see the reason we added this logic in #449.\n\nGiven that you're using the function explicitly, maybe the best option is to add a kwarg to `get_netrc_auth` that allows it to be used in an explodey way (when users like yourself want to be explicit), but that allows us to continue to use it in a not-explodey way.\n",
"Because this can be done in a backwards compatible way, I've tagged this optimistically for 2.8.0\n"
] |
https://api.github.com/repos/psf/requests/issues/2653
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2653/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2653/comments
|
https://api.github.com/repos/psf/requests/issues/2653/events
|
https://github.com/psf/requests/issues/2653
| 91,461,009 |
MDU6SXNzdWU5MTQ2MTAwOQ==
| 2,653 |
.htaccesss redirect to non ASCII folder does not work
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1542137?v=4",
"events_url": "https://api.github.com/users/gerritsangel/events{/privacy}",
"followers_url": "https://api.github.com/users/gerritsangel/followers",
"following_url": "https://api.github.com/users/gerritsangel/following{/other_user}",
"gists_url": "https://api.github.com/users/gerritsangel/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gerritsangel",
"id": 1542137,
"login": "gerritsangel",
"node_id": "MDQ6VXNlcjE1NDIxMzc=",
"organizations_url": "https://api.github.com/users/gerritsangel/orgs",
"received_events_url": "https://api.github.com/users/gerritsangel/received_events",
"repos_url": "https://api.github.com/users/gerritsangel/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gerritsangel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gerritsangel/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gerritsangel",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 17 |
2015-06-27T12:51:51Z
|
2021-09-08T13:05:41Z
|
2016-12-09T21:17:02Z
|
NONE
|
resolved
|
Hello,
I have the following setup on a shared hoster:
- Apache 2.2.15
- A Japanese language .みんな (.minna; xn--q9jyb4c) IDN domain.
- A blog which is in the subfolder ブログ (blog)
- A redirect in the .htaccess file like this: `Redirect /index.html /ブログ/`
So I usually open the domain http://test.みんな and the server redirects to http://test.みんな/ブログ. This works fine in Firefox etc.
With requests, I get the following error (Python 3.4 with Requests 2.7.0 on a Japanese Ubuntu 15.04):
```
'<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">\n<html><head>\n<title>404 Not Found</title>\n</head><body>\n<h1>Not Found</h1>\n<p>The requested URL /ãÂ\x83Â\x96ãÂ\x83Â\xadãÂ\x82°/ was not found on this server.</p>\n<hr>\n<address>Apache/2.2.15 (CentOS) Server at test.xn--q9jyb4c Port 80</address>\n</body></html>\n'
```
So I guess the request lib gets a redirect from a server with Japanese characters, but then fails to convert the characters correctly. If I do `requests.get(http://test.みんな/ブログ)` directly it works, only the redirect does not.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2653/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2653/timeline
| null |
completed
| null | null | false |
[
"Is that error output from response.content or response.text? If it's text, can you show me response.content please?\n",
"Could you also show us\n\n``` py\nprint(response.request.url)\nprint(response.history)\nfor resp in response.history:\n print('---')\n print('Request URI: {}'.format(resp.request.url))\n print('Status: {}'.format(resp.status_code))\n print('Location: {}'.format(resp.headers['Location']))\n```\n",
"Hello, thanks for your answer.\n\n@Lukasa \nThis is both from response.text and from response.content:\n\n``` python\nIn [3]: r = requests.get(\"http://test.みんな\")\nIn [5]: r.text\nOut[5]: '<!DOCTYPE HTML PUBLIC \"-//IETF//DTD HTML 2.0//EN\">\\n<html><head>\\n<title>404 Not Found</title>\\n</head><body>\\n<h1>Not Found</h1>\\n<p>The requested URL /ãÂ\\x83Â\\x96ãÂ\\x83Â\\xadãÂ\\x82°/ was not found on this server.</p>\\n<hr>\\n<address>Apache/2.2.15 (CentOS) Server at test.xn--q9jyb4c Port 80</address>\\n</body></html>\\n'\nIn [6]: r.content\nOut[6]: b'<!DOCTYPE HTML PUBLIC \"-//IETF//DTD HTML 2.0//EN\">\\n<html><head>\\n<title>404 Not Found</title>\\n</head><body>\\n<h1>Not Found</h1>\\n<p>The requested URL /\\xc3\\xa3\\xc2\\x83\\xc2\\x96\\xc3\\xa3\\xc2\\x83\\xc2\\xad\\xc3\\xa3\\xc2\\x82\\xc2\\xb0/ was not found on this server.</p>\\n<hr>\\n<address>Apache/2.2.15 (CentOS) Server at test.xn--q9jyb4c Port 80</address>\\n</body></html>\\n'\n```\n\n@sigmavirus24 \n\n``` python\nIn [9]: r.request.url\nOut[9]: 'http://test.xn--q9jyb4c/%C3%A3%C2%83%C2%96%C3%A3%C2%83%C2%AD%C3%A3%C2%82%C2%B0/'\n\nIn [11]: r.history\nOut[11]: [<Response [302]>]\n\nIn [13]: for resp in r.history:\n ....: print('---')\n ....: print('Request URI: {}'.format(resp.request.url))\n ....: print('Status: {}'.format(resp.status_code))\n ....: print('Location: {}'.format(resp.headers['Location']))\n---\nRequest URI: http://test.xn--q9jyb4c/\nStatus: 302\nLocation: http://test.xn--q9jyb4c/ããã°/\n```\n\nMaybe it is also a problem that the Apache server sends the header in ISO-8859-1? But this would likely be a problem of the shared hoster setup, I guess?\n\n``` python\nIn [21]: r.history[0].headers\nOut[21]: {'server': 'Apache/2.2.15 (CentOS)', 'content-type': 'text/html; charset=iso-8859-1', 'location': 'http://test.xn--q9jyb4c/ã\\x83\\x96ã\\x83\\xadã\\x82°/', 'content-length': '328', 'date': 'Sat, 27 Jun 2015 19:02:37 GMT', 'connection': 'close'}\n```\n\nI am sorry that I have obfuscated my original domain. Can I somehow privately send it to you? I would not really like to have it here on Github for all eternity, but of course I have no problem sending it directly to you so that you can check it out.\n",
"So it looks like the redirect URI is encoded in shift-JIS. Requests receives those bytes and puts them back on the wire. I wonder if we're hurting when we round trip.\n\nYou can mail me at cory [at] lukasa [dot] co [dot] uk.\n",
"Ok, this is a Python 3 bug. Everything works fine if I use Python 2. This is because on Python 2 the bytestring Location header is treated as a bytestring, which we turn into a bytestring URI, which we then correctly percent-encode and send back to urllib3.\n\nPython 3 doesn't work like that. If I use httplib directly:\n\n``` python\n>>> import http.client\n>>> c = http.client.HTTPConnection('変哲もない.みんな', 80)\n>>> c.request('GET', '/')\n>>> r = c.getresponse()\n>>> r.getheader('Location')\n'http://xn--n8jyd3c767qtje.xn--q9jyb4c/ã\\x83\\x96ã\\x83\\xadã\\x82°/'\n```\n\nNotice that this is a unicode string, but it's weirdly a Latin-1 encoded string. I think somewhere in our stack we're re-encoding this as UTF-8, where we should re-encode it as Latin-1.\n",
"Ok, this problem seems to boil down to our `requote_uri` function:\n\n``` python\n>>> url = 'http://xn--n8jyd3c767qtje.xn--q9jyb4c/ã\\x83\\x96ã\\x83\\xadã\\x82°/'\n>>> requote_uri(url)\n'http://xn--n8jyd3c767qtje.xn--q9jyb4c/%C3%A3%C2%83%C2%96%C3%A3%C2%83%C2%AD%C3%A3%C2%82%C2%B0/'\n```\n\nThis is the wrong URI: specifically, it has been treated as utf-8 and it should have been treated as latin-1.\n",
"The problem actually appears to be with passing the string directly to `urllib.parse.quote`:\n\n``` python\n>>> from urllib.parse import quote\n>>> >>> quote('/ã\\x83\\x96ã\\x83\\xadã\\x82°/')\n'/%C3%A3%C2%83%C2%96%C3%A3%C2%83%C2%AD%C3%A3%C2%82%C2%B0/'\n>>> quote('/ã\\x83\\x96ã\\x83\\xadã\\x82°/'.encode('latin1'))\n'/%E3%83%96%E3%83%AD%E3%82%B0/'\n```\n\nSo, my bet is that `quote` makes a UTF-8 assumption that is simply not valid.\n",
"Yup, just checked the code: that's exactly what it does.\n\nSo, this is a bit tricky now. I _think_ we want to round-trip through `Latin1`, but I have to work out where best to do it.\n",
"Or is it rather a problem of the Apache setup on my web hoster? Because it replies as ISO-8859-1, not as UTF-8.\n",
"No, I think Python screwed this up. Out of interest, if it's easy, can you set it to reply with UTF-8 and see if that changes anything?\n",
"Sorry, I guess not, I don't have root access. It is a shared hoster (albeit a really cool one; www.uberspace.de). \n",
"My suspicion is that Python's httplib is decoding the header as latin 1, which means if we re-encode with latin 1 we'll get exactly the bytes your server sent. I need to confirm that though. Time to dumpster dive through the code. ;)\n",
"Yup, httplib definitely decodes as 'iso-8859-1' on Python 3. Ok, we can do special case hellishness to fix this. =D\n",
"It's a little frustrating that these two parts of the stdlib are inconsistent, but there we go.\n",
"Ok, I believe #2655 contains a fix for this issue @TheHellstorm. Feel free to check. =)\n",
"The new fix is at #2754.\n",
"I think we can close this out as fixed on proposed/3.0.0 with #2754."
] |
https://api.github.com/repos/psf/requests/issues/2652
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2652/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2652/comments
|
https://api.github.com/repos/psf/requests/issues/2652/events
|
https://github.com/psf/requests/issues/2652
| 90,935,727 |
MDU6SXNzdWU5MDkzNTcyNw==
| 2,652 |
Documentation of verify parameter inconsistent with examples on stackoverflow
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2481775?v=4",
"events_url": "https://api.github.com/users/nmgeek/events{/privacy}",
"followers_url": "https://api.github.com/users/nmgeek/followers",
"following_url": "https://api.github.com/users/nmgeek/following{/other_user}",
"gists_url": "https://api.github.com/users/nmgeek/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nmgeek",
"id": 2481775,
"login": "nmgeek",
"node_id": "MDQ6VXNlcjI0ODE3NzU=",
"organizations_url": "https://api.github.com/users/nmgeek/orgs",
"received_events_url": "https://api.github.com/users/nmgeek/received_events",
"repos_url": "https://api.github.com/users/nmgeek/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nmgeek/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nmgeek/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nmgeek",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-06-25T11:32:03Z
|
2021-09-08T22:00:51Z
|
2015-09-01T01:34:11Z
|
NONE
|
resolved
|
requests is a very nice replacement for ugly code you would write using urllib and urllib2. After converting my code to requests and testing extensively I found a glitch when packaging my code with pyinstaller (I guess the same thing happens with py2exe). When connecting to https sites requests apparently depends upon a certificate file which is hard to find in the release tree. (When you package python code with pyinstaller you must also package any required data files.)
Posted solutions to this problem show setting the 'verify' parameter for the post and get methods to the path to a cacert.pem file grabbed from the certifi python package. I tried this solution and it works. (I also tried the documented cert parameter pointing to a cert and key file pair used by my django server but that did not work.) One such answer is at http://stackoverflow.com/questions/10667960/python-requests-throwing-up-sslerror
This demonstrates that the verify parameter has undocumented functionality: it accepts more than the True and False values shown in the documentation.
Could you correct the documentation to document this functionality. And hopefully you could explain the difference between setting a pem file with the verity parameter vs the cert parameter.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2652/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2652/timeline
| null |
completed
| null | null | false |
[
"@nmgeek You may want to take a look at http://docs.python-requests.org/en/latest/user/advanced/#ssl-cert-verification where it is documented: 'You can pass verify the path to a CA_BUNDLE file with certificates of trusted CAs.'\n\nThis really could use an example though.\n",
"@t-8ch is right, this _is_ documented, just not very well. As to your point of `verify` vs `cert`, that's covered in the same section of the prose documentation that @t-8ch linked to.\n"
] |
https://api.github.com/repos/psf/requests/issues/2651
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2651/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2651/comments
|
https://api.github.com/repos/psf/requests/issues/2651/events
|
https://github.com/psf/requests/issues/2651
| 90,806,236 |
MDU6SXNzdWU5MDgwNjIzNg==
| 2,651 |
Cannot make URL query string with a parameter without a value
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1110390?v=4",
"events_url": "https://api.github.com/users/agilevic/events{/privacy}",
"followers_url": "https://api.github.com/users/agilevic/followers",
"following_url": "https://api.github.com/users/agilevic/following{/other_user}",
"gists_url": "https://api.github.com/users/agilevic/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/agilevic",
"id": 1110390,
"login": "agilevic",
"node_id": "MDQ6VXNlcjExMTAzOTA=",
"organizations_url": "https://api.github.com/users/agilevic/orgs",
"received_events_url": "https://api.github.com/users/agilevic/received_events",
"repos_url": "https://api.github.com/users/agilevic/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/agilevic/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/agilevic/subscriptions",
"type": "User",
"url": "https://api.github.com/users/agilevic",
"user_view_type": "public"
}
|
[
{
"color": "02e10c",
"default": false,
"description": null,
"id": 76800,
"name": "Feature Request",
"node_id": "MDU6TGFiZWw3NjgwMA==",
"url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request"
},
{
"color": "eb6420",
"default": false,
"description": null,
"id": 44501256,
"name": "Breaking API Change",
"node_id": "MDU6TGFiZWw0NDUwMTI1Ng==",
"url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change"
},
{
"color": "1cff91",
"default": false,
"description": "",
"id": 860696300,
"name": "3.0",
"node_id": "MDU6TGFiZWw4NjA2OTYzMDA=",
"url": "https://api.github.com/repos/psf/requests/labels/3.0"
}
] |
closed
| true | null |
[] | null | 34 |
2015-06-24T23:35:06Z
|
2021-02-08T02:01:02Z
|
2015-11-05T17:59:07Z
|
NONE
|
resolved
|
URL query string may contain a parameter, which has no value i.e. http://host/path/?foo or http://host/path/?a=1&foo. Currently Requests does not provide support for that.
```
In [68]: d
Out[68]: {'a': 1, 'foo': None}
In [69]: tl
Out[69]: [('a', 1), ('foo',)]
In [70]: RequestEncodingMixin._encode_params(d)
Out[70]: 'a=1'
In [71]: RequestEncodingMixin._encode_params(tl)
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-71-5d4dac855108> in <module>()
----> 1 RequestEncodingMixin._encode_params(tl)
/home/f557010/jpm/local/lib/python2.7/site-packages/requests/models.pyc in _encode_params(data)
87 elif hasattr(data, '__iter__'):
88 result = []
---> 89 for k, vs in to_key_val_list(data):
90 if isinstance(vs, basestring) or not hasattr(vs, '__iter__'):
91 vs = [vs]
ValueError: need more than 1 value to unpack
```
Expected:
```
'a=1&foo'
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 4,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 4,
"url": "https://api.github.com/repos/psf/requests/issues/2651/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2651/timeline
| null |
completed
| null | null | false |
[
"I can see some value in this. For API reasons it could only ever work with the 'list of tuples' approach, but I'd be ok with us adding support for this. @sigmavirus24?\n",
"I don't think we use it (yet) but it seems that [urllib3 uses `urlencode`](https://github.com/kennethreitz/requests/blob/8b5e457b756b2ab4c02473f7a42c2e0201ecc7e9/requests/packages/urllib3/request.py#L80) and [so do we](https://github.com/kennethreitz/requests/blob/f5dacf84468ab7e0631cc61a3f1431a32e3e143c/requests/models.py#L97)\n\nIf we examine how that behaves, you can see that it doesn't like either of the proposed ways of working with this. `{'foo': None}` will \"work\" but does not do the right thing. This is probably why we avoided this before. That said, RFC 3986 has a very ... loose definition of the [query part](https://tools.ietf.org/html/rfc3986#section-3.4) of a URI, so in my opinion we _should_ handle it. That said, I'm not sure there's any tools to readily allow us to handle it. =/\n\n``` py\n>>> u = urlparse.urlparse('http://example.com/foo?bar')\n>>> u\nParseResult(scheme='http', netloc='example.com', path='/foo', params='', query='bar', fragment='')\n>>> urlparse.parse_qs(u.query)\n{}\n>>> urllib.urlencode({'foo': None})\n'foo=None'\n>>> urllib.urlencode([('foo',)])\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/usr/local/Cellar/python/2.7.9/Frameworks/Python.framework/Versions/2.7/lib/python2.7/urllib.py\", line 1336, in urlencode\n for k, v in query:\nValueError: need more than 1 value to unpack\n```\n",
"Related: \"urlencode of a None value uses the string 'None'\" – https://bugs.python.org/issue18857\n",
"Give @piotr-dobrogost's input, @agilevic have you tried using the empty string as the value there? Does it work with your API server?\n",
"Bump. =)\n",
"``` py\n>>> import requests\n>>> r = requests.get('https://httpbin.org/get', params={'foo': ''})\n>>> r.request.url\n'https://httpbin.org/get?foo='\n```\n",
"It's not the same as having a parameter without a value. Yours renders the = sign after the parameter name. Some applications will work, but as a matter of providing a complete solution this exact case should be addressed. That urllib doesn't do it is of no significance. Requests does many things better than standard libraries for HTTP - that is its reason to exist.\n",
"@agilevic What would your proposed API design for this feature be?\n",
"Here is a crazy thought:\n\n```\n>>> import requests\n>>> r = requests.get('https://httpbin.org/get', params={'foo': None})\n>>> r.request.url\n'https://httpbin.org/get?foo'\n```\n\nThat is what I think should be happening.\n\nWhat is actually happening:\n\n```\n'https://httpbin.org/get'\n```\n\n:(\n",
"@frnhr So the reason that doesn't work with our API is that setting a key to `None` is the signal for \"please remove this key from the map\". We have that signal because some parameters can be persisted on a Session object itself, and users occasionally want to be able to suppress those parameters on a per-request basis.\n",
"I'm afraid I don't see what the session object has to do with this bit of API, sincerely. But ok, maybe`False` then, or even some `SpecialImportableObject` instead of `None`?\n\n> On 17 Sep 2016, at 07:47, Cory Benfield [email protected] wrote:\n> \n> @frnhr So the reason that doesn't work with our API is that setting a key to None is the signal for \"please remove this key from the map\". We have that signal because some parameters can be persisted on a Session object itself, and users occasionally want to be able to suppress those parameters on a per-request basis.\n> \n> —\n> You are receiving this because you were mentioned.\n> Reply to this email directly, view it on GitHub, or mute the thread.\n",
"@frnhr The `Session` API is relevant because the `requests.` API is built on top of the `Session` API: it's a subset of that functionality, a convenience wrapper.\n\nSo we certainly could do it, but I'm not sure to what extent it's worthwhile. When not using key-value mapping, you should just pass a string to the `params` field: `params=\"foo\"`.\n",
"What if we were to make a specific typed argument that could indicate to requests that the param is to be added without a value?\n\n```\n# defined somewhere in requests\nclass ValuelessParam(object):\n pass\n....\n....\n\nparams = {'foo': 'some_param', 'bar': requests.ValuelessParam()}\n\nrequests.get('http://something', params=params)\n\n# url should be 'http://something?foo=some_param&bar'\n```\n\nIts not None, its not a scalar constant... so it should be backwards compatible. Under the hood... we could check for that value type and specially append this parameter to the constructed url.\n",
"So while that will definitely work, I don't think that API will get past Kenneth.\n",
"Yeah, No. I'd rather support a built-in like `None`, and I'm not sure that's the best idea — but it could work well. `False` would not. \n",
"`None` won't work either: the codebase already gives it the meaning of \"unset the value set at the Session level.\n",
"That would be a pretty major change though, and I don't think it would benefit many people. Perhaps an empty tuple could be considered. (e.g. `(,)`.\n",
"Not sure if this was resolved, but I found this thread trying to do the exact thing of adding a key with out a value. \"QueryAll\" in my case, but I have had a number of causes in my RestAPI Automation to make use of this kind of function.\r\n\r\n",
"what happens if you pass `{'QueryAll': ''}`?",
"I get &QueryAll= at the end. \r\n",
"Looks like it's kinda up to the API on how it handles the open \"=\" then, PasswordState's API took it as long as it was at the end of my params list, if I moved it to the beginning it error-ed.\r\n",
"in 3.0, i think we can consider making emptry string not result in an =",
"That would be great :-D So far my Passwordstate project can move forwward with the QueryAll= format at the end of teh params string so I am back on track. I'll watch this thread :-D\r\n\r\nThanks Kenneth!\r\n",
"Have not heard or seen anything yet, but have not circled back around to see if any updates had been released.\r\n\r\nNick\r\n\r\nFrom: Alex Zagoro <[email protected]>\r\nSent: Saturday, September 22, 2018 10:19 AM\r\nTo: requests/requests <[email protected]>\r\nCc: Ellson, Nick <[email protected]>; Comment <[email protected]>\r\nSubject: Re: [requests/requests] Cannot make URL query string with a parameter without a value (#2651)\r\n\r\n\r\nheya, any updates on this?\r\n\r\n",
"I'm running into this problem at the moment as well. Has anyone else found any way around this problem? ",
"I believe there is still no solution in requests for this simple, but very common feature. weird.",
"Unfortunately facing the same issue.\r\n**Wouldn't it be a better pattern or option to be able to inject/pass an optional custom formatter/encoder or a specific flag for handling None behaviour?**\r\n\r\n@Lukasa You seem to know the codebase better than most of us, what do you think about it?\r\n\r\nWouldn't that resolve in a non-breaking-change, @kennethreitz? Defaults could be set to mimic 2.x behaviour.",
"It appears that this is still adding the = when an empty string is passed in. Time to bang out a work-around! ",
"So I'm guessing this is still a problem?",
"If you have parameters that have no value, you can list them as part of the url and any additional parameters will append with `&` instead of starting with `?`:\r\n```\r\nIn [3]: r = requests.get(\"http://www.google.com\", params={'test': 'true'})\r\n\r\nIn [4]: r.url\r\nOut[4]: 'http://www.google.com/?test=true'\r\n\r\nIn [5]: r = requests.get(\"http://www.google.com?test2\", params={'test': 'true'})\r\n\r\nIn [6]: r.url\r\nOut[6]: 'http://www.google.com/?test2&test=true'\r\n```"
] |
https://api.github.com/repos/psf/requests/issues/2650
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2650/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2650/comments
|
https://api.github.com/repos/psf/requests/issues/2650/events
|
https://github.com/psf/requests/pull/2650
| 90,304,864 |
MDExOlB1bGxSZXF1ZXN0MzgzMDU2MjQ=
| 2,650 |
Display content as part of HTTP error messages
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1921476?v=4",
"events_url": "https://api.github.com/users/kuldeeprishi/events{/privacy}",
"followers_url": "https://api.github.com/users/kuldeeprishi/followers",
"following_url": "https://api.github.com/users/kuldeeprishi/following{/other_user}",
"gists_url": "https://api.github.com/users/kuldeeprishi/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kuldeeprishi",
"id": 1921476,
"login": "kuldeeprishi",
"node_id": "MDQ6VXNlcjE5MjE0NzY=",
"organizations_url": "https://api.github.com/users/kuldeeprishi/orgs",
"received_events_url": "https://api.github.com/users/kuldeeprishi/received_events",
"repos_url": "https://api.github.com/users/kuldeeprishi/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kuldeeprishi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kuldeeprishi/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kuldeeprishi",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-06-23T06:46:48Z
|
2021-09-08T07:00:53Z
|
2015-06-23T07:05:43Z
|
NONE
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2650/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2650/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2650.diff",
"html_url": "https://github.com/psf/requests/pull/2650",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2650.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2650"
}
| true |
[
"Unlike #2648 I think this is a not a good idea.\n\nThe content of a message is potentially extremely large or unbounded. It should not appear in tracebacks. Additionally, if you've set `stream=True` because you're expecting a large response in any form then the body may not have been downloaded yet, in which case we will actually block behind obtaining the request body before we throw the exception. Generally I'd say an exception we _know_ we want to throw should not block behind I/O before being thrown.\n\nIf you want to log this information out yourself you can do: catch the exception from `raise_for_status` and then consume the content yourself. =)\n",
"Ok Thanks for the feedback :)\n"
] |
|
https://api.github.com/repos/psf/requests/issues/2649
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2649/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2649/comments
|
https://api.github.com/repos/psf/requests/issues/2649/events
|
https://github.com/psf/requests/issues/2649
| 90,273,302 |
MDU6SXNzdWU5MDI3MzMwMg==
| 2,649 |
Requests seems to stucks when using with futures.ThreadPoolExecutor
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10950287?v=4",
"events_url": "https://api.github.com/users/sezginriggs/events{/privacy}",
"followers_url": "https://api.github.com/users/sezginriggs/followers",
"following_url": "https://api.github.com/users/sezginriggs/following{/other_user}",
"gists_url": "https://api.github.com/users/sezginriggs/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sezginriggs",
"id": 10950287,
"login": "sezginriggs",
"node_id": "MDQ6VXNlcjEwOTUwMjg3",
"organizations_url": "https://api.github.com/users/sezginriggs/orgs",
"received_events_url": "https://api.github.com/users/sezginriggs/received_events",
"repos_url": "https://api.github.com/users/sezginriggs/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sezginriggs/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sezginriggs/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sezginriggs",
"user_view_type": "public"
}
|
[
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
},
{
"color": "fef2c0",
"default": false,
"description": null,
"id": 298537994,
"name": "Needs More Information",
"node_id": "MDU6TGFiZWwyOTg1Mzc5OTQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20More%20Information"
}
] |
closed
| true | null |
[] | null | 24 |
2015-06-23T03:07:29Z
|
2021-09-07T00:06:25Z
|
2017-07-30T00:22:47Z
|
NONE
|
resolved
|
Hello,
I'm using Python 2.7.9 with futures (3.0.3) and requests (2.7.0) on Debian (also tested on Win8 and results are same).
The problem is, Requests doesn't timeout and stucks, so it seems my threads never finish their jobs and stops processing queue.
I'm trying to make a multi-threaded web crawler and I'm fetching to-be-crawled URLs from frontier (which returns a json list of domains) and populating a queue with them.
After this I'm populating Thread Pool with the code below
```
while not url_queue.empty():
queue_data = url_queue.get()
task_pool.submit(processItem, queue_data)
```
In processItem() function, I'm fetching url with get_data() and marking the queue item with task_done()
My get_data() function is as follows
```
def get_data(fqdn):
try:
response = requests.get("http://"+fqdn, headers=headers, allow_redirects=True, timeout=3)
if response.status_code == requests.codes.ok:
result = response.text
else:
result = ""
except requests.exceptions.RequestException as e:
print "ERROR OCCURED:"
print fqdn
print e.message
result = ""
return result
```
If I mark get_data() as comment in processItem(), all threads and queue works fine. If I uncomment it, works fine for most of requests but stucking for some and that affects all queue and script because queue.join() waits for threads to complete requests. I suppose it's a bug of requests module as everything works fine without calling get_data() and as requests doesn't time out the GET request.
Any help will be greatly appreciated... Thank you very much..
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 2,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 2,
"url": "https://api.github.com/repos/psf/requests/issues/2649/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2649/timeline
| null |
completed
| null | null | false |
[
"Have you tried [requests-futures](/ross/requests-futures)? If not, could you reproduce this with requests-futures?\n",
"Additionally, is it possible for us to obtain a traceback of where we're getting stuck? I want to know if we're getting stuck during the connection phase (in which case it's our fault) or if we're getting stuck in the read phase (in which case it's httplib's fault).\n",
"@Lukasa I would like to provide traceback but how can I do that? I'm getting lots of exceptions because it's uses 200 threads. Generally like these but not in particular order.. And after it stucks and does nothing.\n\n---\n\n('Connection aborted.', gaierror(-2, 'Name or service not known'))\nERROR OCCURED:\nkarawanghosting.net\n('Connection aborted.', gaierror(-2, 'Name or service not known'))\nERROR OCCURED:\nbjjzk.net\nHTTPConnectionPool(host='bjjzk.net', port=80): Max retries exceeded with url: / (Caused by ConnectTimeoutError(<requests.packages.urllib3.connection.HTTPConnection object at 0x7f97dc0c1ad0>, 'Connection to bjjzk.net timed out. (connect timeout=3)'))\n\n---\n",
"Can you broaden your `except` statement to catch all exceptions of any kind, and then re-run your code? I want to see if anything isn't getting logged out.\n",
"It's weird, I changed \"except requests.exceptions.RequestException as e\" to \"except Exception as e:\" and now it seems working fine (at least still works for longer time than before)\n",
"So my theory is that if a thread dies because of an exception it can't get waited on.\n",
"After long time it stucked again and unfortunately doesn't give any different exception than many others. :( Do you have any other ideas to try?\n",
"```\n ('Connection aborted.', gaierror(-2, 'Name or service not known'))\n```\n\nIndicates a DNS issue.\n\n```\n HTTPConnectionPool(host='bjjzk.net', port=80): Max retries exceeded with url: / (Caused by ConnectTimeoutError(, 'Connection to bjjzk.net timed out. (connect timeout=3)'))\n```\n\nSounds like you're being ratelimited by `bjjzk.net`.\n\nHave you tried using [requests-futures](/ross/requests-futures)?\n",
"No I didn't tried requests-futures yet and it's not the only domain that I crawl. I'm crawling millions of domains and accessing them for only one time, so being ratelimited is not likely.\n\nNow I did a little change on get_data() function, declared \"result\" variable with empty string on top of try-catch block. I'm not sure is there any chance to pass try-catch block with or without any exception and returning null result but I wanted to try.\n",
"Same issue here.\n\nAlso developing a web crawler intended to process a continuous stream of URLs.\n\nMy code behaviour is something like the following:\n\n``` python\nfrom concurrent.futures import ThreadPoolExecutor\nimport logging\nimport random\nimport time\n\nimport requests\n\nNTHREADS = 2\nDELAY_SECONDS = 0.5\nURLS = ['https://google.com', 'http://yahoo.com', 'http://github.com', 'https://bing.com']\n\nlogging.basicConfig(format='%(asctime)s : %(levelname)s : %(message)s', level=logging.INFO)\n\ndef callback():\n response = requests.get(random.choice(URLS), timeout=120)\n logging.info('status_code=%d ok=%s', response.status_code, response.ok)\n\nwith ThreadPoolExecutor(NTHREADS) as executor:\n while True:\n time.sleep(DELAY_SECONDS) # do not hit the site too hard\n queued_works = executor._work_queue.qsize()\n logging.info('queued works: %s', queued_works)\n if queued_works < 10: # do not flood executor's queue\n executor.submit(callback)\n```\n\nI wasn't able to reproduce this very same error for this small list of URLs, but on my production environment (after some time running - let's say 2~3 hours), the log messages starts looking this:\n\n```\n2015-10-01 16:51:41,488 : INFO : queued works: 10\n2015-10-01 16:51:41,489 : INFO : queued works: 10\n2015-10-01 16:51:41,489 : INFO : queued works: 10\n2015-10-01 16:51:41,489 : INFO : queued works: 10\n2015-10-01 16:51:41,490 : INFO : queued works: 10\n2015-10-01 16:51:41,490 : INFO : queued works: 10\n2015-10-01 16:51:41,491 : INFO : queued works: 10\n2015-10-01 16:51:41,491 : INFO : queued works: 10\n2015-10-01 16:51:41,492 : INFO : queued works: 10\n2015-10-01 16:51:41,492 : INFO : queued works: 10\n2015-10-01 16:51:41,492 : INFO : queued works: 10\n2015-10-01 16:51:41,493 : INFO : queued works: 10\n.... (and goes like this forever - like, not even a few days it would stop)\n```\n\nI checked ThreadPoolExecutor's [implementation](https://hg.python.org/cpython/file/3.5/Lib/concurrent/futures/thread.py) and I'm pretty convinced the problem is NOT related to it. The code just seems to get stuck on line 55:\n\n``` python\nresult = self.fn(*self.args, **self.kwargs)\n```\n\n**edit**: by \"the issue is not related to ThreadPoolExecutor\", I mean: it doesn't matter if callback() raises an exception or not; it's supposed to work just fine. The thing is that _WorkItem:run() method never stops.\n\n**edit 2**: python 2.7\n",
"@eltermann Can you please add timeouts to your requests call and see if the problem persists?\n",
"@Lukasa, this snippet doesn't have timeout, but my call does. Edited the snippet anyway.\n",
"@eltermann Interesting. It would be really insightful to try to get stacks from those threads.\n",
"@Lukasa, what do you recommend to print an useful stack? And where to place it?\n",
"Good question. Try [this](https://docs.python.org/3/library/faulthandler.html#dumping-the-traceback).\n",
"@sezginriggs so have you resolve the problem ? I am stuck with it also.\n",
"@metrue, I changed my approach to use processes instead of threads -- also, because I found that [python does not really parallelize threads execution because of GIL](https://www.google.com.br/search?q=python+does+not+really+parallelize+threads+execution+because+of+GIL).\n\nYou have two choices:\n1. you deal with multiprocesses yourself in your python code -- I recommend looking at how [scrapy](https://github.com/scrapy/scrapy) does the parallelization (even though it uses twisted under the hood and not requests)\n2. (and that's what I did) you write a simple \"stream-consumer\" python program and let something else do the parallelization (something like Kafka or Storm) -- then, you start multiple processes for your \"stream-consumer\" and voilá\n",
"@eltermann , I do know 'python does not really parallelize threads execution because of GIL'. So \bI am using ProcessPoolExecutor instead of ThreadPoolExecutor, But Still, requests also stucks.\n",
"If you are using a process pool executor you _must not_ use a Session that is shared across those processes. \n",
"@Lukasa \n\nRight, I realized that, I am not a Python expert, but I wonder what's the best practice of sharing data (let's say a global task queue) between those process ?\n",
"@metrue to maintain a thread-safe/multiprocess-safe queue, you can use the standard library's `Queue` implementation. If you're on Python 2\n\n``` py\nimport Queue\n\ntask_queue = Queue.Queue()\n```\n\nif you're on Python 3\n\n``` py\nimport queue\n\ntask_queue = queue.Queue()\n```\n",
"Have you fixed this problem?\r\n\r\nBecause I think, that I had and have the same problem: [example](http://stackoverflow.com/questions/40891497/threadpoolexecutor-requests-deadlock). Sometimes this example has a deadlock.\r\n",
"@antongulikov We have not. We are still missing a bug chunk of debugging data as discussed earlier in the thread.",
"after you make the requests, make sure to kill the process. `driver.close` AND `driver.quit`. That should both keep your mem stable across all those requests and keep you from getting jobs stuck due to mem issues. "
] |
https://api.github.com/repos/psf/requests/issues/2648
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2648/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2648/comments
|
https://api.github.com/repos/psf/requests/issues/2648/events
|
https://github.com/psf/requests/pull/2648
| 90,205,820 |
MDExOlB1bGxSZXF1ZXN0MzgyNzMxNjE=
| 2,648 |
Display URL as part of HTTP error messages
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/305268?v=4",
"events_url": "https://api.github.com/users/msabramo/events{/privacy}",
"followers_url": "https://api.github.com/users/msabramo/followers",
"following_url": "https://api.github.com/users/msabramo/following{/other_user}",
"gists_url": "https://api.github.com/users/msabramo/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/msabramo",
"id": 305268,
"login": "msabramo",
"node_id": "MDQ6VXNlcjMwNTI2OA==",
"organizations_url": "https://api.github.com/users/msabramo/orgs",
"received_events_url": "https://api.github.com/users/msabramo/received_events",
"repos_url": "https://api.github.com/users/msabramo/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/msabramo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/msabramo/subscriptions",
"type": "User",
"url": "https://api.github.com/users/msabramo",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-06-22T20:33:10Z
|
2021-09-08T07:00:53Z
|
2015-06-22T21:16:28Z
|
CONTRIBUTOR
|
resolved
|
It seems convenient to include the URL in the error message in case you get an unexpected error.
E.g.:
```
In [1]: import requests
In [2]: resp = requests.get('http://www.google.com/eofdfdfdfdfd')
In [3]: resp
Out[3]: <Response [404]>
In [4]: resp.raise_for_status()
---------------------------------------------------------------------------
HTTPError Traceback (most recent call last)
<ipython-input-4-00e7077cfb5b> in <module>()
----> 1 resp.raise_for_status()
/Users/marca/dev/git-repos/requests/requests/models.py in raise_for_status(self)
835
836 if http_error_msg:
--> 837 raise HTTPError(http_error_msg, response=self)
838
839 def close(self):
HTTPError: 404 Client Error: Not Found for url: http://www.google.com/eofdfdfdfdfd
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/2648/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2648/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2648.diff",
"html_url": "https://github.com/psf/requests/pull/2648",
"merged_at": "2015-06-22T21:16:28Z",
"patch_url": "https://github.com/psf/requests/pull/2648.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2648"
}
| true |
[
"Seems reasonable enough to me. I'm happy to take this. @sigmavirus24?\n",
"Seems like a good UX improvement to me. :+1: \n"
] |
https://api.github.com/repos/psf/requests/issues/2647
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2647/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2647/comments
|
https://api.github.com/repos/psf/requests/issues/2647/events
|
https://github.com/psf/requests/pull/2647
| 90,021,155 |
MDExOlB1bGxSZXF1ZXN0MzgyMDI1MDU=
| 2,647 |
Update AUTHORS.rst
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1743810?v=4",
"events_url": "https://api.github.com/users/neosab/events{/privacy}",
"followers_url": "https://api.github.com/users/neosab/followers",
"following_url": "https://api.github.com/users/neosab/following{/other_user}",
"gists_url": "https://api.github.com/users/neosab/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/neosab",
"id": 1743810,
"login": "neosab",
"node_id": "MDQ6VXNlcjE3NDM4MTA=",
"organizations_url": "https://api.github.com/users/neosab/orgs",
"received_events_url": "https://api.github.com/users/neosab/received_events",
"repos_url": "https://api.github.com/users/neosab/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/neosab/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/neosab/subscriptions",
"type": "User",
"url": "https://api.github.com/users/neosab",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2015-06-22T06:09:58Z
|
2021-09-08T07:00:54Z
|
2015-06-22T07:08:49Z
|
NONE
|
resolved
|
For #2631
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2647/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2647/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2647.diff",
"html_url": "https://github.com/psf/requests/pull/2647",
"merged_at": "2015-06-22T07:08:49Z",
"patch_url": "https://github.com/psf/requests/pull/2647.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2647"
}
| true |
[
"\\o/ :cake: :sparkles: :cake:\n",
":joy: Thanks @Lukasa \n",
"WOOOT\n"
] |
https://api.github.com/repos/psf/requests/issues/2646
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2646/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2646/comments
|
https://api.github.com/repos/psf/requests/issues/2646/events
|
https://github.com/psf/requests/pull/2646
| 89,917,567 |
MDExOlB1bGxSZXF1ZXN0MzgxODQ5Mzg=
| 2,646 |
Add release notes for PR 2631
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-06-21T14:22:24Z
|
2021-09-08T07:00:54Z
|
2015-06-21T15:15:17Z
|
CONTRIBUTOR
|
resolved
|
/cc @neosab @Lukasa
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2646/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2646/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2646.diff",
"html_url": "https://github.com/psf/requests/pull/2646",
"merged_at": "2015-06-21T15:15:17Z",
"patch_url": "https://github.com/psf/requests/pull/2646.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2646"
}
| true |
[
":cake: :sparkles:\n",
"Thanks @sigmavirus24 \n"
] |
https://api.github.com/repos/psf/requests/issues/2645
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2645/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2645/comments
|
https://api.github.com/repos/psf/requests/issues/2645/events
|
https://github.com/psf/requests/issues/2645
| 89,629,370 |
MDU6SXNzdWU4OTYyOTM3MA==
| 2,645 |
[RFC] Change behaviour of Response.ok in 3.0
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| false | null |
[] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2024-05-19T18:29:04Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/34",
"id": 11073254,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/34/labels",
"node_id": "MI_kwDOABTKOs4AqPbm",
"number": 34,
"open_issues": 0,
"state": "open",
"title": "Bankruptcy",
"updated_at": "2024-05-20T14:37:16Z",
"url": "https://api.github.com/repos/psf/requests/milestones/34"
}
| 19 |
2015-06-19T18:29:16Z
|
2024-05-20T14:36:12Z
|
2024-05-20T14:36:12Z
|
CONTRIBUTOR
| null |
# Preamble
Please note that this a request for comments, not something that we will _definitely_ do in Requests.
# Problem
Right now we have an attribute defined on a Response object, `ok`. This attribute [currently](https://github.com/kennethreitz/requests/blob/9bbab338fdbb562b923ba2d8a80f0bfba697fa41/requests/models.py#L617..L623) calls `self.raise_for_status()` and catches the [exception](https://github.com/kennethreitz/requests/blob/9bbab338fdbb562b923ba2d8a80f0bfba697fa41/requests/models.py#L825..L837) raised. `raise_for_status` appropriately only raises an exception for status codes in the 4xx or 5xx range. This means that a Response with a status code in the 2xx and 3xx range will be "ok". The problem is that "ok" has a certain association in HTTP with 2xx responses, specifically 200 responses. This _may_ (I haven't looked to see if anyone has had problems with this) be misleading, especially if the user is combining their usage of the `ok` attribute with `allow_redirects=False`.
# Proposed Solution
Instead of using `raise_for_status` to determine the "ok-ness" of a response, we should compare the status code of the response directly. This will do two things:
1. It will narrow the definition of `Response.ok`
2. It will make using `Response.ok` faster. Currently we add a new stack, potentially throw and catch an exception, and then return. With a simple comparison, we could just do: `return 200 <= self.status_code < 300` which is much faster. This argument, however, doesn't hold much weight for me personally.
I'd really love @Lukasa and @kennethreitz's opinions here as well as anyone else willing to share their opinion publicly.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2645/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2645/timeline
| null |
completed
| null | null | false |
[
"I think that `ok` is just poorly named. I'd be more interested in having predicates that cover the major response ranges: `response.success` , `response.redirect`, `response.client_error`, `response.server_error`. We could then have overlap fields, like `response.error` (a superset of `client_error` and `server_error`).\n\nThat said, if we think that's too heavyweight, I'm still in favour of reducing the scope of `response.ok`.\n",
"Yeah I agree that `ok` is poorly named. I'm not sure a bunch of new attributes is the right way to do this, but I'm not against it. We already have `response.is_redirect()` which, to be fair, could/should be changed to an attribute.\n",
"Against a bunch of new attributes. I'm all in favor of people doing a direct comparison themselves.\n\nI think the current implementation is fine, and also allows for some flexibility when it comes to creative connection abstractions. Not sure if that's a good thing to do or not. \n\nPerhaps a better way to go about this would be to add a parameter to raise (e.g. `raise_for_status(inline=True)`, which would return True/False) .\n",
"Alternate proposal:\n\nStep 1: Heavily document the behaviour of the `ok` attribute, cache the result on the first run (like `content`)\n\nStep 2: Add to the status_codes module a sort of Python 3-esque range class. In Python 3, range returns a range object which allows you to easily do `200 in range(200, 300)` without iterative over the whole thing. We could do something similar in `status_codes` that would be cheap and work on Python 2 and 3 the exact same way (instead of creating a bunch of ~99 item lists at import.\n\nAlternatively, provide lists of the known status codes, e.g., `success = [200, 201, ..., 208, 226]` The latter is a bit more maintenance and the former is a bit magical, albeit very simple.\n",
"@sigmavirus24 that sounds great to me. \n\n`if response.status_code in requests.codes.success`\n",
"Love it.\n\nI propose renaming this issue to be clearer, and marking it 'contributor friendly'. It's a nice starting issue for someone. =)\n",
"Hi, I was playing with this issue and there's one thing I noticed. The response for the 3xx status codes isn't uniform. A GET with codes 301, 302, 303 and 307 ends up with a 200 response. The other ones receive the actual code (304/304 and so on). I tested this with httpbin/status/code_number.\n\nIf the issue is going to be solved this way, the success attribute must account for this behavior, right?\n",
"@Itwilloutliveyou I'm sorry, I don't think I fully understand what you're getting at here: can you elaborate or rephrase?\n",
"The solution proposed is to add a success attribute listing all the status codes that are considered ok (200, 201, 202, ..., 226). Then, we compare the status code of the response with the ones listed in success.\n\nWhat happens is that the response for a 301 redirect, for example, is a status code 200. The same happens for other status codes of the 3xx family. @sigmavirus24 came with the idea that one of the reasons for that change is to narrow the definition of Response.ok. If that's the case, wouldn't it be better to rethink the outcome of those 3xx cases?\n",
"I think @sigmavirus24's point is just that 3XX status codes should probably not be considered \"ok\": they aren't. But requests _always_ follows redirects, so the only way to _end up_ at a 3XX status code is if the 3XX code is not an unambiguous redirect: i.e., it does not have a `Location` header.\n\nThe reason your testing showed no following for 304 and friends is because 304 is not a defined HTTP status code, so httpbin was not generating a Location header for it. That means requests ends up with the 304 code and cannot follow it any further. @sigmavirus24 rightly proposes that that response should not be \"ok\": it is almost certainly not.\n",
"I see, if a 301 redirect ends up on a successful retrieval of some\nresource, then the final status is a 200 code.\n\nAlright, my attempt is probably right. Tonight I'll make the PR, hope\neverything is ok!\n\nThanks!\nAm 16.05.2016 04:10 schrieb \"Cory Benfield\" [email protected]:\n\n> I think @sigmavirus24 https://github.com/sigmavirus24's point is just\n> that 3XX status codes should probably not be considered \"ok\": they aren't.\n> But requests _always_ follows redirects, so the only way to _end up_ at a\n> 3XX status code is if the 3XX code is not an unambiguous redirect: i.e., it\n> does not have a Location header.\n> \n> The reason your testing showed no following for 304 and friends is because\n> 304 is not a defined HTTP status code, so httpbin was not generating a\n> Location header for it. That means requests ends up with the 304 code and\n> cannot follow it any further. @sigmavirus24\n> https://github.com/sigmavirus24 right proposes that that response\n> should not be \"ok\": it is almost certainly not.\n> \n> —\n> You are receiving this because you were mentioned.\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2645#issuecomment-219363390\n",
"@Lukasa when designed, 'ok' meant 'not an error'. 300 is not an error. I still think should be the case. \n\ne.g. '.ok' isnt OMG ALL IS PERFECT, it is NOTHING APPEARS IMMEDIATELY BROKEN\n",
"so, with that logic, a redirect is a perfectly valid and ok response.\n\nwe already have `is_redirect`, so a `success` version of that would prob be fine to add. but we shouldn't change the conceptual behavior of `ok`.\n",
"if only there was a 'maybe' boolean value\n",
"```\n>>> r.ok\n0.9\n```\n",
"What about:\n- `response.status.is_success()`\n- `response.status.is_redirect()`\n- ...\n- `int(response.status)` for converting to `response.status_code`.\n",
"> e.g. '.ok' isnt OMG ALL IS PERFECT, it is NOTHING APPEARS IMMEDIATELY BROKEN\r\n\r\nThat's what I thought it meant too, but that doesn't come across to me.\r\n\r\nJust to play Devil's Advocate on this topic and tie it in with the boolean issue-\r\n\r\nAs far as I'm concerned, unless it only signifies a HTTP 200 OK, every response that is *properly formed* is `ok` and should not evaluate in a boolean context to `not True`.\r\n\r\nIf I receive a properly formed 404 or 500, awesome. That is `ok` by me, because the response might have a redirect history or contain useful information in the headers or content. To me, a \"not ok\" response is one that is improperly formed, but hasn't raised an exception (yet).\r\n\r\nFor some context:\r\n\r\n* If `requests` processes a URL shortener that causes multiple redirects, but ends on a 4xx or 5xx page, the `response` is `not True` and `not ok` -- yet it has a rich history that gets totally blown away **unless** someone reads a lot of documentation.\r\n\r\n* If one encounters a response with 4xx with a body containing info about rate-limiting or authorization, it is `not True` and `not ok`, and that information also might be missed unless someone really reads the docs.\r\n\r\nThe current behavior and some of the above changes don't seem appropriate for `response.ok` as they would still be misleading to a lot of people; many of the above suggestions seem better implemented as `response.http_okay` and `response.http_okayish` -- where it is semantically clear that the http code is \"okayish\", as the http code could not be okay but the response itself is fine.",
"Honest to goodness I just consider this entire issue an excellent example of why `.ok` is an unsuccessful property name, and why responses should not be boolean. There are too many entirely reasonable reads of what a response being \"ok\" might mean.",
"In an effort to clean up the issue tracker to only have issues that are still relevant to the project we've done a quick pass and decided this issue may no longer be relevant for a variety of potential reasons, including:\r\n\r\n* Applies to a much older version, unclear whether the issue still applies.\r\n* Change requires a backwards incompatible release and it's unclear if the benefits are worth the migration effort from the community.\r\n* There isn't a clear demand from the community on the change landing in Requests.\r\n\r\nIf you think the issue should remain open, please comment so below or open a new issue and link back to the original issue. Again, thank you for opening the issue and for the discussion, it's much appreciated."
] |
https://api.github.com/repos/psf/requests/issues/2644
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2644/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2644/comments
|
https://api.github.com/repos/psf/requests/issues/2644/events
|
https://github.com/psf/requests/issues/2644
| 89,478,843 |
MDU6SXNzdWU4OTQ3ODg0Mw==
| 2,644 |
incorrect status_code
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10746779?v=4",
"events_url": "https://api.github.com/users/jincept/events{/privacy}",
"followers_url": "https://api.github.com/users/jincept/followers",
"following_url": "https://api.github.com/users/jincept/following{/other_user}",
"gists_url": "https://api.github.com/users/jincept/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jincept",
"id": 10746779,
"login": "jincept",
"node_id": "MDQ6VXNlcjEwNzQ2Nzc5",
"organizations_url": "https://api.github.com/users/jincept/orgs",
"received_events_url": "https://api.github.com/users/jincept/received_events",
"repos_url": "https://api.github.com/users/jincept/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jincept/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jincept/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jincept",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-06-19T06:18:08Z
|
2021-09-08T23:00:54Z
|
2015-06-19T07:27:43Z
|
NONE
|
resolved
|
import requests
import time
import random
url = 'http://myserver.com'
n = 0
while(True):
resp = requests.get(url)
now=time.time()
n += 1
print(n, resp, resp.status_code, time.ctime(now))
print(resp.headers)
r=random.randint(1,5)
time.sleep(r)
print('\t\t\t slept',r, 'seconds')
#
Output:
1 <Response [200]> 200 Thu Jun 18 23:15:57 2015
{'pragma': 'no-cache', 'set-cookie': 'OutlookSession=d2b7de620d414931b8100369275493d4; path=/; secure; HttpOnly', 'expires': '-1', 'x-owa-version': '14.3.181.6', 'x-powered-by': 'ASP.NET', 'date': 'Fri, 19 Jun 2015 06:15:56 GMT', 'content-type': 'text/html; charset=utf-8', 'vary': 'Accept-Encoding', 'content-encoding': 'gzip', 'content-length': '3251', 'cache-control': 'no-cache, no-store'}
slept 5 seconds
2 <Response [200]> 200 Thu Jun 18 23:16:02 2015
{'pragma': 'no-cache', 'set-cookie': 'OutlookSession=9093d72589364d9894173a5ed0475a45; path=/; secure; HttpOnly', 'expires': '-1', 'x-owa-version': '14.3.181.6', 'x-powered-by': 'ASP.NET', 'date': 'Fri, 19 Jun 2015 06:16:01 GMT', 'content-type': 'text/html; charset=utf-8', 'vary': 'Accept-Encoding', 'content-encoding': 'gzip', 'content-length': '3251', 'cache-control': 'no-cache, no-store'}
.....
#
While actual response code is 300 as seen in wireshark capture:
HTTP/1.1 302 Object Moved\r\n
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2644/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2644/timeline
| null |
completed
| null | null | false |
[
"\n\n",
"This is not a bug.\n\nRequests will follow all redirects until the chain is complete. What has happened here is that you've been redirected from HTTP to HTTPS by the second 302, which means your wireshark filter does not show the next transaction. =)\n\nIf you print `resp.history` you'll see the full redirect chain we followed.\n"
] |
https://api.github.com/repos/psf/requests/issues/2643
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2643/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2643/comments
|
https://api.github.com/repos/psf/requests/issues/2643/events
|
https://github.com/psf/requests/issues/2643
| 88,726,381 |
MDU6SXNzdWU4ODcyNjM4MQ==
| 2,643 |
deflate failure on google appengine above requests 2.3.0
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/824379?v=4",
"events_url": "https://api.github.com/users/tarvitz/events{/privacy}",
"followers_url": "https://api.github.com/users/tarvitz/followers",
"following_url": "https://api.github.com/users/tarvitz/following{/other_user}",
"gists_url": "https://api.github.com/users/tarvitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tarvitz",
"id": 824379,
"login": "tarvitz",
"node_id": "MDQ6VXNlcjgyNDM3OQ==",
"organizations_url": "https://api.github.com/users/tarvitz/orgs",
"received_events_url": "https://api.github.com/users/tarvitz/received_events",
"repos_url": "https://api.github.com/users/tarvitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tarvitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tarvitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tarvitz",
"user_view_type": "public"
}
|
[
{
"color": "fbca04",
"default": false,
"description": null,
"id": 615414998,
"name": "GAE Support",
"node_id": "MDU6TGFiZWw2MTU0MTQ5OTg=",
"url": "https://api.github.com/repos/psf/requests/labels/GAE%20Support"
}
] |
closed
| true | null |
[] | null | 3 |
2015-06-16T13:20:50Z
|
2021-09-08T09:00:46Z
|
2015-06-16T13:24:15Z
|
NONE
|
resolved
|
if response was given with deflate,gzip or just deflate (Content-Encoding: "gzip, deflate") then requests on google appengine would failure this line
packages/urllib3/response.py:440
``` python
if self._original_response and self._original_response._method.upper() == 'HEAD':
```
It happens because self._original_response is replaced by google sdk and has different format from inner Response class objects.
gae:
``` python
<google.appengine.dist27.gae_override.httplib.HTTPResponse instance at 0x7f66f51cbf80>
```
origin:
``` python
<httplib.HTTPResponse instance at 0x7f59742b0638>
```
Though original_response._method is not str (it's actually int).
This breaks content deflating e.g. requests will process request normally but response.content would contain binary (compressed) data.
Tested on 2.7.0 branch. 2.3.0 works well.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2643/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2643/timeline
| null |
completed
| null | null | false |
[
"Sorry, this is a bug in [urllib3](https://github.com/shazow/urllib3), do you mind opening an issue for it on that repository?\n",
"This should be fixed when https://github.com/shazow/urllib3/pull/620 is merged and we pull it in for a new release.\n",
"np, thanks for advance :)\n"
] |
https://api.github.com/repos/psf/requests/issues/2642
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2642/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2642/comments
|
https://api.github.com/repos/psf/requests/issues/2642/events
|
https://github.com/psf/requests/issues/2642
| 88,438,041 |
MDU6SXNzdWU4ODQzODA0MQ==
| 2,642 |
BadStatusLine on Server restart
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/414336?v=4",
"events_url": "https://api.github.com/users/guettli/events{/privacy}",
"followers_url": "https://api.github.com/users/guettli/followers",
"following_url": "https://api.github.com/users/guettli/following{/other_user}",
"gists_url": "https://api.github.com/users/guettli/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/guettli",
"id": 414336,
"login": "guettli",
"node_id": "MDQ6VXNlcjQxNDMzNg==",
"organizations_url": "https://api.github.com/users/guettli/orgs",
"received_events_url": "https://api.github.com/users/guettli/received_events",
"repos_url": "https://api.github.com/users/guettli/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/guettli/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/guettli/subscriptions",
"type": "User",
"url": "https://api.github.com/users/guettli",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 11 |
2015-06-15T14:09:52Z
|
2021-09-08T23:00:54Z
|
2015-06-18T12:27:49Z
|
NONE
|
resolved
|
Hi,
from time to time we this error:
```
File "/home/foo_esm_di226/lib/python2.7/site-packages/requests/api.py", line 49, in request
response = session.request(method=method, url=url, **kwargs)
File "/home/foo_esm_di226/lib/python2.7/site-packages/requests/sessions.py", line 461, in request
resp = self.send(prep, **send_kwargs)
File "/home/foo_esm_di226/lib/python2.7/site-packages/requests/sessions.py", line 573, in send
r = adapter.send(request, **kwargs)
File "/home/foo_esm_di226/lib/python2.7/site-packages/requests/adapters.py", line 415, in send
raise ConnectionError(err, request=request)
ConnectionError: ('Connection aborted.', BadStatusLine("''",))
```
This happens if the request library tries to connect during an Apache server restart.
I think it would be nice if the request library would retry if this happens.
But I guess you see it different.
Would you accept a patch which retries on an empty HTTP Status line?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/414336?v=4",
"events_url": "https://api.github.com/users/guettli/events{/privacy}",
"followers_url": "https://api.github.com/users/guettli/followers",
"following_url": "https://api.github.com/users/guettli/following{/other_user}",
"gists_url": "https://api.github.com/users/guettli/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/guettli",
"id": 414336,
"login": "guettli",
"node_id": "MDQ6VXNlcjQxNDMzNg==",
"organizations_url": "https://api.github.com/users/guettli/orgs",
"received_events_url": "https://api.github.com/users/guettli/received_events",
"repos_url": "https://api.github.com/users/guettli/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/guettli/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/guettli/subscriptions",
"type": "User",
"url": "https://api.github.com/users/guettli",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2642/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2642/timeline
| null |
completed
| null | null | false |
[
"1. We don't retry most errors by default\n2. If you use a session, you can [configure retries on your own](http://www.coglib.com/~icordasc/blog/2014/12/retries-in-requests.html) without having to wait for a release of requests\n3. Not every `BadStatusLine` error is retry-able. In fact, for the most, a `BadStatusLine` can mean a very large number of things and if we retry those automatically for users it could end up in their request taking much longer than they intend. Timeouts would apply to each request in the retried chain as opposed to the whole chain so they wouldn't even be able to set a timeout to limit the execution of the retries without relying on a third-party library.\n4. I don't remember if urllib3's retries (which power requests' retries) will even work with a `BadStatusLine` error or if they need to be based around HTTP status codes\n",
"BadStatusLine usually indicates the request made it to the server, so it's\na tricky retry situation; you can generally retry idempotent requests (GET,\nPUT, DELETE) but not anything side effecting.\n\nThe \"Connection aborted\" message is a little misleading, the connection\nworked just fine I believe, the server just sent back the empty string\ninstead of an HTTP response..\n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n\nOn Mon, Jun 15, 2015 at 7:17 AM, Ian Cordasco [email protected]\nwrote:\n\n> 1. We don't retry most errors by default\n> 2. If you use a session, you can configure retries on your own\n> http://www.coglib.com/%7Eicordasc/blog/2014/12/retries-in-requests.html\n> without having to wait for a release of requests\n> 3. Not every BadStatusLine error is retry-able. In fact, for the most,\n> a BadStatusLine can mean a very large number of things and if we retry\n> those automatically for users it could end up in their request taking much\n> longer than they intend. Timeouts would apply to each request in the\n> retried chain as opposed to the whole chain so they wouldn't even be able\n> to set a timeout to limit the execution of the retries without relying on a\n> third-party library.\n> 4. I don't remember if urllib3's retries (which power requests'\n> retries) will even work with a BadStatusLine error or if they need to\n> be based around HTTP status codes\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2642#issuecomment-112085194\n> .\n",
"> The \"Connection aborted\" message is a little misleading, the connection worked just fine I believe\n\nI disagree. We rely on `httplib`/`http.client` still and in [some cases](https://hg.python.org/cpython/file/b2a37f0ef915/Lib/http/client.py#l264) it closes the connection before raising a `BadStatusLine`. We cannot always introspect whether it has so `Connection aborted` is fairly accurate.\n",
"`BadStatusLine` is extremely unclear. It can occur in a number of situations, and makes no guarantees about whether the request made it through. It just fundamentally means httplib failed to parse the header line.\n\nIn _this_ instance it's unlikely that the server \"sent back the empty string\" because that's not a thing you can meaningfully do in TCP (I guess you could try emitting a zero length packet? not sure if that's valid TCP). What this means is that the connection got closed, likely by the shutdown of the server, without any data left in the send buffer. That will have caused a zero-length read.\n\nA good example of why we _shouldn't_ blindly retry in this case is that plenty of servers will reject requests they don't like by just closing the connection, which will cause exactly this error. This kind of thing tells you nothing about whether the request succeeded. This means @kevinburke's advice is right (only retry idempotent requests), but not quite for the right reason.\n\n</pedantry>\n",
"What is your advice for me - a requests lib user?\n",
"My advice is that you should either follow [this blog post](http://www.coglib.com/~icordasc/blog/2014/12/retries-in-requests.html), or you should be catching exceptions around your requests and choosing to retry yourself.\n\nRequests cannot protect you from the network exploding underneath you. The best we can do is give you tools to help manage it, which that blog post does. =) Sadly, networks are unreliable beasts, and it's dangerous to pretend otherwise.\n",
"Do you have an advice how to create a BadStatusLine in a test?\n\nI know who to use mocking, but I am not sure where to inject a matching \"raise ...\" line.\n\nThe exception in real world looks like this:\n\n```\n\n File \"/home/foo_esm_d/src/foo/footests/test_foo.py\", line 35, in testAAA_need_authentication\n response=requests.get(url, verify=False)\n File \"/home/foo_esm_d/lib/python2.7/site-packages/requests/api.py\", line 69, in get\n return request('get', url, params=params, **kwargs)\n File \"/home/foo_esm_d/lib/python2.7/site-packages/requests/api.py\", line 50, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"/home/foo_esm_d/lib/python2.7/site-packages/requests/sessions.py\", line 465, in request\n resp = self.send(prep, **send_kwargs)\n File \"/home/foo_esm_d/lib/python2.7/site-packages/requests/sessions.py\", line 573, in send\n r = adapter.send(request, **kwargs)\n File \"/home/foo_esm_d/lib/python2.7/site-packages/requests/adapters.py\", line 415, in send\n raise ConnectionError(err, request=request)\nConnectionError: ('Connection aborted.', BadStatusLine(\"''\",))\n```\n",
"Yup, it's easy. Have a socket server that does this:\n\n``` python\nimport socket\n\ns = socket.socket()\ns.bind(('0.0.0.0', 8080))\ns.listen(5)\n\nwhile True:\n b = s.accept()\n\n while True:\n data = b.recv(65535)\n if b'\\r\\n\\r\\n' in data:\n break\n\n b.close()\n```\n\nThen, hit the server with a call from requests. Should cause exactly the bug you're looking at.\n",
"On our continuous integration server several runs happen in parallel. It is likely that port 8080 is already in use. :-(\n\nIs there a way to use the requests lib (client part) via unix-domain sockets?\n",
"Port 8080 is not special, you can change it to whatever you want in your system. Alternatively, you can set it to zero which means 'any port', and then use `s.getsockname()[1]` to find out what port you were assigned.\n",
"@Lukasa thank you very much! I was not aware of the \"zero means any port\" method.\n\nFor the unix-sockets way I found this:\n\nhttps://pypi.python.org/pypi/requests-unixsocket/\n"
] |
https://api.github.com/repos/psf/requests/issues/2641
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2641/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2641/comments
|
https://api.github.com/repos/psf/requests/issues/2641/events
|
https://github.com/psf/requests/pull/2641
| 88,295,341 |
MDExOlB1bGxSZXF1ZXN0Mzc2NTQ2NDU=
| 2,641 |
make api.py support keepalive
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/548686?v=4",
"events_url": "https://api.github.com/users/duanhongyi/events{/privacy}",
"followers_url": "https://api.github.com/users/duanhongyi/followers",
"following_url": "https://api.github.com/users/duanhongyi/following{/other_user}",
"gists_url": "https://api.github.com/users/duanhongyi/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/duanhongyi",
"id": 548686,
"login": "duanhongyi",
"node_id": "MDQ6VXNlcjU0ODY4Ng==",
"organizations_url": "https://api.github.com/users/duanhongyi/orgs",
"received_events_url": "https://api.github.com/users/duanhongyi/received_events",
"repos_url": "https://api.github.com/users/duanhongyi/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/duanhongyi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/duanhongyi/subscriptions",
"type": "User",
"url": "https://api.github.com/users/duanhongyi",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2015-06-15T02:41:51Z
|
2021-09-08T07:00:55Z
|
2015-06-15T09:21:23Z
|
NONE
|
resolved
|
Use an object pool, making api.py support for keepalive
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/548686?v=4",
"events_url": "https://api.github.com/users/duanhongyi/events{/privacy}",
"followers_url": "https://api.github.com/users/duanhongyi/followers",
"following_url": "https://api.github.com/users/duanhongyi/following{/other_user}",
"gists_url": "https://api.github.com/users/duanhongyi/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/duanhongyi",
"id": 548686,
"login": "duanhongyi",
"node_id": "MDQ6VXNlcjU0ODY4Ng==",
"organizations_url": "https://api.github.com/users/duanhongyi/orgs",
"received_events_url": "https://api.github.com/users/duanhongyi/received_events",
"repos_url": "https://api.github.com/users/duanhongyi/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/duanhongyi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/duanhongyi/subscriptions",
"type": "User",
"url": "https://api.github.com/users/duanhongyi",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2641/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2641/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2641.diff",
"html_url": "https://github.com/psf/requests/pull/2641",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2641.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2641"
}
| true |
[
"Thanks for this @duanhongyi!\n\nHowever, I'm :-1: on this idea. My primary objection is that this adds implicit global state. Generally speaking I don't think libraries should maintain any internal state at all except what is truly necessary for their function. Wherever possible we should provide 'state objects' to users that the user has control over, ensuring that users own the lifetimes of objects: we already do this.\n\nI think having a pool of sessions in the background is a bad idea. It'll lead to bug reports from users who aren't expecting it and it'll lead to complaints from users who don't like the memory profile it causes. Most importantly, it makes it _very very difficult_ to obtain reproducible behaviour out of the top-level API, because what exactly happens on a given request is actually dependent on the entire lifetime of the program up until that point. `requests.get()` may or may not work depending on whether there are cookies present in the particular `Session` you're using. It may or may not work depending on whether the remote server supports keepalive connections (a surprising number don't handle it well).\n\nI'll let @sigmavirus24 express an opinion as well, but I'm sorry, I doubt we'll merge this.\n",
"@Lukasa \n\nI agree with you, but the global shared socket connection, which is common in programming languages such as Java, is also practical. and perhaps adapters.py is more appropriate for doing this.\n",
"Any user who wants it can easily achieve a very similar effect:\n\n```\nfrom requests import session as requests\n```\n\nThis gives the implicit shared pool that everyone wants. Bear in mind the session API is a strict superset of the `api.py` API, so it should be a simple replacement.\n",
"I agree with @Lukasa. This will also likely negatively affect things like grequests which build on the api module's design. I also don't think we want this in `requests/adapters.py`.\n"
] |
https://api.github.com/repos/psf/requests/issues/2640
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2640/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2640/comments
|
https://api.github.com/repos/psf/requests/issues/2640/events
|
https://github.com/psf/requests/pull/2640
| 87,951,537 |
MDExOlB1bGxSZXF1ZXN0Mzc2MjA3Njg=
| 2,640 |
Avoid double releasing chunked upload connections
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-06-13T07:11:39Z
|
2021-09-08T07:00:55Z
|
2015-06-13T13:54:40Z
|
MEMBER
|
resolved
|
Resolves #2636
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2640/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2640/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2640.diff",
"html_url": "https://github.com/psf/requests/pull/2640",
"merged_at": "2015-06-13T13:54:40Z",
"patch_url": "https://github.com/psf/requests/pull/2640.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2640"
}
| true |
[
":sparkles: :cake: :sparkles: \n"
] |
https://api.github.com/repos/psf/requests/issues/2639
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2639/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2639/comments
|
https://api.github.com/repos/psf/requests/issues/2639/events
|
https://github.com/psf/requests/issues/2639
| 87,837,829 |
MDU6SXNzdWU4NzgzNzgyOQ==
| 2,639 |
Traceback when filename contains non-ascii characters
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4114154?v=4",
"events_url": "https://api.github.com/users/zhaoguixu/events{/privacy}",
"followers_url": "https://api.github.com/users/zhaoguixu/followers",
"following_url": "https://api.github.com/users/zhaoguixu/following{/other_user}",
"gists_url": "https://api.github.com/users/zhaoguixu/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/zhaoguixu",
"id": 4114154,
"login": "zhaoguixu",
"node_id": "MDQ6VXNlcjQxMTQxNTQ=",
"organizations_url": "https://api.github.com/users/zhaoguixu/orgs",
"received_events_url": "https://api.github.com/users/zhaoguixu/received_events",
"repos_url": "https://api.github.com/users/zhaoguixu/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/zhaoguixu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhaoguixu/subscriptions",
"type": "User",
"url": "https://api.github.com/users/zhaoguixu",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2015-06-12T20:15:01Z
|
2021-09-08T18:00:54Z
|
2016-04-16T04:13:54Z
|
NONE
|
resolved
|
A simple test case:
``` python
request = requests.Request(method='GET', url='https://httpbin.org/get')
request.files = {'f': ('føø', '\xff')}
prepared = request.prepare()
requests.Session().send(prepared)
```
```
prepared = request.prepare()
File "/usr/lib/python2.7/dist-packages/requests/models.py", line 249, in prepare
hooks=self.hooks,
File "/usr/lib/python2.7/dist-packages/requests/models.py", line 296, in prepare
self.prepare_body(data, files)
File "/usr/lib/python2.7/dist-packages/requests/models.py", line 434, in prepare_body
(body, content_type) = self._encode_files(files, data)
File "/usr/lib/python2.7/dist-packages/requests/models.py", line 148, in _encode_files
rf.make_multipart(content_type=ft)
File "/usr/lib/python2.7/dist-packages/urllib3/fields.py", line 176, in make_multipart
self.headers['Content-Disposition'] += '; '.join(['', self._render_parts((('name', self._name), ('filename', self._filename)))])
File "/usr/lib/python2.7/dist-packages/urllib3/fields.py", line 139, in _render_parts
parts.append(self._render_part(name, value))
File "/usr/lib/python2.7/dist-packages/urllib3/fields.py", line 119, in _render_part
return format_header_param(name, value)
File "/usr/lib/python2.7/dist-packages/urllib3/fields.py", line 44, in format_header_param
result.encode('ascii')
UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 11: ordinal not in range(128)
```
The terrible unicode feather of python2.x will implicitly cast the byte-string to unicode. `results` contains non-ascii characters which will be casted to unicode first before calling `encode` method. However the catch in `format_header_param` UnicodeEnocdeError is not enough.
The api specification in this function says value is a unicode string. Actually its a so low-level function. Requests seems ignore this and allow user-decided non-ascii builtin-string flows in, resulting this traceback.
``` python
def format_header_param(name, value):
"""
Helper function to format and quote a single header parameter.
Particularly useful for header parameters which might contain
non-ASCII values, like file names. This follows RFC 2231, as
suggested by RFC 2388 Section 4.4.
:param name:
The name of the parameter, a string expected to be ASCII only.
:param value:
The value of the parameter, provided as a unicode string.
"""
if not any(ch in value for ch in '"\\\r\n'):
result = '%s="%s"' % (name, value)
try:
print result
result.encode('ascii')
except UnicodeEncodeError:
pass
else:
return result
if not six.PY3: # Python 2:
value = value.encode('utf-8')
value = email.utils.encode_rfc2231(value, 'utf-8')
value = '%s*=%s' % (name, value)
return value
```
Thanks,
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2639/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2639/timeline
| null |
completed
| null | null | false |
[
"Hmm, this one is really tricky.\n\nI think the best fix here is actually in urllib3: we should catch `UnicodeDecodeError`. However, if we catch it, we should return `result`. The rationale is that the user has already provided us with a bytestring, so they presumably think they know what they're doing. @shazow, thoughts?\n",
"Does this work if the filename is properly encoded in unicode to begin with? If there is no pre-formatted input that works, then yes that should be fixed in urllib3.\n",
"Yes, it works when `request.files = {'f': (u'føø', u'\\xff')}`\n",
"Hmmm, so my general rule for urllib3, as a library-level module, is garbage-in-garbage-out. I try to avoid converting between bytes/unicode whenever it's direct input, though sometimes it's necessary if it's indirect. I can be convinced otherwise though.\n",
"> I try to avoid converting between bytes/unicode whenever it's direct input\n\nThe problem is that's not true of this function. This function explicitly does exactly that. My proposal is that when it can't, it should just let things keep going.\n",
"Oh woops didn't even realize format_header_param was in urllib3, thought it was a requests thing. In that case, yesh +1.\n",
"Will be fixed when we next pull in a new version of urllib3.\n"
] |
https://api.github.com/repos/psf/requests/issues/2638
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2638/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2638/comments
|
https://api.github.com/repos/psf/requests/issues/2638/events
|
https://github.com/psf/requests/issues/2638
| 87,745,606 |
MDU6SXNzdWU4Nzc0NTYwNg==
| 2,638 |
Behaviors are different when data is a list(dict) and text
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4114154?v=4",
"events_url": "https://api.github.com/users/zhaoguixu/events{/privacy}",
"followers_url": "https://api.github.com/users/zhaoguixu/followers",
"following_url": "https://api.github.com/users/zhaoguixu/following{/other_user}",
"gists_url": "https://api.github.com/users/zhaoguixu/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/zhaoguixu",
"id": 4114154,
"login": "zhaoguixu",
"node_id": "MDQ6VXNlcjQxMTQxNTQ=",
"organizations_url": "https://api.github.com/users/zhaoguixu/orgs",
"received_events_url": "https://api.github.com/users/zhaoguixu/received_events",
"repos_url": "https://api.github.com/users/zhaoguixu/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/zhaoguixu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhaoguixu/subscriptions",
"type": "User",
"url": "https://api.github.com/users/zhaoguixu",
"user_view_type": "public"
}
|
[] |
open
| false | null |
[] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2013-11-17T11:29:34Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": null,
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/20",
"id": 487518,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels",
"node_id": "MDk6TWlsZXN0b25lNDg3NTE4",
"number": 20,
"open_issues": 12,
"state": "open",
"title": "3.0.0",
"updated_at": "2024-05-19T18:43:00Z",
"url": "https://api.github.com/repos/psf/requests/milestones/20"
}
| 5 |
2015-06-12T14:25:11Z
|
2015-06-12T20:36:12Z
| null |
NONE
| null |
When data is unicode, it gives traceback.
A simple test case:
``` python
request = requests.Request(method='GET', url='https://httpbin.org/get')
request.data = u'x=føø'
prepared = request.prepare()
requests.Session().send(prepared)
```
```
Traceback (most recent call last):
...
requests.Session().send(prepared)
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 566, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 331, in send
timeout=timeout
File "/usr/lib/python2.7/dist-packages/urllib3/connectionpool.py", line 558, in urlopen
body=body, headers=headers)
File "/usr/lib/python2.7/dist-packages/urllib3/connectionpool.py", line 383, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python2.7/httplib.py", line 975, in request
self._send_request(method, url, body, headers)
File "/usr/lib/python2.7/httplib.py", line 1009, in _send_request
self.endheaders(body)
File "/usr/lib/python2.7/httplib.py", line 971, in endheaders
self._send_output(message_body)
File "/usr/lib/python2.7/httplib.py", line 835, in _send_output
self.send(message_body)
File "/usr/lib/python2.7/httplib.py", line 805, in send
self.sock.sendall(data)
File "/usr/lib/python2.7/ssl.py", line 329, in sendall
v = self.send(data[count:])
File "/usr/lib/python2.7/ssl.py", line 298, in send
v = self._sslobj.write(data)
UnicodeEncodeError: 'ascii' codec can't encode characters in position 3-4: ordinal not in range(128)
```
When data is a list, it works fine.
``` python
request = requests.Request(method='GET', url='https://httpbin.org/get')
request.data = [(u'x', u'føø')]
prepared = request.prepare()
requests.Session().send(prepared)
```
Requests seems to be supporting unicode data, shown in the code. However, it seems not work well, and the behavior differs when the data is a list-like type and a text-like type, one try to encode using 'uft-8' and the other just pass through(in _encode_params).
``` python
def prepare_body(self, data, files, json=None):
...
if files:
(body, content_type) = self._encode_files(files, data)
else:
if data and json is None:
body = self._encode_params(data) #===> check data
if isinstance(data, basestring) or hasattr(data, 'read'):
content_type = None
else:
content_type = 'application/x-www-form-urlencoded'
...
@staticmethod
def _encode_params(data):
if isinstance(data, (str, bytes)): #===> allows unicode
return data #===> simply return
elif hasattr(data, 'read'):
return data
elif hasattr(data, '__iter__'): #===> the behavior is different.
result = []
for k, vs in to_key_val_list(data):
if isinstance(vs, basestring) or not hasattr(vs, '__iter__'):
vs = [vs]
for v in vs:
if v is not None:
result.append(
(k.encode('utf-8') if isinstance(k, str) else k,
v.encode('utf-8') if isinstance(v, str) else v)) #===>you try to encode as 'utf8'
return urlencode(result, doseq=True)
else:
return data
```
| null |
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2638/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2638/timeline
| null | null | null | null | false |
[
"Yeah, agreed, this is a bug. We shouldn't just return, we should chuck the data through `to_native_str` first.\n\nHowever, that change is backward incompatible, so we should add this into 3.0.0.\n",
"So, I don't think that belongs in the `_encode_params` method though. That's mostly there to create an `application/x-www-form-urlencoded` body. The fact that it sees the data doesn't make it the right place, unless we change it's purpose. But that's more for people who are looking to pick this up.\n",
"So as when `python requests.params=u'x=føø'`, containing non-ascii characters. _encode_params seems buggy. I find requests easily crashes when meeting non-ascii characters overall. Perhaps, it would be better if testing coverage about this could be higher. I seem find another traceback about this. I will raise a new issue then. Thanks for the response.\n",
"The real problem is that mixing unicode and non-unicode data is a bad idea. Requests will work best if you encode your unicode data before you pass it to us, so we don't have to guess what you want.\n\nWith that said, if there are problems at our interface, we should know about them\n",
"However, requests gives us a phantom being able to support unicode and non-unicode data with non-ascii characters. But to what stage, we don't know and the specification is fuzzy. Indeed, requests is a great project and I am really appreciated. Hope it would be better :100: Thanks, you guys.\n"
] |
https://api.github.com/repos/psf/requests/issues/2637
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2637/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2637/comments
|
https://api.github.com/repos/psf/requests/issues/2637/events
|
https://github.com/psf/requests/issues/2637
| 87,615,951 |
MDU6SXNzdWU4NzYxNTk1MQ==
| 2,637 |
Requests does not support encoding headers to ascii?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4114154?v=4",
"events_url": "https://api.github.com/users/zhaoguixu/events{/privacy}",
"followers_url": "https://api.github.com/users/zhaoguixu/followers",
"following_url": "https://api.github.com/users/zhaoguixu/following{/other_user}",
"gists_url": "https://api.github.com/users/zhaoguixu/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/zhaoguixu",
"id": 4114154,
"login": "zhaoguixu",
"node_id": "MDQ6VXNlcjQxMTQxNTQ=",
"organizations_url": "https://api.github.com/users/zhaoguixu/orgs",
"received_events_url": "https://api.github.com/users/zhaoguixu/received_events",
"repos_url": "https://api.github.com/users/zhaoguixu/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/zhaoguixu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhaoguixu/subscriptions",
"type": "User",
"url": "https://api.github.com/users/zhaoguixu",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-06-12T05:42:05Z
|
2021-09-08T23:00:56Z
|
2015-06-12T12:59:32Z
|
NONE
|
resolved
|
A simple test case:
``` python
request = requests.Request(method='GET', url='https://httpbin.org/get')
request.headers = {u'referer': u'http://xx.com?x=føø'}
prepared = request.prepare()
requests.Session().send(prepared)
```
```
Traceback (most recent call last):
...
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 566, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 331, in send
timeout=timeout
File "/usr/lib/python2.7/dist-packages/urllib3/connectionpool.py", line 558, in urlopen
body=body, headers=headers)
File "/usr/lib/python2.7/dist-packages/urllib3/connectionpool.py", line 383, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python2.7/httplib.py", line 975, in request
self._send_request(method, url, body, headers)
File "/usr/lib/python2.7/httplib.py", line 1008, in _send_request
self.putheader(hdr, value)
File "/usr/lib/python2.7/httplib.py", line 955, in putheader
hdr = '%s: %s' % (header, '\r\n\t'.join([str(v) for v in values]))
UnicodeEncodeError: 'ascii' codec can't encode characters in position 17-18: ordinal not in range(128)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2637/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2637/timeline
| null |
completed
| null | null | false |
[
"Requests does not encode header values for you because we don't know what encoding you want to send. You'll need to pick an encoding for the header value yourself and encode it.\n"
] |
https://api.github.com/repos/psf/requests/issues/2636
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2636/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2636/comments
|
https://api.github.com/repos/psf/requests/issues/2636/events
|
https://github.com/psf/requests/issues/2636
| 87,552,543 |
MDU6SXNzdWU4NzU1MjU0Mw==
| 2,636 |
Same HTTPConnection object returned to pool twice
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/450406?v=4",
"events_url": "https://api.github.com/users/gilesbrown/events{/privacy}",
"followers_url": "https://api.github.com/users/gilesbrown/followers",
"following_url": "https://api.github.com/users/gilesbrown/following{/other_user}",
"gists_url": "https://api.github.com/users/gilesbrown/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gilesbrown",
"id": 450406,
"login": "gilesbrown",
"node_id": "MDQ6VXNlcjQ1MDQwNg==",
"organizations_url": "https://api.github.com/users/gilesbrown/orgs",
"received_events_url": "https://api.github.com/users/gilesbrown/received_events",
"repos_url": "https://api.github.com/users/gilesbrown/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gilesbrown/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gilesbrown/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gilesbrown",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2015-06-12T00:34:08Z
|
2021-09-08T23:00:56Z
|
2015-06-13T13:54:40Z
|
NONE
|
resolved
|
Not totally sure how to reproduce this yet but I am definitely getting a case where I am getting the "Connection pool is full" message because the same connection object is being returned to the pool twice (and not surprisingly the second time the pool says that it is full).
I am using gevent (yes I realize that is probably crucial), but even so ...
The first time the connection gets returned is in requests.adapters:
410 else:
411 # All is well, return the connection to the pool.
412 conn._put_conn(low_conn)
The second time the connection gets returned is in requests.packages.urllib3.response:
285 finally:
286 if self._original_response and self._original_response.isclosed():
287 self.release_conn()
I'd love to be able to supply a simple example, but the code path is a little to convoluted for me to work out quickly so I'm looking for hints/tips here if you have any.
Thanks,
Giles
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2636/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2636/timeline
| null |
completed
| null | null | false |
[
"I've seen this and wasn't able to come up with a simple case to reproduce it. /cc @shazow since we discussed it\n",
"Maybe we need a mutex around our release_conn() innards.\n",
"Actually it turns out that the bug is super easy to reproduce.\n\nhttps://gist.github.com/gilesbrown/3832fc29100213171968\n\nIt doesn't require gevent, all it requires is doing a chunked PUT (or POST etc).\n",
"Interesting. This line of code looks fishy to me:\n\n``` python\n# All is well, return the connection to the pool.\nconn._put_conn(low_conn)\n```\n\nWe do that internally, despite having set `preload_content=False`. We may well then consume the content elsewhere.\n\nI think that's just wrong, really, and it opens us up to bugs.\n",
"And, quite expectedly, commenting out that block causes everything to function properly. I just think that's wrong.\n",
"It should be noted that [I was the idiot](https://github.com/kennethreitz/requests/commit/31c0962e83c8f3b043b3faceff880f69a9a4dbdc) who put that release in the code in the first place, nearly two years ago. Past me is a jerk. =P\n",
"@Lukasa I did always like future-you better than past-you.\n",
"Well it's good that you're able to fix your fishy past so easily.\n"
] |
https://api.github.com/repos/psf/requests/issues/2635
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2635/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2635/comments
|
https://api.github.com/repos/psf/requests/issues/2635/events
|
https://github.com/psf/requests/issues/2635
| 87,113,389 |
MDU6SXNzdWU4NzExMzM4OQ==
| 2,635 |
With HTTPResponse body set to string, "ValueError: Unable to determine whether fp is closed"
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3417501?v=4",
"events_url": "https://api.github.com/users/xEtherealx/events{/privacy}",
"followers_url": "https://api.github.com/users/xEtherealx/followers",
"following_url": "https://api.github.com/users/xEtherealx/following{/other_user}",
"gists_url": "https://api.github.com/users/xEtherealx/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/xEtherealx",
"id": 3417501,
"login": "xEtherealx",
"node_id": "MDQ6VXNlcjM0MTc1MDE=",
"organizations_url": "https://api.github.com/users/xEtherealx/orgs",
"received_events_url": "https://api.github.com/users/xEtherealx/received_events",
"repos_url": "https://api.github.com/users/xEtherealx/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/xEtherealx/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xEtherealx/subscriptions",
"type": "User",
"url": "https://api.github.com/users/xEtherealx",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 11 |
2015-06-10T21:27:54Z
|
2021-09-08T23:00:55Z
|
2015-06-10T21:41:12Z
|
NONE
|
resolved
|
I'm doing something like
```
headers = {'a dict':'test'}
body = 'some string'
http_resp = HTTPResponse(body=body,
headers=headers,
status=202,
version=0,
reason='Ok',
decode_content=False)
response = Response()
# Fallback to None if there's no status_code, for whatever reason.
response.status_code = getattr(http_resp, 'status', None)
response.encoding = get_encoding_from_headers(headers)
response.raw = http_resp
response.reason = response.raw.reason
response.url = url
# Hack to avoid ValueError on fp check (bug?)
response._content = response.raw._body
print response.content
```
The hack portion is required because I get a ValueError otherwise in the stream method:
```
while not is_fp_closed(self._fp)
```
Seems like there needs to be an exit on a stringish body.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2635/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2635/timeline
| null |
completed
| null | null | false |
[
"You're mucking around in urllib3 internals. This should be filed over at [urllib3](/shazow/urllib3).\n",
"As an aside, if you're mucking around with urllib3 responses for testing purposes, take a look at how it's done in [betamax](/sigmavirus24/betamax).\n",
"I wouldn't call it mucking, but I think it's questionable whether requests should be calling the stream method when there's no file-like content... regardless of whether the is_fp_closed method in urllib is buggy or not.\n",
"Rather than provide urllib3 objects in this case, requests should let you just provide a file-like object as the underlying response. That should work absolutely fine.\n",
"To add context, I'm wrapping a third-party proxy-like service as a drop-in (ish) replacement for requests. In order to do so, I have to fork over a Response object.\n\nMy first thought was to wrap my string in a StringIO object; but when I do this, requests doesn't handle it properly either. My observations in this case are:\n- the HTTPResponse._body attribute is correctly set\n- Response.content returns None\n\nIf I recall, Response tries to re-use HTTPResponse._fp, but the file-like object's contents have already been read upon initialization of the HTTPResponse and therefore no longer exist.\n\nOn a second look, the additional call to read() happens within HTTPResponse as a result of the call to HHTPResponse.stream(). This is obviously a separate issue, but I guess the root issue here is that I can't seem to find a way to use the requests.Response object in a standalone fashion.\n",
"Do you mind taking a look at betamax (linked above) to see if that provides you some useful guidance?\n",
"Hint: `StringIO` is fundamentally wrong to be used as the fileobject for an `HTTPResponse` since the response will always be returning bytes. \n",
"Can you make a specific suggestion? I'd be happy to try it. I will check\nout betamax but haven't had a chance to do so yet.\nEdit: I see that betamax uses io.BytesIO -- I tried this with the same result as StringIO.\n\nOn Thu, Jun 11, 2015 at 7:59 AM, Ian Cordasco [email protected]\nwrote:\n\n> Hint: StringIO is fundamentally wrong to be used as the fileobject for an\n> HTTPResponse since the response will always be returning bytes.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2635#issuecomment-111164193\n> .\n",
"Ok, so I checked out betamax and it looks like what you're doing is setting preload_content=False to avoid the initial read on the file-like body. I guess that makes sense since an HTTPResponse is a file-like object, but is there a way to allow multiple sequential reads of Response.content?\n",
"If the `Response` object is from requests and is unaltered, it will cache the content so sequential accesses of the same response's content attribute should not be a problem. Unless I'm misunderstanding what you're asking.\n",
"That works, thanks!\n"
] |
https://api.github.com/repos/psf/requests/issues/2634
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2634/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2634/comments
|
https://api.github.com/repos/psf/requests/issues/2634/events
|
https://github.com/psf/requests/issues/2634
| 86,976,192 |
MDU6SXNzdWU4Njk3NjE5Mg==
| 2,634 |
Handle upgrade to WebSocket (Feature Request)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/273727?v=4",
"events_url": "https://api.github.com/users/marians/events{/privacy}",
"followers_url": "https://api.github.com/users/marians/followers",
"following_url": "https://api.github.com/users/marians/following{/other_user}",
"gists_url": "https://api.github.com/users/marians/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/marians",
"id": 273727,
"login": "marians",
"node_id": "MDQ6VXNlcjI3MzcyNw==",
"organizations_url": "https://api.github.com/users/marians/orgs",
"received_events_url": "https://api.github.com/users/marians/received_events",
"repos_url": "https://api.github.com/users/marians/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/marians/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/marians/subscriptions",
"type": "User",
"url": "https://api.github.com/users/marians",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2015-06-10T13:43:57Z
|
2021-09-08T23:00:56Z
|
2015-06-12T12:59:22Z
|
NONE
|
resolved
|
it would be awesome if there was a way to handle the response for a request that upgrades to a websocket connection, for example:
``` python
url = "https://host.domain.com/mypath"
headers = {
"Authorization": "myrealm <my-auth-token>",
"Connection": "upgrade",
"Upgrade": "websocket",
"Sec-Websocket-version": "13",
"Sec-Websocket-key": "<some-challenge-key>",
}
payload = {
"some_key": ['some', 'values']
}
def handler(message):
print(message)
response = requests.post(url, headers=headers, json=payload, websocket_callback=handler)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2634/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2634/timeline
| null |
completed
| null | null | false |
[
"I'm afraid that we're extremely unlikely to add support for this at the requests level in this form. What you'd have to do is use `stream=True` and then grab the socket out of the response object at a very low level.\n",
"I have to agree with @Lukasa. This is not on the roadmap for future versions of requests at this point.\n",
"All right, fair enough.\n"
] |
https://api.github.com/repos/psf/requests/issues/2633
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2633/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2633/comments
|
https://api.github.com/repos/psf/requests/issues/2633/events
|
https://github.com/psf/requests/issues/2633
| 86,547,006 |
MDU6SXNzdWU4NjU0NzAwNg==
| 2,633 |
Does requests support url anchor
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3670110?v=4",
"events_url": "https://api.github.com/users/kaizengliu/events{/privacy}",
"followers_url": "https://api.github.com/users/kaizengliu/followers",
"following_url": "https://api.github.com/users/kaizengliu/following{/other_user}",
"gists_url": "https://api.github.com/users/kaizengliu/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kaizengliu",
"id": 3670110,
"login": "kaizengliu",
"node_id": "MDQ6VXNlcjM2NzAxMTA=",
"organizations_url": "https://api.github.com/users/kaizengliu/orgs",
"received_events_url": "https://api.github.com/users/kaizengliu/received_events",
"repos_url": "https://api.github.com/users/kaizengliu/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kaizengliu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kaizengliu/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kaizengliu",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2015-06-09T10:34:50Z
|
2021-09-08T23:00:57Z
|
2015-06-09T13:03:06Z
|
NONE
|
resolved
|
Can the url containing both parameters and anchor, like http://xx.com?a=1&&b=2#here
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2633/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2633/timeline
| null |
completed
| null | null | false |
[
"It absolutely can, but you have to build the URL yourself. If you ask requests to encode the parameters for you (using params) then we'll throw the anchor away.\n\nIt seems like a reasonable request to keep hold of that anchor: @sigmavirus24?\n",
"I'm confused about what's being discussed:\n\n``` py\n>>> req = requests.Request(method='GET', url='http://xx.com?a=1&&b=2#here')\n>>> p = req.prepare()\n>>> p.url\n'http://xx.com/?a=1&&b=2#here'\n```\n\nIndicates that the URL is not affected. Are we talking about something like:\n\n``` py\nrequests.get('http://xx.com', params='a=1&&b=2#here')\n```\n\nThat shouldn't work because we expect params to be a dictionary\n\n``` py\nrequests.get('http://xx.com', params={'a': '1', 'b': '2#here'})\n```\n\nIn this case, we'd have to do a lot of pre-processing on the params to make sure the value with `#here` is at the end and not encoded... which may not be what people want.\n\nBut I'm probably completely missing the point.\n",
"The thing you _might_ expect to work is this:\n\n``` python\nrequests.get('http://hostname/path#fragment', params={'param': 'value'})\n```\n\nThat does not work.\n\nAll the other things function correctly.\n",
"Yeah, I'm confused:\n\n``` py\n>>> import requests\n>>> r = requests.Request(method='GET', url='http://example.com/foo#bar', params={'a': 'b'})\n>>> p = r.prepare()\n>>> p\n<PreparedRequest [GET]>\n>>> p.url\n'http://example.com/foo?a=b#bar'\n>>> r = requests.get('https://httpbin.org/get#fragment', params={'a': 'b'})\n>>> r.json()\n{u'origin': u'127.0.0.1', u'headers': {u'Host': u'httpbin.org', u'Accept-Encoding': u'gzip, deflate', u'Accept': u'*/*', u'User-Agent': u'python-requests/2.7.0 CPython/2.7.9 Darwin/14.1.0'}, u'args': {u'a': u'b'}, u'url': u'https://httpbin.org/get?a=b'}\n>>> r.url\nu'https://httpbin.org/get?a=b#fragment'\n```\n",
"Ah crap, I'm an idiot. I looked at the JSON response, forgetting that the fragment doesn't get transmitted to the server. Duh.\n\nThis all works totally fine!\n",
"@kaizengliu Please do not use the _bug_ tracker to ask questions. Our question forum is located [here](https://stackoverflow.com/questions/tagged/python-requests).\n",
"I got it. It can work in the way like this\n\n```\nrequests.Request(method='GET', url='http://example.com/foo#bar', params={'a': 'b'})\n```\n\nThank you very much. And I am sorry, I will ask question in [here](https://stackoverflow.com/questions/tagged/python-requests)\n"
] |
https://api.github.com/repos/psf/requests/issues/2632
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2632/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2632/comments
|
https://api.github.com/repos/psf/requests/issues/2632/events
|
https://github.com/psf/requests/issues/2632
| 85,812,645 |
MDU6SXNzdWU4NTgxMjY0NQ==
| 2,632 |
Partial content and compression : decoding error
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2565212?v=4",
"events_url": "https://api.github.com/users/netheosgithub/events{/privacy}",
"followers_url": "https://api.github.com/users/netheosgithub/followers",
"following_url": "https://api.github.com/users/netheosgithub/following{/other_user}",
"gists_url": "https://api.github.com/users/netheosgithub/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/netheosgithub",
"id": 2565212,
"login": "netheosgithub",
"node_id": "MDQ6VXNlcjI1NjUyMTI=",
"organizations_url": "https://api.github.com/users/netheosgithub/orgs",
"received_events_url": "https://api.github.com/users/netheosgithub/received_events",
"repos_url": "https://api.github.com/users/netheosgithub/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/netheosgithub/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/netheosgithub/subscriptions",
"type": "User",
"url": "https://api.github.com/users/netheosgithub",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-06-06T21:11:26Z
|
2021-09-08T23:00:57Z
|
2015-06-06T21:24:32Z
|
NONE
|
resolved
|
Be default requests generate http requests with 'Accept-Encoding: gzip, deflate' header.
If one want to GET only the last 5 bytes of a file, adding a Range: -5 header should do the trick. However, remote server may honor the Accept-Encoding and return the last 5 bytes of gzipped content (instead of the gzip of last 5 bytes as one may think).
This is the difference between Accept-Encoding and transfer encoding 'TE' headers. 'Accept-Encoding: gzip' is for fetching the compressed entity, TE: gzip is for indicating to next proxy that we support a response with header 'Transfer-Encoding: gzip'. In both cases response body is compressed data ; however if Range header specifies partial content, the difference is _when_ compression is achieved :
- before slicing for Accept-Encoding
- after slicing for TE
[however this is theoric, and not really supported by web servers and proxys]
In my case, the bytes received can not be decoded and urllib3 raises with error :
File ".../site-packages/requests/models.py", line 678, in generate
raise ContentDecodingError(e)
requests.exceptions.ContentDecodingError: ('Received response with content-encoding: gzip, but failed to decode it.', error('Error -3 while decompressing: incorrect header check',))
Error is probably in urllib3.
Some requests questions :
Is it possible to disable compression ?
Is it possible to get the raw gzipped response (without other decoding than chunked) ?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2632/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2632/timeline
| null |
completed
| null | null | false |
[
"> Is it possible to disable compression ?\n\nYes. Set `headers={'Accept-Encoding': None}`.\n\n> Is it possible to get the raw gzipped response (without other decoding than chunked) ?\n\nYes. If you set `stream=True`, you can then call `response.raw.read(decode_content=False)`\n",
"In the future, please ask questions on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests).\n"
] |
https://api.github.com/repos/psf/requests/issues/2631
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2631/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2631/comments
|
https://api.github.com/repos/psf/requests/issues/2631/events
|
https://github.com/psf/requests/pull/2631
| 85,444,835 |
MDExOlB1bGxSZXF1ZXN0MzcwMDMzNTY=
| 2,631 |
Handle empty chunks
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1743810?v=4",
"events_url": "https://api.github.com/users/neosab/events{/privacy}",
"followers_url": "https://api.github.com/users/neosab/followers",
"following_url": "https://api.github.com/users/neosab/following{/other_user}",
"gists_url": "https://api.github.com/users/neosab/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/neosab",
"id": 1743810,
"login": "neosab",
"node_id": "MDQ6VXNlcjE3NDM4MTA=",
"organizations_url": "https://api.github.com/users/neosab/orgs",
"received_events_url": "https://api.github.com/users/neosab/received_events",
"repos_url": "https://api.github.com/users/neosab/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/neosab/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/neosab/subscriptions",
"type": "User",
"url": "https://api.github.com/users/neosab",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2015-06-05T06:05:45Z
|
2021-09-08T07:00:54Z
|
2015-06-21T14:22:55Z
|
NONE
|
resolved
|
Empty chunk in request body could prematurely signal end of chunked
transmission. As a result, the terminating zero-size chunk sent by
'requests' can be interpretted as bad request by the recepient. I
have used the same logic used by httplib to handle such cases.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2631/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2631/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2631.diff",
"html_url": "https://github.com/psf/requests/pull/2631",
"merged_at": "2015-06-21T14:22:55Z",
"patch_url": "https://github.com/psf/requests/pull/2631.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2631"
}
| true |
[
"Resubmitted https://github.com/kennethreitz/requests/pull/2388 on proposed/3.0.0 branch.\n",
":+1: LGTM\n",
"LGTM as well. =) We should note this change somewhere before we merge it and lose it.\n",
"Changes are noted in https://github.com/kennethreitz/requests/pull/2646. Merging. :sparkles: :cake: :sparkles: \n",
"Thanks @neosab \n"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.