url
stringlengths 50
53
| repository_url
stringclasses 1
value | labels_url
stringlengths 64
67
| comments_url
stringlengths 59
62
| events_url
stringlengths 57
60
| html_url
stringlengths 38
43
| id
int64 597k
2.65B
| node_id
stringlengths 18
32
| number
int64 1
6.83k
| title
stringlengths 1
296
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 2
classes | assignee
dict | assignees
listlengths 0
4
| milestone
dict | comments
int64 0
211
| created_at
stringlengths 20
20
| updated_at
stringlengths 20
20
| closed_at
stringlengths 20
20
⌀ | author_association
stringclasses 3
values | active_lock_reason
stringclasses 4
values | body
stringlengths 0
65.6k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 59
62
| performed_via_github_app
null | state_reason
stringclasses 3
values | draft
bool 2
classes | pull_request
dict | is_pull_request
bool 2
classes | issue_comments
listlengths 0
30
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/psf/requests/issues/2931
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2931/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2931/comments
|
https://api.github.com/repos/psf/requests/issues/2931/events
|
https://github.com/psf/requests/pull/2931
| 122,521,762 |
MDExOlB1bGxSZXF1ZXN0NTM4NTEyODM=
| 2,931 |
Fix regression from #2844 regarding binary bodies.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2015-12-16T14:57:00Z
|
2021-09-08T05:01:08Z
|
2015-12-16T15:20:26Z
|
MEMBER
|
resolved
|
Resolves #2930.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2931/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2931/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2931.diff",
"html_url": "https://github.com/psf/requests/pull/2931",
"merged_at": "2015-12-16T15:20:26Z",
"patch_url": "https://github.com/psf/requests/pull/2931.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2931"
}
| true |
[
"@untitaker has confirmed this fix works for them.\n",
":+1: from me.\n",
":100: \n",
"Version bump, please.\n",
"@nfnty we've planned the next release for Monday.\n"
] |
https://api.github.com/repos/psf/requests/issues/2930
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2930/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2930/comments
|
https://api.github.com/repos/psf/requests/issues/2930/events
|
https://github.com/psf/requests/issues/2930
| 122,521,389 |
MDU6SXNzdWUxMjI1MjEzODk=
| 2,930 |
Request with binary payload fails due to calling to_native_string
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/837573?v=4",
"events_url": "https://api.github.com/users/untitaker/events{/privacy}",
"followers_url": "https://api.github.com/users/untitaker/followers",
"following_url": "https://api.github.com/users/untitaker/following{/other_user}",
"gists_url": "https://api.github.com/users/untitaker/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/untitaker",
"id": 837573,
"login": "untitaker",
"node_id": "MDQ6VXNlcjgzNzU3Mw==",
"organizations_url": "https://api.github.com/users/untitaker/orgs",
"received_events_url": "https://api.github.com/users/untitaker/received_events",
"repos_url": "https://api.github.com/users/untitaker/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/untitaker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/untitaker/subscriptions",
"type": "User",
"url": "https://api.github.com/users/untitaker",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2015-12-16T14:55:08Z
|
2021-09-08T20:00:57Z
|
2015-12-16T15:20:26Z
|
CONTRIBUTOR
|
resolved
|
Introduced with https://github.com/kennethreitz/requests/issues/2844
```
import requests
requests.put("http://httpbin.org/put", data=u"ööö".encode("utf-8"))
```
This works with 2.8.1, but not with 2.9.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2930/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2930/timeline
| null |
completed
| null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/2929
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2929/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2929/comments
|
https://api.github.com/repos/psf/requests/issues/2929/events
|
https://github.com/psf/requests/issues/2929
| 122,504,590 |
MDU6SXNzdWUxMjI1MDQ1OTA=
| 2,929 |
Uploading packages via twine fails
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2492836?v=4",
"events_url": "https://api.github.com/users/nrvnrvn/events{/privacy}",
"followers_url": "https://api.github.com/users/nrvnrvn/followers",
"following_url": "https://api.github.com/users/nrvnrvn/following{/other_user}",
"gists_url": "https://api.github.com/users/nrvnrvn/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nrvnrvn",
"id": 2492836,
"login": "nrvnrvn",
"node_id": "MDQ6VXNlcjI0OTI4MzY=",
"organizations_url": "https://api.github.com/users/nrvnrvn/orgs",
"received_events_url": "https://api.github.com/users/nrvnrvn/received_events",
"repos_url": "https://api.github.com/users/nrvnrvn/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nrvnrvn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nrvnrvn/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nrvnrvn",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2015-12-16T13:35:53Z
|
2021-09-08T20:00:56Z
|
2015-12-16T18:09:42Z
|
NONE
|
resolved
|
Using latest 2.9.0
```
$ twine upload dist/*.whl
Uploading somepkg-15.12.16-py3-none-any.whl
ConnectionError: ('Connection aborted.', BadStatusLine("''",))
$
```
rollback to 2.8.1 solves the issue.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2929/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2929/timeline
| null |
completed
| null | null | false |
[
"@sigmavirus24 You're probably best placed to look at this, but my suspicion is that there's an unexpected interaction between the toolbelt's MultipartEncoder and the fix we put in for calculating the actual length of file-like objects that are not seeked to the beginning. That said, I can't find anything in that.\n",
"Yeah, so this is a bug in requests-toolbelt and requests interaction.\n",
"Closing as the bug is actually over at https://github.com/sigmavirus24/requests-toolbelt/issues/117. Thanks for bringing this to our attention @nicorevin ! :cake: \n"
] |
https://api.github.com/repos/psf/requests/issues/2928
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2928/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2928/comments
|
https://api.github.com/repos/psf/requests/issues/2928/events
|
https://github.com/psf/requests/pull/2928
| 122,306,527 |
MDExOlB1bGxSZXF1ZXN0NTM3MjQxMTE=
| 2,928 |
make setup.py upload happy
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/76932?v=4",
"events_url": "https://api.github.com/users/heni/events{/privacy}",
"followers_url": "https://api.github.com/users/heni/followers",
"following_url": "https://api.github.com/users/heni/following{/other_user}",
"gists_url": "https://api.github.com/users/heni/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/heni",
"id": 76932,
"login": "heni",
"node_id": "MDQ6VXNlcjc2OTMy",
"organizations_url": "https://api.github.com/users/heni/orgs",
"received_events_url": "https://api.github.com/users/heni/received_events",
"repos_url": "https://api.github.com/users/heni/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/heni/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/heni/subscriptions",
"type": "User",
"url": "https://api.github.com/users/heni",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2015-12-15T16:10:53Z
|
2021-09-08T05:01:09Z
|
2015-12-15T17:22:06Z
|
NONE
|
resolved
|
distutils brokes when we use tuple based meta values
canonic way is define classifiers as list not a tuple
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2928/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2928/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2928.diff",
"html_url": "https://github.com/psf/requests/pull/2928",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2928.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2928"
}
| true |
[
"@heni We've been using a tuple here since [literally the start of the project](https://github.com/kennethreitz/requests/commit/0477018761c67152cdcc0b83d56f27e701e65b9e), which _used_ distutils. I do not believe this can possibly be the problem you're experiencing.\n",
"I want to redistribute this package into private repository of our project and with tuple for classifiers I have broken next chain of commands\n pip3 install --download=. --no-binary=:all: requests\n tar xf requests-2.9.0.tar.gz\n cd requests-2.9.0\n python3 setup.py bdist_wheel upload -r private-repo\n",
"Distutils does not work with wheel, you have to use setuptools.\n",
"Also `setup.py upload` most likely is _not_ performing certificate validation. Use `twine` instead.\n"
] |
https://api.github.com/repos/psf/requests/issues/2927
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2927/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2927/comments
|
https://api.github.com/repos/psf/requests/issues/2927/events
|
https://github.com/psf/requests/pull/2927
| 122,287,925 |
MDExOlB1bGxSZXF1ZXN0NTM3MTIxMTU=
| 2,927 |
Prepare 2.9.0 release.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
] | null | 4 |
2015-12-15T14:48:11Z
|
2021-09-08T05:01:09Z
|
2015-12-15T15:26:56Z
|
MEMBER
|
resolved
|
This should contain the set of changes required for us to push 2.9.0 out the door.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2927/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2927/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2927.diff",
"html_url": "https://github.com/psf/requests/pull/2927",
"merged_at": "2015-12-15T15:26:56Z",
"patch_url": "https://github.com/psf/requests/pull/2927.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2927"
}
| true |
[
"@sigmavirus24 Can you confirm for me that you're happy with these release notes?\n",
"You also need to bump `__version__` and `__build__`. :)\n",
"@sigmavirus24 I intentionally did not do that, so that we have a single commit entitled `v2.9.0`, rather than a merge commit that hits it. =D\n",
"Got it. Well the rest of this looks excellent to me. Feel free to do that and ping me for the merge. :)\n"
] |
https://api.github.com/repos/psf/requests/issues/2926
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2926/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2926/comments
|
https://api.github.com/repos/psf/requests/issues/2926/events
|
https://github.com/psf/requests/pull/2926
| 121,768,348 |
MDExOlB1bGxSZXF1ZXN0NTM0MjkxMzg=
| 2,926 |
Refactor default params
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/624531?v=4",
"events_url": "https://api.github.com/users/bsamek/events{/privacy}",
"followers_url": "https://api.github.com/users/bsamek/followers",
"following_url": "https://api.github.com/users/bsamek/following{/other_user}",
"gists_url": "https://api.github.com/users/bsamek/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bsamek",
"id": 624531,
"login": "bsamek",
"node_id": "MDQ6VXNlcjYyNDUzMQ==",
"organizations_url": "https://api.github.com/users/bsamek/orgs",
"received_events_url": "https://api.github.com/users/bsamek/received_events",
"repos_url": "https://api.github.com/users/bsamek/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bsamek/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bsamek/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bsamek",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2015-12-11T18:48:38Z
|
2021-09-08T05:01:09Z
|
2015-12-11T22:13:53Z
|
CONTRIBUTOR
|
resolved
|
Specified the default argument for params that have a default in the docstring
so that the default is easier to see from the code. Modified the docstring in
api.py to match the docstring in sessions.py.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2926/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2926/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2926.diff",
"html_url": "https://github.com/psf/requests/pull/2926",
"merged_at": "2015-12-11T22:13:53Z",
"patch_url": "https://github.com/psf/requests/pull/2926.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2926"
}
| true |
[
"I also noticed that there may be a problem in test_requests.py. Lines 524-5 reference the httpbin_ca_bundle fixture, but there is no fixture by that name. If I am reading this correctly, this means that the test is passing in None instead of a cert path. Does this make sense?\n",
"Hey @bsamek! Thanks for the PR.\n\nI don't think we want to change the default parameters to the `request` method. We want to be able to discern between something that's a session setting and a request setting which we cannot do if we change that signature. We use [`merge_setting`](https://github.com/bsamek/requests/blob/master/requests/sessions.py#L49..L53) to make that determination and I don't think we want to break this.\n\nPlease revert that as the docstring changes _are_ desirable.\n",
"Reverted the changes to the parameters. Kept the docstring changes.\n\nShould I submit an issue regarding my earlier questions about the test?\n",
"There absolutely is such a fixture: https://github.com/kevin1024/pytest-httpbin/blob/master/pytest_httpbin/plugin.py#L40\n",
"Thanks again @bsamek !\n",
"Ohhhhhhhhh I see. Thanks @sigmavirus24 !\n",
"Anytime @bsamek :) Thanks for the PR!\n"
] |
https://api.github.com/repos/psf/requests/issues/2925
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2925/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2925/comments
|
https://api.github.com/repos/psf/requests/issues/2925/events
|
https://github.com/psf/requests/issues/2925
| 121,766,203 |
MDU6SXNzdWUxMjE3NjYyMDM=
| 2,925 |
Deadlock caused by local import in different thread.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/568697?v=4",
"events_url": "https://api.github.com/users/guojh/events{/privacy}",
"followers_url": "https://api.github.com/users/guojh/followers",
"following_url": "https://api.github.com/users/guojh/following{/other_user}",
"gists_url": "https://api.github.com/users/guojh/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/guojh",
"id": 568697,
"login": "guojh",
"node_id": "MDQ6VXNlcjU2ODY5Nw==",
"organizations_url": "https://api.github.com/users/guojh/orgs",
"received_events_url": "https://api.github.com/users/guojh/received_events",
"repos_url": "https://api.github.com/users/guojh/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/guojh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/guojh/subscriptions",
"type": "User",
"url": "https://api.github.com/users/guojh",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
},
{
"color": "e102d8",
"default": false,
"description": null,
"id": 117745,
"name": "Planned",
"node_id": "MDU6TGFiZWwxMTc3NDU=",
"url": "https://api.github.com/repos/psf/requests/labels/Planned"
}
] |
closed
| false | null |
[] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2024-05-19T18:29:04Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/34",
"id": 11073254,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/34/labels",
"node_id": "MI_kwDOABTKOs4AqPbm",
"number": 34,
"open_issues": 0,
"state": "open",
"title": "Bankruptcy",
"updated_at": "2024-05-20T14:37:16Z",
"url": "https://api.github.com/repos/psf/requests/milestones/34"
}
| 16 |
2015-12-11T18:35:38Z
|
2024-05-20T14:35:55Z
|
2024-05-20T14:35:55Z
|
NONE
| null |
requests version 2.8.1, installed from pip.
Code to reproduce:
``` bash
#!/bin/bash
cat > test_module.py <<EOF
import threading
import requests
def make_request():
print('before requests.get')
requests.get('https://github.com/kennethreitz/requests') # <-- program will get stuck here
print('after requests.get')
thread = threading.Thread(target=make_request)
thread.start()
thread.join()
EOF
python -c 'import test_module'
```
traceback from stuck point (I modified requests code to print stack):
```
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/threading.py", line 783, in __bootstrap
self.__bootstrap_inner()
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/threading.py", line 810, in __bootstrap_inner
self.run()
File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/threading.py", line 763, in run
self.__target(*self.__args, **self.__kwargs)
File "test_module.py", line 7, in make_request
requests.get('https://github.com/kennethreitz/requests')
File "/private/tmp/requests/v/lib/python2.7/site-packages/requests/api.py", line 69, in get
return request('get', url, params=params, **kwargs)
File "/private/tmp/requests/v/lib/python2.7/site-packages/requests/api.py", line 50, in request
response = session.request(method=method, url=url, **kwargs)
File "/private/tmp/requests/v/lib/python2.7/site-packages/requests/sessions.py", line 454, in request
prep = self.prepare_request(req)
File "/private/tmp/requests/v/lib/python2.7/site-packages/requests/sessions.py", line 375, in prepare_request
auth = get_netrc_auth(request.url)
File "/private/tmp/requests/v/lib/python2.7/site-packages/requests/utils.py", line 74, in get_netrc_auth
traceback.print_stack(); from netrc import netrc, NetrcParseError
```
This should be caused by deadlocking on import lock (imp.acquire_lock, imp.release_lock, imp.lock_held).
Possible duplication of:
https://github.com/kennethreitz/requests/issues/2649
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 1,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/2925/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2925/timeline
| null |
completed
| null | null | false |
[
"Why are you acquiring the import lock?\n",
"## You've provided none of the information requested in our contributing documentation.\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n",
"I edited original post, and added some information.\n",
"> Why are you acquiring the import lock?\n\n@guojh are you acquiring the lock manually in other code or are you positing that the behaviour you're seeing is caused by the import locks?\n",
"@sigmavirus24 No, no manually acquired lock. The bash script I pasted is all.\nIt easy to position the stuck by adding print statements around that line:\nhttps://github.com/kennethreitz/requests/blob/2d91365cba9ddca25ddd28f5c0ecd9497b8a2e24/requests/utils.py#L95\n\nIt seems that import lock is automatically acquired by CPython interpreter during import.\n\nCall stack of CPython 2.7 __import__ should look like this (not verified):\n\nbuiltin___import__\nhttps://github.com/python/cpython/blob/61ee2ece38e609a444575dda0302907fa7a2752f/Python/bltinmodule.c#L36\n\nPyImport_ImportModuleLevel\nhttps://github.com/python/cpython/blob/61ee2ece38e609a444575dda0302907fa7a2752f/Python/import.c#L2287\n\n_PyImport_AcquireLock\nhttps://github.com/python/cpython/blob/61ee2ece38e609a444575dda0302907fa7a2752f/Python/import.c#L292\n(reenterable lock)\n",
"Proof of CPython (2.7) import lock:\n\n``` bash\n#!/bin/bash\ncat > test_module_2.py <<EOF\nimport threading\ndef do_import():\n print('before import')\n import json # <-- program will get stuck here\n print('after import')\nthread = threading.Thread(target=do_import)\nthread.start()\nthread.join()\nEOF\npython -c 'import test_module_2'\n```\n",
"_Oh_, I understand. The problem is that we fire an import not at import time, which can apparently cause a deadlock. It seems to me that this would be a bug in CPython, wouldn't it? If you can deadlock the interpreter, that sounds like a bug in the CPython import logic. \n",
"@Lukasa It seems to have been resolved in CPython 3.3. (per-module import lock instead of global import lock)\nhttps://github.com/python/cpython/blob/d5adb7f65d30afd00921e6c22e9e2b8c323c058d/Doc/whatsnew/3.3.rst#a-finer-grained-import-lock\n",
"Proof of CPython (2.7) implicit import:\n\n``` bash\n#!/bin/bash\ncat > test_module_3.py <<EOF\nimport threading\ndef do_encode():\n print('before import')\n u''.encode('does-not-exist') # <-- program will get stuck here due to implicit import\n print('after import')\nthread = threading.Thread(target=do_encode)\nthread.start()\nthread.join()\nEOF\npython2.7 -c 'import test_module_3'\n```\n\nI think it's not easy to solve this on CPython 2.7.\nRequests needs to prevent all explicit and implicit local import.\n",
"Well, I mean, no: we just need to not do any late imports ourselves. In this case, it just means restructuring the `get_netrc_auth` function so that the import is done at global scope. \n",
"`get_netrc_auth` probably isn't the only such function though and there may be other places where a delayed import is necessary.\n\nThat said, @Lukasa, @guojh's point is that using `str.encode` and `str.decode` will cause this too because they also use delayed imports apparently (the last test module actually also works with a valid encoding that does exist too).\n\nThat said, the other work around is to defer the work in the module such that it doesn't happen at import time.\n",
"Is this bug still around? I think I just ran into this with Python 2.7.12 and requests 2.9.1",
"@julianfnunez I don't believe anyone has written something to fix delayed imports in Requests (which I'm not certain need sweeping changes) but the other problem is that there are other ways to trigger this outside of requests as well. Requests uses stdlib functionality that uses delayed imports and such as well. What makes you think this is affecting you?",
"I'm trying to do something very similar to the code snippet posted in the first comment. requests.get is locking the thread.\r\n\r\nPS: I realize this is not necessarily a problem with requests. I was just wondering if you guys did a workaround since this bug was in \"Planned\".",
"I just found this issue by chance on internet, and thought that it would be good to clarify what the problem really is.\r\n\r\nIn python 2.7 \"an import should not have the side effect of spawning a new thread and then waiting for that thread in any way\" ([source](https://docs.python.org/2.7/library/threading.html#importing-in-threaded-code)). That is why the examples in the previous comments fail, they execute `import blah`, which spawns and waits on a thread, which in turn imports a module -- hence leading to a deadlock. In other words, CPython is not to blame, but the code itself. The solution in all cases is either to run the code using `python blah.py` or removing the call to `thread.join`.",
"In an effort to clean up the issue tracker to only have issues that are still relevant to the project we've done a quick pass and decided this issue may no longer be relevant for a variety of potential reasons, including:\r\n\r\n* Applies to a much older version, unclear whether the issue still applies.\r\n* Change requires a backwards incompatible release and it's unclear if the benefits are worth the migration effort from the community.\r\n* There isn't a clear demand from the community on the change landing in Requests.\r\n\r\nIf you think the issue should remain open, please comment so below or open a new issue and link back to the original issue. Again, thank you for opening the issue and for the discussion, it's much appreciated."
] |
https://api.github.com/repos/psf/requests/issues/2924
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2924/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2924/comments
|
https://api.github.com/repos/psf/requests/issues/2924/events
|
https://github.com/psf/requests/issues/2924
| 121,574,861 |
MDU6SXNzdWUxMjE1NzQ4NjE=
| 2,924 |
Connections not recycled when stream=True and body not consumed
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1315417?v=4",
"events_url": "https://api.github.com/users/mirosval/events{/privacy}",
"followers_url": "https://api.github.com/users/mirosval/followers",
"following_url": "https://api.github.com/users/mirosval/following{/other_user}",
"gists_url": "https://api.github.com/users/mirosval/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mirosval",
"id": 1315417,
"login": "mirosval",
"node_id": "MDQ6VXNlcjEzMTU0MTc=",
"organizations_url": "https://api.github.com/users/mirosval/orgs",
"received_events_url": "https://api.github.com/users/mirosval/received_events",
"repos_url": "https://api.github.com/users/mirosval/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mirosval/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mirosval/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mirosval",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-12-10T20:53:14Z
|
2021-09-08T20:00:57Z
|
2015-12-10T21:10:21Z
|
NONE
|
resolved
|
Calling `response.close()` does not seem to release the connection back to the pool. I'm only using the headers or even just `status_code`, so the content doesn't get downloaded, I call `response.close()` but then the next request opens up a new connection.
Is there a reason for this? It seems that the documentation is insufficient at the very least, if this is intended behavior.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2924/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2924/timeline
| null |
completed
| null | null | false |
[
"If you don't consume all the data from the underlying connection, the connection actually _cannot_ be re-used. This is because requests has no way of signalling to the operating system that it doesn't care about the data already received except by closing the socket and throwing the connection away.\n\nDocumenting this limitation is possible, but given that it's a reality of how TCP connections function it's hard to justify I think. \n"
] |
https://api.github.com/repos/psf/requests/issues/2923
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2923/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2923/comments
|
https://api.github.com/repos/psf/requests/issues/2923/events
|
https://github.com/psf/requests/pull/2923
| 121,284,960 |
MDExOlB1bGxSZXF1ZXN0NTMxNDIzODU=
| 2,923 |
Add hint to :param verify.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/624531?v=4",
"events_url": "https://api.github.com/users/bsamek/events{/privacy}",
"followers_url": "https://api.github.com/users/bsamek/followers",
"following_url": "https://api.github.com/users/bsamek/following{/other_user}",
"gists_url": "https://api.github.com/users/bsamek/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bsamek",
"id": 624531,
"login": "bsamek",
"node_id": "MDQ6VXNlcjYyNDUzMQ==",
"organizations_url": "https://api.github.com/users/bsamek/orgs",
"received_events_url": "https://api.github.com/users/bsamek/received_events",
"repos_url": "https://api.github.com/users/bsamek/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bsamek/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bsamek/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bsamek",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-12-09T16:27:44Z
|
2021-09-08T05:01:10Z
|
2015-12-09T16:33:09Z
|
CONTRIBUTOR
|
resolved
|
It is not clear that :param verify defaults to True. The way the verify
portion of the docstring is written it looks like it defaults to False, and
you have to pass in True if you'd like the SSL cert to be verified, but the
opposite is the case.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2923/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2923/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2923.diff",
"html_url": "https://github.com/psf/requests/pull/2923",
"merged_at": "2015-12-09T16:33:09Z",
"patch_url": "https://github.com/psf/requests/pull/2923.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2923"
}
| true |
[
"Thanks @bsamek! :sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/2922
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2922/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2922/comments
|
https://api.github.com/repos/psf/requests/issues/2922/events
|
https://github.com/psf/requests/issues/2922
| 121,229,016 |
MDU6SXNzdWUxMjEyMjkwMTY=
| 2,922 |
Feature: please add a "status" field to response object which confirms to PEP 0333
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2693414?v=4",
"events_url": "https://api.github.com/users/et304383/events{/privacy}",
"followers_url": "https://api.github.com/users/et304383/followers",
"following_url": "https://api.github.com/users/et304383/following{/other_user}",
"gists_url": "https://api.github.com/users/et304383/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/et304383",
"id": 2693414,
"login": "et304383",
"node_id": "MDQ6VXNlcjI2OTM0MTQ=",
"organizations_url": "https://api.github.com/users/et304383/orgs",
"received_events_url": "https://api.github.com/users/et304383/received_events",
"repos_url": "https://api.github.com/users/et304383/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/et304383/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/et304383/subscriptions",
"type": "User",
"url": "https://api.github.com/users/et304383",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2015-12-09T12:44:12Z
|
2021-09-08T20:00:58Z
|
2015-12-09T12:47:23Z
|
NONE
|
resolved
|
According to this:
https://www.python.org/dev/peps/pep-0333/#the-start-response-callable
The response object must contain a status field with string values such as "200 OK" or "403 Forbidden" and so forth. If one is developing a Python WSGI compliant API then any framework used (such as falcon) is going to require a status field is set correctly.
I am deploying a python app behind WSGI (falcon) and I use the requests module to perform requests on behalf of the API consumer (it's a special use case to sign requests going to Amazon Elasticsearch using signature 4 that I did not want to try to write in JavaScript) and as a result I have to take the response object from requests and take values contained within to construct a WSGI compliant response.
Currently I have to concatenate the status_code and reason fields from the requests response object to in order to construct a PEP 0333 complaint response status. A field added to the requests response object would save this extra step.
This is a "nice to have", not a serious priority.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2922/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2922/timeline
| null |
completed
| null | null | false |
[
"Thanks for this request @eric-tucker!\n\nUnfortunately, Requests is in feature-freeze for the foreseeable future, which means that we're generally not implementing new features unless they represent a really compelling loss of function. Given that all the information you require is present, and that a helper function to build this information is exactly one line long, I'm afraid I don't think this feature is a priority for us at this time.\n\nSorry we can't be more help!\n",
"I understand, but can you elaborate on why you're in a \"feature-freeze\"?\n\nIs the product no longer actively developed?\n",
"Well, [our documentation](http://docs.python-requests.org/en/latest/dev/contributing/#feature-requests) talks about this a little bit, but allow me to elaborate.\n\nRequests is not a project that aims to have every feature that a user can imagine. From our perspective, tools like curl are better suited to that. The reason we don't want to add lots of features is because each feature added complicates the API, and our API is sacred to us. It is the foundation on which Requests is built, and our overwhelming primary concern.\n\nRequests is still actively developed: we will fix bugs as soon as they're reported, and add any glaring feature omissions if they're pointed out to us. But, fundamentally, we believe that Requests already covers 80% of what people need from it.\n\nGenerally speaking, then, we start at a -1 on any feature request. If the feature is both simple to implement outside of requests (like this one), required by relatively few users (also like this one), and has few-or-no pitfalls for unsuspecting users (also like this one), then our overwhelming inclination is to leave that feature out. Features represent an ongoing maintenance burden to the maintainers and a complexity cost to users, who need to navigate a maze of documentation of features they don't care about to find the few that they do. That's just not how we do things here.\n\nFor perspective, our feature freeze has been in place since v1.0.0 was released. Clearly, that has not stopped development of Requests. Don't worry, we're still here still supporting new Python versions and fixing bugs. We're just not willing to become libcurl for Python.\n",
"OK. \n\nAs long as I can be guaranteed that this:\n\n```\nresponse.status = str(return_result.status_code) + \" \" + return_result.reason\n```\n\nwill always give me a PEP 0333 compliant status then I'm fine with things as they are.\n\nThanks!\n",
"That will remain true as long as HTTP/2 is unsupported by Requests.\n\nHowever, in HTTP/2 there are no reason phrases. If we ever support HTTP/2, then `.reason` may return the empty string. Just a warning.\n"
] |
https://api.github.com/repos/psf/requests/issues/2921
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2921/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2921/comments
|
https://api.github.com/repos/psf/requests/issues/2921/events
|
https://github.com/psf/requests/issues/2921
| 121,055,364 |
MDU6SXNzdWUxMjEwNTUzNjQ=
| 2,921 |
Non-descript error: Gevent + Connection Pooling + SSL
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/505616?v=4",
"events_url": "https://api.github.com/users/jgoldberg/events{/privacy}",
"followers_url": "https://api.github.com/users/jgoldberg/followers",
"following_url": "https://api.github.com/users/jgoldberg/following{/other_user}",
"gists_url": "https://api.github.com/users/jgoldberg/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jgoldberg",
"id": 505616,
"login": "jgoldberg",
"node_id": "MDQ6VXNlcjUwNTYxNg==",
"organizations_url": "https://api.github.com/users/jgoldberg/orgs",
"received_events_url": "https://api.github.com/users/jgoldberg/received_events",
"repos_url": "https://api.github.com/users/jgoldberg/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jgoldberg/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jgoldberg/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jgoldberg",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2015-12-08T17:19:07Z
|
2021-09-08T18:00:56Z
|
2016-04-16T00:05:16Z
|
NONE
|
resolved
|
We're getting this error randomly from our production environment. I can't seem to reproduce locally yet. We use requests w/ Gevent, and I'm wondering if it's some kind of race condition where the socket isn't ready to use, but gets handed out by the connection pool. Still investigating, but figured I'd post this anyways. It would be nice to get a better error from pyopenssl.
We're using Requests 2.7.0, installed using pip.
```
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 508, in post
return self.request('POST', url, data=data, json=json, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 465, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 573, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 370, in send
timeout=timeout
File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/connectionpool.py", line 533, in urlopen
conn = self._get_conn(timeout=pool_timeout)
File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/connectionpool.py", line 239, in _get_conn
conn.close()
File "/usr/lib/python2.7/httplib.py", line 780, in close
self.sock.close() # close it manually... there may be other refs
File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/contrib/pyopenssl.py", line 213, in close
return self.connection.shutdown()
Error: []
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/505616?v=4",
"events_url": "https://api.github.com/users/jgoldberg/events{/privacy}",
"followers_url": "https://api.github.com/users/jgoldberg/followers",
"following_url": "https://api.github.com/users/jgoldberg/following{/other_user}",
"gists_url": "https://api.github.com/users/jgoldberg/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jgoldberg",
"id": 505616,
"login": "jgoldberg",
"node_id": "MDQ6VXNlcjUwNTYxNg==",
"organizations_url": "https://api.github.com/users/jgoldberg/orgs",
"received_events_url": "https://api.github.com/users/jgoldberg/received_events",
"repos_url": "https://api.github.com/users/jgoldberg/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jgoldberg/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jgoldberg/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jgoldberg",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2921/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2921/timeline
| null |
completed
| null | null | false |
[
"Hi there! Please provide additional information as requested in our [Contributing.md file](https://github.com/kennethreitz/requests/blob/master/CONTRIBUTING.md): specifically, **what version of Requests** you're using and **how you installed it**.\n",
"You've updated your original post to say requests 2.8.1 from pip. I'm afraid I don't believe that's right, because your traceback apparently raises from line 213 of `pyopenssl.py`. As of v2.8.1, [line 213 of `pyopenssl.py` is blank](https://github.com/kennethreitz/requests/blob/18a6b601100db978f3a6e191816456e75bc47e0f/requests/packages/urllib3/contrib/pyopenssl.py#L213).\n",
"You're right. Wrong stacktrace. This one is from 2.8.1.\n\n```\n File \"/usr/local/lib/python2.7/dist-packages/requests/sessions.py\", line 508, in post\n :param \\*\\*kwargs: Optional arguments that ``request`` takes.\n File \"/usr/local/lib/python2.7/dist-packages/requests/sessions.py\", line 465, in request\n 'allow_redirects': allow_redirects,\n File \"/usr/local/lib/python2.7/dist-packages/requests/sessions.py\", line 573, in send\n start = datetime.utcnow()\n File \"/usr/local/lib/python2.7/dist-packages/requests/adapters.py\", line 370, in send\n timeout=timeout\n File \"/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/connectionpool.py\", line 533, in urlopen\n\n File \"/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/connectionpool.py\", line 239, in _get_conn\n\n File \"/usr/lib/python2.7/httplib.py\", line 780, in close\n self.sock.close() # close it manually... there may be other refs\n File \"/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/contrib/pyopenssl.py\", line 213, in close\n\nError: []\n```\n",
"That stack trace _cannot_ be right: an empty line can't throw an exception. \n",
"In fact, that stack trace is bizarre: it involves several lines that are not code. Your system seems to be in a very bad way. \n",
"Yeah, don't know why that is.\n\nWe had the same error with Requests 2.7.0. The original stack trace up top is from 2.7.0.\n",
"Yeah when you've fixed your installation and have a stacktrace from 2.8.1. that makes sense, I'll take a look at this. I don't see a reason to do so before that if lines of the stacktrace are missing.\n"
] |
https://api.github.com/repos/psf/requests/issues/2920
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2920/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2920/comments
|
https://api.github.com/repos/psf/requests/issues/2920/events
|
https://github.com/psf/requests/issues/2920
| 120,875,099 |
MDU6SXNzdWUxMjA4NzUwOTk=
| 2,920 |
POST to S3 Bucket Returning 403
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7862740?v=4",
"events_url": "https://api.github.com/users/JKulakofsky/events{/privacy}",
"followers_url": "https://api.github.com/users/JKulakofsky/followers",
"following_url": "https://api.github.com/users/JKulakofsky/following{/other_user}",
"gists_url": "https://api.github.com/users/JKulakofsky/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/JKulakofsky",
"id": 7862740,
"login": "JKulakofsky",
"node_id": "MDQ6VXNlcjc4NjI3NDA=",
"organizations_url": "https://api.github.com/users/JKulakofsky/orgs",
"received_events_url": "https://api.github.com/users/JKulakofsky/received_events",
"repos_url": "https://api.github.com/users/JKulakofsky/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/JKulakofsky/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JKulakofsky/subscriptions",
"type": "User",
"url": "https://api.github.com/users/JKulakofsky",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 17 |
2015-12-07T21:45:21Z
|
2021-09-08T20:00:58Z
|
2015-12-07T23:40:47Z
|
NONE
|
resolved
|
I am writing a program that will POST a file to a website that is making me go through a few steps to get the file properly uploaded. In the first step, I perform a simple POST which alerts the site that a file will be uploaded and returns a URL to an S3 bucket, along with some more response content. The next step is to POST the file to the URL returned in step one, and this is where things break. An abbreviated version of my code is as follows:
```
url = 'https://myurl.s3.amazonaws.com/'
payload = {all of the content returned from step one, minus the URL}
file = {'file': open('C:\...\file_name.imscc', 'rb')}
r = requests.post(url, data=payload, files=file)
```
At this point the code waits for about 15-20 seconds and then gives back a 403 error. My initial thought was that the file was too large (about 18.58MB) so I tried to use the Multipart Encoder from the Requests Toolbelt. When I ran this code:
```
m = MultipartEncoder(fileds={the same values as above, 'file': (files[0], open('C:\Users\jonathan.kulakofsky\Documents\Code\Downloads\\' + files[0], 'rb'))})
```
I got a Traceback Error that ended with `'Connection aborted.', error(10054, 'An existing connection was forcibly closed by the remote host')`.
As far as I can tell, I am passing the proper authentication key back to Amazon with the rest of the fields in the payload. Does anyone have any ideas why I'm getting these errors?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7862740?v=4",
"events_url": "https://api.github.com/users/JKulakofsky/events{/privacy}",
"followers_url": "https://api.github.com/users/JKulakofsky/followers",
"following_url": "https://api.github.com/users/JKulakofsky/following{/other_user}",
"gists_url": "https://api.github.com/users/JKulakofsky/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/JKulakofsky",
"id": 7862740,
"login": "JKulakofsky",
"node_id": "MDQ6VXNlcjc4NjI3NDA=",
"organizations_url": "https://api.github.com/users/JKulakofsky/orgs",
"received_events_url": "https://api.github.com/users/JKulakofsky/received_events",
"repos_url": "https://api.github.com/users/JKulakofsky/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/JKulakofsky/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JKulakofsky/subscriptions",
"type": "User",
"url": "https://api.github.com/users/JKulakofsky",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2920/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2920/timeline
| null |
completed
| null | null | false |
[
"I _suspect_ that Amazon is returning a 4XX status code and then getting mad when we keep sending the data.\n\nDo you need to multipart encode this data, or do you just want to post only the bytes of the file?\n",
"I only need to post the file. I only tried the multipart encoder because the requests documentation has a line about `requests.post` failing for larger files, but doesn't state what qualifies as \"large.\"\n",
"Are you sure? You're definitely still posting the data, which suggests you do need the multipart encoding.\n\nIs there any documentation for this API anywhere?\n",
"I am not sure, I am a fairly new developer. Here is the documentation https://canvas.instructure.com/doc/api/file.file_uploads.html#method.file_uploads.post\n",
"The most interesting requirement here is that the file _must_ be the last element in the multipart form data. Otherwise, the streaming multipart encoder from the toolbelt is exactly the right thing to be using.\n\nTry changing your `fields` argument from a dictionary to a list of two-tuples, ensuring that the file parameter comes last, and don't bother providing any other arguments to the file. That would mean, for example:\n\n``` python\nm = MultipartEncoder(fields=[\n ('key1', 'value1'),\n ('key2', 'value2'),\n ('file', open(<path>, 'rb'))\n])\n```\n\nSee if that works better. Don't forget to set the headers [as in the example](https://toolbelt.readthedocs.org/en/latest/uploading-data.html#streaming-multipart-data-encoder).\n",
"It's certainly taking longer to process now, but it's still giving me a 403. Which has to be an update from the previous error 10054 I was getting when using the multipart encoder.\n",
"It's giving a 403? You're sure?\n",
"The next line in the program is `print r.status_code` and it says 403.\n",
"That's interesting @JKulakofsky. A forbidden means that you're not allowed to access that. Are you sure your constructing your request correctly? Is there a reason you can't show us exactly the code you're using?\n",
"Well, splitting a list along ':' is great until you hit the item in the list that is 'https://......'. I was splitting up the success action redirect URL which was throwing off the rest of the post. I fixed that issue and now I'm getting a 500 error. So... progress?\n",
"@sigmavirus24 The main reason I was avoiding posting the whole code was because it's long and dirty, but here you go:\n\n```\nimport requests\nimport os\nimport re\nfrom requests_toolbelt.multipart.encoder import MultipartEncoder\n\netlmig = '*******'\nauth = {'Authorization': 'Bearer ' + etlmig}\n\nfiles = os.listdir('C:\\Users\\jkulakofsky\\Documents\\Code\\Downloads')\n\nparameters = {'name': files[0]}\nr1 = requests.post('https://......', headers=auth, data=parameters)\nprint r1.status_code # This returns a 200\ncontent1 = r1.content\n\ncontentlist1 = content1.split('{')\nurl1 = contentlist1[1]\nurl2 = url1.split(',')\nurl3 = url2[0]\nurl4 = 'ht.+/'\nurl5 = re.compile(url4)\nurl6 = re.findall(url5,url3)\nurl = url6[0]\ncontentlist2 = contentlist1[2].split('}')\ncontentlist3 = contentlist2[0].split(',')\naws = contentlist3[0].split(':')\naws1 = aws[0]\naws1 = aws1[1:-1]\naws2 = aws[1]\naws2 = aws2[1:-1]\nfil = contentlist3[1].split(':')\nfil1 = fil[0]\nfil1 = fil1[1:-1]\nfil2 = ''\nkey = contentlist3[2].split(':')\nkey1 = key[0]\nkey1 = key1[1:-1]\nkey2 = key[1]\nkey2 = key2[1:-1]\nacl = contentlist3[3].split(':')\nacl1 = acl[0]\nacl1 = acl1[1:-1]\nacl2 = acl[1]\nacl2 = acl2[1:-1]\npol = contentlist3[4].split(':')\npol1 = pol[0]\npol1 = pol1[1:-1]\npol2 = pol[1]\npol2 = pol2[1:-1]\nsig = contentlist3[5].split(':')\nsig1 = sig[0]\nsig1 = sig1[1:-1]\nsig2 = sig[1]\nsig2 = sig2[1:-1]\nsuc = contentlist3[6].split('\"')\nsuc1 = suc[1]\nsuc2 = suc[3]\n\nm = MultipartEncoder(fields=[\n (key1, key2),\n (acl1, acl2),\n (fil1, fil2),\n (aws1, aws2),\n (pol1, pol2),\n (sig1, sig2),\n (suc1, suc2),\n ('file', open('C:\\Users\\jkulakofsky\\Documents\\Code\\Downloads\\\\' + files[0], 'rb'))\n])\nr2 = requests.post(url, data=m, headers={'Content-Type': m.content_type})\nprint r2.status_code # This returns a 500\n```\n",
"It looks like (from the documentation) that you are receiving JSON back. Can you try using `r1.json()` and accessing that like a dictionary instead of trying to parse out the values by hand?\n",
"(Also at this point, I'm very certain this problem isn't in requests and this support request is better suited for [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests).\n",
"OK, I'll move it over there. Thanks for your help, moving the multipart encoder from a dictionary to a list at least solved one of the issues. When I eventually get a solution working I'll post it here and close the thread.\n",
"I think when you start using `r1.json()` you'll be able to wrap this up quickly. StackOverflow will help you write this code though, which is not what we're here for. We're here to fix bugs in requests.\n",
"@JKulakofsky instead of using split across a url string, why not use the module urlparse (i think the relevant method is urlparse.urlsplit)?\n",
"@ueg1990 you're misunderstanding the problem.\n"
] |
https://api.github.com/repos/psf/requests/issues/2919
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2919/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2919/comments
|
https://api.github.com/repos/psf/requests/issues/2919/events
|
https://github.com/psf/requests/pull/2919
| 120,694,413 |
MDExOlB1bGxSZXF1ZXN0NTI4MDQwNDA=
| 2,919 |
[DRAFT] WIP in finding a solution to data encoding check
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/954858?v=4",
"events_url": "https://api.github.com/users/ArcTanSusan/events{/privacy}",
"followers_url": "https://api.github.com/users/ArcTanSusan/followers",
"following_url": "https://api.github.com/users/ArcTanSusan/following{/other_user}",
"gists_url": "https://api.github.com/users/ArcTanSusan/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ArcTanSusan",
"id": 954858,
"login": "ArcTanSusan",
"node_id": "MDQ6VXNlcjk1NDg1OA==",
"organizations_url": "https://api.github.com/users/ArcTanSusan/orgs",
"received_events_url": "https://api.github.com/users/ArcTanSusan/received_events",
"repos_url": "https://api.github.com/users/ArcTanSusan/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ArcTanSusan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArcTanSusan/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ArcTanSusan",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-12-07T04:10:39Z
|
2021-09-08T04:01:09Z
|
2016-04-06T19:13:48Z
|
CONTRIBUTOR
|
resolved
|
Use existing method `to_native_string()` as a check.
This is a first attempt to resolve https://github.com/kennethreitz/requests/issues/2638, so marking this PR as [DRAFT]. I'd appreciate feedback on this.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2919/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2919/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2919.diff",
"html_url": "https://github.com/psf/requests/pull/2919",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2919.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2919"
}
| true |
[
"Yeah, so how we approach this is basically dependent on whether we think it's acceptable for `to_native_string()` to go in `encode_params`. I personally think that's the best place for it, because it allows us to avoid duplicating the large amount of knowledge `encode_params` has about how this variable is structured. However, IIRC @sigmavirus24 originally objected to that idea so I'd like to hear from him.\n",
"This should be its own function that we can test thoroughly because I think there's a lot of room to do this wrong.\n\nIf we have similar checking in a lot of places, make it one function that we run often and that we test well. Our `_encode_*` methods are too large as it is and barely tested at all\n"
] |
https://api.github.com/repos/psf/requests/issues/2918
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2918/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2918/comments
|
https://api.github.com/repos/psf/requests/issues/2918/events
|
https://github.com/psf/requests/issues/2918
| 120,598,566 |
MDU6SXNzdWUxMjA1OTg1NjY=
| 2,918 |
Wrong content type on request.head() only
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6280042?v=4",
"events_url": "https://api.github.com/users/bestofothers/events{/privacy}",
"followers_url": "https://api.github.com/users/bestofothers/followers",
"following_url": "https://api.github.com/users/bestofothers/following{/other_user}",
"gists_url": "https://api.github.com/users/bestofothers/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bestofothers",
"id": 6280042,
"login": "bestofothers",
"node_id": "MDQ6VXNlcjYyODAwNDI=",
"organizations_url": "https://api.github.com/users/bestofothers/orgs",
"received_events_url": "https://api.github.com/users/bestofothers/received_events",
"repos_url": "https://api.github.com/users/bestofothers/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bestofothers/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bestofothers/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bestofothers",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-12-06T00:42:32Z
|
2021-09-08T20:00:59Z
|
2015-12-06T02:16:01Z
|
NONE
|
resolved
|
Using request.head() I want to get the content-type of a page before proceeding. Unfortunately for me, the request header content-type returns text/html. But if after completing the requests using request.get()
the content type is application/octet-stream.
I have suspected the web server but using urllib2 with a head request it return application/octet-stream
Here is my code and the exact url to reproduce the same error:
``` py
>>> import requests
>>> u = "http://holo-pack.com.tw/download.phpx?soft=ea70"
>>> r = requests.head(u)
>>> r.headers['content-type']
'text/html'
```
With request.get() the header is "application/octet-stream" so with urllib2 as seen here: pect https://filippo.io/send-a-head-request-in-python/
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2918/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2918/timeline
| null |
completed
| null | null | false |
[
"Please review the [CONTRIBUTING](https://github.com/kennethreitz/requests/blob/master/CONTRIBUTING.md#good-bug-reports) file that GitHub prompted you to review prior to filing this bug.\n\nYou've not told us what version of requests you're using or how you've installed it.\n",
"So in spite of not having the information here, I noticed that the blog post registers a redirect handler. It looks like your head request gets redirected. As such you need to pass `allow_redirects=True` to a `HEAD` request to make sure you follow them too with requests. When you follow it all the way to the bottom you get the content-type you expect.\n"
] |
https://api.github.com/repos/psf/requests/issues/2917
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2917/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2917/comments
|
https://api.github.com/repos/psf/requests/issues/2917/events
|
https://github.com/psf/requests/pull/2917
| 120,569,330 |
MDExOlB1bGxSZXF1ZXN0NTI3NTgwMjM=
| 2,917 |
requests/auth: Handle an empty 'qop' attribute in an Authenticate challenge
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2288718?v=4",
"events_url": "https://api.github.com/users/matt-jordan/events{/privacy}",
"followers_url": "https://api.github.com/users/matt-jordan/followers",
"following_url": "https://api.github.com/users/matt-jordan/following{/other_user}",
"gists_url": "https://api.github.com/users/matt-jordan/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/matt-jordan",
"id": 2288718,
"login": "matt-jordan",
"node_id": "MDQ6VXNlcjIyODg3MTg=",
"organizations_url": "https://api.github.com/users/matt-jordan/orgs",
"received_events_url": "https://api.github.com/users/matt-jordan/received_events",
"repos_url": "https://api.github.com/users/matt-jordan/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/matt-jordan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/matt-jordan/subscriptions",
"type": "User",
"url": "https://api.github.com/users/matt-jordan",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2015-12-05T16:21:42Z
|
2021-09-08T05:01:10Z
|
2015-12-06T04:48:52Z
|
CONTRIBUTOR
|
resolved
|
Some malfunctioning HTTP servers may return a qop directive with no token, as
opposed to correctly omitting the qop directive completely. For example:
```
header: WWW-Authenticate: Digest realm="foobar_api_auth", qop="",
nonce="a12059eaaad0b86ece8f62f04cbafed6", algorithm="MD5",
stale="false"
```
Prior to this patch, requests would respond with a 'None' Authorization header.
While the server is certainly incorrect, this patch updates requests to be
more tolerant to this kind of shenaniganry. If we receive an empty string for
the value of the qop attribute, or some tokens that are not recognized (that
is, not 'auth' or 'auth-int'), we instead treat that as if the qop attribute
was simply not provided.
Closes #2916
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2917/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2917/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2917.diff",
"html_url": "https://github.com/psf/requests/pull/2917",
"merged_at": "2015-12-06T04:48:52Z",
"patch_url": "https://github.com/psf/requests/pull/2917.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2917"
}
| true |
[
"Thanks for this!\n\nI think this change is more aggressive than we need actually. Is there any reason you didn't take the simpler path and just change the conditional to look for falsy values of `qop`?\n",
"Just got a bit trigger happy on trying to handle unrecognized values in the `qop` attribute. A much simpler PR is now up. :smile: \n",
"\\o/\n\nGiven that `None` is falsy, I think we can simplify this a little bit further to simply check `if not qop`.\n",
"Good point. Simpler patch incoming.\n",
"I pulled this down and ran the tests locally. All seems well on this end.\n"
] |
https://api.github.com/repos/psf/requests/issues/2916
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2916/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2916/comments
|
https://api.github.com/repos/psf/requests/issues/2916/events
|
https://github.com/psf/requests/issues/2916
| 120,522,857 |
MDU6SXNzdWUxMjA1MjI4NTc=
| 2,916 |
HTTPDigestAuth: Empty qop header in server challenge results in Authorization: None
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2288718?v=4",
"events_url": "https://api.github.com/users/matt-jordan/events{/privacy}",
"followers_url": "https://api.github.com/users/matt-jordan/followers",
"following_url": "https://api.github.com/users/matt-jordan/following{/other_user}",
"gists_url": "https://api.github.com/users/matt-jordan/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/matt-jordan",
"id": 2288718,
"login": "matt-jordan",
"node_id": "MDQ6VXNlcjIyODg3MTg=",
"organizations_url": "https://api.github.com/users/matt-jordan/orgs",
"received_events_url": "https://api.github.com/users/matt-jordan/received_events",
"repos_url": "https://api.github.com/users/matt-jordan/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/matt-jordan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/matt-jordan/subscriptions",
"type": "User",
"url": "https://api.github.com/users/matt-jordan",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2015-12-05T02:22:21Z
|
2021-09-08T20:00:59Z
|
2015-12-06T04:48:52Z
|
CONTRIBUTOR
|
resolved
|
This is a bit of a weird one, and I'm fairly certain the server is misbehaving slightly, but since I have less control over that:
When issuing a request with the following:
```
response = requests.post(url,
auth=HTTPDigestAuth('admin', 'foobar'))
```
The server returns back a challenge with the following `WWW-Authenticate` header:
```
reply: 'HTTP/1.1 401 Unauthorized\r\n'
header: Server: nginx
header: Date: Sat, 05 Dec 2015 02:08:43 GMT
header: Content-Type: text/html; charset=utf-8
header: Content-Length: 188
header: Connection: keep-alive
header: WWW-Authenticate: Digest realm="foobar_api_auth", qop="", nonce="a12059eaaad0b86ece8f62f04cbafed6", algorithm="MD5", stale="false"
```
Note the `qop` header with an empty value. That doesn't appear to be valid per RFC 2617:
> ```
> qop-value = "auth" | "auth-int" | token
> ```
That being said, RFC 2617 does specify that unrecognised values MUST be ignored. Instead, it looks like requests drops the entire authorization header:
```
if qop is None:
respdig = KD(HA1, "%s:%s" % (nonce, HA2))
elif qop == 'auth' or 'auth' in qop.split(','):
noncebit = "%s:%s:%s:%s:%s" % (
nonce, ncvalue, cnonce, 'auth', HA2
)
respdig = KD(HA1, noncebit)
else:
# XXX handle auth-int.
return None
```
This results in the following being sent back to the challenge:
```
send: 'POST /json HTTP/1.1\r\nHost: foo.com\r\nContent-Length: 94\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nUser-Agent: python-requests/2.8.1\r\nConnection: keep-alive\r\nContent-Type: application/json\r\nAuthorization: None\r\n\r\n{"data": "more}'
```
I'd be more than happy to work on a PR for this issue, but would like to confirm that a PR that updates the handling of `qop` to be tolerant to unrecognized tokens/characters would be acceptable. Arguably, the server should be omitting the `qop` header completely, but RFC compliance in requests would probably make it more tolerant to misbehaving servers like this one.
Thanks!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2916/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2916/timeline
| null |
completed
| null | null | false |
[
"Yeah, so the server is wrong here. However, I think we can get to a place where we roughly DTRT here, by assuming that if `qop` is any falsy value then we should enter the first block. That'll cover this case as well.\n\nI'd accept a PR that makes this change.\n",
"Awesome! I'll have something up this weekend.\n\nI did notice that httpbin doesn't really allow much control over the qop value - it will pretty much force it to be either `auth` or `auth-int`:\n\n```\[email protected]('/digest-auth/<qop>/<user>/<passwd>')\ndef digest_auth(qop=None, user='user', passwd='passwd'):\n \"\"\"Prompts the user for authorization using HTTP Digest auth\"\"\"\n if qop not in ('auth', 'auth-int'):\n qop = None\n```\n\nI'd be happy to write some unit tests that cover the PR, but I'd probably have to make httpbin more flexible as well. I'm kind of concerned about doing that - any unit tests that are written for requests that require a particular updated version of httpbin are bound to be brittle.\n\nStill, if you think it'd be useful, I can put up a PR for httpbin as well.\n",
"Yeah, for now we'll consider the tests a todo: we'll spin back to them when I spend time making more investment on our test suite. \n",
"Cool. If you'd ever like some help with that, let me know. I do some work on test suites from time to time :smile: \n"
] |
https://api.github.com/repos/psf/requests/issues/2915
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2915/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2915/comments
|
https://api.github.com/repos/psf/requests/issues/2915/events
|
https://github.com/psf/requests/pull/2915
| 120,354,789 |
MDExOlB1bGxSZXF1ZXN0NTI2NDEyOTQ=
| 2,915 |
Switch to alabaster sphinx theme in the docs
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/954858?v=4",
"events_url": "https://api.github.com/users/ArcTanSusan/events{/privacy}",
"followers_url": "https://api.github.com/users/ArcTanSusan/followers",
"following_url": "https://api.github.com/users/ArcTanSusan/following{/other_user}",
"gists_url": "https://api.github.com/users/ArcTanSusan/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ArcTanSusan",
"id": 954858,
"login": "ArcTanSusan",
"node_id": "MDQ6VXNlcjk1NDg1OA==",
"organizations_url": "https://api.github.com/users/ArcTanSusan/orgs",
"received_events_url": "https://api.github.com/users/ArcTanSusan/received_events",
"repos_url": "https://api.github.com/users/ArcTanSusan/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ArcTanSusan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ArcTanSusan/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ArcTanSusan",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
] | null | 20 |
2015-12-04T08:22:53Z
|
2021-09-08T05:01:04Z
|
2016-01-15T23:35:03Z
|
CONTRIBUTOR
|
resolved
|
Feedback is welcome! :grinning:
## Before with @kennethreitz's sphinx theme

## After with alabaster sphinx theme

|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2915/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2915/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2915.diff",
"html_url": "https://github.com/psf/requests/pull/2915",
"merged_at": "2016-01-15T23:35:03Z",
"patch_url": "https://github.com/psf/requests/pull/2915.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2915"
}
| true |
[
"\\o/ You are a :star: @onceuponatimeforever!\n\nFor posterity, this resolves #2779. Based on the quick screenshot provided above it looks like the only thing that gets lost here is that the Gumroad button isn't rendering out correctly.\n\n@kennethreitz, you'll need to review this because the look of the site is an important part of your personal branding, but FWIW I'm :+1: and totally delighted about it.\n",
"Ok, so @onceuponatimeforever, Kenneth's concern about ad-tracking codes is well-founded, they _are_ missing. You can check this by opening up the index.html on the current website and on the newly rendered one, then using the browser debugging tools to inspect the top-level `<body>` tag. Both documents contain the `<div class=\"related\">`, `<div class=\"document\">`, and `<div class=\"footer\">` content tags, which is great, but then the Alabaster version is missing a few things:\n- Gumroad JS file: \n \n ``` html\n <script type=\"text/javascript\" src=\"https://gumroad.com/js/gumroad.js\"></script>\n ```\n- Flattr JS:\n \n ``` html\n <script type=\"text/javascript\">\n /* <![CDATA[ */\n (function() {\n var s = document.createElement('script'), t = document.getElementsByTagName('script')[0];\n s.type = 'text/javascript';\n s.async = true;\n s.src = 'http://api.flattr.com/js/0.6/load.js?mode=auto';\n t.parentNode.insertBefore(s, t);\n })();\n /* ]]> */\n </script>\n ```\n- CloudFront JS:\n \n ``` html\n <script type=\"text/javascript\">\n setTimeout(function(){var a=document.createElement(\"script\");\n var b=document.getElementsByTagName(\"script\")[0];\n a.src=document.location.protocol+\"//dnn506yrbagrg.cloudfront.net/pages/scripts/0013/7219.js?\"+Math.floor(new Date().getTime()/3600000);\n a.async=true;a.type=\"text/javascript\";b.parentNode.insertBefore(a,b)}, 1);\n </script>\n ```\n- HelloBar:\n \n ``` html\n <script type=\"text/javascript\">\n new HelloBar(36402,48802);\n </script>\n ```\n- Google Analytics:\n \n ``` html\n <script type=\"text/javascript\">\n \n var _gaq = _gaq || [];\n _gaq.push(['_setAccount', 'UA-8742933-11']);\n _gaq.push(['_setDomainName', 'none']);\n _gaq.push(['_setAllowLinker', true]);\n _gaq.push(['_trackPageview']);\n \n (function() {\n var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true;\n ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js';\n var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s);\n })();\n \n </script>\n ```\n- Gauges\n \n ``` html\n <script type=\"text/javascript\">\n (function() {\n var t = document.createElement('script');\n t.type = 'text/javascript';\n t.async = true;\n t.id = 'gauges-tracker';\n t.setAttribute('data-site-id',\n '4ddc27f6613f5d186d000007');\n t.src = '//secure.gaug.es/track.js';\n var s = document.getElementsByTagName('script')[0];\n s.parentNode.insertBefore(t, s);\n })();\n </script>\n ```\n- Perfect Audience\n \n ``` html\n <script type=\"text/javascript\">\n (function() {\n window._pa = window._pa || {};\n _pa.productId = \"requests-docs\";\n var pa = document.createElement('script'); pa.type = 'text/javascript'; pa.async = true;\n pa.src = ('https:' == document.location.protocol ? 'https:' : 'http:') + \"//tag.perfectaudience.com/serve/5226171f87bc6890da0000a0.js\";\n var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(pa, s);\n })();\n </script>\n ```\n\nThese all need to get re-inserted back in. They previously all lived in the footer, and it should be possible to put them back into the Alabaster footer in some way.\n",
"@Lukasa: Thanks for spotting this problem! I've added back the pre-existing ad trackers in `/_templates/layout.html` (Click on \"Files Changed\" link at the top). [Reference](https://stackoverflow.com/questions/5585250/how-can-i-add-a-custom-footer-to-sphinx-documentation-restructuredtext)\n",
"Thanks @onceuponatimeforever! This looks good to me.\n\nI'm going to sit on it to confirm that @kennethreitz is happy with it. (I'm confident he will be.)\n",
"I'm also very much :+1: on this (although I meant to say this sooner).\n",
"This looks fantastic! The only thing that needs changing is removing the \"Powered by Sphinx and Alabaster\" text at the bottom. \n\nYou can do this with `show_powered_by=False`\n",
"We should also setup the GitHub banner! \n\n`github_banner=True`\n\n(requires that you set `github_user` and `github_repo`)\n",
"I'll likely make a few more tweaks once this is up :) we may need some `!important` css to fix the sizing of the turtle/logo so everything is centered underneath it again. I also prefer the white-colored warnings of the existing theme, but these are all things that can be fixed once we merge, I think :)\n",
"My 2₵, the red warning boxes are better since they're intended to standout to users.\n",
"I knew you were going to say that ;)\n\nI'd just like to find a way to make them a little less jarring.\n",
"> I knew you were going to say that ;)\n\n=P \n\n> I'd just like to find a way to make them a little less jarring.\n\nSure, that I can agree with.\n",
"@kennethreitz so do you want to fix these concerns up first or merge and then fix these up?\n",
"The `show_powered_by` is already added in the sphinx code. And so is the github banner. ~~I'll set the `'github_user': 'kennethreitz','github_repo': 'requests',` to display the github banner shortly.~~ \n\nUpdate: OK, screenshot updated!\n",
"@kennethreitz thoughts?\n",
"I would prefer to have the positioning/sizing of the turtle/logo fixed before we deploy this, if it's possible ;)\n",
"but other than that, :thumbsup: :thumbsup: :thumbsup:\n",
"I don't see the turtle logo being different than before, no css changes have been added on it in the new theme.\n",
"@onceuponatimeforever I think what @kennethreitz wants is for the column beneath it to be aligned with the `R` in `Requests` under the turtle logo. (At least that's roughly what it looks like where it was from the before picture)\n",
"I think a `margin-left: -20px;` or so would do it, on the logo. \n",
":sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/2914
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2914/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2914/comments
|
https://api.github.com/repos/psf/requests/issues/2914/events
|
https://github.com/psf/requests/issues/2914
| 119,928,744 |
MDU6SXNzdWUxMTk5Mjg3NDQ=
| 2,914 |
Upgrade urllib3 to overcome Error: [('SSL routines', 'SSL3_WRITE_PENDING', 'bad write retry')]
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/182938?v=4",
"events_url": "https://api.github.com/users/nkhalasi/events{/privacy}",
"followers_url": "https://api.github.com/users/nkhalasi/followers",
"following_url": "https://api.github.com/users/nkhalasi/following{/other_user}",
"gists_url": "https://api.github.com/users/nkhalasi/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nkhalasi",
"id": 182938,
"login": "nkhalasi",
"node_id": "MDQ6VXNlcjE4MjkzOA==",
"organizations_url": "https://api.github.com/users/nkhalasi/orgs",
"received_events_url": "https://api.github.com/users/nkhalasi/received_events",
"repos_url": "https://api.github.com/users/nkhalasi/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nkhalasi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nkhalasi/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nkhalasi",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2015-12-02T12:13:48Z
|
2021-09-08T20:00:57Z
|
2015-12-02T12:16:59Z
|
NONE
|
resolved
|
We upgrade our production systems to requests==2.8.1 today and ran into <pre> File "/home/xxxxxx/penv/local/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 559, in urlopen
body=body, headers=headers)
File "/home/xxxxxx/penv/local/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 353, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python2.7/httplib.py", line 979, in request
self._send_request(method, url, body, headers)
File "/usr/lib/python2.7/httplib.py", line 1013, in _send_request
self.endheaders(body)
File "/usr/lib/python2.7/httplib.py", line 975, in endheaders
self._send_output(message_body)
File "/usr/lib/python2.7/httplib.py", line 835, in _send_output
self.send(msg)
File "/usr/lib/python2.7/httplib.py", line 811, in send
self.sock.sendall(data)
File "/home/xxxxxx/penv/local/lib/python2.7/site-packages/requests/packages/urllib3/contrib/pyopenssl.py", line 220, in sendall
sent = self._send_until_done(data[total_sent:total_sent+SSL_WRITE_BLOCKSIZE])
File "/home/xxxxxx/penv/local/lib/python2.7/site-packages/requests/packages/urllib3/contrib/pyopenssl.py", line 206, in _send_until_done
return self.connection.send(data)
File "/home/xxxxxx/penv/local/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1271, in send
self._raise_ssl_error(self._ssl, result)
File "/home/xxxxxx/penv/local/lib/python2.7/site-packages/OpenSSL/SSL.py", line 1187, in _raise_ssl_error
_raise_current_error()
File "/home/xxxxxx/penv/local/lib/python2.7/site-packages/OpenSSL/_util.py", line 48, in exception_from_error_queue
raise exception_type(errors)
Error: [('SSL routines', 'SSL3_WRITE_PENDING', 'bad write retry')]</pre>
Digging into the error lead us to https://www.bountysource.com/issues/27417596-ssl3_write_pending-error-from-urllib3-contrib-pyopenssl-sendall and then to https://github.com/shazow/urllib3/pull/719
The fix is on the master branch of urllib3 and hence wondering if there is a possibility of upgrading urllib3 to latest version.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2914/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2914/timeline
| null |
completed
| null | null | false |
[
"Per our [release policy](http://docs.python-requests.org/en/latest/community/release-process/), we upgrade urllib3 on major releases, and only to active releases of urllib3. There has been no release of urllib3 since the last release of requests, and until such a release happens we cannot bring in urllib3. Note that \"upgrading urllib3 to the latest version\" will not help because the latest release does not contain that fix either.\n\nBefore you open a similar issue on urllib3, please note that I am also a core developer on urllib3. We are working towards releasing a new version _soon_, but I have no concrete timeline for when that will be. Please be patient. =)\n",
"Sure I understand. We will be watchful on urllib3 release as well. Until we will figure out a workaround.\n",
"FYI, Urllib 1.13 has been released\n",
"As has requests 2.9.0.\n",
"That's great! :-) Thank you \n"
] |
https://api.github.com/repos/psf/requests/issues/2913
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2913/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2913/comments
|
https://api.github.com/repos/psf/requests/issues/2913/events
|
https://github.com/psf/requests/issues/2913
| 119,841,152 |
MDU6SXNzdWUxMTk4NDExNTI=
| 2,913 |
If socket is none requests started creating a new connection
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/13664257?v=4",
"events_url": "https://api.github.com/users/vkosuri/events{/privacy}",
"followers_url": "https://api.github.com/users/vkosuri/followers",
"following_url": "https://api.github.com/users/vkosuri/following{/other_user}",
"gists_url": "https://api.github.com/users/vkosuri/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/vkosuri",
"id": 13664257,
"login": "vkosuri",
"node_id": "MDQ6VXNlcjEzNjY0MjU3",
"organizations_url": "https://api.github.com/users/vkosuri/orgs",
"received_events_url": "https://api.github.com/users/vkosuri/received_events",
"repos_url": "https://api.github.com/users/vkosuri/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/vkosuri/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vkosuri/subscriptions",
"type": "User",
"url": "https://api.github.com/users/vkosuri",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2015-12-02T00:37:21Z
|
2016-04-20T13:24:46Z
|
2015-12-02T08:08:58Z
|
NONE
| null |
Hi,
I am constantly seeing Resetting dropped connection for every URL,
Could you please let me know Why i am getting and how do i avoid such error message?
Here is python snippet
``` python
def __init__(self, base_url, username, password):
self.session = requests.Session()
self.session.auth = (username, password)
adapter = requests.adapters.HTTPAdapter(pool_connections=1, pool_maxsize=100, pool_block=True, max_retries=0)
self.session.mount('http://', adapter)
self.base_url = base_url
start_url = base_url + 'index.html'
_req = self.session.get(start_url)
self.cookie = _req.cookies
print self.cookie
def get_entry(self, url):
print self.cookie
_req = self.session.get(url, cookies=self.cookie)
print _req.headers
return _req
def post_entry(self, url, body):
header = {"Content-type":"application/json"}
print self.cookie
_req = self.session.post(url=url, data=body, cookies=self.cookie, headers=header)
print _req.headers
print _req.text
return _req
```
Thanks
Malli
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2913/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2913/timeline
| null |
completed
| null | null | false |
[
"I further debugged and I came to the socket information has, If socket is none, requests started creating a new connection\n\n``` Python\ndef is_connection_dropped(conn): # Platform-specific\n \"\"\"\n Returns True if the connection is dropped and should be closed.\n :param conn:\n :class:`httplib.HTTPConnection` object.\n Note: For platforms like AppEngine, this will always return ``False`` to\n let the platform handle connection recycling transparently for us.\n \"\"\"\n sock = getattr(conn, 'sock', False)\n if sock is False: # Platform-specific: AppEngine\n return False\n if sock is None: # Connection already closed (such as by httplib).\n return True\n```\n\nHowever the cookie information is common across all the connection\n\nIs socket information must be False every time?\n\nMy HTTP debug information\n\n``` HTTP\n<RequestsCookieJar[<Cookie session=4D51754C for 10.11.18.11/>]> \n\nconnection <requests.packages.urllib3.connection.HTTPConnection object at 0x032FD5F0> \n{'_HTTPConnection__state': 'Idle', '_buffer': [], '_create_connection': <function create_connection at 0x02ABF130>, '_tunnel_headers': {}, '_tunnel_host': None, 'sock': None, 'port': 80, 'strict': True, 'host': '10.11.18.11', '_tunnel_port': None, 'timeout': None, 'socket_options': [(6, 1, 1)], 'source_address': None, '_HTTPConnection__response': None, '_method': 'GET'} \n\nsend: 'GET test/one/data HTTP/1.1\\r\\nHost: 10.11.18.11\\r\\nAccept-Encoding: gzip, deflate\\r\\nAccept: */*\\r\\nUser-Agent: python-requests/2.8.1\\r\\nConnection: keep-alive\\r\\nCookie: session=4D51754C\\r\\nAuthorization: Basic QURNSU46UEFTU1dPUkQ=\\r\\n\\r\\n' reply: 'HTTP/1.0 200 OK\\r\\n' header: Server: xxx TinyServer header: MIME-version: 1.0 header: Cache-Control: no-store, no-cache; header: Pragma: no-cache; header: Set-Cookie: session=4D51754C; path=/; header: Content-Type: application/json header: Expires: -1 header: Content-Length: 2 {'Content-Length': '2', 'Set-Cookie': 'session=4D51754C; path=/;', 'Expires': '-1', 'Server': 'ADTRAN TinyServer', 'Pragma': 'no-cache;', 'Cache-Control': 'no-store, no-cache;', 'Content-Type': 'application/json', 'MIME-version': '1.0'}\n```\n\nThanks\nMalli\n",
"Thanks for this @vkosuri!\n\nThis is expected behaviour. Requests attempts to \"connection pool\", which means that it attempts to reuse sockets as frequently as possible. Sometimes, requests will come to try to re-use a socket and find that the underlying TCP connection got closed while it's away. When that happens, it'll reset the connection. We mention this in the logs because it's potentially important to debug very subtle issues.\n\nThe important things to know are:\n- this is a log message, not a warning or error message\n- it's logged at INFO level, which is fairly low\n\nIf you want to make the message go away, simply set the log level higher than INFO or disable logging for the `requests.packages.urllib3` logger.\n",
"@Lukasa How to disable Retrying warning messages?\n\n```\n[ WARN ] Retrying (Retry(total=9, connect=None, read=None, redirect=None)) after connection broken by 'NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x7fe2c71ad150>: Failed to establish a new connection: [Errno 111] Connection refused',)'\n```\n\nI am getting thousands of Warning messages in my testcase, due to this effect the log file size was increasing drastically. Is there any possibility to remove/disable warning message when NewConnectionError ?\n",
"@vkosuri That warning is raised by urllib3. If you want to disable that message, either handle retries yourself (stop passing the parameter into urllib3), or raise the log level for urllib3, or disable logging from urllib3 entirely. The logger you want to disable is the one at `requests.packages.urllib3.connectionpool`.\n",
"Thank you @Lukasa I am using [Robotframework-Requests](https://github.com/bulkan/robotframework-requests). I have added fallowing lines code to [Requests.py](https://github.com/bulkan/robotframework-requests/blob/master/src/RequestsLibrary/RequestsKeywords.py)\n\nStill it is not working? Am i missed anything? Could you please suggest me?\n\n``` Python\n# Disable warnings\nlogging.basicConfig() # you need to initialize logging, otherwise you will not see anything from requests\nlogging.getLogger().setLevel(logging.DEBUG)\nrequests_log = logging.getLogger(\"requests\")\nrequests_log.setLevel(logging.DEBUG)\nrequests_log.propagate = True\n```\n",
"That has _enabled_ logging, not disabled it. You need to set the level higher than that (e.g. to logging.ERROR).\n",
"Thanks @Lukasa It's working\n\n``` Python\nlogging.basicConfig() # you need to initialize logging, otherwise you will not see anything from requests\nlogging.getLogger().setLevel(logging.ERROR)\nrequests_log = logging.getLogger(\"requests\")\nrequests_log.setLevel(logging.ERROR)\nrequests_log.propagate = True\n```\n"
] |
https://api.github.com/repos/psf/requests/issues/2912
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2912/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2912/comments
|
https://api.github.com/repos/psf/requests/issues/2912/events
|
https://github.com/psf/requests/issues/2912
| 119,840,302 |
MDU6SXNzdWUxMTk4NDAzMDI=
| 2,912 |
Question mark eaten from request URL if no params specified
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/80741?v=4",
"events_url": "https://api.github.com/users/mloskot/events{/privacy}",
"followers_url": "https://api.github.com/users/mloskot/followers",
"following_url": "https://api.github.com/users/mloskot/following{/other_user}",
"gists_url": "https://api.github.com/users/mloskot/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mloskot",
"id": 80741,
"login": "mloskot",
"node_id": "MDQ6VXNlcjgwNzQx",
"organizations_url": "https://api.github.com/users/mloskot/orgs",
"received_events_url": "https://api.github.com/users/mloskot/received_events",
"repos_url": "https://api.github.com/users/mloskot/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mloskot/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mloskot/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mloskot",
"user_view_type": "public"
}
|
[
{
"color": "02e10c",
"default": false,
"description": null,
"id": 76800,
"name": "Feature Request",
"node_id": "MDU6TGFiZWw3NjgwMA==",
"url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request"
},
{
"color": "e102d8",
"default": false,
"description": null,
"id": 117745,
"name": "Planned",
"node_id": "MDU6TGFiZWwxMTc3NDU=",
"url": "https://api.github.com/repos/psf/requests/labels/Planned"
},
{
"color": "eb6420",
"default": false,
"description": null,
"id": 44501256,
"name": "Breaking API Change",
"node_id": "MDU6TGFiZWw0NDUwMTI1Ng==",
"url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2013-11-17T11:29:34Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": null,
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/20",
"id": 487518,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels",
"node_id": "MDk6TWlsZXN0b25lNDg3NTE4",
"number": 20,
"open_issues": 12,
"state": "open",
"title": "3.0.0",
"updated_at": "2024-05-19T18:43:00Z",
"url": "https://api.github.com/repos/psf/requests/milestones/20"
}
| 22 |
2015-12-02T00:30:13Z
|
2017-10-21T12:30:07Z
|
2017-05-11T22:36:57Z
|
NONE
| null |
Here is an URL with question mark included, but without any parameters:
```
url='http://server.com/api/item.json?'
```
That URL (w/ question mark) controls server response in such way that it allows to request certain metadata about an item, instead of representation of an item iteself:
```
import urllib3
http = urllib3.PoolManager()
r = http.request('GET', url)
print (r.data)
{ "ItemMetadata" : { ... }}
```
The above works well.
Requests, however, eat-or-skip the question mark if the `params` argument is missing/`None`/`{}`, so my request is responded with the resource representation, instead of the metadata:
```
from requests import get
r = get(url, stream=True)
print(r.text)
{ "Item" : { ... }}
```
That seem to indicate a bug in URL parsing.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/80741?v=4",
"events_url": "https://api.github.com/users/mloskot/events{/privacy}",
"followers_url": "https://api.github.com/users/mloskot/followers",
"following_url": "https://api.github.com/users/mloskot/following{/other_user}",
"gists_url": "https://api.github.com/users/mloskot/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mloskot",
"id": 80741,
"login": "mloskot",
"node_id": "MDQ6VXNlcjgwNzQx",
"organizations_url": "https://api.github.com/users/mloskot/orgs",
"received_events_url": "https://api.github.com/users/mloskot/received_events",
"repos_url": "https://api.github.com/users/mloskot/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mloskot/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mloskot/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mloskot",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2912/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2912/timeline
| null |
completed
| null | null | false |
[
"This is not a bug in URL parsing: quite the opposite.\n\nRequests does its best to normalise URLs where it is safe to do so. This is necessary to ensure that various stages of URL building work appropriately, and that we don't accidentally end up with invalid headers (e.g. multiple question marks or none at all when parameters are present). A trailing question mark with no parameters is superfluous (and arguably somewhat invalid), so requests kindly strips it off for you. This is entirely intentional.\n\nFor what it's worth, my opinion is that this API is poorly designed. It relies on the idea that no middle-box will rewrite that URL to remove the empty query part, which is a risk. Generally speaking, the better API would be to have a sub-resource or a proper query string (e.g. `/api/item.json?metadata=true`, or `/api/item/metadata.json`). This API design is _super_ fragile, and I guarantee that it'll be prone to breaking in mysterious ways. I doubt we're the only framework that strips it. In fact, anything that relies on Python's standard `urlsplit` function will probably do exactly what we do and ignore the query part.\n\n**However**, both browsers and curl will, if requested, send the empty query string. Having chatted to @bagder I don't think this is a deliberate decision on the part of those entities, so much as it just happens to fall out of the way they handle query strings.\n\nGiven that we have a duty to be better than the Python standard library, that means I think we should come up with a way to make it possible, at least when using the PreparedRequest API. Sadly, `urlsplit` makes this _really_ hard for us, because it doesn't have a difference in the query portion of the URL for `http://http2bin.org/get' and`http://http2bin.org/get?`. That makes our lives really tricky because I'd rather not hand roll URL parsing if I can possibly avoid it.\n\nI think the only way we can fix this is to change the library we use to parse URLs to one that can safely inform us of the difference between these two cases. @sigmavirus24, do you think your URL parsing library would do better here?\n\nI think this is a reasonable request, but we can't get to it before 3.0.0 because moving to a new URL parsing library will probably break a whole lot of stuff.\n",
"> This is not a bug in URL parsing: quite the opposite.\n\nI would appreciate clarification that would help me to understand the rules behind the URL parsers behaviour at large.\n\nIs that URL parsers interpret the following from [RFC 1738](https://tools.ietf.org/html/rfc1738), section 3.3, that the \"?\" (question mark) is _not_ part of the query string?\n\n```\nhttp://<host>:<port>/<path>?<searchpart>\n```\n\nDoes also the grammar in [RFC 3986](https://tools.ietf.org/html/rfc3986), Appendix A., indicate the \"?\" is _not_ part of the actual query string?\n\n```\nURI = scheme \":\" hier-part [ \"?\" query ] [ \"#\" fragment ]\n```\n\nIf so, for URL like `http://server.com/item.json?`, while applying the suggested (de|re)composition algorithm, i.e.:\n\n```\n if defined(query) then\n append \"?\" to result;\n append query to result;\n endif;\n```\n\nparsers rightly assume the query component as _undefined_, hence superfluous.\n\nIs my understanding correct?\n",
"Your understanding is correct, at least as far as the _standard library_ URL parser goes. That is not necessarily _generally_ true of URL parsers, which is why I asked @sigmavirus24 if his URL parser has the same behaviour or not. In particular, it's not clear to me that an absent query part is semantically identical to an _empty_ query part: Appendix A of RFC 3986 does appear to allow zero-length query strings.\n\nAgain, I think this is _strictly_ acceptable, but it's likely to be extremely fragile.\n",
"@Lukasa I see. Yes, I'm concerned about the _standard library_ URL parser, but also curious what is a common practice in other parsers. So, thanks for the clarifications so far.\n\nFrom your earlier response:\n\n> This API design is super fragile, and I guarantee that it'll be prone to breaking in mysterious ways.\n\nThat is exactly what worries me, so I'm trying to understand what are the _super fragile_ traits exactly and what are the ways the API may fails.\n",
"Sure. So, my reason for thinking it's fragile is mostly because this is clearly a grey area of the URL specification. URLs and URL normalisation are fraught topics, and as a result it's unwise to build an API that assumes that two semantically equivalent URLs with different representations will be transmitted without potentially being translated between those representations.\n\nWhat's not clear to me is whether the zero-length query string is one of them. I think this is because it's simply in a grey area: some entities treat it as different to no query string at all, others do not. For that reason it's _fragile_, not _wrong_: while it may work in many cases, I think it's likely that it'd be a source of subtle bugs.\n",
"@Lukasa I think we strip it but I can imagine a way to avoid that.\n",
"@Lukasa Thanks very much. I sympathise with your critique.\n",
"FYI,\n\nOut of curiosity, not to prove my point, I've probed a bit the landscape of HTTP client libraries.\n\nI've tried my target question-mark-powered API with popular clients for\n- Python (http.client, requests, urllib, urllib3)\n- Node (request, shred, unirest, urllib)\n- C# (WebRequest, WebClient, HttpClient, EasyHttp, RestSharp)\n- Go (http/net)\n- D (std.net.curl - powered by libcurl)\n\nHere is complete project https://github.com/mloskot/http-url-test\n\nThe result is: **4** of those libraries behave like the _Python Requests_:\n- Python: Requests\n- Node: [shred](https://www.npmjs.com/package/shred)\n- Go:\n - golang/go#13488\n - parnurzeal/gorequest#65\n\nIt makes a wild statistics of ~30% clients affected by a fragile question-mark-powered resource API.\n",
"I received some pointers to further details in RFC 3986 (see [Is question mark in URL part of query string?](http://stackoverflow.com/a/34104065/151641)). In particular, [6.2.3. Scheme-Based Normalization](http://tools.ietf.org/html/rfc3986#section-6.2.3) which somewhat applies to this issue and the \"question mark\" handling. Quoting:\n\n> Normalization should not remove delimiters when their associated component is empty unless licensed to do so by the scheme specification. For example, the URI http://example.com/? cannot be assumed to be equivalent to any of the examples above.\n\nWhere \"examples above\" refers to:\n\n> http://example.com\n> http://example.com/\n> http://example.com:/\n> http://example.com:80/\n",
"So I agree that we should be persisting the query part. The problem is that our supporting libraries don't make it clear whether there was no query part or just an empty one. \n",
"So [rfc3986](/sigmavirus24/rfc3986) now supports this use case as of 0.3.1.\n",
"FYI, here is an example of API similar to the one in my issue above:\n\nhttps://confluence.ucop.edu/display/Curation/ARK\n\n> a brief metadata record if you append a single question mark to the ARK\n> (...)\n> Adding '??' to the end should return a policy statement.\n",
"@mloskot Sadly, lots of people make bad decisions. =(\n",
"@mloskot constantly posting information here will not get this fixed any faster. You're merely spamming 781 people (- the people who have already unsubscribed).\n",
"@Lukasa Yes. That example surprised me and I thought I will share it as an example of bad design..\n\n@sigmavirus24 \n\n> constantly posting information here will not get this fixed any faster. \n\nYou are misreading my intention - I'm not trying to press anything at all.\nI'm simply sharing some further details which I believe are relevant to this issue.\nI'm sorry if I caused any annoyance.\n",
"Workaround: you could probably percent-encode the question mark.\n",
"I'd just like some way to remove the url that requests is inserting into my url. I never asked requests to insert a '?' into my url...\n",
"@internaught What code do you have that causes requests to insert a question-mark?\n",
"@Lukasa I'm simply doing this:\n\n``` python\nreturn requests.get(\n 'https://my-{}-api-url.com/api/0.1'.format(\n random.randrange(1, 9)),\n '/thing/my_thing/{}'.format(other_thing), headers=self.headers\n )\n```\n\nThanks for the quick reply!\n",
"@internaught You have a comma between the two parts of your URL. That means your second string is being passed to the `params` parameter of the requests.get call, which means requests assumes it's a query string. Replace the comma with a + and you'll be fine.\n",
"@Lukasa thanks for the help, working great now. Simple mistake, thanks!\n",
"I'll leave this here for the next guy bitten by this...\r\nThe mini-circuits ZTM boxes use a bare ? to query the state. https://www.minicircuits.com/softwaredownload/ztm_rcm.html\r\n\r\nI was able to work around the behavior of requests by putting a space after the question mark. "
] |
https://api.github.com/repos/psf/requests/issues/2911
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2911/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2911/comments
|
https://api.github.com/repos/psf/requests/issues/2911/events
|
https://github.com/psf/requests/issues/2911
| 119,794,371 |
MDU6SXNzdWUxMTk3OTQzNzE=
| 2,911 |
Can't use session proxy in its request for HTTPS protocol
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3279314?v=4",
"events_url": "https://api.github.com/users/FabriceSh44/events{/privacy}",
"followers_url": "https://api.github.com/users/FabriceSh44/followers",
"following_url": "https://api.github.com/users/FabriceSh44/following{/other_user}",
"gists_url": "https://api.github.com/users/FabriceSh44/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/FabriceSh44",
"id": 3279314,
"login": "FabriceSh44",
"node_id": "MDQ6VXNlcjMyNzkzMTQ=",
"organizations_url": "https://api.github.com/users/FabriceSh44/orgs",
"received_events_url": "https://api.github.com/users/FabriceSh44/received_events",
"repos_url": "https://api.github.com/users/FabriceSh44/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/FabriceSh44/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/FabriceSh44/subscriptions",
"type": "User",
"url": "https://api.github.com/users/FabriceSh44",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 39 |
2015-12-01T19:55:01Z
|
2021-09-08T20:01:01Z
|
2015-12-02T21:38:31Z
|
NONE
|
resolved
|
I've been struggling with my company proxy to make an https request.
import requests
from requests.auth import HTTPProxyAuth
proxy_string = 'http://user:password@url_proxt:port_proxy'
s = requests.Session()
s.proxies = {"http": proxy_string , "https": proxy_string}
s.auth = HTTPProxyAuth(user,password)
r = s.get('http://www.google.com') # OK
print(r.text)
r = s.get('https://www.google.com',proxies={"http": proxy_string , "https": proxy_string}) #OK
print(r.text)
r = s.get('https://www.google.com') # KO
print(r.text)
When KO, I have the following exception :
HTTPSConnectionPool(host='www.google.com', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 407 Proxy Authentication Required',)))
I looked online but didn't find someone having this specific issue with HTTPS.
Thank you for your time
Description of issue here :
http://stackoverflow.com/questions/34025964/python-requests-api-using-proxy-for-https-request-get-407-proxy-authentication-r
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2911/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2911/timeline
| null |
completed
| null | null | false |
[
"Have you tried the same request without setting `s.auth`?\n",
"Just did. Same result : KO\n",
"Hmm. I wonder if this is connection pooling related. @Bl4ckc4t, are you comfortable with wireshark?\n",
"I would like to avoid that. Being in a corporate environment and not knowing Wireshark that well, I'm afraid to transmit info of my company that will most probably get me fired. Is there any other way? \nWould it be possible for example to transmit by default in the code the session proxies setup like I did in example 2 to the session.get() requests ?\n",
"@Bl4ckc4t What I'm worried about is that this may be interacting with our connection re-use logic. This seems the most likely cause of the problem.\n",
"Can you tell me what you are looking for in the wireshark capture and I will try to get it for you without the dump - if you think it's possible.\n",
"I'm interested to see, in the case of the second request, whether a new TCP connection and CONNECT request are made or whether we re-use the old one. If a new one is made, I want to see if it has the Proxy-Authorization header in place.\n",
"On the second request, an new tcp connection is made (I guessed that because it uses another port). Inside the Hypertext Transfer Protocol, the proxy authorization header is correctly set. I looked at the third one and it doesn't contain it.\n",
"@Bl4ckc4t Does the third one use a new TCP connection, or the same one?\n",
"New one.\n",
"Ok, so that's interesting. What version of requests are you using and where did you get it?\n",
"requests==2.8.1 on Windows.\nDon't remember installing it specifically so I guess I got it either from python 3.4 installation or with a pip install that had this module as dependency.\n",
"So, I think the issue here is that we're not correct re-applying the proxy headers in this case.\n",
"My real issue is that I'm using a library using this mechanism and failing to request. I can't change s.get(url) to s.get(url,proxy). Do you see any workaround where I would change the session object state in order to force proxy usage at every request of this session ? If not, I will wait for the fix. \n\nIn any case, thank you very much for the time you spend on this issue.\n",
"Wait, @Bl4ckc4t, my understanding is that you _are_ using the `Session` each time, we're just not correctly attaching the headers. You can actually temporarily fix this problem by not using a `Session`.\n\nI'm trying to get an exact reproduction of this problem on my own system. Right now, I'm getting connections that correctly re-use the established tunnel, which is not quite right.\n",
"I'm actually using this library : https://pypi.python.org/pypi/jira\nJIRA contructor create a session and following request on its object only use get method.\nI'm not able (at least I don't know how) to change the implementation, switching from Session to Request or set the proxy at each session request.\n",
"@Bl4ckc4t Can you check whether the response to the second request sends the `Connection` header, and if so, to what value? I'm trying to work out why the connection is going away.\n",
"So far, I'm unable to reproduce this: the new TCP connections correctly have the Proxy-Authorization header set.\n",
"Here is the overview of the dump.\n\n\"Protocol\",\"No.\",\"Info\"\n\"HTTP\",\"106\",\"Continuation or non-HTTP traffic\"\n\"TCP\",\"163\",\"55638 > http-alt [SYN] Seq=0 Win=8192 Len=0 MSS=1460 WS=8 SACK_PERM=1\"\n\"TCP\",\"166\",\"55638 > http-alt [ACK] Seq=1 Ack=1 Win=64860 Len=0\"\n\"HTTP\",\"167\",\"GET http://www.google.com/ HTTP/1.1 \"\n\"TCP\",\"172\",\"55638 > http-alt [ACK] Seq=215 Ack=4081 Win=64860 Len=0\"\n\"TCP\",\"177\",\"55638 > http-alt [ACK] Seq=215 Ack=8220 Win=64860 Len=0\"\n\"TCP\",\"178\",\"53650 > http-alt [ACK] Seq=1 Ack=61 Win=64380 Len=0\"\n,\"TCP\",\"506\",\"[TCP Keep-Alive] 55636 > http-alt [ACK] Seq=1 Ack=1 Win=64860 Len=1\"\n,\"TCP\",\"607\",\"[TCP Keep-Alive] 55636 > http-alt [ACK] Seq=1 Ack=1 Win=64860 Len=1\"\n,\"TCP\",\"624\",\"55651 > http-alt [SYN] Seq=0 Win=8192 Len=0 MSS=1460 WS=8 SACK_PERM=1\"\n,\"TCP\",\"626\",\"55651 > http-alt [ACK] Seq=1 Ack=1 Win=64860 Len=0\"\n,\"HTTP\",\"627\",\"GET http://www.google.com/ HTTP/1.1 \"\n,\"TCP\",\"634\",\"55651 > http-alt [ACK] Seq=509 Ack=5489 Win=64860 Len=0\"\n,\"TCP\",\"637\",\"55651 > http-alt [ACK] Seq=509 Ack=7652 Win=64860 Len=0\"\n,\"HTTP\",\"798\",\"Continuation or non-HTTP traffic\"\n,\"HTTP\",\"799\",\"Continuation or non-HTTP traffic\"\n,\"TCP\",\"803\",\"53650 > http-alt [ACK] Seq=296 Ack=337 Win=64104 Len=0\"\n,\"TCP\",\"819\",\"[TCP Keep-Alive] 55636 > http-alt [ACK] Seq=1 Ack=1 Win=64860 Len=1\"\n,\"TCP\",\"1015\",\"55653 > http-alt [SYN] Seq=0 Win=8192 Len=0 MSS=1460 WS=8 SACK_PERM=1\"\n,\"TCP\",\"1017\",\"55653 > http-alt [ACK] Seq=1 Ack=1 Win=64860 Len=0\"\n,\"TCP\",\"1018\",\"[TCP segment of a reassembled PDU]\"\n,\"HTTP\",\"1021\",\"CONNECT www.google.com:443 HTTP/1.0 \"\n,\"TCP\",\"1024\",\"55653 > http-alt [ACK] Seq=40 Ack=1125 Win=63737 Len=0\"\n,\"TCP\",\"1372\",\"[TCP Keep-Alive] 55636 > http-alt [ACK] Seq=1 Ack=1 Win=64860 Len=1\"\n\nBreakpoint before request 2 : packet number 178\nBreakpoint before request 3 : packet number 637\n\nNo proxy auth on 1021\nProxy auth on 167 and 627\n\ni filtered on my ip and ip dest.\n\nWhich packet number do you want me to check connection header?\n",
"Hang on, hang on.\n\nI see two HTTP requests here, and one HTTPS, but your code above makes two HTTPS requests and one HTTP. Are you sure this behaviour is right? Did the first request get redirected? (e.g. what's the value of r.history for the first request)\n",
"Right, sorry, I messed up my test when rebuild it (lost it by mistake), let me make you a new one.\n",
"So new one :\n\n\"Protocol\",\"No.\",\"Info\"\n\"TCP\",\"9496\",\"62863 > http-alt [SYN] Seq=0 Win=8192 Len=0 MSS=1460 WS=8 SACK_PERM=1\"\n\"TCP\",\"9498\",\"62863 > http-alt [ACK] Seq=1 Ack=1 Win=64860 Len=0\"\n\"HTTP\",\"9499\",\"GET http://ctldl.windowsupdate.com/msdownload/update/v3/static/trustedr/en/disallowedcertstl.cab?248b33e67353bb0c HTTP/1.1 \"\n\"TCP\",\"9502\",\"62863 > http-alt [ACK] Seq=238 Ack=1209 Win=63653 Len=0\"\n\"TCP\",\"9503\",\"62863 > http-alt [FIN, ACK] Seq=238 Ack=1209 Win=63653 Len=0\"\n\"TCP\",\"9504\",\"62864 > http-alt [SYN] Seq=0 Win=8192 Len=0 MSS=1460 WS=8 SACK_PERM=1\"\n\"TCP\",\"9507\",\"62864 > http-alt [ACK] Seq=1 Ack=1 Win=64860 Len=0\"\n\"TCP\",\"9508\",\"[TCP segment of a reassembled PDU]\"\n\"TCP\",\"9509\",\"[TCP segment of a reassembled PDU]\"\n\"TCP\",\"9512\",\"[TCP segment of a reassembled PDU]\"\n\"TCP\",\"9513\",\"[TCP segment of a reassembled PDU]\"\n\"TCP\",\"9514\",\"[TCP segment of a reassembled PDU]\"\n\"HTTP\",\"9515\",\"GET http://ctldl.windowsupdate.com/msdownload/update/v3/static/trustedr/en/disallowedcertstl.cab?248b33e67353bb0c HTTP/1.1 \"\n\"TCP\",\"9624\",\"[TCP ACKed lost segment] 62864 > http-alt [FIN, ACK] Seq=7643 Ack=957 Win=63905 Len=0\"\n\"TCP\",\"16269\",\"62865 > http-alt [SYN] Seq=0 Win=8192 Len=0 MSS=1460 WS=8 SACK_PERM=1\"\n\"TCP\",\"16271\",\"62865 > http-alt [ACK] Seq=1 Ack=1 Win=64860 Len=0\"\n\"HTTP\",\"16272\",\"GET http://www.google.com/ HTTP/1.1 \"\n\"TCP\",\"16743\",\"62865 > http-alt [ACK] Seq=215 Ack=4081 Win=64860 Len=0\"\n\"TCP\",\"16748\",\"62865 > http-alt [ACK] Seq=215 Ack=8224 Win=64860 Len=0\"\n\"TCP\",\"52594\",\"62866 > http-alt [SYN] Seq=0 Win=8192 Len=0 MSS=1460 WS=8 SACK_PERM=1\"\n\"TCP\",\"52596\",\"62866 > http-alt [ACK] Seq=1 Ack=1 Win=64860 Len=0\"\n\"TCP\",\"52597\",\"[TCP segment of a reassembled PDU]\"\n\"HTTP\",\"52606\",\"CONNECT www.google.com:443 HTTP/1.0 \"\n\"TCP\",\"53070\",\"62866 > http-alt [ACK] Seq=89 Ack=40 Win=64821 Len=0\"\n\"TLSv1.2\",\"54122\",\"Client Hello\"\n\"TCP\",\"54128\",\"62866 > http-alt [ACK] Seq=606 Ack=3541 Win=64860 Len=0\"\n\"TLSv1.2\",\"54129\",\"Client Key Exchange, Change Cipher Spec, Encrypted Handshake Message\"\n\"TLSv1.2\",\"54131\",\"Application Data\"\n\"TCP\",\"54338\",\"62866 > http-alt [ACK] Seq=1247 Ack=11763 Win=64860 Len=0\"\n\"HTTP\",\"66776\",\"Continuation or non-HTTP traffic\"\n\"TCP\",\"82001\",\"62868 > http-alt [SYN] Seq=0 Win=8192 Len=0 MSS=1460 WS=8 SACK_PERM=1\"\n\"TCP\",\"82003\",\"62868 > http-alt [ACK] Seq=1 Ack=1 Win=64860 Len=0\"\n\"TCP\",\"82004\",\"[TCP segment of a reassembled PDU]\"\n\"HTTP\",\"82312\",\"CONNECT www.google.com:443 HTTP/1.0 \"\n\"TCP\",\"82365\",\"62868 > http-alt [ACK] Seq=40 Ack=1125 Win=63737 Len=0\"\n\nBreakpoint before request 2 : packet number 16748\nBreakpoint before request 3 : packet number 54338\n",
"And, again, to clarify: the third request for google.com is the one that has no Proxy-Authorization header?\n",
"3rd request , packet 82312 - no Proxy Authorization Header\npackets 52606 and 16272 have it .\n",
"hmm. That's interesting: it doesn't look like the connection is being closed, but it's not available to the pool either.\n\nCan you verify something for me: can you right-click on each of the new SYNs that contain requests and hit \"follow TCP stream\"? Just check that the Proxy-Authorization header didn't come in a later packet, because it does on my machine.\n",
"1st SYN TCP Stream :\nGET http://www.google.com/ HTTP/1.1\nHost: www.google.com\nConnection: keep-alive\nAccept-Encoding: gzip, deflate\nAccept: _/_\nProxy-Authorization: [Id1]\nUser-Agent: python-requests/2.8.1\n\nHTTP/1.1 200 OK\nHeaders data\n\nBINARY\n\n---\n\n2nd SYN TCP Stream :\nCONNECT www.google.com:443 HTTP/1.0\nProxy-Authorization: [Id1]\nHTTP/1.1 200 Connection established\n\nBINARY\n\n---\n\n3rd SYN TCP Stream:\nCONNECT www.google.com:443 HTTP/1.0\nHTTP/1.1 407 Proxy Authentication Required\n\nHeader on authentification\n\nHTML code showing our access denied page\n",
"Hmm. Right now I'm totally short on exactly why this is happening. The TCP stream for the second response: does it get terminated? (RST or FIN packets)\n",
"Don't see anything like that. \n",
"So, my question is why there's a second connection at all. If the connection is still up, there's no reason for us to have thrown it away as far as I can tell. I'm extremely perplexed as to why the connection is not being re-used. With the proxy I have on my local machine (Charles Proxy), we quite happily re-use that same TCP connection, and if we _don't_ re-use it (because it got torn down) we create a new one with the new headers.\n\nFor some insane reason, one part of requests believes that the old connection is still being used, and another part believes that it's not, and I'm not sure why yet.\n\nI'm going to take a quick look at Python 3.4's `http.client` module to see if I can find anything in there that would cause this.\n",
"I cannot see anything that immediately suggests that this problem would occur. Only if a second call to `set_tunnel` was made, somehow losing the headers, would that occur. Unfortunately, without a reproduction scenario that I can reach it's likely to be quite tricky to trace this problem.\n\nIt would be interesting if you could confirm that we never call `http.client.HTTPConnection.set_tunnel` twice on a connection (e.g. by adjusting your local copy to assert that `self._tunnel_host` is always `None` when called. That would be a start.\n"
] |
https://api.github.com/repos/psf/requests/issues/2910
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2910/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2910/comments
|
https://api.github.com/repos/psf/requests/issues/2910/events
|
https://github.com/psf/requests/issues/2910
| 119,755,535 |
MDU6SXNzdWUxMTk3NTU1MzU=
| 2,910 |
Expired SSL certificates don't raise an error
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/62745?v=4",
"events_url": "https://api.github.com/users/spookylukey/events{/privacy}",
"followers_url": "https://api.github.com/users/spookylukey/followers",
"following_url": "https://api.github.com/users/spookylukey/following{/other_user}",
"gists_url": "https://api.github.com/users/spookylukey/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/spookylukey",
"id": 62745,
"login": "spookylukey",
"node_id": "MDQ6VXNlcjYyNzQ1",
"organizations_url": "https://api.github.com/users/spookylukey/orgs",
"received_events_url": "https://api.github.com/users/spookylukey/received_events",
"repos_url": "https://api.github.com/users/spookylukey/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/spookylukey/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/spookylukey/subscriptions",
"type": "User",
"url": "https://api.github.com/users/spookylukey",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2015-12-01T16:32:34Z
|
2021-09-08T20:01:02Z
|
2015-12-01T19:26:59Z
|
NONE
|
resolved
|
```
>>> requests.get('https://expired.badssl.com/', verify=True)
<Response [200]>
```
I'm expecting an SSL exception of some kind here, like it does for https://wrong.host.badssl.com/
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2910/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2910/timeline
| null |
completed
| null | null | false |
[
"Please have a read of our [contributing file](https://github.com/kennethreitz/requests/blob/master/CONTRIBUTING.md), which GitHub already asked you to do. Please then provide the information requested within: particularly, tell us **what version of Requests you're using** and **how you installed it**. Please also let me know what OS you're using, whether you have `pyopenssl`, `pyasn1`, `ndg-httpsclient`, or `certifi` installed, and what version of OpenSSL you are using.\n\nThe reason I ask all this is because I cannot reproduce this on my machine: my machine correctly fails to validate that certificate.\n",
"This is most recent requests, 2.8.1, with Python 2.7\n\nIt was installed as per the installation instructions here:\n\nhttps://github.com/kennethreitz/requests#installation\nOR\nhttp://docs.python-requests.org/en/latest/user/install/#install\n\ninto a virtualenv, with nothing additional. I assumed from the installation instructions that nothing additional was necessary, and even looking through the docs (e.g. http://docs.python-requests.org/en/latest/user/advanced/#ssl-cert-verification ) I'm struggling to work out what else is necessary for correct SSL validation. I installed certifi and that made no difference.\n\nLooking further into urllib3 docs (https://urllib3.readthedocs.org/en/latest/security.html#openssl-pyopenssl ), if I do \"pip install pyopenssl ndg-httpsclient pyasn1\", then I get the expected failure.\n\nI tried it on two machines - Linux Mint 17 and Ubuntu 12.04 - so I assumed this was not just a fluke with a bad setup, sorry for not providing enough information.\n",
"Aha, ok. So, `badssl.com` uses virtual hosting to provide multiple sub domains on one server. To do this in TLS requires an extension called SNI: Server Name Indication. Because the versions of Linux you are using are relatively old, they do not have the capability to do SNI using the standard library alone. Requests can use the three extra packages, if they're available, to do SNI: this is why adding them fixed the problem. \n\nUnfortunately, there's no way to tell the difference between a failure because of a lack of SNI and a failure because of a bad certificate. Confusingly, `badssl.com` serves a _valid_ wildcard certificate as its default, which means that the subdomains all mysteriously succeed.\n\nRegardless, requests does the best it can out of the box. For a better experience, try Python 3.3+ or 2.7.9+, which can do SNI with the standard library alone. \n",
"I really think this shouldn't be closed - surely a docs fix is needed? I followed the installation instructions, and yet had an installation that was subtly but importantly broken.\n",
"@spookylukey what do you want the installation instructions to say? \"Make sure you use an operating system that ships versions of Python and OpenSSL that have these features or satisfy these minimum requirements\"?\n",
"@sigmavirus24 - something like that, preferably with some information about the additional packages that are necessary, or under what (known) conditions you'll need to do more work. For a library that is explicitly \"HTTP for humans\", and has \"Browser-style SSL Verification\" high up on its list of features, this seems fairly important. The choice to rely on Python/OS for some of its functionality is a choice made by this library, and the user shouldn't be expected to know all the details.\n",
"the python module ssl has a flag, ssl.HAS_SNI, that can be used to warn\nduring installation/runtime.\n\nOn 2015-12-01 20:25, Ian Cordasco wrote:\n\n> @spookylukey https://github.com/spookylukey what do you want the\n> installation instructions to say? \"Make sure you use an operating\n> system that ships versions of Python and OpenSSL that have these\n> features or satisfy these minimum requirements\"?\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2910#issuecomment-161085173.\n",
"Warning during installation is not possible: wheels don't run code at install time, and Python packaging strongly prefers them. \n\nWe can warn at runtime, but I'm nervous about adding too many more warnings: users have complained bitterly about the ones we already have, that serve a security purpose. \n\n@spookylukey Note that the specific failure mode you encountered here was extremely uncommon, specifically because it was a _success_ mode. `badssl.com` served you a _valid_ certificate for the request you made! This is the rarest kind of website: one that uses SNI to deliberately _break_ your connection. Arguably, `badssl.com` needs a feature request to ensure that connections without SNI _fail_ outright!\n\nRegardless, for the _common_ failure mode we have a [section of our FAQ](http://docs.python-requests.org/en/latest/community/faq/). When it comes to adding additional information to our install instructions, anything that requires manual user intervention is likely to be a non-starter. The most effective course would be a warning if SNI is not available on the platform, which I'm open to adding to urllib3.\n"
] |
https://api.github.com/repos/psf/requests/issues/2909
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2909/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2909/comments
|
https://api.github.com/repos/psf/requests/issues/2909/events
|
https://github.com/psf/requests/pull/2909
| 119,697,263 |
MDExOlB1bGxSZXF1ZXN0NTIyNDYxMzM=
| 2,909 |
Fix typos
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/141546?v=4",
"events_url": "https://api.github.com/users/jwilk/events{/privacy}",
"followers_url": "https://api.github.com/users/jwilk/followers",
"following_url": "https://api.github.com/users/jwilk/following{/other_user}",
"gists_url": "https://api.github.com/users/jwilk/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jwilk",
"id": 141546,
"login": "jwilk",
"node_id": "MDQ6VXNlcjE0MTU0Ng==",
"organizations_url": "https://api.github.com/users/jwilk/orgs",
"received_events_url": "https://api.github.com/users/jwilk/received_events",
"repos_url": "https://api.github.com/users/jwilk/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jwilk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jwilk/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jwilk",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-12-01T11:26:59Z
|
2021-09-08T04:01:15Z
|
2015-12-01T11:28:06Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2909/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2909/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2909.diff",
"html_url": "https://github.com/psf/requests/pull/2909",
"merged_at": "2015-12-01T11:28:06Z",
"patch_url": "https://github.com/psf/requests/pull/2909.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2909"
}
| true |
[
"Thanks @jwilk! :sparkles: :cake: :sparkles:\n"
] |
|
https://api.github.com/repos/psf/requests/issues/2908
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2908/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2908/comments
|
https://api.github.com/repos/psf/requests/issues/2908/events
|
https://github.com/psf/requests/issues/2908
| 119,489,960 |
MDU6SXNzdWUxMTk0ODk5NjA=
| 2,908 |
set_default_verify_paths not implemented for SSLContext class
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1087183?v=4",
"events_url": "https://api.github.com/users/rjschwei/events{/privacy}",
"followers_url": "https://api.github.com/users/rjschwei/followers",
"following_url": "https://api.github.com/users/rjschwei/following{/other_user}",
"gists_url": "https://api.github.com/users/rjschwei/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rjschwei",
"id": 1087183,
"login": "rjschwei",
"node_id": "MDQ6VXNlcjEwODcxODM=",
"organizations_url": "https://api.github.com/users/rjschwei/orgs",
"received_events_url": "https://api.github.com/users/rjschwei/received_events",
"repos_url": "https://api.github.com/users/rjschwei/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rjschwei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rjschwei/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rjschwei",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2015-11-30T13:18:38Z
|
2021-09-08T20:01:04Z
|
2015-11-30T13:34:56Z
|
NONE
|
resolved
|
It is possible to end up in a situation where set_default_verify_paths() is called on the SSLContext but the SSLContext instance is an instance of the implementation of SSLContext implemented in requests/packages/urllib3/util/ssl_.py This implementation does not provide the set_default_verify_paths() method. Thus resulting in a traceback for the user as follows:
response = requests.get(url)
File "/usr/lib/python2.7/site-packages/requests/api.py", line 69, in get
return request('get', url, params=params, *_kwargs)
File "/usr/lib/python2.7/site-packages/requests/api.py", line 50, in request
response = session.request(method=method, url=url, *_kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 468, in request
resp = self.send(prep, *_send_kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 576, in send
r = adapter.send(request, *_kwargs)
File "/usr/lib/python2.7/site-packages/requests/adapters.py", line 367, in send
timeout=timeout
File "/usr/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 559, in urlopen
body=body, headers=headers)
File "/usr/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 345, in _make_request
self._validate_conn(conn)
File "/usr/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 782, in _validate_conn
conn.connect()
File "/usr/lib/python2.7/site-packages/requests/packages/urllib3/connection.py", line 250, in connect
ssl_version=resolved_ssl_version)
File "/usr/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py", line 282, in ssl_wrap_socket
context.set_default_verify_paths()
AttributeError: 'SSLContext' object has no attribute 'set_default_verify_paths'
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2908/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2908/timeline
| null |
completed
| null | null | false |
[
"@rjschwei Where did you install requests from? urllib3 (and by extension requests) have _never_ called `set_default_verify_paths` in any version we shipped, at least according to a quick search of our git history. The only place we've ever called it is inside the PyOpenSSL contrib module, which of course could not produce the traceback you've provided.\n",
"Sorry, I took the liberty of glancing at your profile and noted that you're a SUSE developer. This breakage is caused by a patch introduced by [the SUSE packaging process](https://build.opensuse.org/package/view_file/Cloud:OpenStack:Master/python-requests/no-default-cacert.patch?expand=1). You'll probably want to chase up with whomever is packaging Requests for SUSE (we don't have a relationship with that person at this time).\n",
"Thanks for the quick response, was just chasing that avenue, sorry for not doing my homework before filing the issue.\n",
"Thanks for the apology @rjschwei, it happens to all of us. =) No harm done, and I hope you get the problem resolved!\n"
] |
https://api.github.com/repos/psf/requests/issues/2907
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2907/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2907/comments
|
https://api.github.com/repos/psf/requests/issues/2907/events
|
https://github.com/psf/requests/issues/2907
| 119,447,209 |
MDU6SXNzdWUxMTk0NDcyMDk=
| 2,907 |
Support arguments whose values are None
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10276811?v=4",
"events_url": "https://api.github.com/users/rohithpr/events{/privacy}",
"followers_url": "https://api.github.com/users/rohithpr/followers",
"following_url": "https://api.github.com/users/rohithpr/following{/other_user}",
"gists_url": "https://api.github.com/users/rohithpr/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rohithpr",
"id": 10276811,
"login": "rohithpr",
"node_id": "MDQ6VXNlcjEwMjc2ODEx",
"organizations_url": "https://api.github.com/users/rohithpr/orgs",
"received_events_url": "https://api.github.com/users/rohithpr/received_events",
"repos_url": "https://api.github.com/users/rohithpr/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rohithpr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rohithpr/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rohithpr",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 10 |
2015-11-30T09:06:32Z
|
2021-09-08T20:01:03Z
|
2015-11-30T14:15:41Z
|
NONE
|
resolved
|
The docs say: `Note that any dictionary key whose value is None will not be added to the URL’s query string.`
Is there a chance that this could be controlled by the user to include None values as well? Perhaps with the help of a flag.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2907/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2907/timeline
| null |
completed
| null | null | false |
[
"@rohithpr What would a `None` value be turned into?\n",
"@Lukasa - `null` perhaps?\n\n```\n>>> json.dumps({'val':None})\n'{\"val\": null}'\n```\n",
"@rohithpr The only place we turn dictionaries into JSON is the `json=` keyword argument, and that is not put into the URL's query string. Query strings in URLs are not JSON, they're urlencoded key-value pairs.\n",
"I should ask: what is your _actual_ problem? What are you trying to achieve that you believe you cannot with requests?\n",
"No, I didn't mean to say that it should follow JSON. I just said that None could be converted to null along the lines of JSON.\n",
"So again, I'm inclined to ask about what exactly you're trying to do. My reading of this issue is that you want the `None` value turned into _something_, but you don't care what it is. That's an unusual request: ordinarily I'd expect someone to have a specific service they're talking to that makes it natural to represent the data in a certain way (namely, to have `None` in there somewhere).\n\nWhat's the reasoning here?\n",
"I'd like to pass a `null` value for some arguments in a PUT (or any other) request.\n\n```\ndata = { 'p1': 'hello', 'p2': None}\nrequests.put(url, data)\n```\n\nOn doing this, the server gets only 'p1' and not 'p2'.\n\nIn the project that I'm working on key:null is used to delete a key from a record while updating the database. I don't have control over the server side code and I'm unable to pass null values using requests. (The ORM used on the server side is Vogels for DynamoDB)\n",
"If you want `null` that's something you should control by looping over the parameters you are passing and swapping `None` for `'null'`. This is not something requests is going to change.\n\n``` py\ndef convert_None_to_null(params):\n new_params = {}\n for (key, value) in params.items():\n if value is None:\n value = 'null'\n new_params[key] = value\n return new_params\n```\n",
"I see.\n\nSo, the use of `null` in this fashion is specific to your server: it is not _generally_ true about the URL query string. Requests tries to restrict itself to the subset of behaviour that is universally understood in the query string. Implementation-specific variances have to handle that themselves: otherwise, requests will grow a million different feature flags that subtly change its behaviour in all sorts of weird ways. This kind of combinatorial expansion of the API and the possible configurations of requests would make it a nightmare to use, maintain, and debug.\n\n@sigmavirus24 has provided you a code snippet that should do what you're looking for (with some small edits). I hope you find that helpful!\n",
"@Lukasa @sigmavirus24 - Thanks! :)\n"
] |
https://api.github.com/repos/psf/requests/issues/2906
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2906/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2906/comments
|
https://api.github.com/repos/psf/requests/issues/2906/events
|
https://github.com/psf/requests/issues/2906
| 119,266,350 |
MDU6SXNzdWUxMTkyNjYzNTA=
| 2,906 |
requests.packages.urllib3.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:646)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2891235?v=4",
"events_url": "https://api.github.com/users/caizixian/events{/privacy}",
"followers_url": "https://api.github.com/users/caizixian/followers",
"following_url": "https://api.github.com/users/caizixian/following{/other_user}",
"gists_url": "https://api.github.com/users/caizixian/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/caizixian",
"id": 2891235,
"login": "caizixian",
"node_id": "MDQ6VXNlcjI4OTEyMzU=",
"organizations_url": "https://api.github.com/users/caizixian/orgs",
"received_events_url": "https://api.github.com/users/caizixian/received_events",
"repos_url": "https://api.github.com/users/caizixian/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/caizixian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/caizixian/subscriptions",
"type": "User",
"url": "https://api.github.com/users/caizixian",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2015-11-28T02:45:30Z
|
2021-09-08T12:00:53Z
|
2015-11-28T08:05:55Z
|
NONE
|
resolved
|
Mac OS X 10.11.1
Python 3.5.0 from https://www.python.org/downloads/
requests (2.8.1)
``` python
>>> import ssl
>>> ssl.OPENSSL_VERSION
'OpenSSL 0.9.8zg 14 July 2015'
```
URL https://api.telegram.org/
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2906/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2906/timeline
| null |
completed
| null | null | false |
[
"That version of OpenSSL is absolutely _ancient_. I suspect it's the cause of your problem.\n\nI have OS X 10.11.2 Beta, Python 3.5.0 linked against a newer version of OpenSSL (1.0.2d), and the same version of requests, and I can reach that URL fine. I highly encourage you to use a newer version of OpenSSL.\n\nYou can get this by running `pip install -U requests[security]`.\n",
"@Lukasa So maybe I need to install a newer version of OpenSSL using homebrew and use it to replace the old OpenSSL shipped with OS.\n",
"@caizixian It's an option, but if you `pip install -U requests[security]` that will grab PyOpenSSL, which includes its own copy of OpenSSL on OS X.\n",
"@Lukasa I had following errors when installing requests[security], both related to cffi.\n1. `c/_cffi_backend.c:14:10: fatal error: 'ffi.h' file not found`\n \n so I `brew install pkg-config libffi`, which solved this problem.\n2. After that, I tried to install requests[security] again, and I got `Failed building wheel for cffi` . \n \n Despite this error, pip said that all things were installed. So I checked `ssl.OPENSSL_VERSION` again, still the same.\n\nFYI, my Xcode is 7.1.1\n",
"`Failed building wheel for cffi` is fine, you don't mind. You shouldn't check `ssl.OPENSSL_VERSION` because that's the standard library and doesn't get changed by the packages you just installed. Instead, you should try running `python -c \"from cryptography.hazmat.backends.openssl.backend import backend;print(backend.openssl_version_text())\"`. That should print the newer OpenSSL version, and that's the one that will be used by requests.\n",
"```\npip install cryptography\npip install pyopenssl\npip install requests[security]\n```\n\nProblem solved.\n\n```\nPython 3.5.1 (v3.5.1:37a07cee5969, Dec 5 2015, 21:12:44) \n[GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> from cryptography.hazmat.backends.openssl.backend import backend\n>>> print(backend.openssl_version_text())\nOpenSSL 1.0.2e 3 Dec 2015\n```\n",
"caizixian, you saved my life!!!!!\r\nThank you so much"
] |
https://api.github.com/repos/psf/requests/issues/2905
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2905/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2905/comments
|
https://api.github.com/repos/psf/requests/issues/2905/events
|
https://github.com/psf/requests/issues/2905
| 119,206,445 |
MDU6SXNzdWUxMTkyMDY0NDU=
| 2,905 |
response.request.headers does not contain the Hosts Header
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/16048186?v=4",
"events_url": "https://api.github.com/users/GreenBattery/events{/privacy}",
"followers_url": "https://api.github.com/users/GreenBattery/followers",
"following_url": "https://api.github.com/users/GreenBattery/following{/other_user}",
"gists_url": "https://api.github.com/users/GreenBattery/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/GreenBattery",
"id": 16048186,
"login": "GreenBattery",
"node_id": "MDQ6VXNlcjE2MDQ4MTg2",
"organizations_url": "https://api.github.com/users/GreenBattery/orgs",
"received_events_url": "https://api.github.com/users/GreenBattery/received_events",
"repos_url": "https://api.github.com/users/GreenBattery/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/GreenBattery/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/GreenBattery/subscriptions",
"type": "User",
"url": "https://api.github.com/users/GreenBattery",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-11-27T14:42:16Z
|
2021-09-08T20:01:05Z
|
2015-11-27T14:57:17Z
|
NONE
|
resolved
|
if you iterate over the headers of the response objects request, you do not get the complete set. The Host Header is missing.
``` python
sending_headers = {"Accept-Encoding": "identity"}
method = "GET"
url = "http://192.168.205.65/poster.php"
resp = requests.request(method, url, headers = sending_headers)
rh = resp.request.headers #original request might be lost in redirects?
for header in rh:
print("%s: %s" % (header, rh[header]))
```
Observe the output:
```
Accept-Encoding: identity
Accept: */*
User-Agent: python-requests/2.8.1
Connection: keep-alive
```
However, on the wire, you will see:
```
GET /poster.php HTTP/1.1
Host: 192.168.205.65
Accept-Encoding: identity
User-Agent: python-requests/2.8.1
Connection: keep-alive
Accept: */*
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2905/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2905/timeline
| null |
completed
| null | null | false |
[
"This is because, ordinarily, requests does not set the `Host` header at all: `httplib` does. We generally don't get in the way of that because httplib wants to set it.\n\nIn general, requests does not (and indeed cannot) claim that the headers variables (on both requests and responses) represent the full set of headers that were on the wire. Some or all may be added or removed by lower layers of the stack, and requests cannot control that.\n"
] |
https://api.github.com/repos/psf/requests/issues/2904
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2904/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2904/comments
|
https://api.github.com/repos/psf/requests/issues/2904/events
|
https://github.com/psf/requests/issues/2904
| 119,110,699 |
MDU6SXNzdWUxMTkxMTA2OTk=
| 2,904 |
Feature Request: Format reponse as HTTP Response Message Format
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2363105?v=4",
"events_url": "https://api.github.com/users/CrazyPython/events{/privacy}",
"followers_url": "https://api.github.com/users/CrazyPython/followers",
"following_url": "https://api.github.com/users/CrazyPython/following{/other_user}",
"gists_url": "https://api.github.com/users/CrazyPython/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/CrazyPython",
"id": 2363105,
"login": "CrazyPython",
"node_id": "MDQ6VXNlcjIzNjMxMDU=",
"organizations_url": "https://api.github.com/users/CrazyPython/orgs",
"received_events_url": "https://api.github.com/users/CrazyPython/received_events",
"repos_url": "https://api.github.com/users/CrazyPython/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/CrazyPython/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/CrazyPython/subscriptions",
"type": "User",
"url": "https://api.github.com/users/CrazyPython",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-11-26T23:36:56Z
|
2021-09-08T20:01:05Z
|
2015-11-27T04:51:37Z
|
NONE
|
resolved
|
http://www.tcpipguide.com/free/t_HTTPResponseMessageFormat.htm
A helper method to format a response as shown in the URL above would be useful.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2904/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2904/timeline
| null |
completed
| null | null | false |
[
"Requests is currently in a feature freeze. As such we aren't going to be implementing this. That said, if you wanted to implement this yourself, you would probably get 90% of the way there with [dump_response](https://toolbelt.readthedocs.org/en/latest/dumputils.html#requests_toolbelt.utils.dump.dump_response) from the [toolbelt](/sigmavirus24/requests-toolbelt).\n"
] |
https://api.github.com/repos/psf/requests/issues/2903
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2903/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2903/comments
|
https://api.github.com/repos/psf/requests/issues/2903/events
|
https://github.com/psf/requests/pull/2903
| 119,106,123 |
MDExOlB1bGxSZXF1ZXN0NTE5NDgxMTI=
| 2,903 |
Support SSL_CERT_FILE and SSL_CERT_DIR env vars
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4051912?v=4",
"events_url": "https://api.github.com/users/thoger/events{/privacy}",
"followers_url": "https://api.github.com/users/thoger/followers",
"following_url": "https://api.github.com/users/thoger/following{/other_user}",
"gists_url": "https://api.github.com/users/thoger/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/thoger",
"id": 4051912,
"login": "thoger",
"node_id": "MDQ6VXNlcjQwNTE5MTI=",
"organizations_url": "https://api.github.com/users/thoger/orgs",
"received_events_url": "https://api.github.com/users/thoger/received_events",
"repos_url": "https://api.github.com/users/thoger/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/thoger/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/thoger/subscriptions",
"type": "User",
"url": "https://api.github.com/users/thoger",
"user_view_type": "public"
}
|
[
{
"color": "e102d8",
"default": false,
"description": null,
"id": 117745,
"name": "Planned",
"node_id": "MDU6TGFiZWwxMTc3NDU=",
"url": "https://api.github.com/repos/psf/requests/labels/Planned"
}
] |
closed
| true | null |
[] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2013-11-17T11:29:34Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": null,
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/20",
"id": 487518,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels",
"node_id": "MDk6TWlsZXN0b25lNDg3NTE4",
"number": 20,
"open_issues": 12,
"state": "open",
"title": "3.0.0",
"updated_at": "2024-05-19T18:43:00Z",
"url": "https://api.github.com/repos/psf/requests/milestones/20"
}
| 21 |
2015-11-26T22:15:14Z
|
2021-09-04T00:06:35Z
|
2017-02-10T17:44:57Z
|
NONE
|
resolved
|
A proposed fix for issue #2899.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/2903/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2903/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2903.diff",
"html_url": "https://github.com/psf/requests/pull/2903",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2903.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2903"
}
| true |
[
"At a certain point this will get clearer if we use a for loop, but for now I think we're still ok. Thanks @thoger!\n",
"Hey, this would be really helpful for me. Can you merge this, please?\n",
"@standaloneSA This has been tagged for the 3.0.0 release. =)\n",
"All the +1s. You rock @Lukasa thanks! \n",
"@Lukasa why is this being delayed until 3.x? It doesn't appear to be break any backwards compatibility to me. \n",
"I'm hitting merge in a few hours unless I hear a good reason not to. \n",
"Because it can unexpectedly break previously working code: code that previously ignored these environment variables now pays attention to them, which can cause logic that used to work fine to now fail.\n\nGenerally speaking my view on environment variables is that if they aren't namespaced for us we should assume that some systems _already have these set_, and that they're set in a way that does not behave well for us.\n",
"Ah, I see. I do think that's a bit overzealous, and tend to consider a \"breaking change\" as removing/modifying existing behavior, not adding new behavior (e.g. considering this a \"new feature\"). \n\nBut, I see your reasoning—especially given the implicitness of environment variables. That being said, I think this is being overly-precautious and is not something I'd do personally, but I'm fine with doing so if you feel strongly about it. \n",
"I think I'm just as borderline about it as you are: while the previous post portrayed confidence, it was mostly me playing devils advocate.\n\nHowever, I don't think it hurts us at all to be a bit more cautious at this stage. We're downloaded more than 6 million times per month from PyPI alone, we underpin some of the most important software infrastructure in the world, and we run on a pretty severe deficit of maintainer time. My inclination is that when we're adding a new feature, if it's borderline whether it should be semver-major or semver-minor, we should play it safe and call it semver-major. Especially when there's a major release on the horizon (I am going to push for us to release 3.0.0 at PyCon).\n",
"@Lukasa ah, think that'll be during the sprints? If so, I might try to stay for them then (was planning on not attending sprints this year).\n",
"Should be do-able in one day of the sprints I reckon, so you don't need to hang around much. I'll also be there during tutorial-time if that works better for you.\n",
"It would be great to be able to walk around all PyCon telling people about v3.0.0. \n\nMaybe I could finally do stickers this year. I've always wanted to do that. \n",
"idk, I feel like stickers are going out of style. I'm probably wrong, I just haven't gone to many conferences lately (and I haven't accepted stickers handed to me in years). \n",
"I've also wanted to do t-shirts in years past, but that always felt like too much overhead. I love the idea of a company sponsoring a drop-shipped t-shirt to every contributor.\n\nPerhaps not a tshirt, but some other object. Branded quartz crystal shards!\n\nIf I had all the time in the world :)\n",
"So, I think the `ssl` module does a lot of the work for you. Take this example where I have set `SSL_CERT_FILE` and `SSL_CERT_DIR` to some dummy values to demonstrate the effect.\n\n```\n$ SSL_CERT_FILE=/bin/bash SSL_CERT_DIR=/bin ipython\nPython 2.7.11 |Continuum Analytics, Inc.| (default, Dec 6 2015, 18:57:58) \nType \"copyright\", \"credits\" or \"license\" for more information.\n\nIPython 4.2.0 -- An enhanced Interactive Python.\n? -> Introduction and overview of IPython's features.\n%quickref -> Quick reference.\nhelp -> Python's own help system.\nobject? -> Details about 'object', use 'object??' for extra details.\n\nIn [1]: import ssl\n\nIn [2]: ssl.get_default_verify_paths()\nOut[2]: DefaultVerifyPaths(cafile='/bin/bash', capath='/bin', openssl_cafile_env='SSL_CERT_FILE', openssl_cafile='/zopt/conda2/envs/nanshenv/ssl/cert.pem', openssl_capath_env='SSL_CERT_DIR', openssl_capath='/zopt/conda2/envs/nanshenv/ssl/certs')\n```\n",
"@jakirkham The problem is that that call can also accidentally have some unexpected results from there. Ideally we'd come up with something a bit more comprehensive. Especially as that, for example, does not work on Python 2.6 (no such method), older Python 2.7s (pre 2.7.9), or Python 3.3.\n",
"Let's merge this into the 3 branch. ",
"merged into proposed/3.0.0 branch! ",
"✨🍰✨",
"The problem with this solution is that it only looks in ONE of those locations for a matching cert. What is really needed is to look in A, then B, then C,... For instance look in a CA bundle, if match not found then continue by looking in a hash dir.\r\n",
"> The problem with this solution is that it only looks in ONE of those locations for a matching cert. What is really needed is to look in A, then B, then C,... For instance look in a CA bundle, if match not found then continue by looking in a hash dir.\r\n\r\nYou can also consider the ability to avoid the use of system CA bundle when specifying SSL_CERT_FILE or SSL_CERT_DIR to be a feature - you often do not need to trust all the CAs in the system bundle for a specific use case and only trust one or few that you know you really need to trust.\r\n\r\nAdditionally, having SSL_CERT_* environment variables have different meaning in requests than they have for OpenSSL or Python stdlib would be error prone."
] |
https://api.github.com/repos/psf/requests/issues/2902
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2902/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2902/comments
|
https://api.github.com/repos/psf/requests/issues/2902/events
|
https://github.com/psf/requests/pull/2902
| 119,055,909 |
MDExOlB1bGxSZXF1ZXN0NTE5MTkxMTU=
| 2,902 |
Include a link to my GitHub profile.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10311?v=4",
"events_url": "https://api.github.com/users/jparise/events{/privacy}",
"followers_url": "https://api.github.com/users/jparise/followers",
"following_url": "https://api.github.com/users/jparise/following{/other_user}",
"gists_url": "https://api.github.com/users/jparise/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jparise",
"id": 10311,
"login": "jparise",
"node_id": "MDQ6VXNlcjEwMzEx",
"organizations_url": "https://api.github.com/users/jparise/orgs",
"received_events_url": "https://api.github.com/users/jparise/received_events",
"repos_url": "https://api.github.com/users/jparise/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jparise/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jparise/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jparise",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2015-11-26T14:35:23Z
|
2021-09-08T05:01:11Z
|
2015-11-26T14:57:10Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2902/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2902/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2902.diff",
"html_url": "https://github.com/psf/requests/pull/2902",
"merged_at": "2015-11-26T14:57:10Z",
"patch_url": "https://github.com/psf/requests/pull/2902.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2902"
}
| true |
[] |
|
https://api.github.com/repos/psf/requests/issues/2901
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2901/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2901/comments
|
https://api.github.com/repos/psf/requests/issues/2901/events
|
https://github.com/psf/requests/issues/2901
| 119,043,831 |
MDU6SXNzdWUxMTkwNDM4MzE=
| 2,901 |
Does iter_content sometimes emit empty chunks
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/189580?v=4",
"events_url": "https://api.github.com/users/LinusU/events{/privacy}",
"followers_url": "https://api.github.com/users/LinusU/followers",
"following_url": "https://api.github.com/users/LinusU/following{/other_user}",
"gists_url": "https://api.github.com/users/LinusU/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/LinusU",
"id": 189580,
"login": "LinusU",
"node_id": "MDQ6VXNlcjE4OTU4MA==",
"organizations_url": "https://api.github.com/users/LinusU/orgs",
"received_events_url": "https://api.github.com/users/LinusU/received_events",
"repos_url": "https://api.github.com/users/LinusU/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/LinusU/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LinusU/subscriptions",
"type": "User",
"url": "https://api.github.com/users/LinusU",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-11-26T13:24:21Z
|
2021-09-08T20:01:06Z
|
2015-11-26T13:34:48Z
|
NONE
|
resolved
|
I've seen a lot of code on stackoverflow that guards the loop-body with an `if chunk:` when using `iter_content`. Is this really necessary, I don't see it anywhere in the documentation.
If it is, could we fix it? Or if not, document it.
``` python
for chunk in r.iter_content():
if chunk: # <- why this?
output.write(chunk)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2901/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2901/timeline
| null |
completed
| null | null | false |
[
"`iter_content` should not emit empty chunks. Historically it _could_, however with the current logic that should not be possible. If anyone encounters this problem I'd consider it a bug.\n",
"Perfect, that's what I'd hoped for! Thanks\n"
] |
https://api.github.com/repos/psf/requests/issues/2900
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2900/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2900/comments
|
https://api.github.com/repos/psf/requests/issues/2900/events
|
https://github.com/psf/requests/issues/2900
| 119,042,802 |
MDU6SXNzdWUxMTkwNDI4MDI=
| 2,900 |
Feature requests: iter_chunks([max_size])
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/189580?v=4",
"events_url": "https://api.github.com/users/LinusU/events{/privacy}",
"followers_url": "https://api.github.com/users/LinusU/followers",
"following_url": "https://api.github.com/users/LinusU/following{/other_user}",
"gists_url": "https://api.github.com/users/LinusU/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/LinusU",
"id": 189580,
"login": "LinusU",
"node_id": "MDQ6VXNlcjE4OTU4MA==",
"organizations_url": "https://api.github.com/users/LinusU/orgs",
"received_events_url": "https://api.github.com/users/LinusU/received_events",
"repos_url": "https://api.github.com/users/LinusU/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/LinusU/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LinusU/subscriptions",
"type": "User",
"url": "https://api.github.com/users/LinusU",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 12 |
2015-11-26T13:18:16Z
|
2021-09-08T20:01:06Z
|
2015-11-26T13:20:34Z
|
NONE
|
resolved
|
I would love to have a function that would iterate over the chunks that are received, as they are received on the socket. It would work like [`socket.recv()`](https://docs.python.org/2/library/socket.html#socket.socket.recv) works in the python standard library.
This would be very good for an easy way to consume the stream as efficient as possible. It would be awesome if we could update `__iter__` to use this function as well instead of using `iter_content` with a fixed length of 128.
This has been discussed to some extent in #844 but that issue got closed because of inactivity. I opened this to be a more focused issue. If we feel that this is a good approach I could hopefully help implement it as well.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2900/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2900/timeline
| null |
completed
| null | null | false |
[
"This already works. Use `iter_content(None)`, as [discussed in the documentation](http://docs.python-requests.org/en/latest/user/advanced/#chunk-encoded-requests).\n",
"That's awesome \\o/\n\nIs there any reason why `__iter__` on `Response` doesn't use it?\n\nWould it be possible to add it to the documentation somewhere other than under the discussion about chunked _up_loading? Maybe under [Raw response content](http://docs.python-requests.org/en/latest/user/quickstart/#raw-response-content). I think that's why I didn't find it.\n\nThank you for the quick response!\n",
"I'd happily accept a pull request that adds a similar stanza to that portion of the docs. =)\n\nAs to why we didn't change `__iter__`, I should point out another subtlety of the way `iter_content` works. `iter_content` returns _up to_ the amount passed to the generator, but will return less if it receives a smaller chunk. 128 was chosen a _long_ time ago as a reasonable maximum chunk size in that context.\n\nOne way or another, you should usually set a maximum size there. Arguably we should set it away from 128, but for now I don't think it's unreasonable to leave it as it. We may change it in 3.0.0, though.\n",
"Are you sure that `iter_content` returns _up to_? I've never seen it return anything other the exactly the chunk_size (expect for last chunk). I'm even specifically trying to observe just that behaviour but it always returned the entire buffer.\n\nWhen I take a stack-trace during `r.iter_content(None)` is that it is on `_safe_read` in pythons builtin `httplib.py`, which calls `read` until the number of bytes have been received. That means that it doesn't do what I had hoped...\n\nSpeaking of, this has actually been sitting here for quite some time now. Shouldn't it have given me some chunks? It works with smaller `chunk_size`, but I wanted to use `None` to get the chunk as soon as any data is available:\n\n\n",
"Here is the stack trace that I observed:\n\n``` text\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/usr/local/lib/python2.7/site-packages/requests/models.py\", line 657, in generate\n for chunk in self.raw.stream(chunk_size, decode_content=True):\n File \"/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/response.py\", line 326, in stream\n data = self.read(amt=amt, decode_content=decode_content)\n File \"/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/response.py\", line 278, in read\n data = self._fp.read()\n File \"/usr/local/Cellar/python/2.7.10_2/Frameworks/Python.framework/Versions/2.7/lib/python2.7/httplib.py\", line 596, in read\n s = self._safe_read(self.length)\n File \"/usr/local/Cellar/python/2.7.10_2/Frameworks/Python.framework/Versions/2.7/lib/python2.7/httplib.py\", line 703, in _safe_read\n chunk = self.fp.read(min(amt, MAXAMOUNT))\n File \"/usr/local/Cellar/python/2.7.10_2/Frameworks/Python.framework/Versions/2.7/lib/python2.7/socket.py\", line 384, in read\n data = self._sock.recv(left)\nKeyboardInterrupt\n```\n",
"What version of requests are you using?\n",
"`2.8.1` with python `2.7.10`\n",
"Then the problem is that the website you contacted is not actually doing chunked encoding. In this context, it will attempt to up to the maximum.\n",
"Hmm, okay, maybe I was a bit unclear but I didn't mean this in relation to chunked encoding. Chunked encoding is at the application level, but I wanted it on a packet level.\n\nWith that I mean that as soon as the first packet of data has arrived to my computer, I want to process that chunk of data. If several packets arrive while I'm doing something else, I don't mind getting a larger chunk.\n\nThis is the default behaviour or the `recv` function on the socket in python. It would ensure processing of data in the most effective manor possible.\n",
"@LinusU Unfortunately, `httplib` (upon which requests builds) does not expose this functionality. It converts the socket into a buffered file-like object which has a blocking `read` method, rather than a socket-like `recv` method. You could _in principle_ reach down into the socket below httplib, but in practice I think that will only rarely work because `httplib` itself uses the blocking read logic to get the headers, which means there may be information inside the `httplib` buffer you'd need to grab.\n",
"Hmm, that is too bad :(\n\nThank you for all your help though, stellar support :+1: \n",
"My pleasure, I'm sorry we can't be more helpful here!\n"
] |
https://api.github.com/repos/psf/requests/issues/2899
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2899/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2899/comments
|
https://api.github.com/repos/psf/requests/issues/2899/events
|
https://github.com/psf/requests/issues/2899
| 119,003,692 |
MDU6SXNzdWUxMTkwMDM2OTI=
| 2,899 |
requests: Support SSL_CERT_FILE environment variable
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4051912?v=4",
"events_url": "https://api.github.com/users/thoger/events{/privacy}",
"followers_url": "https://api.github.com/users/thoger/followers",
"following_url": "https://api.github.com/users/thoger/following{/other_user}",
"gists_url": "https://api.github.com/users/thoger/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/thoger",
"id": 4051912,
"login": "thoger",
"node_id": "MDQ6VXNlcjQwNTE5MTI=",
"organizations_url": "https://api.github.com/users/thoger/orgs",
"received_events_url": "https://api.github.com/users/thoger/received_events",
"repos_url": "https://api.github.com/users/thoger/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/thoger/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/thoger/subscriptions",
"type": "User",
"url": "https://api.github.com/users/thoger",
"user_view_type": "public"
}
|
[
{
"color": "02e10c",
"default": false,
"description": null,
"id": 76800,
"name": "Feature Request",
"node_id": "MDU6TGFiZWw3NjgwMA==",
"url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request"
},
{
"color": "0b02e1",
"default": false,
"description": null,
"id": 191274,
"name": "Contributor Friendly",
"node_id": "MDU6TGFiZWwxOTEyNzQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly"
}
] |
closed
| true | null |
[] | null | 22 |
2015-11-26T09:25:04Z
|
2021-09-08T09:00:44Z
|
2017-05-29T20:08:52Z
|
NONE
|
resolved
|
Requests allows setting path to a CA certificate bundle that should be used instead of the default system one using the following environment variables: `REQUESTS_CA_BUNDLE` and `CURL_CA_BUNDLE`.
https://github.com/kennethreitz/requests/blob/v2.8.1/requests/sessions.py#L621
Please make it also check `SSL_CERT_FILE` environment variable in the same way. This variable is recognized by OpenSSL and is documented in PEP 476 as a way to make Python standard library https clients use non-default CA bundle file:
https://www.python.org/dev/peps/pep-0476/#trust-database
It sounds reasonable to have a single environment variable that can be used for all Python scripts regardless of whether they are using `requests` or standard library `httplib` / `urllib*`.
In terms of the fix, a simple `or os.environ.get('SSL_CERT_FILE')` should do the trick.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 6,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 4,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 10,
"url": "https://api.github.com/repos/psf/requests/issues/2899/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2899/timeline
| null |
completed
| null | null | false |
[
"This seems reasonable enough to me. I'd be happy to accept a patch that does this. We should also accept `SSL_CERT_DIR`.\n",
"This seems harmless enough.\n",
"Is this still an issue ? If so I would like to give it a shot.\n",
"@Keops92 The work has been done, sadly: see #2903.\n",
"Thanks ! \n",
"What about adding a new keyword (I don't know, let's say verify_native ) and if set to True, requests would respect SSL_CERT_FILE and SSL_CERT_DIR as well as try to use platform-native certificate storage? This shouldn't break backwards compatibility.\n",
"The Requests team are _strongly_ opposed to adding new keyword arguments, especially if they are redundant with, overlap with, or can be set to contradictory values with a pre-existing keyword argument.\n",
"That makes sense.\n\nIs there a plan to support system certificate storage in the next compatibility-breaking release?\n",
"@luv Currently the expectation is that a future urllib3 release will allow the passing of custom SSLContext objects to urllib3. That will allow construction of adapters that let you use the system certificate store on Linux. For OS X and Windows we have the beginnings of some code to do it, but it's a lot of very subtle work that needs a lot of bugs ironed out.\n",
"Thanks for answer.\n\nActually, are you sure you need custom SSLContext to use system certificate store on Unix/Linux? It should be enough to pass correct path to \"verify\" keyword and you are sorted, even with the current version of requests.\n\nI realize it must be really PITA working on supporting OS X, Windows, Unix and Linux system certificate store transparently. \n\nBtw. from your previous comment it was not clear if you plan to limit this functionality to OS X and Windows only (and leave it up to the end-user to support unix and linux system certificates with custom SSLContext) or if you plan to support system certificates on Linux (and *BSD and friends :) ) as well.\n\nThanks!\n",
"> It should be enough to pass correct path to \"verify\" keyword and you are sorted, even with the current version of requests.\n\nI wish, but it is not. There is no consistent \"correct\" path that works across all Unices. There is no consistent \"correct\" path that works across all Linux distributions. There is not even a guarantee that there's a consistent \"correct\" path that works across all _versions_ of the same distribution. The reason the `SSLContext` approach is preferred is simply that it allows you to defer that choice to the distribution-provided OpenSSL, which is hopefully correctly configured.\n\n> Btw. from your previous comment it was not clear if you plan to limit this functionality to OS X and Windows only (and leave it up to the end-user to support unix and linux system certificates with custom SSLContext) or if you plan to support system certificates on Linux (and *BSD and friends :) ) as well.\n\nThe ideal end goal will be that there's a urllib3 flag you can hit to change its verification logic, and that requests will make it available in some way. That will work for all platforms.\n",
"I see. Would you mind providing an example where curl's approach to finding the \"correct path\" does not work https://github.com/curl/curl/blob/master/acinclude.m4#L2627 ?\n\n> The reason the SSLContext approach is preferred is simply that it allows you to defer that choice to the distribution-provided OpenSSL, which is hopefully correctly configured.\n\nOh ... if this is going to depend on functions like X509_get_default_cert_dir - it might be, in the real world, more problematic than hard-coded paths :( ... http://lists.freebsd.org/pipermail/freebsd-python/2014-December/007809.html\n",
"> Would you mind providing an example where curl's approach to finding the \"correct path\" does not work\n\nWell, for one, in the FreeBSD post you just linked, which has the certs at `/usr/local/etc/ssl/cert.pem` and `/usr/local/share/certs/ca-root-nss.crt`, neither of which are in curl's list:\n\n```\n/etc/ssl/certs/ca-certificates.crt\n/etc/pki/tls/certs/ca-bundle.crt\n/usr/share/ssl/certs/ca-bundle.crt\n/usr/local/share/certs/ca-root.crt\n/etc/ssl/cert.pem\n```\n\nThe general concerns here are the following set:\n1. Requests does not want to have any quick-fire way of enabling this function unless it works across _all_ platforms we support uniformly. Right now, we cannot do that. This means we do not want to advertise an API for it. Any easy-access API that only works on some systems is a _bad_ API in this case, because it contributes to user confusion.\n2. However, we are willing for users who have more specialised needs to overrule us. That is part of the reason the Transport Adapter API layer exists: it allows users to configure urllib3 directly in a way that enables them to support use-cases they know will work in their more general scenario. That includes this particular use-case.\n3. Ultimately, there is no cross-platform way to locate SSL certificates, and if Unices ship OpenSSLs that do not _actually_ point at the correct certificate locations then there is _no_ automated solution available to us.\n\nIt's the wild west out here, with people dumping certs wherever the hell they want, and we need a system that doesn't mysteriously fail as much as possible.\n\nIf you believe the curl approach works for your software, then by all means, you may implement it in your own software. You could even create a third-party module like [certifi](https://pypi.python.org/pypi/certifi) that implements a similar API and also implements curl's logic. However, it's my belief that curl's logic is somewhat fragile and only works because it allows configuration overrides in the build scripts, which most distros take advantage of. Those same distros make equivalent changes in requests when shipped on their platform, so I see no reason to want to emulate curl in this case.\n",
"sounds good\n",
"@Lukasa `/etc/ssl/cert.pem` is now by default linked by the `ca_root_nss` port.\n",
"I have a more practical question: I am a Mac OS user which is forced to use custom CA certificates. Importing them in KeyChain works well for ALL browsers and even for `curl`. Even if I am unfortunate enough to have to install the alternative curl via brew I do still have the option to use `SSL_CERT_FILE` environment variable in order to workaround the fact that that one fails to load certs from the keychain.\n\nNow, the big problem is how to convince `requests` to work? And when I say convince I mean that I need a solution that does NOT require me to patch any library that may be using requests.\n",
"Can't you use REQUESTS_CA_BUNDLE environment variable?\n\nhttp://docs.python-requests.org/en/master/user/advanced/#ssl-cert-verification\n",
"As @luv suggested, that will work.\n",
"@luv Thanks, I can confirm that if I do `export REQUESTS_CA_BUNDLE=$SSL_CERT_FILE` i would be able to use requests the same way as other libraries. I dont really know why requests needs another environment variable instead of re-using the same one but I guess there were some reasons for that.\n\nWhile I did this I observed something that looks like a bug: `requests.certs.where()` returns the same location even when you define `REQUESTS_CA_BUNDLE`, even if the library is using the ones from `REQUESTS_CA_BUNDLE`. Should I open a bug for this or is not really a bug?\n",
"> I dont really know why requests needs another environment variable instead of re-using the same one but I guess there were some reasons for that.\n\nStrictly, `SSL_CERT_FILE` is an OpenSSL environment variable. Requests doesn't use it because it does not ever check for it being set, and never trusts that OpenSSL will locate certs. In principle it could watch for this variable, though: the patch would be small. \n\n> Should I open a bug for this or is not really a bug?\n\nIt's not a bug. `requests.certs.where` is an implementation detail of Requests and doesn't claim to tell you where the cert bundle Requests will actually use is. This is partly because it can't: the location of the bundle can be changed by properties of the `Session` and arguments to the various request methods. All `where()` ever does is point to the _on-disk_ location of the Requests bundled certificates. \n",
"Can we close this? #2903 has already been merged into the `proposed/3.0.0` branch.",
"@hobarrera thanks!"
] |
https://api.github.com/repos/psf/requests/issues/2898
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2898/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2898/comments
|
https://api.github.com/repos/psf/requests/issues/2898/events
|
https://github.com/psf/requests/pull/2898
| 118,736,560 |
MDExOlB1bGxSZXF1ZXN0NTE3NDAwMDA=
| 2,898 |
Update adapters.py
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/90185?v=4",
"events_url": "https://api.github.com/users/lndbrg/events{/privacy}",
"followers_url": "https://api.github.com/users/lndbrg/followers",
"following_url": "https://api.github.com/users/lndbrg/following{/other_user}",
"gists_url": "https://api.github.com/users/lndbrg/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/lndbrg",
"id": 90185,
"login": "lndbrg",
"node_id": "MDQ6VXNlcjkwMTg1",
"organizations_url": "https://api.github.com/users/lndbrg/orgs",
"received_events_url": "https://api.github.com/users/lndbrg/received_events",
"repos_url": "https://api.github.com/users/lndbrg/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/lndbrg/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lndbrg/subscriptions",
"type": "User",
"url": "https://api.github.com/users/lndbrg",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-11-25T01:02:47Z
|
2021-09-08T05:01:11Z
|
2015-11-25T02:44:51Z
|
CONTRIBUTOR
|
resolved
|
Remove duplicate word.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2898/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2898/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2898.diff",
"html_url": "https://github.com/psf/requests/pull/2898",
"merged_at": "2015-11-25T02:44:51Z",
"patch_url": "https://github.com/psf/requests/pull/2898.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2898"
}
| true |
[
"Thanks @lndbrg!\n"
] |
https://api.github.com/repos/psf/requests/issues/2897
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2897/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2897/comments
|
https://api.github.com/repos/psf/requests/issues/2897/events
|
https://github.com/psf/requests/pull/2897
| 118,724,897 |
MDExOlB1bGxSZXF1ZXN0NTE3MzIwMzk=
| 2,897 |
Test socket server
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/715372?v=4",
"events_url": "https://api.github.com/users/BraulioVM/events{/privacy}",
"followers_url": "https://api.github.com/users/BraulioVM/followers",
"following_url": "https://api.github.com/users/BraulioVM/following{/other_user}",
"gists_url": "https://api.github.com/users/BraulioVM/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/BraulioVM",
"id": 715372,
"login": "BraulioVM",
"node_id": "MDQ6VXNlcjcxNTM3Mg==",
"organizations_url": "https://api.github.com/users/BraulioVM/orgs",
"received_events_url": "https://api.github.com/users/BraulioVM/received_events",
"repos_url": "https://api.github.com/users/BraulioVM/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/BraulioVM/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BraulioVM/subscriptions",
"type": "User",
"url": "https://api.github.com/users/BraulioVM",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 44 |
2015-11-24T23:20:45Z
|
2021-09-08T04:01:07Z
|
2016-04-11T20:29:02Z
|
CONTRIBUTOR
|
resolved
|
Fixes #2866.
### What is this?
This is an implementation of a basic socket server that is able to run concurrently with the main thread. This might be useful for testing our 'Transfer-Encoding: chunked' functionality as pytest-httpbin isn't able to handle this kind of encoding.
### How does it work?
You provide the Server constructor with a handler that will be called once the socket server is listening. This handler will be able to accept clients and send and receive any info it wants to. The main thread will be able to query the server by creating a normal socket.
### Syntax proposed
I think that using a Context-Manager is a great way of making it easy to use the Server.
``` python
with Server(handler) as host, port:
sock = socket.socket()
sock.connect((host, port))
...
```
### Doubts
- urllib3 tries to use a IPv6 socket and falls back to IPv4 when it can't. Should I try doing the same thing?
- I made a new directory for dummyserver, but don't know whether that was the right thing to do.
- I am not sure whether I'm testing my code properly. Code is being tested in very different ways in `test_requests.py`
### Roadmap
- Handle exceptions in Server.
- Add features such as `basic_response_handler` in urllib3's implementation.
- Implement more tests
- Implement 'Transfer-Encoding: chunked' tests using the dummy server.
Is this a good idea? Or should I stop doing this right now?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2897/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2897/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2897.diff",
"html_url": "https://github.com/psf/requests/pull/2897",
"merged_at": "2016-04-11T20:29:02Z",
"patch_url": "https://github.com/psf/requests/pull/2897.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2897"
}
| true |
[
"Can we perhaps not call it dummyserver? Maybe something more accurate, e.g., testserver?\n",
"Calling it a testserver feels like a good idea. As to the approach, this is definitely going in the right general direction, at least as far as I'm concerned. Obviously you need to replace the test that's currently there with one that actually tests requests itself, but that's fine.\n\nWe also need to be on the lookout for flaky tests here. Threaded tests can get tricky.\n",
"Ok, I'll call it testserver.\n\nThe test I added is there just for testing the server. It might be weird to add tests for a testing utility but I think it is necessary as it could be broken in many no trivial ways. If you don't like having requests tests and testserver tests in the same file (which actually looks bad) we could do the following things:\n- Having a separate file for testing testserver\n- Extracting testserver as a separate package and developing it with its tests independently from requests\n- Not testing it at all\n",
"I think I broke the CI server with tests that hang forever. I don't know whether I should do anything to fix that\n",
"Yup, that seems right. They don't consistently hang, but they do sometimes hang. Are you able to run the tests locally @BraulioVM?\n",
"PS @sigmavirus24 I don't seem to be able to stop the builds: can you do it?\n",
"I cannot stop the tests.\n",
"Well...that's not ideal.\n",
"@Lukasa I ran the tests locally on python2.7 and know why they hang. I have run the tests for the last version both on 2.7 and 3.4, and they are fine. Is there any easy way to replicate the build environment of the CI server? \n\nI am sorry for breaking it, I never thought there wasn't going to be a timeout or something like that.\n",
"Ok, I already have a failing test for chunked uploads (in python3, the 'Transfer-Encoding' header is not being set and python2.7 doesn't even let me use a generator as `data`)\n",
"Do you like the overall direction of this PR? Do you think I have gone too far? As I am new with this project I'm not sure whether this is the way to go\n",
"I think in general it's ok, but I need to spend some time playing with it tomorrow. \n",
"@BraulioVM So, that test doesn't fail for me, it passes, but more importantly it does run!\n\nHaving played with this quickly on my local machine, this seems generally good: the test runs, the structure of the function mostly seems to be good, so I'm happy to have this as the basis of a set of more specific tests. I would like us to add some more helpers, but we don't need to do that right now: we can definitely tackle it later.\n\nWell done @BraulioVM! Did you have more work you wanted to do before you proposed this for merging?\n",
"Thank you @Lukasa! I am happy this is finally going to be useful. \n\nI don't mind adding those helpers you talk about, but as I am not totally sure about how this server will be used I don't know what helpers should be implemented. I guess some logic for parsing the chunked content must be implemented in order to fully test the library, however, I don't think that logic should be built into TestServer.\n\nThere's no more work I was planning on doing. I was hoping you would tell some other things you wanted this PR to have (such as those helpers).\n",
"So, some helpers that might be useful:\n- something that lets you easily return a complete response from a HTTP/1.1 string.\n- something that returns 200 OK but saves the request off to be examined\n\nIdeally we'd want these with tests that use them, to confirm they work.\n",
"Ok! Sure, everything I implement will be tested. \n\nJust for clarification, when you say \"return a complete response from a HTTP/1.1 string\" you mean the _string_ contains the http headers and the body of the response and not just its body, right? Or is it just the body?\n",
"That's correct, headers and body. =)\n",
"Ok, done!\n\nI added an attribute to the TestServer class called `handler_results`. It is a list which stores the return values of the request handler. This allows the server instance returned by `text_response_server` to store the requests by returning their content.\n\nI have added tests for those two features as well.\n",
"Anything else I should do in order to get this merged?\n",
"Not right now @BraulioVM, we're pretty quiet over the holiday period so we probably won't find time to focus on reviewing this until the new year.\n",
"Ok, that's perfect!\n\nHappy holydays!\n",
"I have just finished my exams and have time to work on this again if it is heading the right direction. I have read https://github.com/kennethreitz/requests/issues/2866 and believe you want something similar to this, but you may prefer a different implementation anyway.\n\nShould I keep working on this or look for another issue to work on?\n",
"I believe this is generally going in the right direction, but now that @kennethreitz is more active he may want to see if this is something he wants. @sigmavirus24 and I are definitely :+1: on this for improving our testing, but it's up to Kenneth.\n",
"Ok, I'll wait to see what Kenneth thinks of this\n",
"I have no opinion :)\n",
"Ok, I'll give this a review tomorrow and see where we are.\n",
"Ok @BraulioVM I've left some notes in the diff. =)\n\nUnfortunately, while you've been away we moved the tests to a `test` subdirectory, so you'll need to move this change onto that directory structure. Sorry!\n",
"Ok, I have had several doubts while merging with master:\n- I have put `TestTestServer` test cases inside `test_util.py` because that's the file I've found best for it. Should I change it? I had to import requests in order to test the server, but I believe you did not import it on purpose.\n- Is there any better place for `test_chunked_upload`?\n",
"Ok, we're making some good progress here @BraulioVM! We'll have a few more rounds of code review while we get this stuff settled, and right now our Jenkins runs are hanging (I'll investigate why).\n\nDid you lose `test_chunked_upload` entirely? It doesn't seem to be in the diff at all: commit f17ef753d2c1f4db0d7f5aec51261da1db20d611 just removed it, without moving it _to_ anywhere.\n",
"I'm sorry for needing this much reviewing. I hope that I can get the code right from the beginning with my next PR.\n"
] |
https://api.github.com/repos/psf/requests/issues/2896
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2896/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2896/comments
|
https://api.github.com/repos/psf/requests/issues/2896/events
|
https://github.com/psf/requests/pull/2896
| 118,459,467 |
MDExOlB1bGxSZXF1ZXN0NTE1NzE2NTQ=
| 2,896 |
Set 'Transfer-Encoding: chunked' if data is a file with length 0
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/715372?v=4",
"events_url": "https://api.github.com/users/BraulioVM/events{/privacy}",
"followers_url": "https://api.github.com/users/BraulioVM/followers",
"following_url": "https://api.github.com/users/BraulioVM/following{/other_user}",
"gists_url": "https://api.github.com/users/BraulioVM/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/BraulioVM",
"id": 715372,
"login": "BraulioVM",
"node_id": "MDQ6VXNlcjcxNTM3Mg==",
"organizations_url": "https://api.github.com/users/BraulioVM/orgs",
"received_events_url": "https://api.github.com/users/BraulioVM/received_events",
"repos_url": "https://api.github.com/users/BraulioVM/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/BraulioVM/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BraulioVM/subscriptions",
"type": "User",
"url": "https://api.github.com/users/BraulioVM",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2015-11-23T19:55:37Z
|
2021-09-08T05:01:11Z
|
2015-12-02T14:35:41Z
|
CONTRIBUTOR
|
resolved
|
Fixes #2215
This is the first time I contribute to the project so I am not sure at all I put that code where it belongs. I know I have to add tests for this, but wanted to show you this beforehand.
Did I do this right?
Thank you
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2896/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2896/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2896.diff",
"html_url": "https://github.com/psf/requests/pull/2896",
"merged_at": "2015-12-02T14:35:41Z",
"patch_url": "https://github.com/psf/requests/pull/2896.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2896"
}
| true |
[
"@BraulioVM This is a good first pass! I have a draft of this fix sitting around on my laptop, so when I get home I'll show you how I approached it and let you take the ideas you like best. \n",
"@Lukasa Thanks!\n\nI have just seen #2866, which won't allow me to implement the tests I want, so I might as well think of a solution for that issue.\n",
"So I think this change is not quite right. What we want is a simple change around line 437. Right now we test `length` for whether it is `None`: we should just broaden that to say `if length`. That way, we'll fall through to chunked encoding for zero-length files.\n",
"I see. What I did didn't make any sense because file objects are _streams_. However, this change won't just affect files, it will affect any stream with length 0. Things will keep working, but I thought we just wanted to set 'Transfer-Encoding: chunked' if `data` was a file object with `super_len(data) == 0`.\n",
"@BraulioVM That's actually intentional: sending a chunked zero-length stream is fine, there won't be a problem there. That way, the code is clearer. =)\n\nSo, this change looks good to me, but I'll let @sigmavirus24 review it.\n",
"Is there anything else I should do regarding this PR?\n",
"@BraulioVM Nope, just waiting for @sigmavirus24 to get enough time to swing by and take a look at it. =)\n",
"Woops! Sorry. I lost track of this with recent goings on IRL. This looks great to me! Thanks @BraulioVM \n"
] |
https://api.github.com/repos/psf/requests/issues/2895
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2895/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2895/comments
|
https://api.github.com/repos/psf/requests/issues/2895/events
|
https://github.com/psf/requests/pull/2895
| 118,021,604 |
MDExOlB1bGxSZXF1ZXN0NTEzNDU4Nzc=
| 2,895 |
Add basic CONTRIBUTING.md file
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2015-11-20T11:30:20Z
|
2021-09-08T05:01:12Z
|
2015-11-20T14:14:59Z
|
MEMBER
|
resolved
|
It's beginning to get frustrating to have to keep dealing with questions on this issue tracker. Let's see if we can dissuade them.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2895/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2895/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2895.diff",
"html_url": "https://github.com/psf/requests/pull/2895",
"merged_at": "2015-11-20T14:14:59Z",
"patch_url": "https://github.com/psf/requests/pull/2895.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2895"
}
| true |
[
"That looks great!\n",
"One addition to what makes a good bug report for us.\n",
"Updated. =)\n",
"I'm not confident people will read this before filing a bug, but we can at least point them to it when closing a bug now.\n",
"More importantly, because GitHub prompts people to look at it first we can be sure that anyone who doesn't do what this says definitely didn't care.\n"
] |
https://api.github.com/repos/psf/requests/issues/2894
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2894/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2894/comments
|
https://api.github.com/repos/psf/requests/issues/2894/events
|
https://github.com/psf/requests/issues/2894
| 118,018,074 |
MDU6SXNzdWUxMTgwMTgwNzQ=
| 2,894 |
How can I detect encoding after I streamed only few pieces of response?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/621269?v=4",
"events_url": "https://api.github.com/users/adibalcan/events{/privacy}",
"followers_url": "https://api.github.com/users/adibalcan/followers",
"following_url": "https://api.github.com/users/adibalcan/following{/other_user}",
"gists_url": "https://api.github.com/users/adibalcan/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/adibalcan",
"id": 621269,
"login": "adibalcan",
"node_id": "MDQ6VXNlcjYyMTI2OQ==",
"organizations_url": "https://api.github.com/users/adibalcan/orgs",
"received_events_url": "https://api.github.com/users/adibalcan/received_events",
"repos_url": "https://api.github.com/users/adibalcan/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/adibalcan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/adibalcan/subscriptions",
"type": "User",
"url": "https://api.github.com/users/adibalcan",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2015-11-20T11:12:27Z
|
2021-09-08T20:01:06Z
|
2015-11-20T11:38:46Z
|
NONE
|
resolved
|
```
size = 0
start = time.time()
for chunk in r.iter_content(1024):
if time.time() - start > max_time:
r.reason = 'timeout riched'
raise ValueError(r.reason)
size += len(chunk)
if size > max_size:
r.reason = 'response too large'
raise ValueError(r.reason)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2894/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2894/timeline
| null |
completed
| null | null | false |
[
"When you say \"encoding\", what kind of encoding do you mean?\n",
"After stream ends (finish, or I interrupt), I want to use r.content (decoded response). Who can I do this?\n",
"If you use `iter_content` manually, it does not save into `r.content`. You should simply save off the data directly yourself.\n",
"Okay but I want to use request decode system for my saved data to obtain a result similar to r.text\n",
"The easiest way to emulate it is to add `chardet` to your list of dependencies, and then do:\n\n``` python\nr = requests.get(url)\nif r.encoding:\n text = content.decode(r.encoding)\nelse:\n encoding = chardet.detect(content)['encoding']\n text = content.decode(encoding)\n```\n\nThat'll get you text decoded exactly as requests does it.\n"
] |
https://api.github.com/repos/psf/requests/issues/2893
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2893/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2893/comments
|
https://api.github.com/repos/psf/requests/issues/2893/events
|
https://github.com/psf/requests/issues/2893
| 118,008,797 |
MDU6SXNzdWUxMTgwMDg3OTc=
| 2,893 |
Ability to use requests as a web server
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/105927?v=4",
"events_url": "https://api.github.com/users/valerianrossigneux/events{/privacy}",
"followers_url": "https://api.github.com/users/valerianrossigneux/followers",
"following_url": "https://api.github.com/users/valerianrossigneux/following{/other_user}",
"gists_url": "https://api.github.com/users/valerianrossigneux/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/valerianrossigneux",
"id": 105927,
"login": "valerianrossigneux",
"node_id": "MDQ6VXNlcjEwNTkyNw==",
"organizations_url": "https://api.github.com/users/valerianrossigneux/orgs",
"received_events_url": "https://api.github.com/users/valerianrossigneux/received_events",
"repos_url": "https://api.github.com/users/valerianrossigneux/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/valerianrossigneux/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/valerianrossigneux/subscriptions",
"type": "User",
"url": "https://api.github.com/users/valerianrossigneux",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2015-11-20T10:15:36Z
|
2021-09-08T20:01:07Z
|
2015-11-20T10:35:42Z
|
NONE
|
resolved
|
Hi, I was wondering if a wrapper was existing around requests so that it could be used as a web server ?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2893/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2893/timeline
| null |
completed
| null | null | false |
[
"@cyberval When you say \"used as a web server\", what does that mean exactly? What would requests do?\n",
"Sorry it's not very clear. I want to access requests through a web server rather than the CLI, something like:\nhttp://mydomain/requests?get=http://foo.com&authuser=a&authpass=b\n\nBecause of thrid party tool limitation that can only perform a GET or POST request and can't execute a script...\n",
"I'm afraid I'm not aware of any such project, however it should be an easy enough thing to write.\n"
] |
https://api.github.com/repos/psf/requests/issues/2892
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2892/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2892/comments
|
https://api.github.com/repos/psf/requests/issues/2892/events
|
https://github.com/psf/requests/issues/2892
| 117,994,464 |
MDU6SXNzdWUxMTc5OTQ0NjQ=
| 2,892 |
capath support in requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7211278?v=4",
"events_url": "https://api.github.com/users/mariayh1/events{/privacy}",
"followers_url": "https://api.github.com/users/mariayh1/followers",
"following_url": "https://api.github.com/users/mariayh1/following{/other_user}",
"gists_url": "https://api.github.com/users/mariayh1/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mariayh1",
"id": 7211278,
"login": "mariayh1",
"node_id": "MDQ6VXNlcjcyMTEyNzg=",
"organizations_url": "https://api.github.com/users/mariayh1/orgs",
"received_events_url": "https://api.github.com/users/mariayh1/received_events",
"repos_url": "https://api.github.com/users/mariayh1/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mariayh1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mariayh1/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mariayh1",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-11-20T08:47:06Z
|
2021-09-08T20:01:07Z
|
2015-11-20T08:48:37Z
|
NONE
|
resolved
|
We are currently using pycurl, but we would like to migrate to requests in order to independent of NSS.
However, we could see that we can only provide a ca_bundle instead of a capath. In urllib2.urlopen you can provide cafile and capath options, however that is not the case in the requests module.
Also we tried to use urllib2, but this only supports GET and POST requests, we are also interested in HEAD, PUT and DELETE. That is why we would like to use requests.
Could you let me know if there are plans to support capath in the requests module?
Thanks a lot,
Maria
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2892/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2892/timeline
| null |
completed
| null | null | false |
[
"Support for directories of CAs is in the current master, and will be released as part of the 2.9.0 release. See also #2858 and #2659.\n"
] |
https://api.github.com/repos/psf/requests/issues/2891
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2891/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2891/comments
|
https://api.github.com/repos/psf/requests/issues/2891/events
|
https://github.com/psf/requests/pull/2891
| 117,885,102 |
MDExOlB1bGxSZXF1ZXN0NTEyNjYzMzI=
| 2,891 |
Specify minimum pyOpenSSL version
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/790805?v=4",
"events_url": "https://api.github.com/users/lorenzph/events{/privacy}",
"followers_url": "https://api.github.com/users/lorenzph/followers",
"following_url": "https://api.github.com/users/lorenzph/following{/other_user}",
"gists_url": "https://api.github.com/users/lorenzph/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/lorenzph",
"id": 790805,
"login": "lorenzph",
"node_id": "MDQ6VXNlcjc5MDgwNQ==",
"organizations_url": "https://api.github.com/users/lorenzph/orgs",
"received_events_url": "https://api.github.com/users/lorenzph/received_events",
"repos_url": "https://api.github.com/users/lorenzph/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/lorenzph/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lorenzph/subscriptions",
"type": "User",
"url": "https://api.github.com/users/lorenzph",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-11-19T19:09:27Z
|
2021-09-08T05:01:13Z
|
2015-11-19T19:11:34Z
|
CONTRIBUTOR
|
resolved
|
urllib3 requires "set_tlsext_host_name" which was only added in
pyOpenSSL 0.13. As some distributions (e.g. Ubuntu 12.04) still ship an
older version enforce the correct minimum version during installation.
I will submit a similar pull requests to the urllib3 project.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2891/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2891/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2891.diff",
"html_url": "https://github.com/psf/requests/pull/2891",
"merged_at": "2015-11-19T19:11:34Z",
"patch_url": "https://github.com/psf/requests/pull/2891.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2891"
}
| true |
[
"It's mind-boggling to me that anyone is shipping a PyOpenSSL that old. Amusingly, the reason we need set_tlsext_host_name is to enable SNI, which is literally the point of using PyOpenSSL with requests, so any earlier version of PyOpenSSL wouldn't do what we need it to do anyway. What a world we live in. Thanks for spotting this!\n",
"_reserves comments on Long Term Support versions of distributions harming user security in the interested of perceived stability_\n"
] |
https://api.github.com/repos/psf/requests/issues/2890
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2890/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2890/comments
|
https://api.github.com/repos/psf/requests/issues/2890/events
|
https://github.com/psf/requests/issues/2890
| 117,683,020 |
MDU6SXNzdWUxMTc2ODMwMjA=
| 2,890 |
Deleted cookies should be included in response.cookies
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1221480?v=4",
"events_url": "https://api.github.com/users/Bekt/events{/privacy}",
"followers_url": "https://api.github.com/users/Bekt/followers",
"following_url": "https://api.github.com/users/Bekt/following{/other_user}",
"gists_url": "https://api.github.com/users/Bekt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Bekt",
"id": 1221480,
"login": "Bekt",
"node_id": "MDQ6VXNlcjEyMjE0ODA=",
"organizations_url": "https://api.github.com/users/Bekt/orgs",
"received_events_url": "https://api.github.com/users/Bekt/received_events",
"repos_url": "https://api.github.com/users/Bekt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Bekt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Bekt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Bekt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2015-11-18T21:16:22Z
|
2021-09-08T21:00:41Z
|
2015-11-19T19:04:46Z
|
NONE
|
resolved
|
``` py
s = requests.Session()
r = s.get('https://httpbin.org/cookies/set?k2=v2&k1=v1', allow_redirects=False)
r.headers['set-cookie']
# 'k2=v2; Path=/, k1=v1; Path=/'
r.cookies.keys()
# ['k1', 'k2']
r = s.get('https://httpbin.org/cookies/delete?k1', allow_redirects=False)
r.headers['set-cookie']
# 'k1=; Expires=Thu, 01-Jan-1970 00:00:00 GMT; Max-Age=0; Path=/'
r.cookies.keys()
# []
```
This is pretty frustrating when one is trying to build a custom response object off of a `requests.Response`.
Expected behavior is `r.cookies` should include a CookieJar when a cookie is deleted (empty value) in the response.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1221480?v=4",
"events_url": "https://api.github.com/users/Bekt/events{/privacy}",
"followers_url": "https://api.github.com/users/Bekt/followers",
"following_url": "https://api.github.com/users/Bekt/following{/other_user}",
"gists_url": "https://api.github.com/users/Bekt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Bekt",
"id": 1221480,
"login": "Bekt",
"node_id": "MDQ6VXNlcjEyMjE0ODA=",
"organizations_url": "https://api.github.com/users/Bekt/orgs",
"received_events_url": "https://api.github.com/users/Bekt/received_events",
"repos_url": "https://api.github.com/users/Bekt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Bekt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Bekt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Bekt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2890/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2890/timeline
| null |
completed
| null | null | false |
[
"Sorry, it's not clear to me: _why_ should deleted cookies be included in `r.cookies`?\n",
"Well, why _shouldn't_ they be?\n\nSay, you have an endpoint that logs out a user. A log-out usually clears out browser cookies by specifying an empty key-value `set-cookie` header (like in the example above). If I'm using `requests` as a middleman/reverse-proxy between the browser and the log-out endpoint, it's my duty to pass down everything the log-out endpoint returns. If the deleted cookie is not passed down, the user won't be actually logged out.\n\nI am manually parsing the headers because `CaseInsensitiveDict` (type of `Response.headers`) does not support multiple values for the same key (which is a whole different problem). It is completely normal to have multiple `set-cookie` keys in the header, but `Response.headers` messes up in these kind of situations. A better approach is to use something like [`MultiDict`](https://github.com/mitsuhiko/werkzeug/blob/285e300f7f5806253000f13b36e87ef10eccfae7/werkzeug/datastructures.py#L326) to store response headers. Again, this is a different issue.\n",
"So, `response.cookies` provides an indication of the cookies that are validly set after the request has completed. When we update a cookie with an expiry date in the past, that means the cookie is now unset, and should not validly be in the `CookieJar`. This logic is actually enforced by the `CookieJar` class itself and is not something we normally control.\n\nIf you're using requests as part of a middleman/reverse proxy, you can achieve exactly the effect you want by passing the `Set-Cookie` header on, unmolested. As a reverse proxy, you actually have little-to-no reason to care about the cookies at all, they're strictly a _user-agent_ concern. The only reason a reverse-proxy would care about cookies is to cache-invalidate for responses that contain `Vary: Cookie` headers.\n\nWhat we need to decide is whether there is a compelling use-case in providing access to cookies that were _unset_ in a response, beyond simply querying `Set-Cookie` for them.\n\nAs to your claim that `CaseInsensitiveDict` does not support multiple values for the same key, that's true of the class itself but untrue of the way it's used. Headers, before they get to requests, go via urllib3. urllib3 uses its own class for holding headers, the `HTTPHeaderDict`. That dictionary coalesces together multiple header values by comma-separating their values, which means that by the time the CaseInsensitiveDict gets hold of them any repeated values received from the server are coalesced together. This is totally valid _except_ for the `Set-Cookie` header, unfortunately, which is the specific problem you're encountering, and is a known limitation of this approach in requests. However, in your case you have no need to handle header parsing yourself: simply use `response.raw.headers`.\n",
"> This logic is actually enforced by the CookieJar class itself and is not something we normally control.\n\nDidn't know this, thanks.\n\n> If you're using requests as part of a middleman/reverse proxy, you can achieve exactly the effect you want by passing the Set-Cookie header on, unmolested.\n\nThis is exactly how I started off, but started getting weird errors because there is only one `set-cookie` header in the response. For example, I had the following problem:\n\n```\n...\n>>> r.headers['set-cookie']\nfoo=123; Expires=Thu, 17-Nov-2016 22:34:13 GMT; Path=/, \nbar=456; Expires=Wed, 18-Nov-2015 22:44:13 GMT; HttpOnly; Path=/\n```\n\nThe browser, however, gets confused and sets the expiration date for `foo` as _18-Nov-201**5** 22:44:13_ instead of _17-Nov-201**6** 22:34:13_. So in this case, both cookies had the same expiration date.\n\nWell, if `HTTPHeaderDict` is handling `Set-Cookie` like the way it is, then the underlying issue might be with urrlib3 or the browser. What do you think?\n",
"Screenshots:\n\n\n\n\n",
"Why not something like:\n\n``` py\nheaders = r.raw.headers\nfor key in headers.keys():\n for value in headers.getlist(key):\n # pass on (key, value) as a header-name: header-value pair\n```\n",
"@Bekt The problem with the header dict is that `Set-Cookie` _cannot_ be simply joined by commas in the general case, because its values _contain_ commas. This is some truly terrible design, but we're stuck with it now. This is why you need to use @sigmavirus24's approach.\n",
"Ah I had not looked into the `HTTPHeaderDict` API, I just assumed it was similar to `CaseInsensitiveDict`. Thanks for the suggestions guys! I probably would not have thought about looking into `raw` or `raw.headers` at all.\n\nHere's what I ended up doing in case anybody is reading this:\n\n``` py\nr = requests.get|post|...\n\nres = MyResponse(headers=r.headers.items())\nres.headers.pop('set-cookie', None)\nfor c in r.raw.headers.getlist('set-cookie'):\n res.headers.add('set-cookie', c)\n```\n"
] |
https://api.github.com/repos/psf/requests/issues/2889
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2889/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2889/comments
|
https://api.github.com/repos/psf/requests/issues/2889/events
|
https://github.com/psf/requests/issues/2889
| 117,662,679 |
MDU6SXNzdWUxMTc2NjI2Nzk=
| 2,889 |
Cannot Upload a large package on Windows 7 32 bit OS, with requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/15913535?v=4",
"events_url": "https://api.github.com/users/srujanayreddy/events{/privacy}",
"followers_url": "https://api.github.com/users/srujanayreddy/followers",
"following_url": "https://api.github.com/users/srujanayreddy/following{/other_user}",
"gists_url": "https://api.github.com/users/srujanayreddy/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/srujanayreddy",
"id": 15913535,
"login": "srujanayreddy",
"node_id": "MDQ6VXNlcjE1OTEzNTM1",
"organizations_url": "https://api.github.com/users/srujanayreddy/orgs",
"received_events_url": "https://api.github.com/users/srujanayreddy/received_events",
"repos_url": "https://api.github.com/users/srujanayreddy/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/srujanayreddy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/srujanayreddy/subscriptions",
"type": "User",
"url": "https://api.github.com/users/srujanayreddy",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2015-11-18T19:30:43Z
|
2021-09-08T21:00:41Z
|
2015-11-18T19:33:56Z
|
NONE
|
resolved
|
Cannot Upload from a Windows 7 32 bit OS. It works fine on Windows 7 64 bit OS with 32/64 bit Python.
I am using Python 3.4.3 with latest requests API.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2889/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2889/timeline
| null |
completed
| null | null | false |
[
"This is likely a duplicate of https://github.com/kennethreitz/requests/issues/2510 (in the sense that you're reading the exception improperly) and you've given us no detail on what you're code does or what the _full_ traceback is.\n\nPlease post questions to [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests) if you intend to ask questions.\n",
"Sorry I couldn't provide you more details earlier.\n\nThe error I get is\n\nTraceback (most recent call last):\n File \"C:\\Python34\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.p\ny\", line 376, in _make_request\n httplib_response = conn.getresponse(buffering=True)\nTypeError: getresponse() got an unexpected keyword argument 'buffering'\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"C:\\Python34\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.p\ny\", line 559, in urlopen\n body=body, headers=headers)\n File \"C:\\Python34\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.p\ny\", line 378, in _make_request\n httplib_response = conn.getresponse()\n File \"C:\\Python34\\lib\\http\\client.py\", line 1171, in getresponse\n response.begin()\n File \"C:\\Python34\\lib\\http\\client.py\", line 351, in begin\n version, status, reason = self._read_status()\n File \"C:\\Python34\\lib\\http\\client.py\", line 313, in _read_status\n line = str(self.fp.readline(_MAXLINE + 1), \"iso-8859-1\")\n File \"C:\\Python34\\lib\\socket.py\", line 374, in readinto\n return self._sock.recv_into(b)\nConnectionResetError: [WinError 10054] An existing connection was forcibly close\nd by the remote host\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"C:\\Python34\\lib\\site-packages\\requests\\adapters.py\", line 370, in send\n timeout=timeout\n File \"C:\\Python34\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.p\ny\", line 609, in urlopen\n _stacktrace=sys.exc_info()[2])\n File \"C:\\Python34\\lib\\site-packages\\requests\\packages\\urllib3\\util\\retry.py\",\nline 245, in increment\n raise six.reraise(type(error), error, _stacktrace)\n File \"C:\\Python34\\lib\\site-packages\\requests\\packages\\urllib3\\packages\\six.py\"\n, line 309, in reraise\n raise value.with_traceback(tb)\n File \"C:\\Python34\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.p\ny\", line 559, in urlopen\n body=body, headers=headers)\n File \"C:\\Python34\\lib\\site-packages\\requests\\packages\\urllib3\\connectionpool.p\ny\", line 378, in _make_request\n httplib_response = conn.getresponse()\n File \"C:\\Python34\\lib\\http\\client.py\", line 1171, in getresponse\n response.begin()\n File \"C:\\Python34\\lib\\http\\client.py\", line 351, in begin\n version, status, reason = self._read_status()\n File \"C:\\Python34\\lib\\http\\client.py\", line 313, in _read_status\n line = str(self.fp.readline(_MAXLINE + 1), \"iso-8859-1\")\n File \"C:\\Python34\\lib\\socket.py\", line 374, in readinto\n return self._sock.recv_into(b)\nrequests.packages.urllib3.exceptions.ProtocolError: ('Connection aborted.', Conn\nectionResetError(10054, 'An existing connection was forcibly closed by the remot\ne host', None, 10054, None))\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"Upgrader.py\", line 12, in <module>\n rdst = requests.post(urldst, files={'1_19_0_developer.exe': resp.content})\n File \"C:\\Python34\\lib\\site-packages\\requests\\api.py\", line 109, in post\n return request('post', url, data=data, json=json, *_kwargs)\n File \"C:\\Python34\\lib\\site-packages\\requests\\api.py\", line 50, in request\n response = session.request(method=method, url=url, *_kwargs)\n File \"C:\\Python34\\lib\\site-packages\\requests\\sessions.py\", line 468, in reques\nt\n resp = self.send(prep, *_send_kwargs)\n File \"C:\\Python34\\lib\\site-packages\\requests\\sessions.py\", line 576, in send\n r = adapter.send(request, *_kwargs)\n File \"C:\\Python34\\lib\\site-packages\\requests\\adapters.py\", line 412, in send\n raise ConnectionError(err, request=request)\nrequests.exceptions.ConnectionError: ('Connection aborted.', ConnectionResetErro\nr(10054, 'An existing connection was forcibly closed by the remote host', None,\n10054, None))\n",
"The code is\n\nimport requests\nfrom requests_file import FileAdapter\n\ns = requests.Session()\ns.mount('file://', FileAdapter())\nresp = s.get('file:///local_package_name')\nurldst = 'Upload URL'\nrdst = requests.post(urldst, files={'filename': resp.content})\nprint(rdst)\n\nThis code works fine on a Windows7 64 bit OS, but returns the error as described in Windows7 32 bit OS.\n",
"So the actual problem here is that the remote peer is closing the connection. It appears not to like the request you're making on your 32-bit machine. My suspicion here is that you're sending an incomplete request from resp.content.\n\nOut of interest, why are you tripping this through a file adapter instead of just sending that file directly? \n",
"Thanks for your reply.\nI am using File Adapter to Upload a package from a local drive.(local_package_name is an input path for the local package on a C Drive).\n\nI couldn't upload the file directly, without using FileAdapter.\nI get the following error, if I upload file directly.\n\nTraceback (most recent call last):\n File \"Upgrader.py\", line 12, in <module>\n resp = s.get('file:///C:/pythonws/1_9_0.exe')\n File \"C:\\Python341\\lib\\site-packages\\requests\\sessions.py\", line 480, in get\n return self.request('GET', url, *_kwargs)\n File \"C:\\Python341\\lib\\site-packages\\requests\\sessions.py\", line 468, in reque\nst\n resp = self.send(prep, *_send_kwargs)\n File \"C:\\Python341\\lib\\site-packages\\requests\\sessions.py\", line 570, in send\n adapter = self.get_adapter(url=request.url)\n File \"C:\\Python341\\lib\\site-packages\\requests\\sessions.py\", line 644, in get_a\ndapter\n raise InvalidSchema(\"No connection adapters were found for '%s'\" % url)\nrequests.exceptions.InvalidSchema: No connection adapters were found for 'file:/\n//C:/pythonws/1_9_0.exe'\n\nPlease suggest about the changes that should be made to resp.content to make it work on a 32 bit Windows 7 OS.\n\nThanks.\n",
"Also, I can upload small packages using the provided code on a 32 bit Windows 7 OS.\nThe only problem is with uploading large packages.\n",
"Are you uploading over HTTPS or HTTP?\n",
"Uploading over HTTP.\nThe code works on the same setup, but using 64 bit Windows OS and also smaller files in 32 bit Windows OS.\n",
"If you're uploading over HTTP, can you use something like Wireshark to capture the headers on each machine and put them here? Feel free to remove the Authorization header if it's present.\n"
] |
https://api.github.com/repos/psf/requests/issues/2888
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2888/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2888/comments
|
https://api.github.com/repos/psf/requests/issues/2888/events
|
https://github.com/psf/requests/issues/2888
| 117,593,602 |
MDU6SXNzdWUxMTc1OTM2MDI=
| 2,888 |
I try 10000 concurrent tcp requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/546738?v=4",
"events_url": "https://api.github.com/users/berkantaydin/events{/privacy}",
"followers_url": "https://api.github.com/users/berkantaydin/followers",
"following_url": "https://api.github.com/users/berkantaydin/following{/other_user}",
"gists_url": "https://api.github.com/users/berkantaydin/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/berkantaydin",
"id": 546738,
"login": "berkantaydin",
"node_id": "MDQ6VXNlcjU0NjczOA==",
"organizations_url": "https://api.github.com/users/berkantaydin/orgs",
"received_events_url": "https://api.github.com/users/berkantaydin/received_events",
"repos_url": "https://api.github.com/users/berkantaydin/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/berkantaydin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/berkantaydin/subscriptions",
"type": "User",
"url": "https://api.github.com/users/berkantaydin",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-11-18T14:06:11Z
|
2021-09-08T21:00:41Z
|
2015-11-18T14:08:20Z
|
NONE
|
resolved
|
I try 10000 concurrent tcp requests but i have max exceed error. How to solve this ? About TCP Poolsize ?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2888/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2888/timeline
| null |
completed
| null | null | false |
[
"Questions and support requests belong on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests).\n"
] |
https://api.github.com/repos/psf/requests/issues/2887
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2887/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2887/comments
|
https://api.github.com/repos/psf/requests/issues/2887/events
|
https://github.com/psf/requests/issues/2887
| 117,473,777 |
MDU6SXNzdWUxMTc0NzM3Nzc=
| 2,887 |
On requests 2.8.0+ chunked requests can cause an unhandled urllib3.exceptions.NewConnectionError
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2015-11-17T23:15:58Z
|
2021-09-08T18:00:58Z
|
2015-11-18T09:10:33Z
|
CONTRIBUTOR
|
resolved
|
When we perform a chunked request, we cause a call to `HTTPConnection._new_conn` through `HTTPConnection.connect` which can raise an instance of `urllib3.exceptions.NewConnectionError`.
Code to reproduce
``` py
import requests
requests.post('https://is.invalid', data=(str(i) for i in range(2)))
```
cc @stevelle
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2887/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2887/timeline
| null |
completed
| null | null | false |
[
"Yup, this looks wrong. Lemme get a patch up and running.\n",
"Hmm. Interestingly, I can't reproduce this on the current master.\n",
"Yup, so this appears to be fixed in master but I can't work out why. I'll close until someone else can verify that I'm losing my mind.\n",
"Huh, so this raises the proper exception now. @stevelle and I specifically tested this on 2.8.1 and were seeing the exception from urllib3 bubble up. On master, I see the (expected) `requests.exceptions.ConnectionError` (which wraps the `NewConnectionError` from urllib3).\n\nThis is especially baffling because the exception was coming from the line where `endheaders` is called on `low_conn` and the difference between 2.8.1 and HEAD is:\n\n``` diff\ndiff --git a/requests/adapters.py b/requests/adapters.py\nindex 7682db0..cecf6c2 100644\n--- a/requests/adapters.py\n+++ b/requests/adapters.py\n@@ -8,6 +8,7 @@ This module contains the transport adapters that Requests uses to define\n and maintain connections.\n \"\"\"\n\n+import os.path\n import socket\n\n from .models import Response\n@@ -185,10 +186,15 @@ class HTTPAdapter(BaseAdapter):\n raise Exception(\"Could not find a suitable SSL CA certificate bundle.\")\n\n conn.cert_reqs = 'CERT_REQUIRED'\n- conn.ca_certs = cert_loc\n+\n+ if not os.path.isdir(cert_loc):\n+ conn.ca_certs = cert_loc\n+ else:\n+ conn.ca_cert_dir = cert_loc\n else:\n conn.cert_reqs = 'CERT_NONE'\n conn.ca_certs = None\n+ conn.ca_cert_dir = None\n\n if cert:\n if not isinstance(cert, basestring):\n@@ -394,7 +400,15 @@ class HTTPAdapter(BaseAdapter):\n low_conn.send(b'\\r\\n')\n low_conn.send(b'0\\r\\n\\r\\n')\n\n- r = low_conn.getresponse()\n+ # Receive the response from the server\n+ try:\n+ # For Python 2.7+ versions, use buffering of HTTP\n+ # responses\n+ r = low_conn.getresponse(buffering=True)\n+ except TypeError:\n+ # For compatibility with Python 2.6 versions and back\n+ r = low_conn.getresponse()\n+\n resp = HTTPResponse.from_httplib(\n r,\n pool=conn,\n```\n\nIn other words, the exception should be raised before we get to this try/except logic.\n\nI can't imagine this would be system specific, but I'll toy around with the server that @stevelle and I were testing on yesterday to see if I can reproduce this on master there. (I'm using OSX right now, we were using Ubuntu 14.04 inside a virtualenv yesterday.)\n",
"This issue seems to be happening on requests 2.9.1. You can check the [requests_issue_2887.py](https://gist.github.com/suoto/2c3e9fe51b38208dfddd4144c0ae80cb) gist, it has the relevant code that seems to reproduce this issue every time I run.\n\nAt least it seems the same issue to me, I'm sorry if it's a different issue.\n\nFor the record, the shell output is as follows\n\n```\nBottle v0.12.9 server starting up (using WaitressServer())...\nListening on http://localhost:50000/\nHit Ctrl-C to quit.\n\nserving on http://127.0.0.1:50000\nTraceback (most recent call last):\n File \"/home/souto/dummy.py\", line 39, in <module>\n main()\n File \"/home/souto/dummy.py\", line 33, in main\n requests.post('http://localhost:50000/foobar')\n File \"/home/souto/.local/lib/python2.7/site-packages/requests/api.py\", line 107, in post\n return request('post', url, data=data, json=json, **kwargs)\n File \"/home/souto/.local/lib/python2.7/site-packages/requests/api.py\", line 53, in request\n return session.request(method=method, url=url, **kwargs)\n File \"/home/souto/.local/lib/python2.7/site-packages/requests/sessions.py\", line 468, in request\n resp = self.send(prep, **send_kwargs)\n File \"/home/souto/.local/lib/python2.7/site-packages/requests/sessions.py\", line 576, in send\n r = adapter.send(request, **kwargs)\n File \"/home/souto/.local/lib/python2.7/site-packages/requests/adapters.py\", line 437, in send\n raise ConnectionError(e, request=request)\nrequests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=50000): Max retries exceeded with url: /foobar (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x1a991d0>: Failed to establish a new connection: [Errno 111] Connection refused',))\n```\n",
"@suoto It's not the same bug. =)\n\nYour bug is a simple timing issue: the requests.post call executes before the server has started running, which means it isn't yet listening on the socket. If you add a `sleep(1)` call before the call to `requests.post()` you'll find it works fine. =)\n",
"Yup, that's true. What I found it's weird is the fact I was seeing `requests.packages.urllib3.exceptions.NewConnectionError` exception instead of `requests.exceptions.ConnectionError` on the gist and on a project I'm working on. I'm so sure of this that I actually added the exception from `urllib3` to this project.\n\nAnyway, sorry about that. I'll let you know if I have a code that reproduces this. :confounded: :confused: \n",
"=) The most likely bet is just that the requests in question there was older.\n"
] |
https://api.github.com/repos/psf/requests/issues/2886
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2886/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2886/comments
|
https://api.github.com/repos/psf/requests/issues/2886/events
|
https://github.com/psf/requests/issues/2886
| 117,375,358 |
MDU6SXNzdWUxMTczNzUzNTg=
| 2,886 |
Failure while formatting an error for incorrect SSL CA file
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/45051?v=4",
"events_url": "https://api.github.com/users/traut/events{/privacy}",
"followers_url": "https://api.github.com/users/traut/followers",
"following_url": "https://api.github.com/users/traut/following{/other_user}",
"gists_url": "https://api.github.com/users/traut/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/traut",
"id": 45051,
"login": "traut",
"node_id": "MDQ6VXNlcjQ1MDUx",
"organizations_url": "https://api.github.com/users/traut/orgs",
"received_events_url": "https://api.github.com/users/traut/received_events",
"repos_url": "https://api.github.com/users/traut/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/traut/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/traut/subscriptions",
"type": "User",
"url": "https://api.github.com/users/traut",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2015-11-17T15:13:14Z
|
2021-09-08T21:00:42Z
|
2015-11-17T15:24:28Z
|
NONE
|
resolved
|
Requests fail while trying to format exception related to non-existent CA file.
Code to reproduce:
```
import requests
requests.get('https://google.com', verify='/some/some')
```
results in:
```
File "/Users/traut/.p/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 559, in urlopen
body=body, headers=headers)
File "/Users/traut/.p/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 348, in _make_request
self._raise_timeout(err=e, url=url, timeout_value=conn.timeout)
File "/Users/traut/.p/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 318, in _raise_timeout
if 'timed out' in str(err) or 'did not complete (read)' in str(err): # Python 2.6
TypeError: __str__ returned non-string (type Error)
```
Versions:
Python 2.7.10
Requests 2.8.1
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2886/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2886/timeline
| null |
completed
| null | null | false |
[
"This is definitely a duplicate but I don't have the time to find the exact bug right now. The fix is already merged in urllib3 iirc but @Lukasa can correct me.\n",
"Yup, this is a duplicate of shazow/urllib3#556.\n",
"cool. thanks for looking into it\n"
] |
https://api.github.com/repos/psf/requests/issues/2885
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2885/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2885/comments
|
https://api.github.com/repos/psf/requests/issues/2885/events
|
https://github.com/psf/requests/issues/2885
| 117,320,647 |
MDU6SXNzdWUxMTczMjA2NDc=
| 2,885 |
sending post request data nested object
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9910819?v=4",
"events_url": "https://api.github.com/users/wiryonolau/events{/privacy}",
"followers_url": "https://api.github.com/users/wiryonolau/followers",
"following_url": "https://api.github.com/users/wiryonolau/following{/other_user}",
"gists_url": "https://api.github.com/users/wiryonolau/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/wiryonolau",
"id": 9910819,
"login": "wiryonolau",
"node_id": "MDQ6VXNlcjk5MTA4MTk=",
"organizations_url": "https://api.github.com/users/wiryonolau/orgs",
"received_events_url": "https://api.github.com/users/wiryonolau/received_events",
"repos_url": "https://api.github.com/users/wiryonolau/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/wiryonolau/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wiryonolau/subscriptions",
"type": "User",
"url": "https://api.github.com/users/wiryonolau",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 11 |
2015-11-17T10:00:37Z
|
2019-04-18T12:26:01Z
|
2015-11-17T10:13:49Z
|
NONE
|
resolved
|
Unable to send post data from nested object, some of the data is truncated
``` python
payload = {
"config": {
"data" : "test"
}
}
r = request.post('http://httpbin.org/post', data=payload)
print(r.text)
{
"args": {},
"data": "",
"files": {},
"form": {
"config": "data"
},
"headers": {
"Accept": "*/*",
"Accept-Encoding": "gzip, deflate",
"Content-Length": "11",
"Content-Type": "application/x-www-form-urlencoded",
"Host": "httpbin.org",
"User-Agent": "python-requests/2.8.1"
},
"json": null,
"origin": "xxx.xxx.xxx.xxx",
"url": "http://httpbin.org/post"
}
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2885/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2885/timeline
| null |
completed
| null | null | false |
[
"What are you expecting the result to be here?\n",
"the post data that send become config : data instead of the full object\n",
"I'm sorry, I wasn't clear enough.\n\nWhen you use the `data` keyword, we encode the data using HTTP form-encoding. This is not capable of representing nested data structures, only flat ones. So my question is: what are you trying to achieve, exactly? Because what you're asking requests to do does not make any sense.\n",
"If I urlencode the payload it appear like this\n\n```\nconfig=%7B%27data%27%3A+%27test%27%7\n```\n\nwhich is what should I get on the server\n",
"So you literally want to represent the config as a urlencoded JSON string?\n",
"Ah sorry shouldn't use urlencode, in php they have http_build_query that convert the payload become like this \n\n```\nconfig%5Bdata%5D=test\n```\n",
"@Wiryono That does not match what the urlencoded payload you copied decodes to. In fact, your urlencoded payload corresponds to `config={'data': 'test'}`, which is a JSON-serialized representation of the nested data structure. To get that, you'd want to use:\n\n``` python\nimport json\n\npayload = {\n \"config\": json.dumps({\n \"data\" : \"test\"\n })\n}\n\nr = request.post('http://httpbin.org/post', data=payload)\nprint(r.text)\n```\n\nThat will work appropriately.\n",
"Ok, so to represent that in requests you want to use:\n\n``` python\npayload = {\n \"config[data]\": 'test',\n}\n```\n\nThere is no consistent, standardised way to represent nested data in form-encoding, and PHP just happily invented one. Requests doesn't guess what you want here, you need to come up with a conversion function yourself I'm afraid.\n",
"See also https://github.com/sigmavirus24/requests-toolbelt/issues/45 (which I hope to have time to tackle soon).\n",
"The solution is non-obvious, but all you have to do is change the `data` argument to `json`:\r\n\r\n```python\r\nr = request.post('http://httpbin.org/post', json=payload)\r\n```\r\n\r\nIt's not clear to me why the two arguments exist when they seem to be functionally identical. At least, a warning should be given to the user when trying to send a nested dictionary with the `data` argument (see #5058).",
"They're not functionally identical in the slightest, otherwise both would just work. Please read the documentation."
] |
https://api.github.com/repos/psf/requests/issues/2884
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2884/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2884/comments
|
https://api.github.com/repos/psf/requests/issues/2884/events
|
https://github.com/psf/requests/issues/2884
| 117,081,445 |
MDU6SXNzdWUxMTcwODE0NDU=
| 2,884 |
SSLError: [Errno 185090050] _ssl.c:336: error:0B084002:x509 certificate routines:X509_load_cert_crl_file:system lib
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/15173749?v=4",
"events_url": "https://api.github.com/users/AraHaan/events{/privacy}",
"followers_url": "https://api.github.com/users/AraHaan/followers",
"following_url": "https://api.github.com/users/AraHaan/following{/other_user}",
"gists_url": "https://api.github.com/users/AraHaan/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/AraHaan",
"id": 15173749,
"login": "AraHaan",
"node_id": "MDQ6VXNlcjE1MTczNzQ5",
"organizations_url": "https://api.github.com/users/AraHaan/orgs",
"received_events_url": "https://api.github.com/users/AraHaan/received_events",
"repos_url": "https://api.github.com/users/AraHaan/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/AraHaan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/AraHaan/subscriptions",
"type": "User",
"url": "https://api.github.com/users/AraHaan",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2015-11-16T09:21:47Z
|
2021-09-08T20:01:03Z
|
2015-11-19T15:46:14Z
|
NONE
|
resolved
|
I am seeing this when compiling with requests in Python 2.7.10 with py2exe.
Everything goes just fine till I test the exe it generates.
My setup.py I used to create it is this.
``` Python
from distutils.core import setup
import py2exe
setup(console=['Decorater.py'])
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2884/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2884/timeline
| null |
completed
| null | null | false |
[
"This is almost certainly a py2exe problem. It seems like py2exe is not handling the ssl module correctly. \n",
"I think that missing 3 dependencies could cause this as well although it still happens even when I just solved the \"Insecure Platform Warning\".\n",
"looking closer it seems to be similar to the one like this that is closed.\n\nhere is some Code that I do not know would work in the module that I make executable.\n[personal login data to the account the bot uses stripped from code.]\n\nhttps://i.gyazo.com/20b51bd6ac81a8f613f047146b910287.png\nseems like the below does not work fully atm. But that is when I do not run it with py2exe. Testign it with now.\n\n``` Python\nimport sys\nfrom requests.certs import where\n\nwhere()\nos.environ['REQUESTS_CA_BUNDLE'] = os.path.join(os.path.dirname(sys.executable), \"cacert.pem\")\n```\n",
"Fixed it with the code above but only works in py2exe. Closing this.\n",
"The only issues in py2exe are the Platform ones which I have not found out a fix on.\nAlthough I tried the deps on the other issue that was closed.\n\nhttps://github.com/kennethreitz/requests/issues/2837\n",
"@AraHaan this is absolutely not a bug in requests. Please look into using [data files](http://www.py2exe.org/index.cgi/data_files) with py2exe and [StackOverflow](https://stackoverflow.com) for further help with py2exe.\n",
"I noticed that when I find the warnings in the files it says from that other issue when I comment them out I do not see them anymore. Now it works just fine.\n"
] |
https://api.github.com/repos/psf/requests/issues/2883
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2883/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2883/comments
|
https://api.github.com/repos/psf/requests/issues/2883/events
|
https://github.com/psf/requests/issues/2883
| 116,984,180 |
MDU6SXNzdWUxMTY5ODQxODA=
| 2,883 |
Posting files with unicode file names
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6841988?v=4",
"events_url": "https://api.github.com/users/DeepSpace2/events{/privacy}",
"followers_url": "https://api.github.com/users/DeepSpace2/followers",
"following_url": "https://api.github.com/users/DeepSpace2/following{/other_user}",
"gists_url": "https://api.github.com/users/DeepSpace2/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/DeepSpace2",
"id": 6841988,
"login": "DeepSpace2",
"node_id": "MDQ6VXNlcjY4NDE5ODg=",
"organizations_url": "https://api.github.com/users/DeepSpace2/orgs",
"received_events_url": "https://api.github.com/users/DeepSpace2/received_events",
"repos_url": "https://api.github.com/users/DeepSpace2/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/DeepSpace2/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DeepSpace2/subscriptions",
"type": "User",
"url": "https://api.github.com/users/DeepSpace2",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2015-11-15T09:59:43Z
|
2021-09-08T21:00:42Z
|
2015-11-15T10:29:24Z
|
NONE
|
resolved
|
Hi all. I'm having issues with posting files with UTF-8 file names.
I've thoroughly read these 2 issues:
https://github.com/kennethreitz/requests/issues/2505
https://github.com/mitsuhiko/flask/issues/1384
Please also refer to my question on SO:
http://stackoverflow.com/questions/33717690/python-requests-post-with-unicode-filenames
Unlike the first issue I linked to, my filename doesn't end with '\n'.
I noticed that the file ends up in request.form and not in request.files (just like described in the issues I linked to), must be because of the '_' in 'Content-Disposition: form-data; name="file0"; filename_=utf-8''%D7%90.txt' just after 'filename'.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2883/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2883/timeline
| null |
completed
| null | null | false |
[
"Sorry, you've said you're \"having trouble\", but haven't actually specified what's happening that you aren't expecting. Could you elaborate?\n",
"I'm sorry it was not clear but the problem is that the server doesn't\nreceive the file (request.files is empty).\nOn Nov 15, 2015 12:23, \"Cory Benfield\" [email protected] wrote:\n\n> Sorry, you've said you're \"having trouble\", but haven't actually specified\n> what's happening that you aren't expecting. Could you elaborate?\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2883#issuecomment-156799188\n> .\n",
"So, I believe this is fundamentally the same problem as shown as #2313. We're encoding the file using RFC 2231, which is a more-than-10-year-old specification, and your server appears not to be following it. To resolve this problem, either set the filename to a bytestring (or one that does not contain unicode data) or get the server fixed. =)\n"
] |
https://api.github.com/repos/psf/requests/issues/2882
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2882/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2882/comments
|
https://api.github.com/repos/psf/requests/issues/2882/events
|
https://github.com/psf/requests/pull/2882
| 116,908,035 |
MDExOlB1bGxSZXF1ZXN0NTA3MTg2MDY=
| 2,882 |
Please add a Session.close()
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/14113854?v=4",
"events_url": "https://api.github.com/users/fzzo/events{/privacy}",
"followers_url": "https://api.github.com/users/fzzo/followers",
"following_url": "https://api.github.com/users/fzzo/following{/other_user}",
"gists_url": "https://api.github.com/users/fzzo/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/fzzo",
"id": 14113854,
"login": "fzzo",
"node_id": "MDQ6VXNlcjE0MTEzODU0",
"organizations_url": "https://api.github.com/users/fzzo/orgs",
"received_events_url": "https://api.github.com/users/fzzo/received_events",
"repos_url": "https://api.github.com/users/fzzo/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/fzzo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fzzo/subscriptions",
"type": "User",
"url": "https://api.github.com/users/fzzo",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-11-14T07:28:04Z
|
2021-09-08T05:01:13Z
|
2015-11-14T07:31:51Z
|
NONE
|
resolved
|
Is there a reason why the .close() has yet to be implemented, the equivalent of the .close() from the original requests library? To ensure the session is closed, without leaving any open/dead/hanging/zombie sockets, would a Session.close() be the way to avoid having socket nightmares on a multi-threaded crawler utilizing requesocks.Session()?
Trying to do:
```
sesh = requesocks.Session()
# do stuff
sesh.close()
```
From requests:
```
def close(self):
"""Closes all adapters and as such the session"""
for v in self.adapters.values():
v.close()
```
Thanks!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2882/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2882/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2882.diff",
"html_url": "https://github.com/psf/requests/pull/2882",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2882.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2882"
}
| true |
[
"Uh...this is the repository for the original requests library. We _do_ have a close method. \n"
] |
https://api.github.com/repos/psf/requests/issues/2881
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2881/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2881/comments
|
https://api.github.com/repos/psf/requests/issues/2881/events
|
https://github.com/psf/requests/issues/2881
| 116,875,853 |
MDU6SXNzdWUxMTY4NzU4NTM=
| 2,881 |
No Session.close() ?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/14113854?v=4",
"events_url": "https://api.github.com/users/fzzo/events{/privacy}",
"followers_url": "https://api.github.com/users/fzzo/followers",
"following_url": "https://api.github.com/users/fzzo/following{/other_user}",
"gists_url": "https://api.github.com/users/fzzo/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/fzzo",
"id": 14113854,
"login": "fzzo",
"node_id": "MDQ6VXNlcjE0MTEzODU0",
"organizations_url": "https://api.github.com/users/fzzo/orgs",
"received_events_url": "https://api.github.com/users/fzzo/received_events",
"repos_url": "https://api.github.com/users/fzzo/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/fzzo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fzzo/subscriptions",
"type": "User",
"url": "https://api.github.com/users/fzzo",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2015-11-13T23:36:49Z
|
2021-09-08T21:00:43Z
|
2015-11-14T07:21:01Z
|
NONE
|
resolved
|
Oops! Intended to post this on requesocks, closing
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/14113854?v=4",
"events_url": "https://api.github.com/users/fzzo/events{/privacy}",
"followers_url": "https://api.github.com/users/fzzo/followers",
"following_url": "https://api.github.com/users/fzzo/following{/other_user}",
"gists_url": "https://api.github.com/users/fzzo/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/fzzo",
"id": 14113854,
"login": "fzzo",
"node_id": "MDQ6VXNlcjE0MTEzODU0",
"organizations_url": "https://api.github.com/users/fzzo/orgs",
"received_events_url": "https://api.github.com/users/fzzo/received_events",
"repos_url": "https://api.github.com/users/fzzo/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/fzzo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fzzo/subscriptions",
"type": "User",
"url": "https://api.github.com/users/fzzo",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2881/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2881/timeline
| null |
completed
| null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/2880
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2880/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2880/comments
|
https://api.github.com/repos/psf/requests/issues/2880/events
|
https://github.com/psf/requests/pull/2880
| 116,642,479 |
MDExOlB1bGxSZXF1ZXN0NTA1NjkyODA=
| 2,880 |
cache Response.text
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/118963?v=4",
"events_url": "https://api.github.com/users/j0hnsmith/events{/privacy}",
"followers_url": "https://api.github.com/users/j0hnsmith/followers",
"following_url": "https://api.github.com/users/j0hnsmith/following{/other_user}",
"gists_url": "https://api.github.com/users/j0hnsmith/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/j0hnsmith",
"id": 118963,
"login": "j0hnsmith",
"node_id": "MDQ6VXNlcjExODk2Mw==",
"organizations_url": "https://api.github.com/users/j0hnsmith/orgs",
"received_events_url": "https://api.github.com/users/j0hnsmith/received_events",
"repos_url": "https://api.github.com/users/j0hnsmith/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/j0hnsmith/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/j0hnsmith/subscriptions",
"type": "User",
"url": "https://api.github.com/users/j0hnsmith",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-11-12T21:19:12Z
|
2021-09-08T05:01:13Z
|
2015-11-12T23:05:17Z
|
NONE
|
resolved
|
`Response.text` is a pure function, seems pointless doing the same conversions in repeated calls.
Cached
```
In [4]: %timeit any(word in resp.text for word in words)
The slowest run took 6.20 times longer than the fastest. This could mean that an intermediate result is being cached
100000 loops, best of 3: 17.4 µs per loop
```
Not cached
```
In [5]: %timeit any(word in resp.text_old for word in words)
10000 loops, best of 3: 64.4 µs per loop
```
`len(words) == 3` in these examples, in a real world use case it could be much longer.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2880/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2880/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2880.diff",
"html_url": "https://github.com/psf/requests/pull/2880",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2880.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2880"
}
| true |
[
"Thanks for this! However, `Response.text` is _not_ a pure function, and cannot be safely cached. This is actually explicitly called out [in the docs](http://docs.python-requests.org/en/latest/user/quickstart/#response-content):\n\n> If you change the encoding, Requests will use the new value of `r.encoding` whenever you call `r.text`.\n\nThis is intentional and desirable. For this reason, we want to prevent text from being cached. \n\nSorry we can't accept your contribution! =(\n"
] |
https://api.github.com/repos/psf/requests/issues/2879
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2879/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2879/comments
|
https://api.github.com/repos/psf/requests/issues/2879/events
|
https://github.com/psf/requests/issues/2879
| 116,582,947 |
MDU6SXNzdWUxMTY1ODI5NDc=
| 2,879 |
SSL error when using Requests 2.8.1 and SHA-1
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/13770313?v=4",
"events_url": "https://api.github.com/users/datarup/events{/privacy}",
"followers_url": "https://api.github.com/users/datarup/followers",
"following_url": "https://api.github.com/users/datarup/following{/other_user}",
"gists_url": "https://api.github.com/users/datarup/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/datarup",
"id": 13770313,
"login": "datarup",
"node_id": "MDQ6VXNlcjEzNzcwMzEz",
"organizations_url": "https://api.github.com/users/datarup/orgs",
"received_events_url": "https://api.github.com/users/datarup/received_events",
"repos_url": "https://api.github.com/users/datarup/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/datarup/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/datarup/subscriptions",
"type": "User",
"url": "https://api.github.com/users/datarup",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 15 |
2015-11-12T16:16:26Z
|
2021-09-08T21:00:43Z
|
2015-11-13T16:02:56Z
|
NONE
|
resolved
|
I'm trying to connect to a site which uses TLS 1.2 using reqests package in Python 3.5. In Chrome I get the warning that the certificate chain for the website contains atleast one certificate that was signed by using SHA-1.
I get the following error:
```
Traceback (most recent call last):
File "<ipython-input-51-e43bc1030cea>", line 1, in <module>
f = s.get(url)
File "C:\Anaconda3\lib\site-packages\requests\sessions.py", line 480, in get
return self.request('GET', url, **kwargs)
File "C:\Anaconda3\lib\site-packages\requests\sessions.py", line 468, in request
resp = self.send(prep, **send_kwargs)
File "C:\Anaconda3\lib\site-packages\requests\sessions.py", line 576, in send
r = adapter.send(request, **kwargs)
File "C:\Anaconda3\lib\site-packages\requests\adapters.py", line 433, in send
raise SSLError(e, request=request)
SSLError: EOF occurred in violation of protocol (_ssl.c:646)
```
Requests version is 2.8.1 and Python version is Python 3.5.0 :: Anaconda 2.4.0 (64-bit)
Is there a fix to this issue or has come across something similar? I have tried connecting to the site by forcing the TLS version after this [blog](https://lukasa.co.uk/2013/01/Choosing_SSL_Version_In_Requests/) form one of the contributors to Requests package, but I still get the sam error. Upgrading certs on the site is not an option now.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2879/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2879/timeline
| null |
completed
| null | null | false |
[
"EOF in violation of protocol suggests that the remote peer has rejected the connection for some reason and, rather than terminate it cleanly, has decided to tear the connection down. Do you have PyOpenSSL installed?\n",
"But I'm able to hit the same server from a browser or PostMan. There's no authentication for the site too. \nNo, I don't have PyOpenSSL let me try researching that.\n",
"@datarup No need. Can you run this from your Python shell?\n\n``` python\nimport ssl\nprint ssl.OPENSSL_VERSION\n```\n",
"OpenSSL 1.0.2d 9 Jul 2015\n",
"Alright, fab. Can you share the URL with me, either in this thread or privately? I'd like to see if I can verify the problem on my end.\n",
"I'm sorry, but he site is hosted on our intranet and you will not be able to hit externally.\n",
"In that case, do you feel comfortable sending me a packet capture from your machine as you attempt this connection? You'd need to use a tool like Wireshark to capture it.\n",
"Okay, I will PM you the capture.\n",
"@Lukasa The capture was not possible due to IT restrictions on Wireshark. Anyway I have tried curl and it works. the site has 'SSL connection using TLS_RSA_WITH_RC4_128_SHA' and I tried the suggestion [on Stack Overflow](http://stackoverflow.com/questions/32650984/why-does-python-requests-ignore-the-verify-parameter) but still get the same error\n",
"Update: Running it as a prepared request in a session is working:\n\n```\nfrom requests import Request, Session\ns = Session()\nreq = Request('GET', url)\nprepped = req.prepare()\nresp = s.send(prepped)\ns.close()\n```\n\nI was using this, which has issues with EOF error:\n\n```\nimport requests\nrequests.get(url)\n```\n\nHave I missed some documentation and the second approach was not quite right or should that too work?\n\nThanks!\n",
"These should be essentially identical. I wonder...when you use the prepared request, can you show me information about the response?\n",
"Respnse headers:\n\n```\n{'Content-Type': 'application/json;charset=UTF-8', 'ETag': '7db2c493cc7a80f2fc0fff9cea09e59a', 'Access-Control-Allow-Headers': 'origin, x-requested-with, content-type,gm-clientidentifier,gm-clientkey', 'Transfer-Encoding': 'chunked', 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'PUT, GET, POST, DELETE, OPTIONS', 'Date': 'Thu, 12 Nov 2015 19:30:31 GMT', 'X-Powered-By': 'Servlet/2.5 JSP/2.1'}\n```\n\nStatus: 200 OK\nApparent-encoding: ascii\nis_redirect: False\nContent: {Is a valid json}\n\nLet me know if you need any other information from response\n",
"So, the key reason the prepared request would be different is because you didn't give it a chance to merge your environment settings. Can you check for me, please, the following environment variables: `REQUESTS_CA_BUNDLE`, `CURL_CA_BUNDLE`, `HTTPS_PROXY`, `HTTP_PROXY`, `NO_PROXY`,\n",
"Here are the variables:\n\n```\nNO_PROXY: localhost,127.0.0.1,::1,[::1]\nHTTP_PROXY: http://user:pass@proxy:port\nHTTPS_PROXY: https://user:pass@proxy:port\n```\n\nProxy is connection is valid. REQUESTS_CA_BUNDLE and CURL_CA_BUNDLE are not present. \n",
"Yeah, the proxy connection is valid, but requests doesn't support speaking to proxy using https. That's where your problem is coming from: we're not doing what that proxy wants us to do.\n\nYou can either change your HTTPS_PROXY variable to be \"http://user:pass@proxy:port\", or you can set proxies={'https': 'http://user:pass@proxy:port'} to enable requests to function correctly.\n"
] |
https://api.github.com/repos/psf/requests/issues/2878
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2878/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2878/comments
|
https://api.github.com/repos/psf/requests/issues/2878/events
|
https://github.com/psf/requests/pull/2878
| 116,552,991 |
MDExOlB1bGxSZXF1ZXN0NTA1MTMwNjc=
| 2,878 |
Remove redundant json import.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9250239?v=4",
"events_url": "https://api.github.com/users/kiddten/events{/privacy}",
"followers_url": "https://api.github.com/users/kiddten/followers",
"following_url": "https://api.github.com/users/kiddten/following{/other_user}",
"gists_url": "https://api.github.com/users/kiddten/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kiddten",
"id": 9250239,
"login": "kiddten",
"node_id": "MDQ6VXNlcjkyNTAyMzk=",
"organizations_url": "https://api.github.com/users/kiddten/orgs",
"received_events_url": "https://api.github.com/users/kiddten/received_events",
"repos_url": "https://api.github.com/users/kiddten/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kiddten/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kiddten/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kiddten",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-11-12T13:55:57Z
|
2021-09-08T05:01:14Z
|
2015-11-12T13:56:55Z
|
CONTRIBUTOR
|
resolved
|
related to https://github.com/kennethreitz/requests/issues/2877
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2878/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2878/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2878.diff",
"html_url": "https://github.com/psf/requests/pull/2878",
"merged_at": "2015-11-12T13:56:55Z",
"patch_url": "https://github.com/psf/requests/pull/2878.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2878"
}
| true |
[
"\\o/ Thanks so much! :sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/2877
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2877/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2877/comments
|
https://api.github.com/repos/psf/requests/issues/2877/events
|
https://github.com/psf/requests/issues/2877
| 116,547,548 |
MDU6SXNzdWUxMTY1NDc1NDg=
| 2,877 |
Redundant import in docs?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9250239?v=4",
"events_url": "https://api.github.com/users/kiddten/events{/privacy}",
"followers_url": "https://api.github.com/users/kiddten/followers",
"following_url": "https://api.github.com/users/kiddten/following{/other_user}",
"gists_url": "https://api.github.com/users/kiddten/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kiddten",
"id": 9250239,
"login": "kiddten",
"node_id": "MDQ6VXNlcjkyNTAyMzk=",
"organizations_url": "https://api.github.com/users/kiddten/orgs",
"received_events_url": "https://api.github.com/users/kiddten/received_events",
"repos_url": "https://api.github.com/users/kiddten/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kiddten/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kiddten/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kiddten",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2015-11-12T13:21:00Z
|
2021-09-08T21:00:43Z
|
2015-11-12T14:11:11Z
|
CONTRIBUTOR
|
resolved
|
I am talking about this doc.
https://github.com/kennethreitz/requests/blob/master/docs/user/quickstart.rst
Here:
``` python
>>> import json
>>> url = 'https://api.github.com/some/endpoint'
>>> payload = {'some': 'data'}
>>> r = requests.post(url, json=payload)
```
json parameter is optional.
Is there any reason to import json module?
Thank you.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2877/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2877/timeline
| null |
completed
| null | null | false |
[
"@kiddick You're entirely right: that import of the `json` module is from an old version of the documentation. Would you like to open a pull request to remove it?\n",
"@Lukasa Thanks for reply. Yep I'll open pull request!\n",
"Thanks for fixing this @kiddick \n"
] |
https://api.github.com/repos/psf/requests/issues/2876
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2876/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2876/comments
|
https://api.github.com/repos/psf/requests/issues/2876/events
|
https://github.com/psf/requests/issues/2876
| 116,447,926 |
MDU6SXNzdWUxMTY0NDc5MjY=
| 2,876 |
Exception messages
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1687238?v=4",
"events_url": "https://api.github.com/users/KenKundert/events{/privacy}",
"followers_url": "https://api.github.com/users/KenKundert/followers",
"following_url": "https://api.github.com/users/KenKundert/following{/other_user}",
"gists_url": "https://api.github.com/users/KenKundert/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/KenKundert",
"id": 1687238,
"login": "KenKundert",
"node_id": "MDQ6VXNlcjE2ODcyMzg=",
"organizations_url": "https://api.github.com/users/KenKundert/orgs",
"received_events_url": "https://api.github.com/users/KenKundert/received_events",
"repos_url": "https://api.github.com/users/KenKundert/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/KenKundert/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/KenKundert/subscriptions",
"type": "User",
"url": "https://api.github.com/users/KenKundert",
"user_view_type": "public"
}
|
[
{
"color": "fbca04",
"default": false,
"description": null,
"id": 44501249,
"name": "Needs BDFL Input",
"node_id": "MDU6TGFiZWw0NDUwMTI0OQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input"
},
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
}
] |
closed
| true | null |
[] | null | 15 |
2015-11-11T23:55:59Z
|
2021-09-08T07:00:41Z
|
2017-07-30T00:22:54Z
|
NONE
|
resolved
|
As a user I would like it to be easy to generate simple helpful messages upon an exception. A common way this is done in is to simply cast the exception to a string. However, with requests, the result is often something you don't want to show an end user. For example:
``` python
try:
downloaded = requests.get(url)
except (requests.Timeout) as err:
print(str(err))
```
Results in the following message to the user:
```
HTTPSConnectionPool(host='cal.example.com', port=443): Max retries exceeded with url: /ken/ken.ics/00832974-ffb3-42ea-ba3e-84ba3c0a30f6.ics (Caused by ConnectTimeoutError(<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7fd4644ef400>, 'Connection to cal.example.com timed out. (connect timeout=0.1)'))
```
There is useful information in this message, but it is not easily user accessible and is rather intimidating for end users. The information is probably available in the exception itself, but it is not clear how to get it. Also, it seems like accessing it would likely be different for each type of exception, which greatly increases the complexity of catching and reporting exceptions.
What I would expect is something like::
```
Connection to cal.example.com timed out.
```
It would be very helpful if there were an easy way to generate user friendly error messages from requests exceptions. If there is such a way, I have not been able to find it. Thus, I suggest it be added to the otherwise excellent introduction to requests. If there is not such a way, I would like to to suggest that it be added.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 2,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 2,
"url": "https://api.github.com/repos/psf/requests/issues/2876/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2876/timeline
| null |
completed
| null | null | false |
[
"So a little introspection shows this:\n\n``` py\n>>> import requests\n>>> try:\n... r = requests.get('https://httpbin.org/delay/5', timeout=2)\n... except requests.Timeout as terr:\n... pass\n...\n>>> terr\nReadTimeout(ReadTimeoutError(\"HTTPSConnectionPool(host='httpbin.org', port=443): Read timed out. (read timeout=2)\",),)\n>>> terr.message\nReadTimeoutError(\"HTTPSConnectionPool(host='httpbin.org', port=443): Read timed out. (read timeout=2)\",)\n>>> terr.message.message\n\"HTTPSConnectionPool(host='httpbin.org', port=443): Read timed out. (read timeout=2)\"\n>>> str(terr)\n\"HTTPSConnectionPool(host='httpbin.org', port=443): Read timed out. (read timeout=2)\"\n```\n\nThe best that can be done is to call `str(terr)` to get the lowest level error message from urllib3. Alternatively, if you always know it's a Timeout in that case, you can do this:\n\n``` py\ntry:\n r = requests.get(url)\nexcept requests.Timeout:\n print('Request to {} timed out.'.format(url))\n```\n",
"The appeal of requests is its simplicity and ease of use. However it seems that exceptions appear to be, well, the exception. It would be nice if there were some easy way of getting an error message suitable for the end user that would work for all of the various exceptions generated by requests.\n",
"@KenKundert That's sadly a bit of a tricky business. Because of the nature of Python code the total range of exceptions that may be thrown out of requests code is actually very nearly unbounded: our dependencies may throw a really quite dramatic range.\n\nWhile Timeout errors potentially have nice user-friendly messages, not all the others do, or they represent a painfully large range of problems (SSLError is a particular culprit here, due to the fact that there's only one exception used and it uses OpenSSL's extremely inscrutable error messages to generate its text.\n\nI think the best we could do is enhance the API documentation to indicate how to get good error messages, or to indicate what the error messages could/should be for each type of exception.\n",
"@KenKundert I think there's a disconnect here. requests is meant to be powerful and simple for _developers_ (of varying experience with HTTP). Developers tend to prefer more detailed information when something fails than they prefer vague end-user friendly messages. We give developers what they need to debug failure cases. It's up to those developers to make something end-user friendly.\n\nFailure scenarios also aren't uniform across all servers, applications, etc. The developer creating something for the end-user should strive to anticipate as many of those failure cases as possible and handle them in a way that appropriately and accurately describes the problem to those users.\n",
"But don't developer develop for users? I agree that during debugging the more detailed messages might be helpful, but in the end the developers should want to create programs that are easily used. Even when I develop programs for my own use, when I am actually using them I prefer error messages that I don't have to dissect.\n",
"We don't know who developers using requests are developing for. They can be developing for a wide range of other people. praw, github3.py, twython, etc. are all geared towards developers. httpie, python-keystoneclient, etc. are all geared towards some mix of developers and non-developers. Other projects use requests in other ways and expose errors much differently. Our only concern is providing the person directly using requests with as much information as we can.\n\nHow you provide users (whether you're the sole user or not) of your code with informative error messages is up to you.\n",
"I agree that the information should be available, but currently it is very difficult to access. Presumably it is accessible through the exception object itself, but how to access it is undocumented and seems to differ from exception to exception. The only thing I reliably have access to is the exception cast to a string, and there if I want to use it I have to parse the string. Preferably the components of that string would be available through attributes or methods of the exception.\n\nIn my experience, casting the exception to a string generally results is a reasonably easy to read summary of the error in a form that could be passed to the end user, and the components of that message are also available so that a tailored response is possible. For example, an IOError contains filename, strerror, errno, etc. to make it possible to roll your own message or craft a response.\n\nThe problem I am having with requests at the moment is that I cannot produce an error message suitable for end users. Casting the exception to a string produces something too complex, and I cannot find the things I would need to use to create an informative message. I am reduced to an error message that simply parrots the name of the exception:\n\n``` python\n try: \n payload = requests.get(url)\n except (requests.Timeout) as err:\n print('%s: connection timed out.' % url)\n except requests.ConnectionError as err:\n print('%s: connection error' % url)\n ...\n```\n\nMy fear is that not only is the information I need not documented, but getting it is complicated and highly exception specific. This seems to be what @Lukasa is suggesting. Then, to produce good responses to exceptions I will have to understand all of those exceptions in detail and craft code that handles all the cases that could occur. And of course, most users would have to do the same thing, meaning that this code will be written over and over. Perhaps someday someone will create a requests-like package for interpreting the exceptions produced by requests. I guess I am suggesting that it would be good to build that into requests itself, or it it already exists, then give some examples of how to use it in the documentation.\n",
"For the Certbot project, we have a similar need: https://github.com/certbot/certbot/issues/4535. Specifically, when retries are enabled, all errors seem to get wrapped in \"ConnectionError: HTTPSConnectionPool(host='acme-v01.api.letsencrypt.org', port=443): Max retries exceeded with url: /directory (Caused by NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7f020eee70d0>: Failed to establish a new connection: [Errno -2] Name or service not known',))\". We can unwrap the error layers ourselves, but I think this is probably a common problem, and end-users would be better served by a library that provide more readable errors by default. And, having attempted to manually unwrap the error layers in our code, it is not immediately obvious how to do so.\r\n\r\nI would also argue, in this specific case, that even a developer usually doesn't consider \"Max retries exceeded\" to be the core error, but the \"Name or service not known.\" Reducing some of the layering in the string format would be useful. Note that I'm not proposing to change the actual exceptions thrown, since that would break existing code.",
"@jsha So, I think it'd be instructive to get an idea of what you're proposing here. Should we be removing the text \"Max retries exceeded\"? What do we do if retries *are* enabled, given that they will necessarily hide some errors?",
"Yep! My ideal error would be more like:\r\n\r\n> ConnectionError: For https://acme-v01.api.letsencrypt.org/directory: Failed to establish a new connection: [Errno -2] Name or service not known\r\n\r\nThat involves stripping out a few layers of errors. I realize that this may hide errors: For instance, the first try might result in a connection refused, and the second try might result in a timeout. I think that's fine. For most error conditions, they will be the same, and in situations where they are different, it's usually non-deterministic.\r\n\r\nIf we wanted to really emphasize the retrying aspect, it would be reasonable to append \"(tried N times)\" after the message, but I think that is not super important.",
"I'd presume you would only want this change for the string representation of the error? Changing the internals so that the MaxRetryError is gone would be tricky and sketchy from a backward-compact perspective. ",
"Yep, agreed that changing the internals would likely break a lot of programs. It would be sufficient for our use case to just change the string representation.",
"Ok, so the likely thing to want to do is to override the `__str__` implementation of some of the relevant Requests exceptions, probably in the base class if possible, to spot `MaxRetryError` and strip it out. I'll happily review any PR that makes this change. :grin:",
"Working through an implementation now; will reference here.",
"Proposed starting to implementation: https://github.com/kennethreitz/requests/pull/4047"
] |
https://api.github.com/repos/psf/requests/issues/2875
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2875/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2875/comments
|
https://api.github.com/repos/psf/requests/issues/2875/events
|
https://github.com/psf/requests/issues/2875
| 116,433,178 |
MDU6SXNzdWUxMTY0MzMxNzg=
| 2,875 |
Failed to establish a new connection: [Errno 11001] getaddrinfo failed
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6819843?v=4",
"events_url": "https://api.github.com/users/patjones80/events{/privacy}",
"followers_url": "https://api.github.com/users/patjones80/followers",
"following_url": "https://api.github.com/users/patjones80/following{/other_user}",
"gists_url": "https://api.github.com/users/patjones80/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/patjones80",
"id": 6819843,
"login": "patjones80",
"node_id": "MDQ6VXNlcjY4MTk4NDM=",
"organizations_url": "https://api.github.com/users/patjones80/orgs",
"received_events_url": "https://api.github.com/users/patjones80/received_events",
"repos_url": "https://api.github.com/users/patjones80/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/patjones80/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/patjones80/subscriptions",
"type": "User",
"url": "https://api.github.com/users/patjones80",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 16 |
2015-11-11T22:16:44Z
|
2017-12-06T02:03:04Z
|
2015-11-12T14:11:49Z
|
NONE
| null |
Hello all. I'm using requests 2.8.1 as part of a Flask 0.10.1 installation with Python 3.4 on Windows 8.1, and getting the following traceback upon attempting to run my code:
```
File "c:\Python34\atweather\lib\site-packages\requests-2.8.1-py3.4.egg\requests\api.py", line 69, in get return request('get', url, params=params, **kwargs)
File "c:\Python34\atweather\lib\site-packages\requests-2.8.1-py3.4.egg\requests\api.py", line 50, in request response = session.request(method=method, url=url, **kwargs)
File "c:\Python34\atweather\lib\site-packages\requests-2.8.1-py3.4.egg\requests\sessions.py", line 468, in request resp = self.send(prep, **send_kwargs)
File "c:\Python34\atweather\lib\site-packages\requests-2.8.1-py3.4.egg\requests\sessions.py", line 576, in send r = adapter.send(request, **kwargs)
File "c:\Python34\atweather\lib\site-packages\requests-2.8.1-py3.4.egg\requests\adapters.py", line 423, in send raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='proxy.server', port=3128): Max retries exceeded with url: http://forecast.weather.gov/MapClick.php?lon=-84.2476&lat=34.5627&unit=0&lg=english&FcstType=text&TextType=1 (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x036F3830>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed',)))
```
It's not an issue with the url that I'm passing in, as the url works fine when I enter it directly into a browser tab - and moreover the same issue crops up with any url, for instance http://google.com.
My website is very simple and I'm just attempting to run it locally to do some development on it. I'm fairly new to using Flask and web programming in general, so if this seems more like something that I'm doing wrong with my Flask environment, directory structure, etc. then I apologize. Nothing has really changed about my project since the last time I did local development on it except for the versions of various packages that Flask uses.
Also...I chopped off everything in the traceback prior to when it started throwing "requests" stuff at me, but if you think any of what I excluded could be helpful then let me know. Thank you.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2875/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2875/timeline
| null |
completed
| null | null | false |
[
"Hey @patjones80,\n\nCan you try doing\n\n``` py\n>>> import socket\n>>> s = socket.socket()\n>>> s.connect(('www.google.com', 80))\n```\n\nTo see what happens?\n",
"Oh, I just noticed this snippet from your issue:\n\n```\nHTTPConnectionPool(host='proxy.server', port=3128)\n```\n\nDo you have proxies properly configured?\n",
"I'm not sure to be honest. I can say for sure that I have never done anything with them to begin with. \n",
"@patjones80 It seems like you have some environment variables set up here. In your deployed environment, can you run `echo \"$HTTP_SERVER\"` for me please?\n\nRequests will normally look for environment variables that tell processes to use proxies. It seems like you have one set here.\n",
"HTTP_SERVER is not set.\n",
"Oh I'm sorry, I had clearly lost my mind. `echo \"$HTTP_PROXY\"` please. =)\n",
"No problem! That isn't set either, which I checked both at a DOS command line and from within IDLE.\n",
"Hmmmm.\n\nCan you try opening a Python shell and running this?\n\n``` python\n>>> import requests\n>>> r = requests.get('http://http2bin.org/get')\n```\n",
"The problem was in my views.py file. The line \n\n```\npage = requests.get(url, proxies = {'http':'http://proxy.server:3128'})\n```\n\nshould just be\n\n```\npage = requests.get(url)\n```\n\nI think that proxy 3128 is necessary for the server (pythonanywhere) that hosts my website, and I just forgot to remove it from my local code. Thanks!\n",
"Thanks for updating us @patjones80 \n",
"@patjones80 could you please tell me which views.py? I am new to this and I am getting a similar error MaxRetryError: HTTPConnectionPool(host='w.x.y.z', port=5000):\n",
"@wjdan94 I'm not sure that I understand your question. If your problem is the same as mine, then your port number is either incorrect or just doesn't need to be specified. In my case, it did not need to be specified since I was working in my local development environment.\n",
"your system is running behind a proxy which is not allowing it. Try to set the settings of proxy to that of google's",
"after running _echo \"$HTTP_SERVER\"_ I got _\"$HTTP_SERVER\"_",
"@5ai92 My system does run behind a proxy which is XX-net. How can I fix that",
"Hi,\r\n@patjones80 \r\nDoes this views.py file exist in request library folder? Could you please let me know where to change the settings to bypass proxy as I am also getting proxy issues .\r\n"
] |
https://api.github.com/repos/psf/requests/issues/2874
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2874/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2874/comments
|
https://api.github.com/repos/psf/requests/issues/2874/events
|
https://github.com/psf/requests/issues/2874
| 116,346,023 |
MDU6SXNzdWUxMTYzNDYwMjM=
| 2,874 |
urlib3.HTTPResponse.stream would confuse gzip content originally not encoded utf8
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/741544?v=4",
"events_url": "https://api.github.com/users/yupbank/events{/privacy}",
"followers_url": "https://api.github.com/users/yupbank/followers",
"following_url": "https://api.github.com/users/yupbank/following{/other_user}",
"gists_url": "https://api.github.com/users/yupbank/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/yupbank",
"id": 741544,
"login": "yupbank",
"node_id": "MDQ6VXNlcjc0MTU0NA==",
"organizations_url": "https://api.github.com/users/yupbank/orgs",
"received_events_url": "https://api.github.com/users/yupbank/received_events",
"repos_url": "https://api.github.com/users/yupbank/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/yupbank/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yupbank/subscriptions",
"type": "User",
"url": "https://api.github.com/users/yupbank",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2015-11-11T14:34:13Z
|
2021-09-08T21:00:44Z
|
2015-11-11T14:49:18Z
|
NONE
|
resolved
|
Hi, i was using requests for get gziped web page which originally encoded in gbk.
as requests would try their best decoding gziped pages but as originally not by utf-8 would make the result weried and hard to debug :(
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2874/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2874/timeline
| null |
completed
| null | null | false |
[
"@yupbank Sorry, I didn't fully understand this message. Can you post some code that demonstrates the problem?\n",
"basically \ni figured the correct way:\n\n```\nzlib.decompress(requests.get('http://zhongyanghuachengzs.fang.com/', stream=True).raw.read(), \n zlib.MAX_WBITS|32).decode('gbk'))\n```\n\nas the page is encoded in gbk and gziped.\n\nalso due to the fact that requests would auto decompress gzip, this would give weird results.\n\n```\nrequests.get('http://zhongyanghuachengzs.fang.com/')\n```\n",
"I don't believe this is a urllib3 bug: nothing unexpected is happening here.\n\nAs you've correctly spotted, requests is transparently decompressing the gzip. This is working perfectly, nothing unexpected is happening. However, I suspect you're then using `response.text`, which is trying to decode the response body. Because there's no header explaining what encoding was used, requests falls back on the spec which claims that text/html should be served in ISO-8859-1. This is wrong for this page, and so it breaks the rendering.\n\nAll you need to do is:\n\n```\nrequests.get('http://zhongyanghuachengzs.fang.com/').content.decode('gbk')\n```\n\nThat'll work perfectly. =)\n",
"thanks a lot, i should follow the document carefully :) \n"
] |
https://api.github.com/repos/psf/requests/issues/2873
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2873/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2873/comments
|
https://api.github.com/repos/psf/requests/issues/2873/events
|
https://github.com/psf/requests/pull/2873
| 116,256,838 |
MDExOlB1bGxSZXF1ZXN0NTAzNDM5Njg=
| 2,873 |
Fix super_len for partially read files
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
] | null | 1 |
2015-11-11T03:22:53Z
|
2021-09-08T05:01:14Z
|
2015-11-11T07:52:47Z
|
CONTRIBUTOR
|
resolved
|
See #2872
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2873/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2873/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2873.diff",
"html_url": "https://github.com/psf/requests/pull/2873",
"merged_at": "2015-11-11T07:52:47Z",
"patch_url": "https://github.com/psf/requests/pull/2873.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2873"
}
| true |
[
"LGTM. :sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/2872
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2872/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2872/comments
|
https://api.github.com/repos/psf/requests/issues/2872/events
|
https://github.com/psf/requests/issues/2872
| 116,211,092 |
MDU6SXNzdWUxMTYyMTEwOTI=
| 2,872 |
Post request hangs in certain cases when body is a StringIO
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4674965?v=4",
"events_url": "https://api.github.com/users/braincore/events{/privacy}",
"followers_url": "https://api.github.com/users/braincore/followers",
"following_url": "https://api.github.com/users/braincore/following{/other_user}",
"gists_url": "https://api.github.com/users/braincore/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/braincore",
"id": 4674965,
"login": "braincore",
"node_id": "MDQ6VXNlcjQ2NzQ5NjU=",
"organizations_url": "https://api.github.com/users/braincore/orgs",
"received_events_url": "https://api.github.com/users/braincore/received_events",
"repos_url": "https://api.github.com/users/braincore/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/braincore/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/braincore/subscriptions",
"type": "User",
"url": "https://api.github.com/users/braincore",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2015-11-10T21:56:24Z
|
2021-09-08T21:00:44Z
|
2015-11-11T07:52:49Z
|
NONE
|
resolved
|
This is related to a report for the [Dropbox Python SDK](https://github.com/dropbox/dropbox-sdk-python/issues/27).
The following hangs:
```
from StringIO import StringIO
s = StringIO()
s.write('hello') # This is seeked to the end
requests.post('http://www.google.com', data=s) # Hangs: A success would be a 405 error
```
After a cursory look, it looks like the request isn't fully formed so the server doesn't attempt to send a response which leaves the client hanging.
If we call `s.seek(0)`, this works. A bit more counterintuitively, this also works:
```
requests.post('http://www.google.com', data=StringIO())
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2872/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2872/timeline
| null |
completed
| null | null | false |
[
"Thanks for this report. This comes because we calculate the length of the StringIO, but don't account for where it is seeked to. \n\nAs I see it we have two options: we can either always `seek(0)` before sending a body, or we can use `tell()` to adjust the content length. \n\nOn balance, I think the first is probably better: we already seek(0) in some cases (e.g. auth handlers), so this would be consistent. The downside is that it makes it harder for someone to upload a partial file from disk, though I suspect that's a minor use-case. \n\n@sigmavirus24 thoughts?\n",
"This isn't our fault. If you do this:\n\n```\nimport StringIO\n\ns = StringIO.StringIO()\ns.write('foobarbogus')\nprint(s.read())\n```\n\nYou will get nothing. By writing you've moved the fake pointer. We send what we can. That said, we're sending a content-length of whatever is in the buffer because we look at `s.len` so we're telling the server we're sending something with 11 bytes but sending nothing and that's they the server hangs.\n\nThis is how `StringIO` instances work and misusing them is not a bug in requests because partial file uploads are _very_ desirable. Especially if you're trying to split a file up into multiple requests (e.g., uploading very large objects to S3 or an OpenStack Swift service).\n\nThere's only so far I will go in helping a user not shoot themselves in the foot. The line I draw is in doing the wrong thing for users who actually know what they're doing.\n\n> we already seek(0) in some cases (e.g. auth handlers), so this would be consistent.\n\nI've always argued that that behaviour is wrong too. I still don't like it.\n",
"I see now that this applies to file objects as well which makes sense. I would favor using the current location of the stream by inspecting `tell()`. That to me is part of the benefit and contract of using a file-like object.\n\nIt will be more burdensome for developers to get around your forced `seek(0)` than for developers to add a `seek(0)` themselves when needed. If the latter case were more common, I could see an ease-of-use argument. Given that there hasn't been a reported issue before, this scenario is probably rare and reserved for more advanced usage.\n",
"@sigmavirus24: Are you opposed to setting the `Content-Length` to `s.len - s.tell()`?\n\nTo be clear, my expectation was that the post request would have an empty body precisely for the reason that you state. The surprise was that the request hung, ie. the `Content-Length` is set to the full size, rather than what's remaining.\n",
"No I'm not opposed to that, but that is not necessarily applicable to every case where you have `s.len`. `super_len`'s logic would need to change drastically. You basically want [`super_len`](https://github.com/kennethreitz/requests/blob/master/requests/utils.py#L50) to look like this:\n\n``` py\ndef super_len(o):\n total_len = 0\n current_position = 0\n if hasattr(o, '__len__'):\n total_len = len(o)\n\n elif hasattr(o, 'len'):\n total_len = o.len\n\n elif hasattr(o, 'fileno'):\n try:\n fileno = o.fileno()\n except io.UnsupportedOperation:\n pass\n else:\n total_len = os.fstat(fileno).st_size\n\n # Having used fstat to determine the file length, we need to\n # confirm that this file was opened up in binary mode.\n if 'b' not in o.mode:\n warnings.warn((\n \"Requests has determined the content-length for this \"\n \"request using the binary size of the file: however, the \"\n \"file has been opened in text mode (i.e. without the 'b' \"\n \"flag in the mode). This may lead to an incorrect \"\n \"content-length. In Requests 3.0, support will be removed \"\n \"for files in text mode.\"),\n FileModeWarning\n )\n\n if hasattr(o, 'tell'):\n current_position = o.tell()\n\n return max(0, total_len - current_position)\n\n if hasattr(o, 'getvalue'):\n # e.g. BytesIO, cStringIO.StringIO\n return len(o.getvalue())\n```\n\nThat said, I've not tested that.\n",
"As @braincore stated, it's more the fact that the request will hang due to the mismatch between the `Content-Length` and the actual content that is being sent. The misuse of the `StringIO` API was the reason this was discovered, and not problem to be solved.\n\n> It will be more burdensome for developers to get around your forced seek(0) than for developers to add a seek(0) themselves when needed.\n\nI couldn't agree more, especially due to the rather simple fact that the forced seek would make explicit partial file uploads more difficult than they need to be as mentioned by @sigmavirus24.\n",
"Feel free to checkout #2873. That should fix this for all y'all.\n",
"Awesome. Looks great! Thanks again.\n",
"Thanks!\n"
] |
https://api.github.com/repos/psf/requests/issues/2871
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2871/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2871/comments
|
https://api.github.com/repos/psf/requests/issues/2871/events
|
https://github.com/psf/requests/issues/2871
| 116,122,060 |
MDU6SXNzdWUxMTYxMjIwNjA=
| 2,871 |
GitHub release page lists the "latest release" as v2.5.1
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/501702?v=4",
"events_url": "https://api.github.com/users/edmorley/events{/privacy}",
"followers_url": "https://api.github.com/users/edmorley/followers",
"following_url": "https://api.github.com/users/edmorley/following{/other_user}",
"gists_url": "https://api.github.com/users/edmorley/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/edmorley",
"id": 501702,
"login": "edmorley",
"node_id": "MDQ6VXNlcjUwMTcwMg==",
"organizations_url": "https://api.github.com/users/edmorley/orgs",
"received_events_url": "https://api.github.com/users/edmorley/received_events",
"repos_url": "https://api.github.com/users/edmorley/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/edmorley/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/edmorley/subscriptions",
"type": "User",
"url": "https://api.github.com/users/edmorley",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2015-11-10T14:54:43Z
|
2021-09-08T21:00:45Z
|
2015-11-10T15:53:30Z
|
CONTRIBUTOR
|
resolved
|
On https://github.com/kennethreitz/requests/releases , the "latest release" label is next to v2.5.1, whereas the newest release is v2.8.1, which is hidden under the "Show 8 newer tags" link.
If newer release entries are missing because it's a hassle to transpose the release notes into each, perhaps just linking to the relevant entry on http://docs.python-requests.org/en/latest/community/updates/ would be sufficient? (Though it seems the permalinks on that page are just generic, eg `#id1` and not `#v2-8-1` etc). Or even leaving the notes section blank would be fine.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2871/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2871/timeline
| null |
completed
| null | null | false |
[
"Yeah, it's mostly annoying to transpose the notes. I'll do it now anyway. =)\n",
"@Lukasa let's just remove them. This is a hassle and we've had .... 4 bugs about this? It's worthless and it doesn't provide real value.\n",
"Removing them works for me; at least GitHub then defaults to just showing the tags, which will always be up to date :-)\n",
"Releases deleted.\n",
":heart: @sigmavirus24 \n\nThanks for the report @edmorley!\n",
"Many thanks :-)\n"
] |
https://api.github.com/repos/psf/requests/issues/2870
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2870/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2870/comments
|
https://api.github.com/repos/psf/requests/issues/2870/events
|
https://github.com/psf/requests/issues/2870
| 116,100,502 |
MDU6SXNzdWUxMTYxMDA1MDI=
| 2,870 |
Unbundling logic still breaks downstream.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 29 |
2015-11-10T12:49:34Z
|
2021-09-08T07:00:37Z
|
2017-07-30T14:02:09Z
|
MEMBER
|
resolved
|
Originally reported by @fasaxc. Interested parties: @sigmavirus24 @dcramer @eriol @ralphbean @warsaw
We added support for a new form of unbundling logic in #2567. Unfortunately, this seems not to work properly. Specifically, the presence of requests in a system makes it impossible to catch urllib3 exceptions correctly: see [this Stack Overflow question](https://stackoverflow.com/questions/33516164/why-cant-i-catch-this-python-exception) for more.
This problem seems to specifically affect the urllib3 sub-packages, not urllib3 itself. One proposed change is to add every urllib3 sub-package that gets imported to the list of modules requests explicitly imports in this stub file. This is pretty gross, but might work. Does anyone have other suggestions?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2870/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2870/timeline
| null |
completed
| null | null | false |
[
"The issue I'm seeing is that, after importing first `urllib3.exceptions` then `requests.package.urllib3.exceptions`\n\n```\nsys.modules[\"urllib3\"].exceptions is not sys.modules[\"urllib3.exceptions\"]\nsys.modules[\"urllib3\"].exceptions is sys.modules[\"requests.packages.urllib3.exceptions\"]\n```\n\nImporting `requests.package.urllib3.exceptions` results in the exceptions module being loaded a second time because patching `sys.modules[\"requests.packages.urllib3\"]` doesn't also patch `sys.modules[\"requests.packages.urllib3.exceptions\"]` so the import mechanism loads a second copy. Then, it assigns \n\n```\nmodules[\"requests.packages.urllib3\"].exceptions = sys.modules[\"requests.packages.urllib3.exceptions\"]\n```\n\nbut `modules[\"requests.packages.urllib3\"]` is an alias for `modules[\"urllib3\"]` so it clobbers `modules[\"urllib3\"].exceptions`.\n",
"So we definitely broke this. On the other hand, I don't really understand the Python import logic well enough to effectively fix this problem. Thoughts?\n",
"See also https://github.com/kennethreitz/requests/issues/2867\n\nThat SO question was about 2.6.0 I thought. Is this still a problem with 2.8.x?\n",
"Oh that's a good point, did this get backported into the distros?\n",
"So 2.6.x has the old VendorAlias logic from pip. 2.7.0 had nothing but some distros (Debian specifically, more specifically @warsaw) backported the patch from @untitaker that was released in 2.8.0. So version information is **very** important here.\n",
"My recollection is that @fasaxc was using 2.7.0 from Ubuntu 14.04 trusty-liberty, but I may be mistaken.\n",
"Ubuntu may have pulled in Debian's patch from @warsaw. In that case, this is still relevant. (Alternatively, Ubuntu may have continued using the VendorAlias from 2.6.x and we should check that.)\n",
"If it turns out that Debian actually does have the new logic, I'd say it's a good time to think about actually giving up on vendoring urllib3.\n\nOn 10 November 2015 16:20:13 CET, Ian Cordasco [email protected] wrote:\n\n> Ubuntu may have pulled in Debian's patch from @warsaw. In that case,\n> this is still relevant. (Alternatively, Ubuntu may have continued using\n> the VendorAlias from 2.6.x and we should check that.)\n> \n> ---\n> \n> Reply to this email directly or view it on GitHub:\n> https://github.com/kennethreitz/requests/issues/2870#issuecomment-155449214\n\n## \n\nSent from my phone. Please excuse my brevity.\n",
"@untitaker I can understand that, but I don't believe we can really discuss doing that unless we get the whole team together, which realistically means PyCon 2016.\n",
"On Nov 10, 2015, at 07:20 AM, Ian Cordasco wrote:\n\n> Ubuntu may have pulled in Debian's patch from @warsaw. In that case, this is\n> still relevant. (Alternatively, Ubuntu may have continued using the\n> VendorAlias from 2.6.x and we should check that.)\n\nUbuntu tracks the Debian version. There are no deltas, except for some\nadditional fixes backported to the Trusty version.\n\n% rmadison -u debian requests\nrequests | 0.12.1-1~bpo60+1 | squeeze-backports | source\nrequests | 0.12.1-1+deb7u1 | oldstable | source\nrequests | 2.0.0-1~bpo70+2 | wheezy-backports | source\nrequests | 2.4.3-6 | stable | source\nrequests | 2.4.3-6 | stable-kfreebsd | source\nrequests | 2.8.1-1 | testing | source\nrequests | 2.8.1-1 | unstable | source\n\n% rmadison -u ubuntu requests\n requests | 0.8.2-1 | precise/universe | source\n requests | 2.2.1-1 | trusty | source\n requests | 2.2.1-1ubuntu0.2 | trusty-security | source\n requests | 2.2.1-1ubuntu0.3 | trusty-updates | source\n requests | 2.4.3-6 | vivid | source\n requests | 2.7.0-3 | wily | source\n requests | 2.7.0-3 | xenial | source\n requests | 2.8.1-1 | xenial-proposed | source\n",
"`trusty-liberty` is 14.04's OpenStack Liberty Archive then?\n",
"I believe so, yes.\n",
"On Nov 10, 2015, at 06:24 AM, Cory Benfield wrote:\n\n> Oh that's a good point, did this get backported into the distros?\n\nDon't forget, @eriol is really doing most of excellent work on this package\nfor Debian!\n\nOur 2.8.1-1 removed the devendorizing patch we were carrying separately, so\nthese days we are really only carrying a few deltas from upstream:\n\nhttps://anonscm.debian.org/cgit/python-modules/packages/requests.git/tree/debian/patches\n\nOf course, we prefer to reduce the differences between the Debian version and\nupstream.\n",
"Hmm. Can we attempt to discover whether this problem exists in Debian's released packages for 2.8.x?\n",
"Lukasa, who do you mean exactly when you say \"the whole team\"? Isn't Kenneth some sort of BDFL, so his decision would suffice?\n\nOn 10 November 2015 16:51:56 CET, Barry Warsaw [email protected] wrote:\n\n> On Nov 10, 2015, at 06:24 AM, Cory Benfield wrote:\n> \n> > Oh that's a good point, did this get backported into the distros?\n> \n> Don't forget, @eriol is really doing most of excellent work on this\n> package\n> for Debian!\n> \n> Our 2.8.1-1 removed the devendorizing patch we were carrying\n> separately, so\n> these days we are really only carrying a few deltas from upstream:\n> \n> https://anonscm.debian.org/cgit/python-modules/packages/requests.git/tree/debian/patches\n> \n> Of course, we prefer to reduce the differences between the Debian\n> version and\n> upstream.\n> \n> ---\n> \n> Reply to this email directly or view it on GitHub:\n> https://github.com/kennethreitz/requests/issues/2870#issuecomment-155460186\n\n## \n\nSent from my phone. Please excuse my brevity.\n",
"@untitaker He is, correct. However, of the three maintainers he's the one _opposed_ to unbundling.\n\nIan and I are both unanimous that we'd like to unbundle. However, we have a duty of care over this library, and that includes not changing things that Kenneth cares a great deal about without his explicit say-so.\n\nI don't want to have a detailed discussion with Kenneth about this remotely, because this issue is close to all our hearts and we should discuss it in an environment that makes it easy for us to treat each other like humans, with respect and kindness. Those two qualities are often lost online, and that would be a tragedy: our working relationships are more important than that.\n",
"@Lukasa I'm sad to confirm but yes, the problem actually exists in Debian testing (with the new import machinery).\nI tried the same setup of https://github.com/kennethreitz/requests/issues/2867 using only system packages (so I took python-etcd from sid) and I can confirm it.\nI looked at python-etcd code, but the problem was already explained by @fasaxc and is about exceptions.\n\nOn Debian this is what I got:\n\n``` python\nimport requests\nimport urllib3\n\nfrom urllib3.exceptions import MaxRetryError\n\nprint(urllib3.exceptions.MaxRetryError is MaxRetryError) # False\n```\n\nI'm real sorry I did not noticed when we discussed https://github.com/kennethreitz/requests/pull/2567.\n\nI'm not neither an import logic expert, but your proposal seems the only way to fix this, without unvendoring. Maybe we can ask to @brettcannon if he can take a look at this: I can volunteer to recap all the story so far.\n\nAlso, I just want to say thanks to you, @sigmavirus24 and @kennethreitz because although you can just mark this as wontfix, we started working together talking without forget that we are all human beings. \nSo, thank you.\n",
"@eriol Thanks for investigating. =) And thanks for working with us on this. Again, I can't reiterate enough, it's really important to us to get to a place where everyone's getting a good experience. We're not there yet, but we'll keep trying.\n\nI'd be delighted to see if someone who knows the import machinery really well can propose a better solution to this problem. If you're willing to do the legwork on ramping up @brettcannon that would be even better @eriol.\n\nOtherwise, we should aim to take that boring manual step or see if we can avoid the problem in requests in some other way.\n",
"> Otherwise, we should aim to take that boring manual step or see if we can avoid the problem in requests in some other way.\n\nWe should do this as a stop-gap solution for now. It will benefit the users and it's trivial for now. Meanwhile we should investigate longer-term solutions to this.\n",
"I'm traveling this week but I can try and take a look next week.\n",
"Basically @fasaxc is right in his analysis of the problem. The tricky bit is that this is all influenced by how you actually import something. You have to realize each of the following statements use slightly different logic to get you what you want:\n\n``` python\nimport urllib3.exceptions; HTTPError = urllib3.exceptions.HTTPError\nfrom urllib3 import exceptions; HTTPError = urllib3.exceptions.HTTPError\nfrom urllib3.exceptions import HTTPError\n```\n\nIf you use the last approach then that will definitely trip you up in this instance because import will do an explicit import for `urllib3.exceptions` _and then_ look for the `HTTPError` attribute. This means that since you are not patching `urllib3.exceptions` in `sys.modules` you end up having `from requests.packages.urllib3 import HTTPError` needing to import `requests.packages.urllib3` directly and since that isn't in `sys.modules` it leads to a fresh import. Now you have two separate modules with two separate classes and thus they won't succeed in a `issubclass()` check in an `except` clause.\n\nNow to solve this, @Lukasa was right and you can just patch every module in `urllib3`. Another option is a custom importer that handles just this case, but I don't know if you want such a magical solution. A third option is to insert an object into `sys.modules` who uses `__getattribute__` to direct to the proper module and handle all magical tweaks to `sys.modules`. And lastly, the simplest solution is either let Debian handle their own desire to monkeypatch vendored code or stop vendoring stuff entirely.\n",
"Thanks so much @brettcannon. :cake: is owed. =)\n\nOk folks, given the set of solutions proposed, does anyone have preferences?\n",
"Thank you @brettcannon!\n",
"So I'm enumerating the options and asking questions to make sure I understand correctly.\n\n> you can just patch every module in `urllib3`\n\nAs in, ensuring that `sys.modules` has both `urllib3.submodule` and `requests.packages.urllib3.submodule`? I think this is the least magic way and given that @Lukasa and I are both core developers of urllib3, we will catch any new submodules that need to be added to requests' patching logic. I'm most strongly in favor of this one.\n\n> Another option is a custom importer that handles just this case, but I don't know if you want such a magical solution.\n\nWe had that option and it broke many a thing. I'm not in favor of going back down that route.\n\n> insert an object into `sys.modules` who uses `__getattribute__` to direct to the proper module and handle all magical tweaks to `sys.modules`. \n\nHm. This sounds like something in the vein of option 2 but a little less magic-y. It still feels like some abuse of the module system and like it might cause us problems. If we come up with something like this, I would appreciate it if @fasaxc and @eriol would commit to testing this with python-etcd and making sure things still aren't broken before shipping a release with it.\n\n> And lastly, the simplest solution is either let Debian handle their own desire to monkeypatch vendored code or stop vendoring stuff entirely.\n\nBreaking this into two sub-suggestions:\n- \"let Debian handle their own desire to monkey patch vendored code\" No. I don't want @eriol to have to figure that out. Vendoring is a position of this project. As much as I dislike downstreams unvendoring things, it isn't the individual maintainer's fault or decision and I don't want to push this work onto their backs.\n- \"stop vendoring stuff entirely\" Many people constantly badger (or even, at times, yell) this at us (I know you're not Brett) but it simply just won't happen like that (if it happens at all).\n\n---\n\nMy 2¢:\n\nWe can take a short term solution of the first option for a `2.8.2` release and get that updated patch to @eriol, @warsaw, @ralphbean, etc. so distros can fix their packages while taking a look at the feasibility and reliability of option 3. I still prefer the first option to the third, but the third may have benefits I'm not seeing due to the bad taste left in my mouth by the `VendorAlias` (a.k.a., option 2).\n",
"@brettcannon many thanks!\n\nI'm also in favour of the less magic solution.\nI will be happy to test before the release python-etcd and all the dependants of requests that can be affected by this: using codesearch.debian.net I noticed that also cinder and proliantutils import urllib3's exceptions using `requests.packages.urllib3`, so I will check also them.\n\n@sigmavirus24 I really appreciate your words, thanks!\n",
"On Nov 17, 2015, at 06:14 AM, Ian Cordasco wrote:\n\n> As in, ensuring that `sys.modules` has both `urllib3.submodule` and\n> `requests.packages.urllib3.submodule`? I think this is the least magic way\n> and given that @Lukasa and I are both core developers of urllib3, we will\n> catch any new submodules that need to be added to requests' patching\n> logic. I'm most strongly in favor of this one.\n\nIt's mildly disconcerting that a library would fiddle with another library's\nsys.modules namespace, but I agree that this is probably the least magical\n(and thus most likely to work) way. Maybe we can convince @brettcannon to\nbuild a nicer foolproof <wink> module alias system for 3.6. :)\n\n> We can take a short term solution of the first option for a `2.8.2` release\n> and get that updated patch to @eriol, @warsaw, @ralphbean, etc. so distros\n> can fix their packages while taking a look at the feasibility and reliability\n> of option 3. I still prefer the first option to the third, but the third may\n> have benefits I'm not seeing due to the bad taste left in my mouth by the\n> `VendorAlias` (a.k.a., option 2).\n\nYep, I suspect that #1 may be the best approach, and that #3 would be\ndifficult to debug if things go south. It may be that no solution is perfect,\ngiven Python's current import semantics and implementation, in which case\ndoing the best you can with the least magic (i.e. most discoverable and\ndebuggable) would suck less.\n\nThanks!\n",
"I created http://bugs.python.org/issue25649 for the aliasing idea that @warsaw had (I wouldn't count on it getting a blessed solution, but you never know :) .\n\nAnd just to toss in my opinion, the explicit aliasing is the safest, most compatible way to do things (it's still not fool-proof since `__name__`, the spec object, etc. will still be the other name, but it's better than nothing). Getting a custom importer right is hard and using a non-module object is actually done in the wild, but some people make assumptions as to the kind of objects they get back from an import so it really depends on your user base whether it will work or not.\n",
"With the recent unbundling, I'm closing this as it should no longer be an issue.",
"Just a quick note to say also that I was able to drop all the Debian specific patches: https://tracker.debian.org/news/860709."
] |
https://api.github.com/repos/psf/requests/issues/2869
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2869/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2869/comments
|
https://api.github.com/repos/psf/requests/issues/2869/events
|
https://github.com/psf/requests/pull/2869
| 116,090,030 |
MDExOlB1bGxSZXF1ZXN0NTAyNDM3MDA=
| 2,869 |
unnest deep loop
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1508133?v=4",
"events_url": "https://api.github.com/users/miroli/events{/privacy}",
"followers_url": "https://api.github.com/users/miroli/followers",
"following_url": "https://api.github.com/users/miroli/following{/other_user}",
"gists_url": "https://api.github.com/users/miroli/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/miroli",
"id": 1508133,
"login": "miroli",
"node_id": "MDQ6VXNlcjE1MDgxMzM=",
"organizations_url": "https://api.github.com/users/miroli/orgs",
"received_events_url": "https://api.github.com/users/miroli/received_events",
"repos_url": "https://api.github.com/users/miroli/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/miroli/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/miroli/subscriptions",
"type": "User",
"url": "https://api.github.com/users/miroli",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2015-11-10T11:40:58Z
|
2021-09-08T05:01:14Z
|
2015-11-10T13:21:01Z
|
CONTRIBUTOR
|
resolved
|
Flat is better than nested. Instead of nesting `if` statements, check for the opposite early in the loop and `continue` if the opposite is the case.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2869/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2869/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2869.diff",
"html_url": "https://github.com/psf/requests/pull/2869",
"merged_at": "2015-11-10T13:21:01Z",
"patch_url": "https://github.com/psf/requests/pull/2869.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2869"
}
| true |
[
"Good idea @vienno!\n\nIf we're going to un-nest this, let's un-nest it the whole way. Want to break out the next nested couplet as well and use the same logic?\n",
"How about now @Lukasa ? I rewrote the third `if` statement as well, for consistency.\n",
"Thanks, much better! I have one small note, inline on the diff. =)\n",
"Of course! It's been updated.\n",
"Looks good to me @vienno! Would you like to add yourself to AUTHORS.rst?\n",
"Sure!\n",
"Thanks so much @vienno! :sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/2868
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2868/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2868/comments
|
https://api.github.com/repos/psf/requests/issues/2868/events
|
https://github.com/psf/requests/issues/2868
| 115,786,189 |
MDU6SXNzdWUxMTU3ODYxODk=
| 2,868 |
Can't modify cookies from auth plugin
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/416057?v=4",
"events_url": "https://api.github.com/users/jamielennox/events{/privacy}",
"followers_url": "https://api.github.com/users/jamielennox/followers",
"following_url": "https://api.github.com/users/jamielennox/following{/other_user}",
"gists_url": "https://api.github.com/users/jamielennox/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jamielennox",
"id": 416057,
"login": "jamielennox",
"node_id": "MDQ6VXNlcjQxNjA1Nw==",
"organizations_url": "https://api.github.com/users/jamielennox/orgs",
"received_events_url": "https://api.github.com/users/jamielennox/received_events",
"repos_url": "https://api.github.com/users/jamielennox/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jamielennox/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jamielennox/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jamielennox",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2015-11-09T03:50:40Z
|
2021-09-08T15:00:53Z
|
2016-09-06T00:09:19Z
|
NONE
|
resolved
|
HI,
I'm writing an auth plugin in which I want to manipulate the cookies set by a response. Specifically in this case I need to use cookies for the auth exchange but i don't want them added to the session. When the response hooks and similar are finished the response is returned, the history examined and cookies merged into the cookie jar using the _original_response.msg httpmessage that urllib3 sets. This means there is no public way for me to manipulate the headers to prevent these cookies being added to the session.
From a quick examination my guess to address this would be to regenerate the httpmessage header list from the response.headers.
My solution for now is going to be not include the intermediate requests in the history so that the cookies don't get picked up. I'm also going to look into a cookie policy that simply denies everything.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2868/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2868/timeline
| null |
completed
| null | null | false |
[
"I'm sorry, it's not clear what exactly the bug is here. What function would you like that you don't have?\n",
"If I know @jamielennox, they're working on SAML auth plugin. Part of (some of) the SAML authentication flow(s) includes cookies being sent that aren't meant to be persisted on the session.\n\nI think what @jamielennox wants is for those cookies that are meant to be part of an intermediate step in authentication not to be persisted. After a series of redirects, we process everything and update the Session cookie jar with the cookies from individual responses in the history. I suspect @jamielennox was removing the cookies individually from the intermediate responses hoping that would prevent them from being persisted on the Session but that isn't working because we don't use those attributes when extracting cookies to a session cookie jar. (Further, this is likely a plugin for OpenStack.)\n\nI don't think there is necessarily a bug here. We're doing the right thing for the 99% case.\n\n@jamielennox I suspect the cookie names are predictable and you could have a CookieJar that ignores those cookie values (along the lines of a far more selective ForgetfulCookieJar which @ceaess is adding to the requests-toolbelt).\n",
"Right, sorry i was trying to make the issue a bit more generic than my case. Requests is absolutely doing the right thing in 99% of cases. We have the issue that we are in some cases reusing sessions between different authenticated flows. This hasn't been an issue in the past because we are doing our own token based authentication against REST apis so we just haven't had to contend with cookies. \n\nEssentially my case is that i don't want these cookies that are part of the auth flow to end up in the general session as then they are available across user flows.\n\nMy hope here was that i could develop a generic SAML plugin that we could contribute back to requests but then subclass it from an openstack perspective to remove those cookies. However there was no was no way that i could strip these cookies. If i cleared the cookie jar or removed all the Set-Cookie headers from response.raw.getheaders or response.headers the cookies were still persisted because the CookieJar is looking directly at the _original_message (https://github.com/kennethreitz/requests/blob/dd0f164f8ed8e6f936c8af4442f6ddfa4d302fd5/requests/cookies.py#L122) which is a private variable that i cannot (should not) modify from the plugin. \n\nI was intending to say on that initial report that i don't know if my case is worth supporting. And i'm going to look into either a disable all cookie jar policy or just doing my own no-op cookie jar for our case. I just thought i would let you know it was a problem i hit because i doubt anyone has seen it before.\n\nI will have a look into ForgetfulCookieJar as well. \n\nThanks\n",
"Hmm. Just to clarify my own understanding, the core problem here is that you are acting as a middleman between some set of users on one side and a set of REST-accessible resources on the other. You are handling requests on behalf of some set of users _n_ with some set of sessions _m_ where _m_<_n_, which means that each session may have _n_/_m_ authenticated users running through it. The problem you have there is that you don't want your auth cookies to end up in the Session, lest they accidentally extend their auth scope to one of the other users. Is that about right?\n\nAssuming it is (I'm aware our timezone overlap is bad, so I'm going to make that assumption about your answer to give more information: shout if my assumption is wrong), the actual problem you have is cookies in general. Sessions keep all cookies in one place, which means there's no way of keeping cookies that semantically \"belong\" to different users apart. This is clearly a problem for the SAML auth flow cookies, but is also a wider problem: if you were using a REST API that happened to set cookies, you'd be unable to keep those isolated either.\n\nFor that reason, you either want one Session per user, or you want to disallow cookies altogether. The first is more generally correct at the cost of being _substantially_ more resource intensive, the other potentially has a weird failure mode (refusing to save a cookie that a service needs) but avoids the resource intensity.\n",
"@jamielennox I'm currently writing tests for the ForgetfulCookieJar and I'm confident it should be available in requests-toolbelt in the next week or so.\n",
"@Lukasa Yes that's all correct, it's just that the SAML case is the first time i've run across cookies being used at all in OpenStack. I need to look into a way of managing cookies seperate to what the requests.Session is doing. For almost all cases what i need now is just drop all cookies.\n",
"@jamielennox so you can lift the couple lines of code that constitute the [ForgetfulCookieJar](https://github.com/sigmavirus24/requests-toolbelt/blob/master/requests_toolbelt/cookies/forgetful.py#L5) or you can wait for/help with requests-toolbelt 0.5 so you can add it to OpenStack.\n"
] |
https://api.github.com/repos/psf/requests/issues/2867
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2867/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2867/comments
|
https://api.github.com/repos/psf/requests/issues/2867/events
|
https://github.com/psf/requests/issues/2867
| 115,776,920 |
MDU6SXNzdWUxMTU3NzY5MjA=
| 2,867 |
How to deal with using requests and urllib3 together
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/12321?v=4",
"events_url": "https://api.github.com/users/sjmh/events{/privacy}",
"followers_url": "https://api.github.com/users/sjmh/followers",
"following_url": "https://api.github.com/users/sjmh/following{/other_user}",
"gists_url": "https://api.github.com/users/sjmh/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sjmh",
"id": 12321,
"login": "sjmh",
"node_id": "MDQ6VXNlcjEyMzIx",
"organizations_url": "https://api.github.com/users/sjmh/orgs",
"received_events_url": "https://api.github.com/users/sjmh/received_events",
"repos_url": "https://api.github.com/users/sjmh/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sjmh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sjmh/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sjmh",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2015-11-09T01:37:31Z
|
2021-09-08T21:00:46Z
|
2015-11-09T03:09:21Z
|
NONE
|
resolved
|
Ran into an interesting issue in the past week while testing things out with SaltStack. ( SaltStack is a remote execution engine / cfg mgmt tool ). SaltStack uses requests internally.
However, one of the 'modules' you can use is python-etcd. python-etcd uses urllib3. But, because of request's wrapping of urllib3 exceptions, some of the python-etcd code bombs because it tries to catch urllib3 exceptions via fully-qualified names.
An example of this happening can be easily demo'd with:
```
#!/usr/bin/python
#import requests
import etcd
c = etcd.Client('127.0.0.1', 4001)
c.read("/blah", wait=True, timeout=2)
```
Without Requests, you get etcd.EtcdConnectionFailed. With it, you get urllib3.exceptions.ReadTimeoutError. python-etcd tries to catch urllib3 exceptions, but because of request's wrapping, it fails to catch them.
Is there a way to work around this properly ( besides going to the python-etcd author and asking him to change everything to use requests instead of urllib3 )?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2867/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2867/timeline
| null |
completed
| null | null | false |
[
"How are you installing requests and urllib3?\n",
"Via RPM ( both from RHEL's official repos for RHEL6 )\n\n```\n[root@alpha site-packages]# rpm -qa | grep urllib3\npython-urllib3-1.10.2-1.el6.noarch\n[root@alpha site-packages]# rpm -qa | grep requests\npython-requests-2.6.0-3.el6.noarch\n```\n",
"So this is a version of requests that had some magic handling for downstream redistributors that remove parts of requests. This should be fixed in newer versions of requests but I have no control over RHEL's repos.\n",
"@sigmavirus24 No problem - the pip versions should be good though?\n",
"@sjmh should be!\n",
"@sigmavirus24 Alrighty - thanks, just confirmed those versions work! Do you know if there's any way to get around the issue with those RPM versions? If not, is there a minimum version that this issue would be fixed in ( Looks like newest from pip is 2.8.1 ) or is this a general thing for all downstream redistributors in any version?\n\nThanks for the help.\n",
"I forget in which version we fixed it upstream but I'm tempted to say 2.7.0.\n"
] |
https://api.github.com/repos/psf/requests/issues/2866
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2866/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2866/comments
|
https://api.github.com/repos/psf/requests/issues/2866/events
|
https://github.com/psf/requests/issues/2866
| 115,669,814 |
MDU6SXNzdWUxMTU2Njk4MTQ=
| 2,866 |
Better testing for chunked uploads.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 15 |
2015-11-07T14:29:59Z
|
2021-09-08T18:00:59Z
|
2016-04-11T20:29:02Z
|
MEMBER
|
resolved
|
We have a whole code branch that does chunked uploads that is not and has never been tested. That's a problem, because sometimes it breaks and we don't notice (#2861). I'd like to add more testing for chunked uploads.
However, we can't use httpbin to do this testing. This is because the WSGI spec staunchly refuses to let applications see chunked transfer encoding at the app layer. For this reason, to test chunked transfer encoding requires a new style of testing, one that doesn't involve running a WSGI server but instead running something that can see the bytes _as they hit the wire_. Basically, something like the socket level tests of urllib3.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2866/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2866/timeline
| null |
completed
| null | null | false |
[
"One possibility would be adapting urllib3's solution to our case. This new server we'd make doesn't need to support the whole HTTP protocol, because we just need it for the chunked transfer encoding thing, right? \n",
"I think we wouldn't actually use a proper server at all, just a pair of sockets.\n",
"Right, ok. \n\nBoth #2860 and this issue come from the same problem: pytest-httpbin's server doesn't seem to support the whole spectrum of our testing needs properly (not saying it is bad software, it's great). We could solve both issues at the same time just by making our socket server able to listen to listen to _parallel_ requests. The problem is that it would be necessary to hardcode the response headers in the threaded's auth case, which is far from optimal.\n",
"Even that is overdesigning it. All we need is to be able to receive a request and send a response. Parallel requests aren't a concern, just one is totally fine. We need a listening socket that can send a pre-canned 200 response when a request is received then read all the data. That's literally all we need to do.\n",
"Yes, I understand. My point was that we could solve #2860 too by making this server able to listen to parallel requests. It's ok though if you think it would be better to address those two problems separately.\n",
"Oh, I see. Frankly, I think we need to write a better test for #2860, because the approach we had been using (brute force running a lot of threads) is not really the most effective way to test the function.\n",
"Ok!\n\nWould it be ok if I start a PR trying to adapt `SocketDummyServer` and `SocketDummyServerTestCase` from urllib3? \n",
"You are welcome to, though it'll probably require quite a bit of work.\n",
"May it make sense to get this into urllib3 and also get rid of tornado there?\n",
"@ml31415 Given that we're going to try to move the chunked encoding logic to urllib3, that's probably true. However, the time investment here is still useful, because this kind of focused socket-based testing is better than using httpbin. =)\n",
"Agreed, that it makes sense to have this kind of testing in requests. But the testserver itself may be interesting for urllib3 too. So maybe put the testserver there, and requests just reuses it for testing? Instead of duplicating code around.\n",
"urllib3 already has its own equivalent of this. =) That's where we stole the idea from. However, it's a little bit nicer to avoid importing urllib3's test framework, because generally speaking testing functionality is not covered by versioning constraints, and can change at any time. Having that change under our feet without warning would be...annoying.\n",
"If I didn't overlook anything, urllib3 uses tornado for a bunch of testing (besides the dummyserver). This is quite some overhead imho, and could be done with a much simpler - own - test server implementation, e.g. extending the dummyserver, or adding some features here.\n",
"@ml31415 If urllib3 switches away from Tornado we'd like to switch to a socket-level test approach as you saw with the chunked encoding tests. That's something we're interested in doing, but it's hard to find the time to do the migration. If you're volunteering, then go for it!\n",
"Hmm, just had a closer look on it. Moving a bunch of the dummyserver tests to the socketserver (e.g. the retry tests), and replacing tornado with some pimped SimpleHTTPServer for the remaining tests seems quite possible... anyways, I guess that's another discussion then.\n"
] |
https://api.github.com/repos/psf/requests/issues/2865
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2865/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2865/comments
|
https://api.github.com/repos/psf/requests/issues/2865/events
|
https://github.com/psf/requests/issues/2865
| 115,634,401 |
MDU6SXNzdWUxMTU2MzQ0MDE=
| 2,865 |
Can't pass boolean parameter in body of POST request when using multipart encoding
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/604138?v=4",
"events_url": "https://api.github.com/users/Aerlinger/events{/privacy}",
"followers_url": "https://api.github.com/users/Aerlinger/followers",
"following_url": "https://api.github.com/users/Aerlinger/following{/other_user}",
"gists_url": "https://api.github.com/users/Aerlinger/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Aerlinger",
"id": 604138,
"login": "Aerlinger",
"node_id": "MDQ6VXNlcjYwNDEzOA==",
"organizations_url": "https://api.github.com/users/Aerlinger/orgs",
"received_events_url": "https://api.github.com/users/Aerlinger/received_events",
"repos_url": "https://api.github.com/users/Aerlinger/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Aerlinger/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Aerlinger/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Aerlinger",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-11-07T05:05:10Z
|
2021-09-08T21:00:46Z
|
2015-11-07T07:25:27Z
|
NONE
|
resolved
|
Here's a short example of what I'm experiencing:
``` python
params = {
"someBoolean": True
}
response = requests.post(endpoint, data=params, files={'input_file', open(fileobj, 'rb')})
```
The issue is that requests encodes the `someBoolean` parameter as the string value `"True"` rather than the boolean literal `true` value.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2865/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2865/timeline
| null |
completed
| null | null | false |
[
"The `data` argument does not encode as JSON: it encodes as form-encoded data. This has no `true` literal. If you want to encode json, you'll want to use the `json` argument, but you cannot combine that argument with the `files` argument. Instead, try this:\n\n``` python\nimport json\nresponse = requests.post(endpoint, data=json.dumps(params), files={'input_file', open(fileobj, 'rb')})\n```\n",
"@Lukasa that won't work. We expect `data` to be a dictionary (or list of tuples with length two) and `files` to be the same. That said, without a description of what the endpoint you're talking to is actually expecting, we can't provide more help than we already have.\n\nThis is probably a problem better suited for [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests).\n"
] |
https://api.github.com/repos/psf/requests/issues/2864
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2864/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2864/comments
|
https://api.github.com/repos/psf/requests/issues/2864/events
|
https://github.com/psf/requests/pull/2864
| 115,587,756 |
MDExOlB1bGxSZXF1ZXN0NTAwMDM5MDY=
| 2,864 |
Assure session is closed on exception.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1945440?v=4",
"events_url": "https://api.github.com/users/aartur/events{/privacy}",
"followers_url": "https://api.github.com/users/aartur/followers",
"following_url": "https://api.github.com/users/aartur/following{/other_user}",
"gists_url": "https://api.github.com/users/aartur/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/aartur",
"id": 1945440,
"login": "aartur",
"node_id": "MDQ6VXNlcjE5NDU0NDA=",
"organizations_url": "https://api.github.com/users/aartur/orgs",
"received_events_url": "https://api.github.com/users/aartur/received_events",
"repos_url": "https://api.github.com/users/aartur/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/aartur/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aartur/subscriptions",
"type": "User",
"url": "https://api.github.com/users/aartur",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2015-11-06T20:59:03Z
|
2021-09-08T05:01:15Z
|
2015-11-06T21:02:31Z
|
CONTRIBUTOR
|
resolved
|
As I noticed in #2862 session is not closed on exception when using high-level API (not sessions). Here's a fix for it.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2864/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2864/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2864.diff",
"html_url": "https://github.com/psf/requests/pull/2864",
"merged_at": "2015-11-06T21:02:30Z",
"patch_url": "https://github.com/psf/requests/pull/2864.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2864"
}
| true |
[
"Looks good to me! Want to add yourself to AUTHORS?\n",
"s/AUTHORS/CONTRIBUTORS/\n",
"It's too small, next time ;).\n",
"Ok! :sparkles: :cake: :sparkles:\n",
"Thanks @aartur \n"
] |
https://api.github.com/repos/psf/requests/issues/2863
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2863/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2863/comments
|
https://api.github.com/repos/psf/requests/issues/2863/events
|
https://github.com/psf/requests/issues/2863
| 115,547,977 |
MDU6SXNzdWUxMTU1NDc5Nzc=
| 2,863 |
Multiple requests on a single Session with different client certs behave incorrectly
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/772?v=4",
"events_url": "https://api.github.com/users/alex/events{/privacy}",
"followers_url": "https://api.github.com/users/alex/followers",
"following_url": "https://api.github.com/users/alex/following{/other_user}",
"gists_url": "https://api.github.com/users/alex/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/alex",
"id": 772,
"login": "alex",
"node_id": "MDQ6VXNlcjc3Mg==",
"organizations_url": "https://api.github.com/users/alex/orgs",
"received_events_url": "https://api.github.com/users/alex/received_events",
"repos_url": "https://api.github.com/users/alex/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/alex/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alex/subscriptions",
"type": "User",
"url": "https://api.github.com/users/alex",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 18 |
2015-11-06T17:10:13Z
|
2021-09-08T08:00:34Z
|
2017-07-25T22:34:55Z
|
MEMBER
|
resolved
|
Consider the following code:
``` py
import requests
s = requests.session()
s.get("https://host", cert="path/to/cert1.pem")
s.get("https://host", cert="path/to/cert2.pem")
```
`requests` will silently reuse the same connection for both connections, which is incorrect.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2863/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2863/timeline
| null |
completed
| null | null | false |
[
"Yeah, this is not good. We have a similar problem if a user switches the 'verify' flag as well, so we need to fix that.\n",
"So, @shazow, I think really we need to adjust the connection pooling logic in urllib3 a bit. Keying connections off the URL isn't really enough: for HTTPS connections we need to key them off the various connection state parameters as well (cert_reqs, ca_certs, ca_cert_dir, client cert parameters). That's pretty gross because it's a major API change.\n\nThoughts?\n",
"Hm I think we had this discussion before somewhere and the conclusion is that we need to avoid exposing per-request cert configs?\n",
"Maybe we need to make the PoolManager keying more configurable in general, though?\n",
"I think I'm hitting the same problem where I want to change the auth between requests, but it looks like the first set of credentials gets used for all requests.\n",
"Yup, this is a known issue. It's on my list to fix, but don't expect the fix to be soon. =)\n",
"is there a workaround?\n\nOn Wed, Nov 25, 2015 at 10:33 AM, Cory Benfield [email protected]\nwrote:\n\n> Yup, this is a known issue. It's on my list to fix, but don't expect the\n> fix to be soon. =)\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2863#issuecomment-159642250\n> .\n",
"The best workaround at this time is to avoid using a Session or, if you are using one of the non-connection-pooling features of the Session, to run a TransportAdapter that never pools connections.\n",
"I'm using the Session to track cookies. I'll look into alternate\nTransportAdapters, thanks.\n\nOn Wed, Nov 25, 2015 at 10:37 AM, Cory Benfield [email protected]\nwrote:\n\n> The best workaround at this time is to avoid using a Session or, if you\n> are using one of the non-connection-pooling features of the Session, to run\n> a TransportAdapter that never pools connections.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2863#issuecomment-159643426\n> .\n",
"Would it be possible to add this limitation to the docs? I lost several\nhours of work debugging this today.\nEl 25 nov. 2015 10:39, \"Chris AtLee\" [email protected] escribió:\n\n> I'm using the Session to track cookies. I'll look into alternate\n> TransportAdapters, thanks.\n> \n> On Wed, Nov 25, 2015 at 10:37 AM, Cory Benfield [email protected]\n> wrote:\n> \n> > The best workaround at this time is to avoid using a Session or, if you\n> > are using one of the non-connection-pooling features of the Session, to run\n> > a TransportAdapter that never pools connections.\n> > \n> > —\n> > Reply to this email directly or view it on GitHub\n> > https://github.com/kennethreitz/requests/issues/2863#issuecomment-159643426\n> > .\n",
"We could do, but I think it'd be better long-term if we made a move to fix it instead.\n",
"@catlee My gift to you is the first part of the work required for this: shazow/urllib3#751.\n",
"Hi all. I would be very excited if this particular issue got resolved and would be more than willing to contribute my time to make that happen. I've spent a bit of time poking around in both urllib3 and requests, but I'm still pretty new to both libraries.\n\nIs the plan to get https://github.com/shazow/urllib3/pull/751 into a satisfactory state and then refactor the `requests.adapters.HTTPAdapter` to use the new API? It seems like the urllib3 pull request is stalled because of design uncertainty, so I'm not sure there's a lot I can do there (especially since I don't really know what I'm doing yet). Is there anything else I can do to help?\n",
"Yeah, essentially that's the plan. Last I checked, I think @shazow and I were unclear on exactly the direction that change should take.\n",
"I just ran into this and struggled at some length. Here's a quick doc change to prevent others from suffering the same fate: https://github.com/kennethreitz/requests/pull/3200\n",
"This should be fixed by #3109. Once that's done, we can close this out.\n",
"Should this be closed now?",
"Yep, it looks like Jeremy's kw work solved this. Thanks for pinging @alex!"
] |
https://api.github.com/repos/psf/requests/issues/2862
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2862/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2862/comments
|
https://api.github.com/repos/psf/requests/issues/2862/events
|
https://github.com/psf/requests/issues/2862
| 115,507,408 |
MDU6SXNzdWUxMTU1MDc0MDg=
| 2,862 |
Big memory leaks on PyPy (simplest usage)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1945440?v=4",
"events_url": "https://api.github.com/users/aartur/events{/privacy}",
"followers_url": "https://api.github.com/users/aartur/followers",
"following_url": "https://api.github.com/users/aartur/following{/other_user}",
"gists_url": "https://api.github.com/users/aartur/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/aartur",
"id": 1945440,
"login": "aartur",
"node_id": "MDQ6VXNlcjE5NDU0NDA=",
"organizations_url": "https://api.github.com/users/aartur/orgs",
"received_events_url": "https://api.github.com/users/aartur/received_events",
"repos_url": "https://api.github.com/users/aartur/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/aartur/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aartur/subscriptions",
"type": "User",
"url": "https://api.github.com/users/aartur",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2015-11-06T13:44:32Z
|
2021-09-08T21:00:47Z
|
2015-11-06T15:51:06Z
|
CONTRIBUTOR
|
resolved
|
The simplest usage of `requests` causes big memory leaks on PyPy (2.6.1, 4.0.0) and newest requests (2.8.1). The following script:
```
import requests
import time
import sys
def main():
while True:
r = requests.get('https://www.bing.com/')
time.sleep(0.1)
sys.stderr.write('.')
sys.stderr.flush()
if __name__ == '__main__':
main()
```
causes an increase of memory usage of about 1MB/second.
What does not help: creating and closing a `Session` for each request. What seems to help: using a global `Session`.
I have seen issues about it like https://github.com/kennethreitz/requests/issues/1685 but I'm not sure it was fixed and if it's regression.
Can connection pooling and keep-alive be disabled as a workaround? It seems like keep-alive is hardcoded now.
I don't see a leak on CPython, which probably means the bug must be caused by relying on reference counting sideeffects and incorrect cleanup code.
BTW, this line: https://github.com/kennethreitz/requests/blob/HEAD/requests/api.py#L54 should be wrapped around `try ... finally` (or `with ...`), unless session.request can't ever throw an exception.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2862/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2862/timeline
| null |
completed
| null | null | false |
[
"This is quite puzzling. I ran the script for a long time, then broke into pdb.\n\n``` python\n(Pdb) objgraph.show_most_common_types(limit=20)\nstr 18275\nfunction 5946\nint 5812\ncode 4013\ntuple 3548\nbuiltin-code 2646\ndict 880\nunicode 766\ntype 719\nbuiltin_function 686\ngetset_descriptor 447\nlist 434\nmodule 243\nstaticmethod 189\nset 184\nproperty 173\nmethod 143\nweakref 129\nclassobj 116\nlong 113\n(Pdb) objgraph.show_growth(limit=20)\nlist 435 +1\n(Pdb) requests.get('https://www.bing.com/')\n<Response [200]>\n(Pdb) objgraph.show_growth(limit=20)\n(Pdb)\n```\n\nPut another way, the simple act of calling requests.get does not seem to leak memory. This can be verified by removing the assignment to `r`: the leak is still present. I'm not really seeing any requests objects in memory at this point, which begins to suggest that this may be a PyPy problem, but I'll keep digging and see if I can shrink the repro code.\n",
"Alright, I can remove the leak by switching to `http://` instead of `https://`. Suggests this leak is either in PyPy's `ssl` module or in our (probably urllib3's) use of that module.\n",
"Well, not totally removed: I get a _much_ slower increase in memory usage. Like, 1/100 of what it was before.\n",
"This may actually be worse than it seems. objgraph's top 20 types are basically identical in the `http` and `https` cases, which may mean that the leak is actually in the C interface of PyPy's SSL module. I really hope it's not, so I'll try to construct a test case that disproves that hypothesis.\n",
"Yup, I can reproduce a consistent memory leak with just this code:\n\n``` python\nimport gc\nimport time\nimport socket\nimport ssl\nimport sys\n\ndef main():\n while True:\n ctx = ssl.create_default_context()\n s = socket.create_connection(('www.bing.com', 443))\n s = ctx.wrap_socket(s, server_hostname='www.bing.com')\n s.close()\n\n time.sleep(0.1)\n sys.stderr.write('.')\n sys.stderr.flush()\n gc.collect()\n\nif __name__ == '__main__':\n main()\n```\n",
"That makes this a PyPy bug, not a requests bug. There's no requests code in the above, but the problem still occurs.\n",
"And this does not leak in CPython 2.7.10. I need to update my PyPy version briefly to test on the most recent, but I'm pretty confident that this is a PyPy bug.\n",
"This issue can now be followed as [PyPy issue #2183](https://bitbucket.org/pypy/pypy/issues/2183/pypy-400-ssl-module-appears-to-leak-memory). Thanks for the report @aartur!\n"
] |
https://api.github.com/repos/psf/requests/issues/2861
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2861/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2861/comments
|
https://api.github.com/repos/psf/requests/issues/2861/events
|
https://github.com/psf/requests/pull/2861
| 115,349,215 |
MDExOlB1bGxSZXF1ZXN0NDk4NjQ3NTE=
| 2,861 |
Fix breakage introduced by 8f591682
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 12 |
2015-11-05T18:55:57Z
|
2021-09-08T06:00:47Z
|
2015-11-07T17:50:50Z
|
MEMBER
|
resolved
|
Turns out we broke chunked uploads in 8f591682e6a901baf0cc8a0393b5930252f0318a. I discovered this while investigating #2215.
While we were here, I added a test that would at least have caught errors as egregious as this one.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2861/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2861/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2861.diff",
"html_url": "https://github.com/psf/requests/pull/2861",
"merged_at": "2015-11-07T17:50:50Z",
"patch_url": "https://github.com/psf/requests/pull/2861.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2861"
}
| true |
[
"To note: this breakage did not make it into any shipped version of requests, so we don't need to schedule a patch release for this.\n",
"Why is the build failing?\n",
"Good question. The test behaves fine on my\nMac, but it's a tricky one because chunked encoding and WSGI don't necessarily play nice. \n\n> On 5 Nov 2015, at 19:02, Ian Cordasco [email protected] wrote:\n> \n> Why is the build failing?\n> \n> —\n> Reply to this email directly or view it on GitHub.\n",
"Hmm, so this failure is intermittent on my machine.\n",
"(my machine == Linux box I found)\n",
"I wonder if this is actually a problem with the WSGI server being used by pytest-httpbin. Specifically, I think it might be related to the speed of the server.\n",
"Specifically, I think what's happening here is that the chunked-encoding header is being stripped, and so the server is sending the 200 right away and then killing the connection. This works _sometimes_ because the socket sends happen before the connection is closed, but sometimes the server closes the connection before we can get there.\n\nThis is pretty annoying, because it means it's likely very hard to find a good test-case here.\n",
"More importantly, it means we can't really have tests _at all_ for chunked transfer encoding.\n",
"I honestly don't know what to do about this. To get a test that confirms we don't break this we need either to change the WSGI server used by pytest-httpbin (the only options are ones that are _really_ heavyweight and probably not suitable) or start writing new tests that don't use a httpbin at all.\n\nThe second is plenty do-able, it just requires some py.test test fixtures and some socket work. But it makes me a bit nervous given @kennethreitz's attitude to changing our test layout in the past: I'd want to move all the tests to a `test/` subdirectory so that I can add the necessary `conf.py` file, and also start writing new tests that explicitly use sockets rather than a web server.\n\n@kennethreitz, do you have any objection to me adding some more thorough testing around here? The reality is that our chunked upload logic is _entirely_ untested, and has been since it was originally written. That is a recipe for accidental breakage, and it makes me really nervous.\n",
"Ok, so I pushed a new commit that doesn't contain the test. This should pass, and the fix _is_ right, but I don't want us to lose track of the need for better testing here so I'm going to open a separate issue for that.\n",
"@sigmavirus24 This should be good to merge.\n",
"@Lukasa I'm :+1: but I think we should work on fixing our test suite. If Kenneth comes along and decimates it again, then we at least had test coverage for some period of time there.\n"
] |
https://api.github.com/repos/psf/requests/issues/2860
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2860/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2860/comments
|
https://api.github.com/repos/psf/requests/issues/2860/events
|
https://github.com/psf/requests/issues/2860
| 115,309,255 |
MDU6SXNzdWUxMTUzMDkyNTU=
| 2,860 |
Replace missing test for threaded HTTP digest auth.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| false | null |
[] | null | 1 |
2015-11-05T15:34:22Z
|
2024-05-19T18:37:30Z
|
2024-05-19T18:37:30Z
|
MEMBER
| null |
This test was removed as part of #2859, but we should aim to put it back in some form.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2860/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2860/timeline
| null |
completed
| null | null | false |
[
"Should this now be closed with the recent progress? #2897\n"
] |
https://api.github.com/repos/psf/requests/issues/2859
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2859/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2859/comments
|
https://api.github.com/repos/psf/requests/issues/2859/events
|
https://github.com/psf/requests/pull/2859
| 115,286,845 |
MDExOlB1bGxSZXF1ZXN0NDk4MjU1NzM=
| 2,859 |
Move test suite offline using pytest-httpbin
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2015-11-05T13:48:32Z
|
2021-09-08T06:00:48Z
|
2015-11-05T15:37:22Z
|
MEMBER
|
resolved
|
Resolves #2184, and should allow us to move back to Travis for our testing.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2859/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2859/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2859.diff",
"html_url": "https://github.com/psf/requests/pull/2859",
"merged_at": "2015-11-05T15:37:22Z",
"patch_url": "https://github.com/psf/requests/pull/2859.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2859"
}
| true |
[
"There _ought_ to be a lot of improvements to doing this. Tests should be a lot closer to reproducible, and the runtime should improve greatly. This is still not really how I want this test suite to look, but it's a _lot_ closer.\n",
"One request, then I'll press the big green button. :)\n\nAlternatively, we can add a bug to track the work of adding that test.\n",
"I think it's better to add a bug than a note in the code: we'll just lose the note. ;)\n",
"Thanks @Lukasa!\n"
] |
https://api.github.com/repos/psf/requests/issues/2858
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2858/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2858/comments
|
https://api.github.com/repos/psf/requests/issues/2858/events
|
https://github.com/psf/requests/pull/2858
| 115,282,668 |
MDExOlB1bGxSZXF1ZXN0NDk4MjI4MjA=
| 2,858 |
Add support for a directory of CAs
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] |
{
"closed_at": "2016-01-05T17:14:26Z",
"closed_issues": 2,
"created_at": "2015-10-12T10:32:11Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/26",
"id": 1350559,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/26/labels",
"node_id": "MDk6TWlsZXN0b25lMTM1MDU1OQ==",
"number": 26,
"open_issues": 0,
"state": "closed",
"title": "2.9.0",
"updated_at": "2016-01-05T17:14:26Z",
"url": "https://api.github.com/repos/psf/requests/milestones/26"
}
| 0 |
2015-11-05T13:22:14Z
|
2021-09-08T06:00:47Z
|
2015-11-07T21:39:41Z
|
MEMBER
|
resolved
|
Resolves #2659.
This would have to go in 2.9.0.
Once again, our test suite isn't really in a place that lets us effectively test this.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2858/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2858/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2858.diff",
"html_url": "https://github.com/psf/requests/pull/2858",
"merged_at": "2015-11-07T21:39:41Z",
"patch_url": "https://github.com/psf/requests/pull/2858.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2858"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/2857
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2857/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2857/comments
|
https://api.github.com/repos/psf/requests/issues/2857/events
|
https://github.com/psf/requests/issues/2857
| 115,210,530 |
MDU6SXNzdWUxMTUyMTA1MzA=
| 2,857 |
Expose expires attribute of cookie in class RequestsCookieJar
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/748271?v=4",
"events_url": "https://api.github.com/users/ammsa/events{/privacy}",
"followers_url": "https://api.github.com/users/ammsa/followers",
"following_url": "https://api.github.com/users/ammsa/following{/other_user}",
"gists_url": "https://api.github.com/users/ammsa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ammsa",
"id": 748271,
"login": "ammsa",
"node_id": "MDQ6VXNlcjc0ODI3MQ==",
"organizations_url": "https://api.github.com/users/ammsa/orgs",
"received_events_url": "https://api.github.com/users/ammsa/received_events",
"repos_url": "https://api.github.com/users/ammsa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ammsa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ammsa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ammsa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-11-05T05:04:18Z
|
2021-09-08T21:00:45Z
|
2015-11-09T09:25:49Z
|
NONE
|
resolved
|
Right now, the class `RequestsCookieJar` class methods only expose `name` and `value` attributes of its cookies. It will be useful to have a method that checks if any of the cookies in `RequestsCookieJar` are expired.
Like `is_expired()` would return True if any cookies are expired otherwise false
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2857/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2857/timeline
| null |
completed
| null | null | false |
[
"The `RequestsCookieJar` is a subclass of the standard library's [`cookiejar`](https://docs.python.org/2/library/cookielib.html#cookiejar-and-filecookiejar-objects), which means that it behaves exactly like that class. In particular, you can get the actual cookies themselves by iterating over it. This means the method you want is one line of code:\n\n``` python\nany_expired = any(cookie.is_expired() for cookie in session.cookies)\n```\n\nWe're generally reluctant to add methods that are as simple as that one, given that there are a number of other constructions users might want (a method to say whether _all_ the cookies have expired, for example). Restricting the surface area of our APIs is extremely important to us.\n\nOne final thing: what's your use case? From where I'm sat it's hard to see why this would be useful: the cookie jar will remove expired cookies when they would have gotten used anyway, so I'm not entirely sure what advantage you receive by making this change.\n"
] |
https://api.github.com/repos/psf/requests/issues/2856
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2856/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2856/comments
|
https://api.github.com/repos/psf/requests/issues/2856/events
|
https://github.com/psf/requests/issues/2856
| 114,923,674 |
MDU6SXNzdWUxMTQ5MjM2NzQ=
| 2,856 |
Session level default timeout
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2048526?v=4",
"events_url": "https://api.github.com/users/bodenr/events{/privacy}",
"followers_url": "https://api.github.com/users/bodenr/followers",
"following_url": "https://api.github.com/users/bodenr/following{/other_user}",
"gists_url": "https://api.github.com/users/bodenr/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bodenr",
"id": 2048526,
"login": "bodenr",
"node_id": "MDQ6VXNlcjIwNDg1MjY=",
"organizations_url": "https://api.github.com/users/bodenr/orgs",
"received_events_url": "https://api.github.com/users/bodenr/received_events",
"repos_url": "https://api.github.com/users/bodenr/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bodenr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bodenr/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bodenr",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-11-03T22:20:15Z
|
2021-09-08T21:00:51Z
|
2015-11-03T22:31:14Z
|
NONE
|
resolved
|
Best I can tell, today there is no way to set a default timeout at the Session level. That is, I want to create a session and give it a default timeout which should be used on any request() for the session if no timeout is given on the actual **kwargs of the request invocation.
Example:
``` python
session = requests.Session()
session.default_timeout = 9
...
# default timeout of 9 is used
session.get('http://thehost/good-stuff')
# passed value of 3 is used
session.get('http://thehost/good-stuff', timeout=3)
```
This would be very helpful for consumers wanting to pool request sessions above the requests lib layer using common kwargs for all requests in the session(s).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 2,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 2,
"url": "https://api.github.com/repos/psf/requests/issues/2856/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2856/timeline
| null |
completed
| null | null | false |
[
"Hi @bodenr thanks for using requests and stopping by with this feature request.\n\nThe library strives to make very specific divides in what is appropriate to add where. Timeout values being specified on a session do not make very much sense to the project although it's plausible for you to add some code to do this (like [pip](/pypa/pip)) has. \n\nIn the future please search for closed and open issues discussing your topics. A quick search immediately brought up https://github.com/kennethreitz/requests/issues/2011 which was closed over a year ago. I'm sure there are others, but you can find them too with a quick search.\n"
] |
https://api.github.com/repos/psf/requests/issues/2855
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2855/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2855/comments
|
https://api.github.com/repos/psf/requests/issues/2855/events
|
https://github.com/psf/requests/pull/2855
| 114,922,051 |
MDExOlB1bGxSZXF1ZXN0NDk2MTk3NDc=
| 2,855 |
Added Travis support with tox
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2029180?v=4",
"events_url": "https://api.github.com/users/ZuluPro/events{/privacy}",
"followers_url": "https://api.github.com/users/ZuluPro/followers",
"following_url": "https://api.github.com/users/ZuluPro/following{/other_user}",
"gists_url": "https://api.github.com/users/ZuluPro/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ZuluPro",
"id": 2029180,
"login": "ZuluPro",
"node_id": "MDQ6VXNlcjIwMjkxODA=",
"organizations_url": "https://api.github.com/users/ZuluPro/orgs",
"received_events_url": "https://api.github.com/users/ZuluPro/received_events",
"repos_url": "https://api.github.com/users/ZuluPro/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ZuluPro/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ZuluPro/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ZuluPro",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-11-03T22:10:40Z
|
2021-09-08T06:00:48Z
|
2015-11-03T22:16:50Z
|
NONE
|
resolved
|
- Added `.travis.yml`
I think it is useful for every contributors to have a online tool for test.
I added Travis' badge in `README.rst``
Maybe you want to add Coveralls support in future ?
- Added `tox.ini`
I implement tox usage for simplify tests across Python versions.
Maybe you want to add documentation tests in future ?
There are two errors (don't come from my commit) and the build is failing...
```
____________ RequestsTestCase.test_BASICAUTH_TUPLE_HTTP_200_OK_GET _____________
self = <test_requests.RequestsTestCase testMethod=test_BASICAUTH_TUPLE_HTTP_200_OK_GET>
def test_BASICAUTH_TUPLE_HTTP_200_OK_GET(self):
auth = ('user', 'pass')
url = httpbin('basic-auth', 'user', 'pass')
r = requests.get(url, auth=auth)
assert r.status_code == 200
r = requests.get(url)
> assert r.status_code == 401
E AssertionError: assert 200 == 401
E + where 200 = <Response [200]>.status_code
test_requests.py:317: AssertionError
_________ RequestsTestCase.test_auth_is_stripped_on_redirect_off_host __________
self = <test_requests.RequestsTestCase testMethod=test_auth_is_stripped_on_redirect_off_host>
def test_auth_is_stripped_on_redirect_off_host(self):
r = requests.get(
httpbin('redirect-to'),
params={'url': 'http://www.google.co.uk'},
auth=('user', 'pass'),
)
assert r.history[0].request.headers['Authorization']
> assert not r.request.headers.get('Authorization', '')
E AssertionError: assert not 'Basic dXNlcjpwYXNz'
E + where 'Basic dXNlcjpwYXNz' = <bound method CaseInsensitiveDict.get of {'Authorization': 'Basic dXNlcjpwYXNz...ing': 'gzip, deflate', 'User-Agent': 'python-requests/2.8.1', 'Accept': '*/*'}>('Authorization', '')
E + where <bound method CaseInsensitiveDict.get of {'Authorization': 'Basic dXNlcjpwYXNz...ing': 'gzip, deflate', 'User-Agent': 'python-requests/2.8.1', 'Accept': '*/*'}> = {'Authorization': 'Basic dXNlcjpwYXNz', 'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 'User-Agent': 'python-requests/2.8.1', 'Accept': '*/*'}.get
E + where {'Authorization': 'Basic dXNlcjpwYXNz', 'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 'User-Agent': 'python-requests/2.8.1', 'Accept': '*/*'} = <PreparedRequest [GET]>.headers
E + where <PreparedRequest [GET]> = <Response [200]>.request
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2029180?v=4",
"events_url": "https://api.github.com/users/ZuluPro/events{/privacy}",
"followers_url": "https://api.github.com/users/ZuluPro/followers",
"following_url": "https://api.github.com/users/ZuluPro/following{/other_user}",
"gists_url": "https://api.github.com/users/ZuluPro/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ZuluPro",
"id": 2029180,
"login": "ZuluPro",
"node_id": "MDQ6VXNlcjIwMjkxODA=",
"organizations_url": "https://api.github.com/users/ZuluPro/orgs",
"received_events_url": "https://api.github.com/users/ZuluPro/received_events",
"repos_url": "https://api.github.com/users/ZuluPro/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ZuluPro/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ZuluPro/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ZuluPro",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2855/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2855/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2855.diff",
"html_url": "https://github.com/psf/requests/pull/2855",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2855.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2855"
}
| true |
[
"Thanks @ZuluPro but we already use an external Jenkins server that we maintain. Travis proved to be unstable for our needs.\n\nCheers!\n"
] |
https://api.github.com/repos/psf/requests/issues/2854
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2854/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2854/comments
|
https://api.github.com/repos/psf/requests/issues/2854/events
|
https://github.com/psf/requests/issues/2854
| 114,087,648 |
MDU6SXNzdWUxMTQwODc2NDg=
| 2,854 |
Bad default parameters to iter_content
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1865411?v=4",
"events_url": "https://api.github.com/users/gibiansky/events{/privacy}",
"followers_url": "https://api.github.com/users/gibiansky/followers",
"following_url": "https://api.github.com/users/gibiansky/following{/other_user}",
"gists_url": "https://api.github.com/users/gibiansky/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gibiansky",
"id": 1865411,
"login": "gibiansky",
"node_id": "MDQ6VXNlcjE4NjU0MTE=",
"organizations_url": "https://api.github.com/users/gibiansky/orgs",
"received_events_url": "https://api.github.com/users/gibiansky/received_events",
"repos_url": "https://api.github.com/users/gibiansky/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gibiansky/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gibiansky/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gibiansky",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-10-29T16:20:18Z
|
2021-09-08T21:00:52Z
|
2015-10-29T22:03:44Z
|
NONE
|
resolved
|
Apologies if this is a duplicate or not a bug – I ran into this recently and figured I should report it.
`iter_content` takes a `chunk_size` argument, which defaults to 1. Various code snippets on the internet call `request.iter_content()`. This is incredibly inefficient, because it is using a buffer size of 1 _byte_. I think `requests` should either a) set `chunk_size` to something more reasonable, like 2048 or 4096, or b) make `chunk_size` a required argument. (Solution a is more palatable since it's backwards compatible.)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2854/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2854/timeline
| null |
completed
| null | null | false |
[
"See https://github.com/kennethreitz/requests/issues/844 please\n",
"Yeah, this is an outstanding problem that we haven't gotten around to resolving yet. However, it's a dupe of #844.\n"
] |
https://api.github.com/repos/psf/requests/issues/2853
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2853/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2853/comments
|
https://api.github.com/repos/psf/requests/issues/2853/events
|
https://github.com/psf/requests/issues/2853
| 113,921,079 |
MDU6SXNzdWUxMTM5MjEwNzk=
| 2,853 |
Problem with verbose logging
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6306068?v=4",
"events_url": "https://api.github.com/users/troy-perkins/events{/privacy}",
"followers_url": "https://api.github.com/users/troy-perkins/followers",
"following_url": "https://api.github.com/users/troy-perkins/following{/other_user}",
"gists_url": "https://api.github.com/users/troy-perkins/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/troy-perkins",
"id": 6306068,
"login": "troy-perkins",
"node_id": "MDQ6VXNlcjYzMDYwNjg=",
"organizations_url": "https://api.github.com/users/troy-perkins/orgs",
"received_events_url": "https://api.github.com/users/troy-perkins/received_events",
"repos_url": "https://api.github.com/users/troy-perkins/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/troy-perkins/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/troy-perkins/subscriptions",
"type": "User",
"url": "https://api.github.com/users/troy-perkins",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2015-10-28T21:01:24Z
|
2021-09-08T21:00:52Z
|
2015-10-28T21:03:45Z
|
NONE
|
resolved
|
I am trying to enable verbose logging with my request string and I am getting the following error:
"request() got an unexpected keyword argument config"
I have tried searching but have not found a fix for this and I followed the following directions:
> > > my_config = {'verbose': sys.stderr}
> > > requests.get('http://httpbin.org/headers', config=my_config)
> > > 2011-08-17T03:04:23.380175 GET http://httpbin.org/headers
> > > <Response [200]>
Thanks
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2853/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2853/timeline
| null |
completed
| null | null | false |
[
"Please consult our documentation: The section immediately above [the Licensing section](http://docs.python-requests.org/en/latest/api/?highlight=log#licensing) explains how to enable verbose logging in versions of requests 1.0.0 and greater.\n",
"Also in the future, please ask all questions on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests). This is a defect tracker, not a support forum.\n",
"Thanks and noted for future questions!\n"
] |
https://api.github.com/repos/psf/requests/issues/2852
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2852/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2852/comments
|
https://api.github.com/repos/psf/requests/issues/2852/events
|
https://github.com/psf/requests/issues/2852
| 113,901,630 |
MDU6SXNzdWUxMTM5MDE2MzA=
| 2,852 |
OpenSSL.SSL.SysCallError: (10054, 'WSAECONNRESET')
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/325813?v=4",
"events_url": "https://api.github.com/users/steveoh/events{/privacy}",
"followers_url": "https://api.github.com/users/steveoh/followers",
"following_url": "https://api.github.com/users/steveoh/following{/other_user}",
"gists_url": "https://api.github.com/users/steveoh/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/steveoh",
"id": 325813,
"login": "steveoh",
"node_id": "MDQ6VXNlcjMyNTgxMw==",
"organizations_url": "https://api.github.com/users/steveoh/orgs",
"received_events_url": "https://api.github.com/users/steveoh/received_events",
"repos_url": "https://api.github.com/users/steveoh/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/steveoh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/steveoh/subscriptions",
"type": "User",
"url": "https://api.github.com/users/steveoh",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 11 |
2015-10-28T19:16:24Z
|
2021-09-08T21:00:52Z
|
2015-10-29T00:17:41Z
|
NONE
|
resolved
|
This error is being thrown after making around 1500 successful quests to an api of 3200 total.
The api returns a 204 of No content and I check for that status code and add it to another list to retry later.
What are the possible reasons for r = requests.put("https://my.api/update") might toss this error?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2852/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2852/timeline
| null |
completed
| null | null | false |
[
"I suspect this will be fixed when https://github.com/shazow/urllib3/pull/729#issuecomment-151929787 is merged and urllib3 is updated for the next requests release.\n",
"```\nTraceback (most recent call last):\n File \"..\\requests-2.8.1-py2.7.egg\\requests\\api.py\", line 122, in put\n return request('put', url, data=data, **kwargs)\n File \"..\\requests-2.8.1-py2.7.egg\\requests\\api.py\", line 50, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"..\\requests-2.8.1-py2.7.egg\\requests\\sessions.py\", line 468, in request\n resp = self.send(prep, **send_kwargs)\n File \"..\\requests-2.8.1-py2.7.egg\\requests\\sessions.py\", line 576, in send\n r = adapter.send(request, **kwargs)\n File \"..\\requests-2.8.1-py2.7.egg\\requests\\adapters.py\", line 370, in send\n timeout=timeout\n File \"..\\requests-2.8.1-py2.7.egg\\requests\\packages\\urllib3\\connectionpool.py\", line 559, in urlopen\n body=body, headers=headers)\n File \"..\\requests-2.8.1-py2.7.egg\\requests\\packages\\urllib3\\connectionpool.py\", line 376, in _make_request\n httplib_response = conn.getresponse(buffering=True)\n File \"C:\\Python27\\ArcGIS10.3\\lib\\httplib.py\", line 1067, in getresponse\n response.begin()\n File \"C:\\Python27\\ArcGIS10.3\\lib\\httplib.py\", line 409, in begin\n version, status, reason = self._read_status()\n File \"C:\\Python27\\ArcGIS10.3\\lib\\httplib.py\", line 365, in _read_status\n line = self.fp.readline(_MAXLINE + 1)\n File \"C:\\Python27\\ArcGIS10.3\\lib\\socket.py\", line 476, in readline\n data = self._sock.recv(self._rbufsize)\n File \"..\\requests-2.8.1-py2.7.egg\\requests\\packages\\urllib3\\contrib\\pyopenssl.py\", line 179, in recv\n data = self.connection.recv(*args, **kwargs)\n File \"..\\OpenSSL\\SSL.py\", line 1320, in recv\n self._raise_ssl_error(self._ssl, result)\n File \"..\\OpenSSL\\SSL.py\", line 1178, in _raise_ssl_error\n raise SysCallError(errno, errorcode.get(errno))\nSysCallError: (10054, 'WSAECONNRESET')\n```\n",
"This is actually a new bug, it's not the same as shazow/urllib3#729. urllib3 should be wrapping `SysCallError` in this case. @steveoh, can you raise this issue on urllib3?\n",
"So @Lukasa, how does that little change affect us here? How will requests act differently with the wrapped error?\n",
"@steveoh urllib3 should now see this as a standard socket error, and so wrap it, and requests will then wrap _that_. I'd expect a standard requests error to be seen.\n",
"@Lukasa Any idea why the socket is throwing the error in the first place?\n",
"@steveoh Sure. The error is clear if you happen to speak Windows error codes (and I do). ;)\n\nThe error code WSAECONNRESET means that the TCP connection was forcefully closed by the remote peer. This means something about this request the server doesn't like. It'd be interesting if you checked what the state of your system was when it happened so you could see what the request was.\n",
"It happens during a long request. My suspicion is a timeout is reached.\n",
"A server should not have a timeout like that: generally speaking, if the connection is still active there's no need to time it out.\n",
"@Lukasa could it be that there's a load balancer terminating TLS and that's timing out the connection?\n",
"Certainly possible.\n"
] |
https://api.github.com/repos/psf/requests/issues/2851
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2851/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2851/comments
|
https://api.github.com/repos/psf/requests/issues/2851/events
|
https://github.com/psf/requests/issues/2851
| 113,559,868 |
MDU6SXNzdWUxMTM1NTk4Njg=
| 2,851 |
Documentation Problem?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7996484?v=4",
"events_url": "https://api.github.com/users/djchou/events{/privacy}",
"followers_url": "https://api.github.com/users/djchou/followers",
"following_url": "https://api.github.com/users/djchou/following{/other_user}",
"gists_url": "https://api.github.com/users/djchou/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/djchou",
"id": 7996484,
"login": "djchou",
"node_id": "MDQ6VXNlcjc5OTY0ODQ=",
"organizations_url": "https://api.github.com/users/djchou/orgs",
"received_events_url": "https://api.github.com/users/djchou/received_events",
"repos_url": "https://api.github.com/users/djchou/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/djchou/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/djchou/subscriptions",
"type": "User",
"url": "https://api.github.com/users/djchou",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2015-10-27T11:13:53Z
|
2021-09-08T21:00:53Z
|
2015-10-27T12:53:10Z
|
NONE
|
resolved
|
When I run the following code from requests documentation, I get a 401 response.
import requests
r = requests.get('https://api.github.com', auth=('user', 'pass'))
Given that this is the quick start code for requests, is there something that needs to be updated?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2851/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2851/timeline
| null |
completed
| null | null | false |
[
"@djchou In this context, `user` and `pass` are placeholder strings to be replaced by your own Github username and password.\n",
"Yes, I replaced them with my user pass and still had the same result. \n\n> On Oct 27, 2015, at 5:53 AM, Cory Benfield [email protected] wrote:\n> \n> @djchou In this context, user and pass are placeholder strings to be replaced by your own Github username and password.\n> \n> —\n> Reply to this email directly or view it on GitHub.\n",
"@djchou What is the result of:\n\n``` python\nimport requests\nr = requests.get('https://api.github.com', auth=('user', 'pass'))\nprint(r.content)\n```\n",
"ah, its the two factor auth that is causing it to fail.\n\n'{\"message\":\"Must specify two-factor authentication OTP\ncode.\",\"documentation_url\":\"https://developer.github.com/v3/auth#working-with-two-factor-authentication\"}'\n\nOn Tue, Oct 27, 2015 at 6:07 AM, Cory Benfield [email protected]\nwrote:\n\n> @djchou https://github.com/djchou What is the result of:\n> \n> import requests\n> r = requests.get('https://api.github.com', auth=('user', 'pass'))print(r.content)\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2851#issuecomment-151489049\n> .\n",
"Yeah, that's what I thought too.\n\nI don't think it's worth gumming up the docs with this: it's extraneous detail that won't help much.\n"
] |
https://api.github.com/repos/psf/requests/issues/2850
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2850/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2850/comments
|
https://api.github.com/repos/psf/requests/issues/2850/events
|
https://github.com/psf/requests/issues/2850
| 113,456,222 |
MDU6SXNzdWUxMTM0NTYyMjI=
| 2,850 |
Connection reset by peer problem.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/459869?v=4",
"events_url": "https://api.github.com/users/pcsysen/events{/privacy}",
"followers_url": "https://api.github.com/users/pcsysen/followers",
"following_url": "https://api.github.com/users/pcsysen/following{/other_user}",
"gists_url": "https://api.github.com/users/pcsysen/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/pcsysen",
"id": 459869,
"login": "pcsysen",
"node_id": "MDQ6VXNlcjQ1OTg2OQ==",
"organizations_url": "https://api.github.com/users/pcsysen/orgs",
"received_events_url": "https://api.github.com/users/pcsysen/received_events",
"repos_url": "https://api.github.com/users/pcsysen/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/pcsysen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pcsysen/subscriptions",
"type": "User",
"url": "https://api.github.com/users/pcsysen",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 20 |
2015-10-26T21:34:56Z
|
2021-09-08T17:05:20Z
|
2015-11-09T09:26:04Z
|
NONE
|
resolved
|
Hi, we are using slumber to setup a smoke test against our server(django + uwsgi). We started to see this "connection reset by peer" error after we added the "content-length" in the response header. It only happens occasionally. However it never happens when we don't include content-length in the response header, at least based on our experiences. I see some similar questions about they all seem to related to ssl. We are just using http. Could anyone help? Thanks.
url: POST to http://127.0.0.1:8080/api/v1....
```
Traceback (most recent call last):
File "./smoke_test_driver.py", line 936,
_pre_run_apis(host, cpe_mac, api_prefix, webpa_gateway, current_environment)
File "./smoke_test_driver.py", line 790, in _pre_run_apis
cpe.device_class.put({'xpc_model_class': device_class})
File "/usr/local/lib/python2.7/site-packages/slumber/__init__.py", line 175, in put
resp = self._request("PUT", data=data, files=files, params=kwargs)
File "/usr/local/lib/python2.7/site-packages/slumber/__init__.py", line 97, in _request
resp = self._store["session"].request(method, url, data=data, params=params, files=files, headers=headers)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 465, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 573, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/adapters.py", line 415, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', error(104, 'Connection reset by peer'))
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2850/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2850/timeline
| null |
completed
| null | null | false |
[
"Python will calculate the content-length for you. It's very likely your content-length is incorrect and that's causing your issue. Without more details about how to reproduce this, version information, or example code causing this, I'm very tempted to close this outright.\n",
"in our settings.py, we have\n\n```\nMIDDLEWARE_CLASSES = (\n 'shared.audit_logging.middlewares.AuditLoggingMiddleware', # other middleware\n 'django.middleware.common.CommonMiddleware', # other middleware\n 'shared.middlewares.http.ContentLengthMiddleware', # <== the middleware for content length\n}\n```\n\nin http.py, the codes that put the content-length back to header is like this\n\n```\nclass ContentLengthMiddleware(object):\n def process_response(self, request, response):\n if not response.streaming and not response.has_header('Content-Length'):\n response['Content-Length'] = str(len(response.content)) if response.content else '0'\n return response\n```\n",
"@pcsysen Can you also print the headers on the response, as observed by requests?\n",
"@Lukasa I am sorry that I don't understand requests well enough. Could you show me how to do it? Thanks.\n",
"@pcsysen Sure:\n\n``` python\nr = requests.get(url)\nprint r.headers\n```\n",
"@Lukasa Are you talking about the normal result? When it is working it is like this\n\n```\nPython 2.7.8 (default, Sep 30 2014, 22:34:31) \n[GCC 4.4.7 20120313 (Red Hat 4.4.7-4)] on linux2\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests\n>>> r = requests.get('http://localhost:8080/api/v1/version')\n>>> print r.headers\n{'content-length': '728', 'content-type': 'application/json', 'allow': 'GET, HEAD, OPTIONS'}\n```\n\nI don't know if it tells anything. I was thinking maybe you want to see the header when the error happens, which I don't know and I failed to learn it from the example you posted.\n",
"Hang on, did you say the above is the output _without_ your `ContentLengthMiddleware`?\n",
"WITH ContentLengthMiddleware.\n",
"Hmm. With `ConnectionResetByPeer`, I think uWSGI or your server will be at fault. Does uWSGI provide any output in these error cases?\n",
"No info at the uwsgi log.\n",
"Hmm. Can you provide some example code that reproduces the problem on your machine? I'll suggest edits to it that will give me some debugging information.\n",
"The thing is, it is a bit hard to reproduce. The error does not always happen.\n",
"@pcsysen Can you guarantee it'll happen _eventually_ if you put it in a `while True` loop?\n",
"I can't guarantee anything as I don't know why it happens. I have seen it happens 2 times today. I put it on a semi-loop. That is, I am also trying to debugging it myself by adding some debug printout. Every now and then, I stopped the loop and make some changes. The error did not happen within the last 2 hours.\n",
"@pcsysen is `response.content` always bytes? If not is it possible that the requests where you saw this had multi-byte characters in them? (To be clear, I'm fairly certain this isn't caused by requests at this point.)\n",
"@sigmavirus24 I don't quite understand what you mean by \"always bytes\". The response should be all ASCII characters.\n",
"@pcsysen It _should_ always be ASCII, but _is_ it?\n",
"@pcsysen Ping =)\n",
"Closing due to inactivity.\n",
"For anyone else having a 104 error, I seem to have fixed my problem by omitting gcc system includes.\nI had used the output of `clang-3.6 -v -E -x c++ -` as my `-isystem` paths in `.ycm_extra_conf.py`, which output these two lines along with everything else:\n\n```\n/usr/bin/../lib/gcc/x86_64-linux-gnu/4.8/../../../../include/c++/4.8\n/usr/bin/../lib/gcc/x86_64-linux-gnu/4.8/include\n```\n\nIt seems there are files in those paths whose content cause the 104 errors in Ycm, which is not surprising since there are lots of gcc intrinsics being declared.\n"
] |
https://api.github.com/repos/psf/requests/issues/2849
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2849/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2849/comments
|
https://api.github.com/repos/psf/requests/issues/2849/events
|
https://github.com/psf/requests/pull/2849
| 113,384,052 |
MDExOlB1bGxSZXF1ZXN0NDg3NTY0MDE=
| 2,849 |
Add 511 Network Authentication Required status code
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1740816?v=4",
"events_url": "https://api.github.com/users/annp89/events{/privacy}",
"followers_url": "https://api.github.com/users/annp89/followers",
"following_url": "https://api.github.com/users/annp89/following{/other_user}",
"gists_url": "https://api.github.com/users/annp89/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/annp89",
"id": 1740816,
"login": "annp89",
"node_id": "MDQ6VXNlcjE3NDA4MTY=",
"organizations_url": "https://api.github.com/users/annp89/orgs",
"received_events_url": "https://api.github.com/users/annp89/received_events",
"repos_url": "https://api.github.com/users/annp89/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/annp89/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/annp89/subscriptions",
"type": "User",
"url": "https://api.github.com/users/annp89",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-10-26T15:43:45Z
|
2021-09-08T06:00:49Z
|
2015-10-26T15:58:01Z
|
NONE
|
resolved
|
Addresses issue: https://github.com/kennethreitz/requests/issues/2847
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2849/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2849/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2849.diff",
"html_url": "https://github.com/psf/requests/pull/2849",
"merged_at": "2015-10-26T15:58:01Z",
"patch_url": "https://github.com/psf/requests/pull/2849.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2849"
}
| true |
[
"Thanks @annp89 !\n"
] |
https://api.github.com/repos/psf/requests/issues/2848
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2848/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2848/comments
|
https://api.github.com/repos/psf/requests/issues/2848/events
|
https://github.com/psf/requests/issues/2848
| 113,382,287 |
MDU6SXNzdWUxMTMzODIyODc=
| 2,848 |
urllib3 SSL errors in 2.8.x
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/188962?v=4",
"events_url": "https://api.github.com/users/flupke/events{/privacy}",
"followers_url": "https://api.github.com/users/flupke/followers",
"following_url": "https://api.github.com/users/flupke/following{/other_user}",
"gists_url": "https://api.github.com/users/flupke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/flupke",
"id": 188962,
"login": "flupke",
"node_id": "MDQ6VXNlcjE4ODk2Mg==",
"organizations_url": "https://api.github.com/users/flupke/orgs",
"received_events_url": "https://api.github.com/users/flupke/received_events",
"repos_url": "https://api.github.com/users/flupke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/flupke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/flupke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/flupke",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-10-26T15:35:15Z
|
2021-09-08T21:00:53Z
|
2015-10-26T20:48:37Z
|
NONE
|
resolved
|
We recently tried to upgrade to requests 2.8.1 and have been touched by this urllib3 issue: https://github.com/shazow/urllib3/issues/717
Downgrading to 2.7.0 resolves it.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2848/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2848/timeline
| null |
completed
| null | null | false |
[
"@flupke The urllib3 and requests projects work together closely: we're entirely aware of the bug that you're encountering, but we won't be able to fix it until a new urllib3 release comes out.\n"
] |
https://api.github.com/repos/psf/requests/issues/2847
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2847/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2847/comments
|
https://api.github.com/repos/psf/requests/issues/2847/events
|
https://github.com/psf/requests/issues/2847
| 113,300,514 |
MDU6SXNzdWUxMTMzMDA1MTQ=
| 2,847 |
Add 511 Network Authentication Required status code for completeness
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1740816?v=4",
"events_url": "https://api.github.com/users/annp89/events{/privacy}",
"followers_url": "https://api.github.com/users/annp89/followers",
"following_url": "https://api.github.com/users/annp89/following{/other_user}",
"gists_url": "https://api.github.com/users/annp89/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/annp89",
"id": 1740816,
"login": "annp89",
"node_id": "MDQ6VXNlcjE3NDA4MTY=",
"organizations_url": "https://api.github.com/users/annp89/orgs",
"received_events_url": "https://api.github.com/users/annp89/received_events",
"repos_url": "https://api.github.com/users/annp89/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/annp89/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/annp89/subscriptions",
"type": "User",
"url": "https://api.github.com/users/annp89",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2015-10-26T07:37:48Z
|
2021-09-08T21:00:53Z
|
2015-10-26T15:58:14Z
|
NONE
|
resolved
|
Based on http://tools.ietf.org/html/rfc6585, looks like all the other status codes mentioned in that document are included except for 511. Any specific reason/concern?
I'd be happy to open a pull request with the following additional line if it looks reasonable to you.
```
511: ('network_authentication_required', 'network_auth', 'network_authentication'),
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2847/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2847/timeline
| null |
completed
| null | null | false |
[
"Please do @annp89. I think it was just an error code that we happened to miss. I don't think there was any reason or concern we had with adding it.\n\nCheers!\n",
"Awesome! Will do shortly :)\n",
"Closed by #2849 \n"
] |
https://api.github.com/repos/psf/requests/issues/2846
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2846/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2846/comments
|
https://api.github.com/repos/psf/requests/issues/2846/events
|
https://github.com/psf/requests/issues/2846
| 113,267,959 |
MDU6SXNzdWUxMTMyNjc5NTk=
| 2,846 |
HTTP PROXY password contains '#' failed to connect
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3996876?v=4",
"events_url": "https://api.github.com/users/bonfy/events{/privacy}",
"followers_url": "https://api.github.com/users/bonfy/followers",
"following_url": "https://api.github.com/users/bonfy/following{/other_user}",
"gists_url": "https://api.github.com/users/bonfy/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bonfy",
"id": 3996876,
"login": "bonfy",
"node_id": "MDQ6VXNlcjM5OTY4NzY=",
"organizations_url": "https://api.github.com/users/bonfy/orgs",
"received_events_url": "https://api.github.com/users/bonfy/received_events",
"repos_url": "https://api.github.com/users/bonfy/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bonfy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bonfy/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bonfy",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-10-26T01:15:04Z
|
2021-09-08T21:00:54Z
|
2015-10-26T02:01:59Z
|
NONE
|
resolved
|
``` python
import requests
PROXIES = {
'http':'http://{usr}:{pwd}@proxy:port'.format(usr = 'XXX',pwd = 'XXX#XXX'),
}
url = 'http://www.sina.com'
r = requests.get(url, proxies = PROXIES)
print r.text
```
with error :
Traceback (most recent call last):
File "autohome.py", line 11, in <module>
r = requests.get(url, proxies = PROXIES)
File "C:\Python27\lib\site-packages\requests-2.2.1-py2.7.egg\requests\api.py",
line 55, in get
return request('get', url, *_kwargs)
File "C:\Python27\lib\site-packages\requests-2.2.1-py2.7.egg\requests\api.py",
line 44, in request
return session.request(method=method, url=url, *_kwargs)
File "C:\Python27\lib\site-packages\requests-2.2.1-py2.7.egg\requests\sessions
.py", line 383, in request
resp = self.send(prep, *_send_kwargs)
File "C:\Python27\lib\site-packages\requests-2.2.1-py2.7.egg\requests\sessions
.py", line 486, in send
r = adapter.send(request, *_kwargs)
File "C:\Python27\lib\site-packages\requests-2.2.1-py2.7.egg\requests\adapters
.py", line 305, in send
conn = self.get_connection(request.url, proxies)
File "C:\Python27\lib\site-packages\requests-2.2.1-py2.7.egg\requests\adapters
.py", line 215, in get_connection
block=self._pool_block)
File "C:\Python27\lib\site-packages\requests-2.2.1-py2.7.egg\requests\packages
\urllib3\poolmanager.py", line 258, in proxy_from_url
return ProxyManager(proxy_url=url, **kw)
File "C:\Python27\lib\site-packages\requests-2.2.1-py2.7.egg\requests\packages
\urllib3\poolmanager.py", line 207, in __init__
proxy = parse_url(proxy_url)
File "C:\Python27\lib\site-packages\requests-2.2.1-py2.7.egg\requests\packages
\urllib3\util.py", line 397, in parse_url
raise LocationParseError("Failed to parse: %s" % url)
requests.packages.urllib3.exceptions.LocationParseError: Failed to parse: Failed
to parse: XXX:XXX
the password after '#' missed and can not connect to the proxy server
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2846/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2846/timeline
| null |
completed
| null | null | false |
[
"Since the proxy URL needs to be a valid URL, you'll need to URL encode any special characters in the credentials yourself before you put them in an URL - otherwise, the `#` will be interpreted as a fragment identifier, in this case by `urllib3`'s [`parse_url()`](https://github.com/kennethreitz/requests/blob/v2.2.1/requests/packages/urllib3/util.py#L335-L417).\n\nTry `XXX%23XXX` for your password (`%23` is the URL encoded form of `#`). Or programmatically:\n\n```\nimport urllib\nencoded_pwd = urllib.quote_plus(pwd)\n```\n",
"```\nimport urllib\nencoded_pwd = urllib.quote_plus(pwd)\n```\n\nit worked! Thank you\n"
] |
https://api.github.com/repos/psf/requests/issues/2845
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2845/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2845/comments
|
https://api.github.com/repos/psf/requests/issues/2845/events
|
https://github.com/psf/requests/pull/2845
| 113,009,220 |
MDExOlB1bGxSZXF1ZXN0NDg1NzM2MTE=
| 2,845 |
Fix issue #2844
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1894766?v=4",
"events_url": "https://api.github.com/users/akhomchenko/events{/privacy}",
"followers_url": "https://api.github.com/users/akhomchenko/followers",
"following_url": "https://api.github.com/users/akhomchenko/following{/other_user}",
"gists_url": "https://api.github.com/users/akhomchenko/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/akhomchenko",
"id": 1894766,
"login": "akhomchenko",
"node_id": "MDQ6VXNlcjE4OTQ3NjY=",
"organizations_url": "https://api.github.com/users/akhomchenko/orgs",
"received_events_url": "https://api.github.com/users/akhomchenko/received_events",
"repos_url": "https://api.github.com/users/akhomchenko/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/akhomchenko/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/akhomchenko/subscriptions",
"type": "User",
"url": "https://api.github.com/users/akhomchenko",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-10-23T12:23:19Z
|
2021-09-08T06:00:50Z
|
2015-10-23T12:24:36Z
|
CONTRIBUTOR
|
resolved
|
Closes https://github.com/kennethreitz/requests/issues/2844
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2845/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2845/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2845.diff",
"html_url": "https://github.com/psf/requests/pull/2845",
"merged_at": "2015-10-23T12:24:36Z",
"patch_url": "https://github.com/psf/requests/pull/2845.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2845"
}
| true |
[
"\\o/ Looks good to me! :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/2844
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2844/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2844/comments
|
https://api.github.com/repos/psf/requests/issues/2844/events
|
https://github.com/psf/requests/issues/2844
| 113,001,924 |
MDU6SXNzdWUxMTMwMDE5MjQ=
| 2,844 |
_encode_params return bytes that break urlencode
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1894766?v=4",
"events_url": "https://api.github.com/users/akhomchenko/events{/privacy}",
"followers_url": "https://api.github.com/users/akhomchenko/followers",
"following_url": "https://api.github.com/users/akhomchenko/following{/other_user}",
"gists_url": "https://api.github.com/users/akhomchenko/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/akhomchenko",
"id": 1894766,
"login": "akhomchenko",
"node_id": "MDQ6VXNlcjE4OTQ3NjY=",
"organizations_url": "https://api.github.com/users/akhomchenko/orgs",
"received_events_url": "https://api.github.com/users/akhomchenko/received_events",
"repos_url": "https://api.github.com/users/akhomchenko/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/akhomchenko/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/akhomchenko/subscriptions",
"type": "User",
"url": "https://api.github.com/users/akhomchenko",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
},
{
"color": "0b02e1",
"default": false,
"description": null,
"id": 191274,
"name": "Contributor Friendly",
"node_id": "MDU6TGFiZWwxOTEyNzQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly"
}
] |
closed
| true | null |
[] |
{
"closed_at": "2016-01-05T17:14:26Z",
"closed_issues": 2,
"created_at": "2015-10-12T10:32:11Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/26",
"id": 1350559,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/26/labels",
"node_id": "MDk6TWlsZXN0b25lMTM1MDU1OQ==",
"number": 26,
"open_issues": 0,
"state": "closed",
"title": "2.9.0",
"updated_at": "2016-01-05T17:14:26Z",
"url": "https://api.github.com/repos/psf/requests/milestones/26"
}
| 8 |
2015-10-23T11:27:22Z
|
2021-09-08T20:00:55Z
|
2015-10-23T12:24:36Z
|
CONTRIBUTOR
|
resolved
|
Simple example:
```
import requests
r = requests.get('http://python.org', params=b'test=foo')
print(r.status_code)
```
Trace:
```
Traceback (most recent call last):
... skip ...
url = requote_uri(urlunparse([scheme, netloc, path, None, query, fragment]))
File "C:\Python35\lib\urllib\parse.py", line 383, in urlunparse
_coerce_args(*components))
File "C:\Python35\lib\urllib\parse.py", line 111, in _coerce_args
raise TypeError("Cannot mix str and non-str arguments")
TypeError: Cannot mix str and non-str arguments
```
I know, that this one is a bad use case, but API allows us to do so.
Should we do something?
P.S. Python 3.4/3.5 on win 8.1 x64
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2844/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2844/timeline
| null |
completed
| null | null | false |
[
"I think what we need to do here is to change `encode_params` to make sure that it calls `to_native_string` on its argument before returning it. This seems like a bug we can fix pretty swiftly, and is well suited to a new contributor, so I'll mark this as `Contributor Friendly`.\n",
"This fix seems completely broken. It prevents me from uploading binary data through PUT.\n",
"@untitaker please open a new issue.\n",
"See #2030\n",
"Could you release a quick 2.9.1 with this fix plz ?\n\nI was waiting the 2.9.0 for the REQUEST_CA_BUNDLE env var but cannot use it for the moment due to this bug ;-(\n",
"@touilleMan We're planning to release 2.9.1 on Monday.\n",
"Awesome, you guys rock ! ^^\n",
"Looking forward to the new release. This bug breaks everything, since we use Chinese...\n"
] |
https://api.github.com/repos/psf/requests/issues/2843
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2843/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2843/comments
|
https://api.github.com/repos/psf/requests/issues/2843/events
|
https://github.com/psf/requests/pull/2843
| 112,988,165 |
MDExOlB1bGxSZXF1ZXN0NDg1NjA0MjU=
| 2,843 |
Timeout used by `resolve_redirects` generator can now depend on URL
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/368123?v=4",
"events_url": "https://api.github.com/users/tristan0x/events{/privacy}",
"followers_url": "https://api.github.com/users/tristan0x/followers",
"following_url": "https://api.github.com/users/tristan0x/following{/other_user}",
"gists_url": "https://api.github.com/users/tristan0x/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tristan0x",
"id": 368123,
"login": "tristan0x",
"node_id": "MDQ6VXNlcjM2ODEyMw==",
"organizations_url": "https://api.github.com/users/tristan0x/orgs",
"received_events_url": "https://api.github.com/users/tristan0x/received_events",
"repos_url": "https://api.github.com/users/tristan0x/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tristan0x/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tristan0x/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tristan0x",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2015-10-23T09:48:32Z
|
2021-09-08T06:00:51Z
|
2015-10-23T10:24:13Z
|
NONE
|
resolved
|
`timeout` parameter given to the `Session::resolve_redirects` member method
can be a `callable` object taking the URL in parameter.
It allows you to specify different timeout values for URLs encountered
by the generator.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2843/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2843/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2843.diff",
"html_url": "https://github.com/psf/requests/pull/2843",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2843.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2843"
}
| true |
[
"How is it expected that you'd use this? By setting `allow_redirects=False` and calling `resolve_redirects` manually?\n",
"Yes exactly!\n",
"Ok. =)\n\nSo, generally speaking, we don't allow the parameters to `resolve_redirects` to be different to those that you can pass to `requests.*`: it just leads to too much confusion. In this case, it should be safe enough to step through `resolve_redirects` one URL at a time and check the `Location` header as you go, using the information there to determine what timeout you'd like to apply.\n\nGiven that we are in feature freeze, I don't think this meets the bar for a merge into the library. Sorry! =(\n"
] |
https://api.github.com/repos/psf/requests/issues/2842
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2842/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2842/comments
|
https://api.github.com/repos/psf/requests/issues/2842/events
|
https://github.com/psf/requests/issues/2842
| 112,891,328 |
MDU6SXNzdWUxMTI4OTEzMjg=
| 2,842 |
PEP8 compliance
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3768896?v=4",
"events_url": "https://api.github.com/users/pathcl/events{/privacy}",
"followers_url": "https://api.github.com/users/pathcl/followers",
"following_url": "https://api.github.com/users/pathcl/following{/other_user}",
"gists_url": "https://api.github.com/users/pathcl/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/pathcl",
"id": 3768896,
"login": "pathcl",
"node_id": "MDQ6VXNlcjM3Njg4OTY=",
"organizations_url": "https://api.github.com/users/pathcl/orgs",
"received_events_url": "https://api.github.com/users/pathcl/received_events",
"repos_url": "https://api.github.com/users/pathcl/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/pathcl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pathcl/subscriptions",
"type": "User",
"url": "https://api.github.com/users/pathcl",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-10-22T21:04:31Z
|
2021-09-08T21:00:56Z
|
2015-10-22T21:08:20Z
|
NONE
|
resolved
|
Hello,
I've just forked requests and tried flake8 on it.
1 E122 continuation line missing indentation or outdented
1 E127 continuation line over-indented for visual indent
1 E129 visually indented line with same indent as next logical line
1 E401 multiple imports on one line
1 E713 test for membership should be 'not in'
1 E714 test for object identity should be 'is not'
1 F402 import '_' from line 13 shadowed by loop variable
1 W391 blank line at end of file
2 E121 continuation line under-indented for hanging indent
2 E225 missing whitespace around operator
2 F841 local variable 'url' is assigned to but never used
3 E125 continuation line with same indent as next logical line
3 E221 multiple spaces before operator
3 E301 expected 1 blank line, found 0
4 E126 continuation line over-indented for hanging indent
10 E303 too many blank lines (3)
10 W291 trailing whitespace
11 E131 continuation line unaligned for hanging indent
16 F821 undefined name 'unicode'
20 E251 unexpected spaces around keyword / parameter equals
20 E302 expected 2 blank lines, found 1
39 E128 continuation line under-indented for visual indent
65 E262 inline comment should start with '# '
84 F401 'urllib3' imported but unused
85 E261 at least two spaces before inline comment
111 E265 block comment should start with '# '
257 E501 line too long (80 > 79 characters)
2651 E231 missing whitespace after ','
Is this relevant? I think it should be available some linting check before doing setup.py install
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2842/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2842/timeline
| null |
completed
| null | null | false |
[
"Please check old issues before opening new ones.\n- https://github.com/kennethreitz/requests/issues/1899\n- https://github.com/kennethreitz/requests/pull/2298\n- https://github.com/kennethreitz/requests/pull/1964\n\nThis project does not follow PEP-0008.\n"
] |
https://api.github.com/repos/psf/requests/issues/2841
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2841/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2841/comments
|
https://api.github.com/repos/psf/requests/issues/2841/events
|
https://github.com/psf/requests/issues/2841
| 112,567,994 |
MDU6SXNzdWUxMTI1Njc5OTQ=
| 2,841 |
test_connection_error_invalid_domain and test_connection_error_invalid_port fail with proxies
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1009685?v=4",
"events_url": "https://api.github.com/users/frispete/events{/privacy}",
"followers_url": "https://api.github.com/users/frispete/followers",
"following_url": "https://api.github.com/users/frispete/following{/other_user}",
"gists_url": "https://api.github.com/users/frispete/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/frispete",
"id": 1009685,
"login": "frispete",
"node_id": "MDQ6VXNlcjEwMDk2ODU=",
"organizations_url": "https://api.github.com/users/frispete/orgs",
"received_events_url": "https://api.github.com/users/frispete/received_events",
"repos_url": "https://api.github.com/users/frispete/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/frispete/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/frispete/subscriptions",
"type": "User",
"url": "https://api.github.com/users/frispete",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2015-10-21T11:40:16Z
|
2021-09-08T12:01:08Z
|
2017-01-24T18:26:05Z
|
NONE
|
resolved
|
testing should handle environments with proxies consciously, otherwise these tests fail:
``` ======================================================================
ERROR: test_connection_error_invalid_domain (__main__.RequestsTestCase)
Connecting to an unknown domain should raise a ConnectionError
----------------------------------------------------------------------
Traceback (most recent call last):
File "test_requests.py", line 322, in test_connection_error_invalid_domain
requests.get("http://doesnotexist.google.com")
File "/usr/lib/python2.7/site-packages/_pytest/python.py", line 1307, in __exit__
pytest.fail("DID NOT RAISE")
File "/usr/lib/python2.7/site-packages/_pytest/runner.py", line 478, in fail
raise Failed(msg=msg, pytrace=pytrace)
Failed: DID NOT RAISE
======================================================================
ERROR: test_connection_error_invalid_port (__main__.RequestsTestCase)
Connecting to an invalid port should raise a ConnectionError
----------------------------------------------------------------------
Traceback (most recent call last):
File "test_requests.py", line 327, in test_connection_error_invalid_port
requests.get("http://httpbin.org:1", timeout=1)
File "/usr/lib/python2.7/site-packages/_pytest/python.py", line 1307, in __exit__
pytest.fail("DID NOT RAISE")
File "/usr/lib/python2.7/site-packages/_pytest/runner.py", line 478, in fail
raise Failed(msg=msg, pytrace=pytrace)
Failed: DID NOT RAISE
```
Environment: openSUSE 13.2/x86_64, squid 3 proxy
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2841/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2841/timeline
| null |
completed
| null | null | false |
[
"Yeah, this is probably true. I suspect we'll fix it by moving to an entirely offline test suite though.\n",
"Neither of these tests exist anymore, so I think this can be closed out."
] |
https://api.github.com/repos/psf/requests/issues/2840
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2840/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2840/comments
|
https://api.github.com/repos/psf/requests/issues/2840/events
|
https://github.com/psf/requests/issues/2840
| 112,567,289 |
MDU6SXNzdWUxMTI1NjcyODk=
| 2,840 |
test_response_iter_lines_reentrant failing with python 2.7 and urllib 1.12
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1009685?v=4",
"events_url": "https://api.github.com/users/frispete/events{/privacy}",
"followers_url": "https://api.github.com/users/frispete/followers",
"following_url": "https://api.github.com/users/frispete/following{/other_user}",
"gists_url": "https://api.github.com/users/frispete/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/frispete",
"id": 1009685,
"login": "frispete",
"node_id": "MDQ6VXNlcjEwMDk2ODU=",
"organizations_url": "https://api.github.com/users/frispete/orgs",
"received_events_url": "https://api.github.com/users/frispete/received_events",
"repos_url": "https://api.github.com/users/frispete/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/frispete/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/frispete/subscriptions",
"type": "User",
"url": "https://api.github.com/users/frispete",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 19 |
2015-10-21T11:35:53Z
|
2021-09-08T21:00:51Z
|
2015-10-21T11:37:40Z
|
NONE
|
resolved
|
FAIL: test_response_iter_lines_reentrant (**main**.RequestsTestCase)
## Response.iter_lines() is not reentrant safe
Traceback (most recent call last):
File "test_requests.py", line 1127, in test_response_iter_lines_reentrant
assert len(list(r.iter_lines())) == 3
AssertionError
---
Ran 150 tests in 43.451s
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2840/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2840/timeline
| null |
completed
| null | null | false |
[
"@frispete That test is supposed to fail: it is marked `pytest.xfail`. You need to run the tests using py.test.\n",
"Hmm, PyPI hosts a py.test version 0.0, stating to use pytest for installation, and that is what I use: pytest 2.8.2. Otherwise the test suite wouldn't run at all.\n",
"@frispete How did you run the test? Because I clearly see it marked as `xfail`.\n",
"@Lukasa I build distribution packages for openSUSE. Since the tests will fail in Build Service due to \"no network\" policy during build, I test these packages in a local installation at least. \n\ncd requests-2.8.1\n./test_requests.py\n\nBTW, I considered to add a feature request to have an --local test option, e.g. using vcrpy.\n",
"@frispete We have a plan to swap to using pytest-httpbin, we just haven't done that yet.\n",
"@Lukasa Does that allow offline tests? Scratch that, I read the purpose: sounds great.\n",
"@frispete we have a similar policy in Debian and I follow the same workflow, testing requests locally.\n\nI'm testing calling py.test (and py.test-3: on Debian we had to version the name, I know it's ugly...) directly and all the tests are fine (with one xfailed):\n\n```\n% py.test test_requests.py\n% py.test-3 test_requests.py\n```\n\n@Lukasa it's a very good news the plan about switching to pytest-httpbin! I would like to help, is there a ticket about this?\n",
"@eriol Yup, see #2184. We're very close now, just got problems with one threaded test. \n",
"@Lukasa Confirming this issue too (requests 2.8.1). The invocation method is setup.py test, which runs py.test directly. I was a bit confused too when I saw the mark.xfail, but it's definitely making the tests fail:\n\n```\nEnsures that we properly deal with different kinds of IO streams. ... ok\n\n======================================================================\nFAIL: test_response_iter_lines_reentrant (test_requests.RequestsTestCase)\nResponse.iter_lines() is not reentrant safe\n----------------------------------------------------------------------\nTraceback (most recent call last):\n File \"/usr/home/koobs/repos/freebsd/ports/www/py-requests/work/requests-2.8.1/test_requests.py\", line 1127, in test_response_iter_lines_reentrant\n assert len(list(r.iter_lines())) == 3\nAssertionError\n\n----------------------------------------------------------------------\nRan 150 tests in 94.706s\n\nFAILED (failures=1)\n*** Error code 1\n```\n\npytest version is 2.7.1.\n",
"@koobs out of interest, what happens if you use a newer py.test?\n",
"@Lukasa It's on my list of ports to update, will report back post-update :)\n",
"@Lukasa Same failure with pytest 2.8.2 (latest py as well)\n\nEdit: Fails with setuptools test command. Direct py.test passes (that test), but fails on another :)\n\n```\nTestTimeout.test_total_timeout_connect\n<snip>\nrequests/adapters.py:423: ConnectionError\n<snip>\n1 failed, 164 passed, 1 xfailed in 87.75 seconds\n```\n",
"@koobs So this suggests there's some problem with the py.test to `setup.py test` integration (given that xfail doesn't work appropriately).\n",
"@Lukasa I would concur given the tests/evidence so far. Perhaps @hpk42 could shed some light on it\n",
"I found this, which may be related: https://github.com/pytest-dev/pytest/issues/750\n",
"Glimpsing through the various comments here i am not sure what the precise question related to requests/pytest interactions, sorry. Could you shed some light on this? :)\n",
"@hpk42 Thanks for responding :) TLDR is: tests marked xfail seem to be reported as failures (not expected failures, resulting in a OK on the test suite) for `setup.py test` and `python -m pytest` invocations of py.test (py.test direct works fine)\n",
"@koobs i can't reproduce. Both \"python -m pytest\" and \"setup.py test\" work fine for me for an xfailing test when using the standard integration https://pytest.org/latest/goodpractises.html#integration-with-setuptools-test-commands (using pytest-2.8.2 but i think it worked before). \n",
"@hpk42 Interesting. I'm out of ideas for isolating/debugging it from here, any thoughts? Just for clarity I was just +1 on the original report by @frispete \n"
] |
https://api.github.com/repos/psf/requests/issues/2839
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2839/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2839/comments
|
https://api.github.com/repos/psf/requests/issues/2839/events
|
https://github.com/psf/requests/pull/2839
| 112,534,727 |
MDExOlB1bGxSZXF1ZXN0NDgzMDA4MzI=
| 2,839 |
Make sure we build environment settings properly.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2013-11-17T11:29:34Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": null,
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/20",
"id": 487518,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels",
"node_id": "MDk6TWlsZXN0b25lNDg3NTE4",
"number": 20,
"open_issues": 12,
"state": "open",
"title": "3.0.0",
"updated_at": "2024-05-19T18:43:00Z",
"url": "https://api.github.com/repos/psf/requests/milestones/20"
}
| 3 |
2015-10-21T08:24:30Z
|
2021-09-08T04:01:06Z
|
2016-04-15T21:38:50Z
|
MEMBER
|
resolved
|
Resolves #2836.
This actually requires a slightly tricky re-ordering here: the proxies work needs to happen separately to make sure keys set to `None` DTRT.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2839/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2839/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2839.diff",
"html_url": "https://github.com/psf/requests/pull/2839",
"merged_at": "2016-04-15T21:38:50Z",
"patch_url": "https://github.com/psf/requests/pull/2839.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2839"
}
| true |
[
"This looks like :cake: to me!\n",
"Rebase. Also appears as though it should have been merged six months ago. \n",
"Rebase complete.\n"
] |
https://api.github.com/repos/psf/requests/issues/2838
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2838/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2838/comments
|
https://api.github.com/repos/psf/requests/issues/2838/events
|
https://github.com/psf/requests/issues/2838
| 112,356,995 |
MDU6SXNzdWUxMTIzNTY5OTU=
| 2,838 |
UnicodeEncodeError: 'latin-1' codec can't encode characters in position 52-55: ordinal not in range(256)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9416005?v=4",
"events_url": "https://api.github.com/users/DeadNumbers/events{/privacy}",
"followers_url": "https://api.github.com/users/DeadNumbers/followers",
"following_url": "https://api.github.com/users/DeadNumbers/following{/other_user}",
"gists_url": "https://api.github.com/users/DeadNumbers/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/DeadNumbers",
"id": 9416005,
"login": "DeadNumbers",
"node_id": "MDQ6VXNlcjk0MTYwMDU=",
"organizations_url": "https://api.github.com/users/DeadNumbers/orgs",
"received_events_url": "https://api.github.com/users/DeadNumbers/received_events",
"repos_url": "https://api.github.com/users/DeadNumbers/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/DeadNumbers/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DeadNumbers/subscriptions",
"type": "User",
"url": "https://api.github.com/users/DeadNumbers",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-10-20T12:26:47Z
|
2021-09-08T21:00:57Z
|
2015-10-20T12:28:03Z
|
NONE
|
resolved
|
Requests 2.8.1
Python 3.5.0 (default, Sep 20 2015, 11:56:03)
[GCC 5.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
import requests
r = requests.post('https://2ch.hk/makaba/posting.fcgi', data='json=1&task=post&board=test&thread=80980710&comment=тест')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python3.5/site-packages/requests/api.py", line 109, in post
return request('post', url, data=data, json=json, *_kwargs)
File "/usr/lib/python3.5/site-packages/requests/api.py", line 50, in request
response = session.request(method=method, url=url, *_kwargs)
File "/usr/lib/python3.5/site-packages/requests/sessions.py", line 468, in request
resp = self.send(prep, *_send_kwargs)
File "/usr/lib/python3.5/site-packages/requests/sessions.py", line 576, in send
r = adapter.send(request, *_kwargs)
File "/usr/lib/python3.5/site-packages/requests/adapters.py", line 370, in send
timeout=timeout
File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 559, in urlopen
body=body, headers=headers)
File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 353, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/lib/python3.5/http/client.py", line 1083, in request
self._send_request(method, url, body, headers)
File "/usr/lib/python3.5/http/client.py", line 1127, in _send_request
body = body.encode('iso-8859-1')
UnicodeEncodeError: 'latin-1' codec can't encode characters in position 52-55: ordinal not in range(256)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2838/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2838/timeline
| null |
completed
| null | null | false |
[
"Please do not attempt to send body data in unicode strings. You need to encode that before you send it: `data='json=1&task=post&board=test&thread=80980710&comment=тест'.encode('utf-8')`\n"
] |
https://api.github.com/repos/psf/requests/issues/2837
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2837/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2837/comments
|
https://api.github.com/repos/psf/requests/issues/2837/events
|
https://github.com/psf/requests/issues/2837
| 112,253,265 |
MDU6SXNzdWUxMTIyNTMyNjU=
| 2,837 |
Insecure Platform Warning
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3393794?v=4",
"events_url": "https://api.github.com/users/ueg1990/events{/privacy}",
"followers_url": "https://api.github.com/users/ueg1990/followers",
"following_url": "https://api.github.com/users/ueg1990/following{/other_user}",
"gists_url": "https://api.github.com/users/ueg1990/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ueg1990",
"id": 3393794,
"login": "ueg1990",
"node_id": "MDQ6VXNlcjMzOTM3OTQ=",
"organizations_url": "https://api.github.com/users/ueg1990/orgs",
"received_events_url": "https://api.github.com/users/ueg1990/received_events",
"repos_url": "https://api.github.com/users/ueg1990/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ueg1990/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ueg1990/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ueg1990",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-10-19T23:25:49Z
|
2021-09-08T21:00:57Z
|
2015-10-20T08:24:32Z
|
CONTRIBUTOR
|
resolved
|
I am contributing to a project that uses requests library (version 2.7.0) and even though the requests return outputs without any issue, I see the following warnings being printed:
/Users/ue/.virtualenvs/imgurpython/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/packages/urllib3/util/ssl_.py:90: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail.
Any help on how to fix this?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2837/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2837/timeline
| null |
completed
| null | null | false |
[
"@ueg1990 This is occurring because you're using this on an old version of Python which does not have a fully-featured `ssl` module. You can fix it either by upgrading to 2.7.9 or later, or by installing `pyopenssl`, `ndg-httpsclient`, and `pyasn1`.\n"
] |
https://api.github.com/repos/psf/requests/issues/2836
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2836/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2836/comments
|
https://api.github.com/repos/psf/requests/issues/2836/events
|
https://github.com/psf/requests/pull/2836
| 112,250,476 |
MDExOlB1bGxSZXF1ZXN0NDgxMzc5NzM=
| 2,836 |
Use session.verify
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4534764?v=4",
"events_url": "https://api.github.com/users/causton81/events{/privacy}",
"followers_url": "https://api.github.com/users/causton81/followers",
"following_url": "https://api.github.com/users/causton81/following{/other_user}",
"gists_url": "https://api.github.com/users/causton81/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/causton81",
"id": 4534764,
"login": "causton81",
"node_id": "MDQ6VXNlcjQ1MzQ3NjQ=",
"organizations_url": "https://api.github.com/users/causton81/orgs",
"received_events_url": "https://api.github.com/users/causton81/received_events",
"repos_url": "https://api.github.com/users/causton81/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/causton81/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/causton81/subscriptions",
"type": "User",
"url": "https://api.github.com/users/causton81",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2015-10-19T23:02:35Z
|
2021-09-08T06:00:48Z
|
2015-10-21T08:24:44Z
|
NONE
|
resolved
|
session.verify is currently ignored. Didn't have time for a unit test, but this is one solution.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2836/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2836/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2836.diff",
"html_url": "https://github.com/psf/requests/pull/2836",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2836.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2836"
}
| true |
[
"@causton81 What evidence do you have that `Session.verify` is ignored? This works fine for me:\n\n``` python\n>>> import requests\n>>> s = requests.Session()\n>>> s.verify = False\n>>> s.get('https://kennethreitz.org/')\n/Users/cory/.pyenv/versions/2.7.10/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py:789: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.org/en/latest/security.html\n InsecureRequestWarning)\n/Users/cory/.pyenv/versions/2.7.10/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py:789: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.org/en/latest/security.html\n InsecureRequestWarning)\n<Response [200]>\n```\n",
"Sorry for the lack of details. The problem is when either `REQUESTS_CA_BUNDLE` or `CURL_CA_BUNDLE` is set in the environment and session.trust_env==True, then session.verify is ignored. I added a unit test to demonstrate. Just for completeness I missed this for awhile because I was testing with Python 2.7.3 which ignores cert issues. I noticed it when testing with Python 2.7.9 and env variable REQUESTS_CA_BUNDLE.\n",
"Ah, yes, I see the problem now.\n\nWe have a similar issue with proxy settings, where proxies set in the environment can override proxies set on the session. I'm beginning to wonder if we can fix two problems in one go here, by taking the `trust_env` block of `merge_environment_settings` and moving it below the `merge_setting` lines in that function.\n\n@causton81 This is definitely a real bug, but we need a bit of time to work out what the fix is and what branch it'll go on to (while this is strictly a bug fix, it's been in the product long enough that it's also an API compatibility break and needs to be dealt with cautiously).\n\n@sigmavirus24 What do you think about my proposed rearrangement of `merge_environment_settings`?\n",
"@Lukasa the rearrangement is something we had discussed for 3.0.0 because it's fundamentally backwards incompatible. I'm in favor of it though because I think the order of precendence should be (in order of least to most important):\n- Environment\n- Session settings\n- Per-call settings\n\n(For those features to which this applies)\n",
"Ok, let's get this done for 3.0.0.\n",
"Ok, I'm closing this in favour of #2839.\n"
] |
https://api.github.com/repos/psf/requests/issues/2835
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2835/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2835/comments
|
https://api.github.com/repos/psf/requests/issues/2835/events
|
https://github.com/psf/requests/issues/2835
| 112,222,516 |
MDU6SXNzdWUxMTIyMjI1MTY=
| 2,835 |
readtimeout of zero will not send the request
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/460530?v=4",
"events_url": "https://api.github.com/users/trcarden/events{/privacy}",
"followers_url": "https://api.github.com/users/trcarden/followers",
"following_url": "https://api.github.com/users/trcarden/following{/other_user}",
"gists_url": "https://api.github.com/users/trcarden/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/trcarden",
"id": 460530,
"login": "trcarden",
"node_id": "MDQ6VXNlcjQ2MDUzMA==",
"organizations_url": "https://api.github.com/users/trcarden/orgs",
"received_events_url": "https://api.github.com/users/trcarden/received_events",
"repos_url": "https://api.github.com/users/trcarden/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/trcarden/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/trcarden/subscriptions",
"type": "User",
"url": "https://api.github.com/users/trcarden",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2015-10-19T20:08:01Z
|
2021-09-08T21:00:57Z
|
2015-10-20T08:30:20Z
|
NONE
|
resolved
|
The following code will not send a request:
``` python
endpoint = 'some.server.com'
data = {'foo':123}
requests.post(self._endpoint , json=data,
verify=False, timeout=(5, 0.00))
```
I would expect the library to send the request but not wait for the server to respond. This would correspond to a fire and forget type of request particularly useful for logging of non critical information
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2835/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2835/timeline
| null |
completed
| null | null | false |
[
"What does \"will not send the request\" mean? On my system, the request _is_ sent, but a `ReadTimeoutError` is raised, which feels like exactly the expected behaviour. What are you seeing that makes you believe that's not happening for you?\n\nPlease note, however, that setting a timeout of exactly zero is [a _bad idea_](https://github.com/kennethreitz/requests/issues/2699#issuecomment-126494944): see #2699 for more. We plan to forbid doing that entirely in future: instead, you should choose a small floating point value.\n",
"So \"will not send the request\" means that the request isn't sent to my server when the timeout is set to zero. I also get the ReadTimeoutError but no sending occurs. My request is sent if i raise the read timeout from 0 to 0.01. Based on #2699 that it would be a bad idea for the library but its not bad from a usage perspective. In other words setting a read timeout of zero seems to me to be a valid use case even if the underlying socket implementation has a bug in it. \n\nHowever if you ended up forbidding it and provided a workaround, that would be better than it _sometimes_ working. I just hope that its not a race condition of some kind and that if you reduced the timeout to some arbitrarily small number that you would start to see what i was experiencing.\n",
"@trcarden Our intention is that we will forbid it: it's simply not a sensible thing to pass to this library. Generally speaking, the better logic for a 'fire-and-forget' request is to set `stream=True` and simply close the request afterwards. This will ensure that the server really did receive the request, but won't block on reading any of the response body.\n",
"> but won't block on reading any of the response body.\n\nIt will block on reading the headers though fwiw\n",
"Ok, I am trying to be practical here. I don't want to wait on the response coming back from the server but i do want to obligate the system to send the request. I could use UDP sockets but that requires a new server and a bunch more overhead. I completely understand TCP/IP and HTTP aren't really built for this but i think i can get 90%+ of what i need just setting a small value on the read-timeout. I did read in a couple places that setting a timeout of zero doesn't \"obligate\" the system to send the message (might only make it to buffers without getting transmitted) depending on the local implementation so i see why my problem might have been occurring. \n\n@sigmavirus24 thank you for the note on the headers. Emulating a \"best effort\" protocol on top of TCP/IP and HTTP with a upper bound on total wait time (limitations acknowledged) seems to work better with low read-timeout settings than enabling streaming responses. Eventually I think we will enable a UDP server.\n"
] |
https://api.github.com/repos/psf/requests/issues/2834
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2834/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2834/comments
|
https://api.github.com/repos/psf/requests/issues/2834/events
|
https://github.com/psf/requests/issues/2834
| 112,105,186 |
MDU6SXNzdWUxMTIxMDUxODY=
| 2,834 |
Response.text as byte string
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/15190816?v=4",
"events_url": "https://api.github.com/users/rschwarz1711/events{/privacy}",
"followers_url": "https://api.github.com/users/rschwarz1711/followers",
"following_url": "https://api.github.com/users/rschwarz1711/following{/other_user}",
"gists_url": "https://api.github.com/users/rschwarz1711/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rschwarz1711",
"id": 15190816,
"login": "rschwarz1711",
"node_id": "MDQ6VXNlcjE1MTkwODE2",
"organizations_url": "https://api.github.com/users/rschwarz1711/orgs",
"received_events_url": "https://api.github.com/users/rschwarz1711/received_events",
"repos_url": "https://api.github.com/users/rschwarz1711/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rschwarz1711/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rschwarz1711/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rschwarz1711",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-10-19T09:19:08Z
|
2021-09-08T21:00:58Z
|
2015-10-19T09:20:13Z
|
NONE
|
resolved
|
Hey,
we are missing informations how to stop the requests library from guessing the content type of response.text.
http://docs.python-requests.org/en/latest/api/?highlight=text#requests.Response.text
For our use case we need the original byte string. The current docs don't explain how to get it.
Please improve the docs.
Best regards
René
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2834/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2834/timeline
| null |
completed
| null | null | false |
[
"The docs absolutely do say how to get the original byte string, just a few lines up from where you linked: http://docs.python-requests.org/en/latest/api/?highlight=text#requests.Response.content\n\nIt's `response.content`.\n"
] |
https://api.github.com/repos/psf/requests/issues/2833
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2833/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2833/comments
|
https://api.github.com/repos/psf/requests/issues/2833/events
|
https://github.com/psf/requests/pull/2833
| 111,888,014 |
MDExOlB1bGxSZXF1ZXN0NDc5NDUxOTE=
| 2,833 |
Check Content-Length when present.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4534764?v=4",
"events_url": "https://api.github.com/users/causton81/events{/privacy}",
"followers_url": "https://api.github.com/users/causton81/followers",
"following_url": "https://api.github.com/users/causton81/following{/other_user}",
"gists_url": "https://api.github.com/users/causton81/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/causton81",
"id": 4534764,
"login": "causton81",
"node_id": "MDQ6VXNlcjQ1MzQ3NjQ=",
"organizations_url": "https://api.github.com/users/causton81/orgs",
"received_events_url": "https://api.github.com/users/causton81/received_events",
"repos_url": "https://api.github.com/users/causton81/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/causton81/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/causton81/subscriptions",
"type": "User",
"url": "https://api.github.com/users/causton81",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2015-10-16T18:39:49Z
|
2021-09-08T06:00:51Z
|
2015-10-17T21:15:13Z
|
NONE
|
resolved
|
This is pretty hard to reproduce, but I have seen truncated responses in the presence of connection resets that are silently ignored by requests.
This link describes how the vagaries of TCP and socket API may return an error or 0 (end-of-file) when the stars align just right:
http://blog.netherlabs.nl/articles/2009/01/18/the-ultimate-so_linger-page-or-why-is-my-tcp-not-reliable
This patch checks that the number of bytes read matches the declared Content-Length (when declared). I'm not an HTTP expert, so I may be missing something, but I hope this is useful. This also only covers the urllib3 code path.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2833/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2833/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2833.diff",
"html_url": "https://github.com/psf/requests/pull/2833",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2833.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2833"
}
| true |
[
"@causton `httplib` should do this check by itself already. Can you try to reproduce this to see if that happens? If it doesn't, can you post your repro code here?\n",
"I used netcat to send a 1 MiB response that's one byte short of declared Content-Length. The script below shows the problem for me with the output shown. It fails silently and writes a file shorter than Content-Length.\n\n``` bash\n#!/usr/bin/env bash\n\nset -e\n\n# 1 MiB\nsize=1048576\n\n_16=1234567890123456\n\nsize_div_16=$(( $size / 16 ))\n\necho \"creating $size byte file\"\ntruncate -s0 body\nfor i in `seq $size_div_16`; do\n echo -n $_16 >> body\ndone\n\ncat >hdr <<EOF\nHTTP/1.1 200 OK\nContent-Length: $size\n\nEOF\n\ncat hdr body >response\n\nwc -c hdr body response\n\n\ncat >rst.py <<\"SCRIPT\"\nimport requests\n\nres = requests.get('http://localhost:1234')\n\nwith open('body.rst', 'w') as fp:\n fp.write(res.content)\nSCRIPT\n\none_less=$(( size - 1 ))\n\ndd if=response bs=$one_less count=1 | nc -l -p 1234 &>/dev/null &\n\njob=$!\n\nsleep 3\n\npython rst.py &\n\nsleep 3\n\nkill $job\n\nwait\n\nwc -c body body.rst\n```\n\nUnpatched output:\n\n```\n(requests2710)[causton@causton requests]$ bash shoddy-server.sh \ncreating 1048576 byte file\n 41 hdr\n1048576 body\n1048617 response\n2097234 total\n1+0 records in\n1+0 records out\n1048575 bytes (1.0 MB) copied, 3.08301 s, 340 kB/s\n1048576 body\n1048534 body.rst\n2097110 total\n(requests2710)[causton@causton requests]$ \n```\n\nOutput after this PR:\n\n```\n(requests2710)[causton@causton requests]$ bash shoddy-server.sh \ncreating 1048576 byte file\n 41 hdr\n1048576 body\n1048617 response\n2097234 total\n1+0 records in\n1+0 records out\n1048575 bytes (1.0 MB) copied, 3.10435 s, 338 kB/s\nTraceback (most recent call last):\n File \"rst.py\", line 3, in <module>\n res = requests.get('http://localhost:1234')\n File \"/home/causton/git/requests/requests/api.py\", line 69, in get\n return request('get', url, params=params, **kwargs)\n File \"/home/causton/git/requests/requests/api.py\", line 50, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"/home/causton/git/requests/requests/sessions.py\", line 468, in request\n resp = self.send(prep, **send_kwargs)\n File \"/home/causton/git/requests/requests/sessions.py\", line 608, in send\n r.content\n File \"/home/causton/git/requests/requests/models.py\", line 747, in content\n self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()\n File \"/home/causton/git/requests/requests/models.py\", line 677, in generate\n expected_len, actual_len))\nrequests.exceptions.TruncatedContentError: Expected: 1048576, got: 1048534\n1048576 body\nwc: body.rst: No such file or directory\n1048576 total\n(requests2710)[causton@causton requests]$ \n```\n",
"Iiiinteresting. So @causton81, I think you're right. Diving into the `httplib` source code shows this:\n\n``` python\ns = self.fp.read(amt)\nif not s and amt:\n # Ideally, we would raise IncompleteRead if the content-length\n # wasn't satisfied, but it might break compatibility.\n self.close()\n```\n\nSo, I think you're right and this doesn't work. I think this would be best implemented in `urllib3`, frankly: in urllib3 they can take advantage of the `HTTPResponse.length` parameter that `httplib` maintains (but is weirdly refusing to use) to make sure this works in almost all cases. @shazow does that sound reasonable to you?\n",
"Could give it a try in urllib3. I've found that we can't rely on content-length being correct, though. Sometimes it'll be smaller than the response, other times it will be larger, and I'd prefer not to explode when it doesn't match up (but rather let the user handle it how they like).\n",
"> Sometimes it'll be smaller than the response,\n\nIt'll be smaller when the it's compressed and we're decoding the content for the user right? Couldn't we do a pre-decompression check in that case? Or are there other cases that I'm not aware of?\n\n> I'd prefer not to explode when it doesn't match up (but rather let the user handle it how they like).\n\nPerhaps a util function (to avoid yet another option/flag) that will take the other parameters for a call to stream as well as handling logic to check the size of the response and raise an exception? This then becomes a very explicit opt-in.\n",
"> It'll be smaller when the it's compressed and we're decoding the content for the user right? Couldn't we do a pre-decompression check in that case? Or are there other cases that I'm not aware of?\n\nI mean roflservers. There are lots of roflservers out there that yoloreturn incorrect lolvalues.\n\n> Perhaps a util function...\n\nUmm not sure what that might look like. How about we start a urllib3 issue thread with some specific API proposals?\n",
"Sounds like a plan, let's move to urllib3.\n",
"Should I start an issue on urllib3 then?\n",
"Yes please!\n"
] |
https://api.github.com/repos/psf/requests/issues/2832
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2832/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2832/comments
|
https://api.github.com/repos/psf/requests/issues/2832/events
|
https://github.com/psf/requests/issues/2832
| 111,879,729 |
MDU6SXNzdWUxMTE4Nzk3Mjk=
| 2,832 |
Not marked as python3.4 compatible in PyPi
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/419698?v=4",
"events_url": "https://api.github.com/users/jra3/events{/privacy}",
"followers_url": "https://api.github.com/users/jra3/followers",
"following_url": "https://api.github.com/users/jra3/following{/other_user}",
"gists_url": "https://api.github.com/users/jra3/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jra3",
"id": 419698,
"login": "jra3",
"node_id": "MDQ6VXNlcjQxOTY5OA==",
"organizations_url": "https://api.github.com/users/jra3/orgs",
"received_events_url": "https://api.github.com/users/jra3/received_events",
"repos_url": "https://api.github.com/users/jra3/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jra3/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jra3/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jra3",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2015-10-16T17:55:06Z
|
2021-09-08T21:00:59Z
|
2015-10-16T21:03:31Z
|
NONE
|
resolved
|
Hi there,
http://docs.python-requests.org/en/latest/ says there is support for python 2.6-3.4, but PyPi lists only 2.7. I discovered this when i used the tool `caniusepython3` and saw requests on the list of packages holding me back.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2832/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2832/timeline
| null |
completed
| null | null | false |
[
"Yeah, all our trove classifiers seem to be missing on PyPI. Looks like it was a while since it was registered. Resolved now. =)\n\nThanks for the report!\n"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.