url
stringlengths 50
53
| repository_url
stringclasses 1
value | labels_url
stringlengths 64
67
| comments_url
stringlengths 59
62
| events_url
stringlengths 57
60
| html_url
stringlengths 38
43
| id
int64 597k
2.65B
| node_id
stringlengths 18
32
| number
int64 1
6.83k
| title
stringlengths 1
296
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 2
classes | assignee
dict | assignees
listlengths 0
4
| milestone
dict | comments
int64 0
211
| created_at
stringlengths 20
20
| updated_at
stringlengths 20
20
| closed_at
stringlengths 20
20
⌀ | author_association
stringclasses 3
values | active_lock_reason
stringclasses 4
values | body
stringlengths 0
65.6k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 59
62
| performed_via_github_app
null | state_reason
stringclasses 3
values | draft
bool 2
classes | pull_request
dict | is_pull_request
bool 2
classes | issue_comments
listlengths 0
30
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/psf/requests/issues/4061
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4061/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4061/comments
|
https://api.github.com/repos/psf/requests/issues/4061/events
|
https://github.com/psf/requests/issues/4061
| 231,404,276 |
MDU6SXNzdWUyMzE0MDQyNzY=
| 4,061 |
readthedocs is broken
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2017-05-25T17:25:28Z
|
2021-09-08T10:00:39Z
|
2017-05-26T00:12:16Z
|
CONTRIBUTOR
|
resolved
|
ugh https://readthedocs.org/projects/requests/builds/5472443/
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4061/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4061/timeline
| null |
completed
| null | null | false |
[
"very unclear what's going on from the error message. @ericholscher?",
"Does it happen locally?",
"I seem to have it reproducing locally, so I'm guessing it's a bug in something recent on your side. ",
"Locally I see your config version being set as `ipdb> version\r\n<module 'requests.__version__' from '/Users/eric/checkouts/requests/requests/__version__.pyc'>` which seems...wrong :)",
"Looks like this is a byproduct of #4059. Something is getting confused about requests.__version__ the module vs requests.__version__ the attr. The import statement in init may need to be more explicit, or the module name needs to change.",
"gah, sorry @ericholscher. I should have checked that before pinging you!"
] |
https://api.github.com/repos/psf/requests/issues/4060
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4060/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4060/comments
|
https://api.github.com/repos/psf/requests/issues/4060/events
|
https://github.com/psf/requests/issues/4060
| 231,402,277 |
MDU6SXNzdWUyMzE0MDIyNzc=
| 4,060 |
"params" arg of requests.get does not properly encode json
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1395651?v=4",
"events_url": "https://api.github.com/users/mxndrwgrdnr/events{/privacy}",
"followers_url": "https://api.github.com/users/mxndrwgrdnr/followers",
"following_url": "https://api.github.com/users/mxndrwgrdnr/following{/other_user}",
"gists_url": "https://api.github.com/users/mxndrwgrdnr/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mxndrwgrdnr",
"id": 1395651,
"login": "mxndrwgrdnr",
"node_id": "MDQ6VXNlcjEzOTU2NTE=",
"organizations_url": "https://api.github.com/users/mxndrwgrdnr/orgs",
"received_events_url": "https://api.github.com/users/mxndrwgrdnr/received_events",
"repos_url": "https://api.github.com/users/mxndrwgrdnr/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mxndrwgrdnr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mxndrwgrdnr/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mxndrwgrdnr",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2017-05-25T17:17:28Z
|
2021-09-08T10:00:41Z
|
2017-05-25T17:32:22Z
|
NONE
|
resolved
|
I have an API endpoint that handles json. This request returns a 400 error
```
requests.get(url, params={"json":json.dumps(myDict)}
```
while this works fine
```
requests.post(url,json=myDict)
```
Is there no proper way to pass json in a GET request?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4060/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4060/timeline
| null |
completed
| null | null | false |
[
"This is entirely up to your API endpoint. If the endpoint doesn't want JSON in the request URL, then it will error out: if it does, then it won't. You need to read your endpoint documentation or observe other requests to see how they send the data.",
"No, the API can handle JSON in the request URL. It's just that when I pass json as a value a-la `params={key:value}` in the GET request, the json gets formatted improperly. The url it returns places a `+` character after every `:` in the json. ",
"That's not improper, that's the result of `json.dumps`:\r\n\r\n```python\r\n>>> json.dumps({'test': 'name'})\r\n'{\"test\": \"name\"}'\r\n```\r\n\r\nIf that's not the way you want your data to look, you need to change the data. We're going to send exactly what you ask us to. :smile:",
"Ah, yes! So I guess by specifying `separators=(',' , ':')` the json.dumps() call I can get rid of the `+`'s. Thanks!"
] |
https://api.github.com/repos/psf/requests/issues/4059
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4059/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4059/comments
|
https://api.github.com/repos/psf/requests/issues/4059/events
|
https://github.com/psf/requests/pull/4059
| 231,374,137 |
MDExOlB1bGxSZXF1ZXN0MTIyNDY0NDIw
| 4,059 |
__version__.py
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 14 |
2017-05-25T15:32:14Z
|
2021-09-06T00:06:54Z
|
2017-05-25T16:39:44Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4059/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4059/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4059.diff",
"html_url": "https://github.com/psf/requests/pull/4059",
"merged_at": "2017-05-25T16:39:44Z",
"patch_url": "https://github.com/psf/requests/pull/4059.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4059"
}
| true |
[
"waiting for ci to pass before merging",
"P.S. I'll be happy to pay Travis money if it means our tests will run faster. /cc @Lukasa ",
"I think appveyor is our bottleneck right now because it won't parallelize tests with the free version.\r\n\r\nAlso just wanted to make sure you'd seen #4058.",
"They'll only give you two concurrent builds at a time with paid though... sucks. ",
"And it's vastly overpriced for that. ",
"Yeah, not worth it for only one additional build slot.",
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4059?src=pr&el=h1) Report\n> Merging [#4059](https://codecov.io/gh/kennethreitz/requests/pull/4059?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/539302a17d62c53ff6116144701a851cf700fe7b?src=pr&el=desc) will **increase** coverage by `0.02%`.\n> The diff coverage is `100%`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/4059?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #4059 +/- ##\n==========================================\n+ Coverage 89.75% 89.77% +0.02% \n==========================================\n Files 15 16 +1 \n Lines 1942 1946 +4 \n==========================================\n+ Hits 1743 1747 +4 \n Misses 199 199\n```\n\n\n| [Impacted Files](https://codecov.io/gh/kennethreitz/requests/pull/4059?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/\\_\\_version\\_\\_.py](https://codecov.io/gh/kennethreitz/requests/pull/4059?src=pr&el=tree#diff-cmVxdWVzdHMvX192ZXJzaW9uX18ucHk=) | `100% <100%> (ø)` | |\n| [...es/urllib3/packages/ssl\\_match\\_hostname/\\_\\_init\\_\\_.py](https://codecov.io/gh/kennethreitz/requests/pull/4059?src=pr&el=tree#diff-cmVxdWVzdHMvcGFja2FnZXMvdXJsbGliMy9wYWNrYWdlcy9zc2xfbWF0Y2hfaG9zdG5hbWUvX19pbml0X18ucHk=) | `92% <0%> (-1.55%)` | :arrow_down: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4059?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4059?src=pr&el=footer). Last update [539302a...7abfd92](https://codecov.io/gh/kennethreitz/requests/pull/4059?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"this is the longest ci run in history. almost makes me want to setup a jenkins again.\r\n\r\nwonder if there's a better paid-for ci service we could use. ",
"i really don't like this waiting around stuff. ",
"wtf this is ridiculous ",
"omg",
"I think you've got builds queued up on appveyor. If you cancel the old ones, it'll speed things up.",
"Has the option to cancel old builds when superceded by a new commit for a PR been activated on Travis CI? Greatly speeds up build queue flushing for PRs. It's a newer feature. I'm not aware of a similar feature for AppVeyor.",
"Also doesn't Python 2.7 not have `exec()`? I only see `eval()` on the built-ins list."
] |
|
https://api.github.com/repos/psf/requests/issues/4058
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4058/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4058/comments
|
https://api.github.com/repos/psf/requests/issues/4058/events
|
https://github.com/psf/requests/pull/4058
| 231,210,521 |
MDExOlB1bGxSZXF1ZXN0MTIyMzUyMDc2
| 4,058 |
Stop relying on a regexp for version parsing
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
] | null | 3 |
2017-05-25T00:47:56Z
|
2021-09-06T00:06:54Z
|
2017-05-25T15:48:11Z
|
CONTRIBUTOR
|
resolved
|
Move all of our metadata into `requests/__about__.py` while keeping a
backwards compatible aliasing of metadata attributes. Also use this to
specify metadata in our setup.py
Closes #4057
---
For what it's worth, I based this off of [pypa/packaging](https://github.com/pypa/packaging/blob/master/setup.py)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4058/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4058/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4058.diff",
"html_url": "https://github.com/psf/requests/pull/4058",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/4058.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4058"
}
| true |
[
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4058?src=pr&el=h1) Report\n> Merging [#4058](https://codecov.io/gh/kennethreitz/requests/pull/4058?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/037108178c1fcc4484f5ccb16e390636ef02a7e6?src=pr&el=desc) will **decrease** coverage by `1.59%`.\n> The diff coverage is `100%`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/4058?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #4058 +/- ##\n=========================================\n- Coverage 89.75% 88.16% -1.6% \n=========================================\n Files 15 16 +1 \n Lines 1942 1943 +1 \n=========================================\n- Hits 1743 1713 -30 \n- Misses 199 230 +31\n```\n\n\n| [Impacted Files](https://codecov.io/gh/kennethreitz/requests/pull/4058?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/\\_\\_about\\_\\_.py](https://codecov.io/gh/kennethreitz/requests/pull/4058?src=pr&el=tree#diff-cmVxdWVzdHMvX19hYm91dF9fLnB5) | `100% <100%> (ø)` | |\n| [requests/packages/chardet/compat.py](https://codecov.io/gh/kennethreitz/requests/pull/4058?src=pr&el=tree#diff-cmVxdWVzdHMvcGFja2FnZXMvY2hhcmRldC9jb21wYXQucHk=) | `64.86% <0%> (-35.14%)` | :arrow_down: |\n| [...es/urllib3/packages/ssl\\_match\\_hostname/\\_\\_init\\_\\_.py](https://codecov.io/gh/kennethreitz/requests/pull/4058?src=pr&el=tree#diff-cmVxdWVzdHMvcGFja2FnZXMvdXJsbGliMy9wYWNrYWdlcy9zc2xfbWF0Y2hfaG9zdG5hbWUvX19pbml0X18ucHk=) | `80% <0%> (-13.55%)` | :arrow_down: |\n| [requests/\\_internal\\_utils.py](https://codecov.io/gh/kennethreitz/requests/pull/4058?src=pr&el=tree#diff-cmVxdWVzdHMvX2ludGVybmFsX3V0aWxzLnB5) | `93.75% <0%> (-6.25%)` | :arrow_down: |\n| [requests/models.py](https://codecov.io/gh/kennethreitz/requests/pull/4058?src=pr&el=tree#diff-cmVxdWVzdHMvbW9kZWxzLnB5) | `91.15% <0%> (-2.27%)` | :arrow_down: |\n| [requests/sessions.py](https://codecov.io/gh/kennethreitz/requests/pull/4058?src=pr&el=tree#diff-cmVxdWVzdHMvc2Vzc2lvbnMucHk=) | `93.23% <0%> (-0.72%)` | :arrow_down: |\n| [requests/adapters.py](https://codecov.io/gh/kennethreitz/requests/pull/4058?src=pr&el=tree#diff-cmVxdWVzdHMvYWRhcHRlcnMucHk=) | `92.45% <0%> (-0.48%)` | :arrow_down: |\n| [requests/utils.py](https://codecov.io/gh/kennethreitz/requests/pull/4058?src=pr&el=tree#diff-cmVxdWVzdHMvdXRpbHMucHk=) | `86% <0%> (-0.25%)` | :arrow_down: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4058?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4058?src=pr&el=footer). Last update [0371081...108dfa9](https://codecov.io/gh/kennethreitz/requests/pull/4058?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"oh damn i did this too",
"Closing for #4059. @sigmavirus24 it would have been good to comment on #4057 when you started working on this, as I checked there before doing so :)"
] |
https://api.github.com/repos/psf/requests/issues/4057
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4057/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4057/comments
|
https://api.github.com/repos/psf/requests/issues/4057/events
|
https://github.com/psf/requests/issues/4057
| 231,126,702 |
MDU6SXNzdWUyMzExMjY3MDI=
| 4,057 |
move setup.py to use a __version__.py
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"color": "e102d8",
"default": false,
"description": null,
"id": 117745,
"name": "Planned",
"node_id": "MDU6TGFiZWwxMTc3NDU=",
"url": "https://api.github.com/repos/psf/requests/labels/Planned"
},
{
"color": "fbca04",
"default": false,
"description": null,
"id": 44501249,
"name": "Needs BDFL Input",
"node_id": "MDU6TGFiZWw0NDUwMTI0OQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
] | null | 5 |
2017-05-24T18:12:45Z
|
2021-09-08T10:00:42Z
|
2017-05-25T16:39:57Z
|
CONTRIBUTOR
|
resolved
|
better pattern. death to regex.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4057/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4057/timeline
| null |
completed
| null | null | false |
[
"Would be nice to keep `__version__` exposed at the package level so that `requests.__version__` is still a thing.",
"Absolutely!",
"Only reason I note that is because I know that `pipenv` does not expose `.__version__` at the package level instead opting for `pipenv.__version__.__version__`.",
"@SethMichaelLarson can you open an issue about that?",
"Addressed in #4059"
] |
https://api.github.com/repos/psf/requests/issues/4056
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4056/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4056/comments
|
https://api.github.com/repos/psf/requests/issues/4056/events
|
https://github.com/psf/requests/pull/4056
| 231,109,093 |
MDExOlB1bGxSZXF1ZXN0MTIyMjc4MzE2
| 4,056 |
Disable pytest messing with warnings.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2017-05-24T17:05:15Z
|
2021-09-06T00:06:53Z
|
2017-05-24T17:46:12Z
|
MEMBER
|
resolved
|
We may be able to remove this if pytest resolves pytest-dev/pytest#2430, but until that point we need to turn the old behaviour back on.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4056/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4056/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4056.diff",
"html_url": "https://github.com/psf/requests/pull/4056",
"merged_at": "2017-05-24T17:46:12Z",
"patch_url": "https://github.com/psf/requests/pull/4056.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4056"
}
| true |
[
"I want to point out that it's pretty stupid that we have to do this, but there we are.",
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4056?src=pr&el=h1) Report\n> Merging [#4056](https://codecov.io/gh/kennethreitz/requests/pull/4056?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/dbb67a13109edef8d02addf5b3fbc8c6f319f4aa?src=pr&el=desc) will **increase** coverage by `3.08%`.\n> The diff coverage is `n/a`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/4056?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #4056 +/- ##\n==========================================\n+ Coverage 86.66% 89.75% +3.08% \n==========================================\n Files 15 15 \n Lines 1942 1942 \n==========================================\n+ Hits 1683 1743 +60 \n+ Misses 259 199 -60\n```\n\n\n| [Impacted Files](https://codecov.io/gh/kennethreitz/requests/pull/4056?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/adapters.py](https://codecov.io/gh/kennethreitz/requests/pull/4056?src=pr&el=tree#diff-cmVxdWVzdHMvYWRhcHRlcnMucHk=) | `92.92% <0%> (+0.47%)` | :arrow_up: |\n| [requests/sessions.py](https://codecov.io/gh/kennethreitz/requests/pull/4056?src=pr&el=tree#diff-cmVxdWVzdHMvc2Vzc2lvbnMucHk=) | `93.95% <0%> (+1.42%)` | :arrow_up: |\n| [requests/models.py](https://codecov.io/gh/kennethreitz/requests/pull/4056?src=pr&el=tree#diff-cmVxdWVzdHMvbW9kZWxzLnB5) | `93.42% <0%> (+2.26%)` | :arrow_up: |\n| [requests/\\_internal\\_utils.py](https://codecov.io/gh/kennethreitz/requests/pull/4056?src=pr&el=tree#diff-cmVxdWVzdHMvX2ludGVybmFsX3V0aWxzLnB5) | `100% <0%> (+6.25%)` | :arrow_up: |\n| [requests/utils.py](https://codecov.io/gh/kennethreitz/requests/pull/4056?src=pr&el=tree#diff-cmVxdWVzdHMvdXRpbHMucHk=) | `86.25% <0%> (+7%)` | :arrow_up: |\n| [...es/urllib3/packages/ssl\\_match\\_hostname/\\_\\_init\\_\\_.py](https://codecov.io/gh/kennethreitz/requests/pull/4056?src=pr&el=tree#diff-cmVxdWVzdHMvcGFja2FnZXMvdXJsbGliMy9wYWNrYWdlcy9zc2xfbWF0Y2hfaG9zdG5hbWUvX19pbml0X18ucHk=) | `93.54% <0%> (+9.67%)` | :arrow_up: |\n| [requests/packages/chardet/compat.py](https://codecov.io/gh/kennethreitz/requests/pull/4056?src=pr&el=tree#diff-cmVxdWVzdHMvcGFja2FnZXMvY2hhcmRldC9jb21wYXQucHk=) | `100% <0%> (+35.13%)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4056?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4056?src=pr&el=footer). Last update [dbb67a1...f3d310b](https://codecov.io/gh/kennethreitz/requests/pull/4056?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"> I want to point out that it's pretty stupid that we have to do this, but there we are.\r\n\r\nHi @Lukasa, sorry to hear that this affected you guys so. We are having some discussion related to this on pytest-dev/pytest#2430, it mean a lot if you could join the discussion and share your thoughts. 👍 ",
"@niccoddemus Yeah, I saw that thread. I am not sure how much value I can provide by weighing in: your warnings filter as added prevented us from tweaking the filter ourselves for testing purposes. I can drop in though. "
] |
https://api.github.com/repos/psf/requests/issues/4055
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4055/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4055/comments
|
https://api.github.com/repos/psf/requests/issues/4055/events
|
https://github.com/psf/requests/issues/4055
| 231,101,432 |
MDU6SXNzdWUyMzExMDE0MzI=
| 4,055 |
remove 301 redirect cache
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"color": "e102d8",
"default": false,
"description": null,
"id": 117745,
"name": "Planned",
"node_id": "MDU6TGFiZWwxMTc3NDU=",
"url": "https://api.github.com/repos/psf/requests/labels/Planned"
},
{
"color": "0b02e1",
"default": false,
"description": null,
"id": 191274,
"name": "Contributor Friendly",
"node_id": "MDU6TGFiZWwxOTEyNzQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
] | null | 6 |
2017-05-24T16:37:46Z
|
2021-09-06T00:06:34Z
|
2017-05-29T19:14:03Z
|
CONTRIBUTOR
|
resolved
|
it's not thread-safe, it's problematic, and it's just not the best of ideas.
-----------
I'm assigning this to myself, but anyone can feel free to take this on if they're in the mood!
Just comment here when you start working on it, if so!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4055/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4055/timeline
| null |
completed
| null | null | false |
[
"I thought we'd already removed this for 3.0?",
"I think @kennethreitz has OK'd removing it for 2.x too.",
"Yeah, I'm personally fine with that. ",
"I opened a PR for this issue. (#4100)",
"✨🍰✨",
"@kennethreitz \r\n\r\nHow is it not threadsafe?\r\n\r\nThis `RecentlyUsedContainer` seems to be threadsafe. \r\n The implementation uses `threading.RLock` [see source](https://github.com/urllib3/urllib3/blob/master/src/urllib3/_collections.py)\r\n\r\nEven if the redirect cache was just a Python dictionary it would be threadsafe... right?\r\n\r\nhttps://docs.python.org/3/glossary.html#term-global-interpreter-lock\r\n\r\n> The mechanism used by the CPython interpreter to assure that only one thread executes Python bytecode at a time. This simplifies the CPython implementation by making the object model (including critical built-in types such as dict) implicitly safe against concurrent access.\r\n"
] |
https://api.github.com/repos/psf/requests/issues/4054
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4054/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4054/comments
|
https://api.github.com/repos/psf/requests/issues/4054/events
|
https://github.com/psf/requests/issues/4054
| 231,085,833 |
MDU6SXNzdWUyMzEwODU4MzM=
| 4,054 |
Easter Egg
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2017-05-24T15:45:15Z
|
2021-09-08T10:00:42Z
|
2017-05-24T15:50:48Z
|
CONTRIBUTOR
|
resolved
|
I'd like to embed an easter egg within requests — something very simple, perhaps just a single dunder variable that says "Kenneth was here".
OR!
```
>>> requests.__cake__
✨ 🍰 ✨
```
yes, i love this.
Any objections will be heard, and likely ignored, but open to thougts if anyone feels strongly against it.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4054/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4054/timeline
| null |
completed
| null | null | false |
[
"Done!",
"Did on my iPad, trusting CI to report if there are any failures caused by this one-line change. Keeping an eye out, as I don't have a dev environment on me at the moment. ",
"@Lukasa sorry! :sparkles: :cake: :sparkles:",
":wink: No worries."
] |
https://api.github.com/repos/psf/requests/issues/4053
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4053/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4053/comments
|
https://api.github.com/repos/psf/requests/issues/4053/events
|
https://github.com/psf/requests/issues/4053
| 230,978,998 |
MDU6SXNzdWUyMzA5Nzg5OTg=
| 4,053 |
Trouble with a specific request
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7857364?v=4",
"events_url": "https://api.github.com/users/BobLamarley/events{/privacy}",
"followers_url": "https://api.github.com/users/BobLamarley/followers",
"following_url": "https://api.github.com/users/BobLamarley/following{/other_user}",
"gists_url": "https://api.github.com/users/BobLamarley/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/BobLamarley",
"id": 7857364,
"login": "BobLamarley",
"node_id": "MDQ6VXNlcjc4NTczNjQ=",
"organizations_url": "https://api.github.com/users/BobLamarley/orgs",
"received_events_url": "https://api.github.com/users/BobLamarley/received_events",
"repos_url": "https://api.github.com/users/BobLamarley/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/BobLamarley/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BobLamarley/subscriptions",
"type": "User",
"url": "https://api.github.com/users/BobLamarley",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2017-05-24T09:38:25Z
|
2021-09-08T10:00:41Z
|
2017-05-24T17:33:33Z
|
NONE
|
resolved
|
Hi,
I have a trouble with my python code.
When i do this cUrl, it work :
`curl -i -XPOST 'http://51.255.80.74:8086/write?db=testdb' --data-binary 'lps0,host=serverlama,region=us-west value=0.65'`
My value is added in to my database.
I want to do the same in python with this script:
````
import requests
import xml.etree.ElementTree as ET
###
interestingData = ['lps0','wuc0']
###INFLUXDB INFOS
databaseName = 'testdb'
urlPostData = 'http://51.255.80.74:8086/write?db='+databaseName
urlGetData = 'http://192.168.168.110/admin/status.xml'
source = requests.get(urlGetData, stream=True)
tree = ET.parse(source.raw)
root = tree.getroot()
for child in root:
if child.tag in interestingData:
data = ''+child.tag+',host=serverlama,region=us-west value='+child.text
response = requests.post(urlPostData, data=data, headers={'Content-Type': 'application/octet-stream'})
````
But i got this error :
````
Traceback (most recent call last):
File "script3.py", line 27, in <module>
response = requests.post(urlPostData, data=data, headers={'Content-Type': 'application/octet-stream'})
File "/usr/lib/python2.7/dist-packages/requests/api.py", line 94, in post
return request('post', url, data=data, json=json, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/api.py", line 49, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 457, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python2.7/dist-packages/requests/sessions.py", line 569, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/dist-packages/requests/adapters.py", line 407, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', error(111, 'Connection refused'))
````
My code seems good to me, i tried a lot of solutions but none works :(
If anyone has an idea, it would be great
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7857364?v=4",
"events_url": "https://api.github.com/users/BobLamarley/events{/privacy}",
"followers_url": "https://api.github.com/users/BobLamarley/followers",
"following_url": "https://api.github.com/users/BobLamarley/following{/other_user}",
"gists_url": "https://api.github.com/users/BobLamarley/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/BobLamarley",
"id": 7857364,
"login": "BobLamarley",
"node_id": "MDQ6VXNlcjc4NTczNjQ=",
"organizations_url": "https://api.github.com/users/BobLamarley/orgs",
"received_events_url": "https://api.github.com/users/BobLamarley/received_events",
"repos_url": "https://api.github.com/users/BobLamarley/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/BobLamarley/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BobLamarley/subscriptions",
"type": "User",
"url": "https://api.github.com/users/BobLamarley",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4053/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4053/timeline
| null |
completed
| null | null | false |
[
"Let's get discorse setup, for questions like this. ",
"Your GET is failing. Why are you trying that GET?",
"My get is working, and i make that get for dowload the xml content of the page.\r\nThe problem is on the requests.post().",
"So all I can tell you then is that the server is refusing our connection attempt.",
"Yes but i had already this information :)\r\nMy server doesn't refuse the connection attempt when i make a cUrl, why the connection would be refused with a request in python ?",
"Can we confirm you're not getting redirected? That is, can you set `allow_redirects=False`?",
"I'm so sorry, i was testing my curl on my PC, not in my container. \r\nThanks for your help ! :)\r\n",
"> Let's get discorse setup, for questions like this.\r\n\r\nWhy can't we simply use StackOverflow for that? That's something we don't need to host @kennethreitz ",
"@sigmavirus24 discourse will give us a free hosted account. I think it'll give our users the opportunity to talk to eachother and self-serve issues potentially. "
] |
https://api.github.com/repos/psf/requests/issues/4052
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4052/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4052/comments
|
https://api.github.com/repos/psf/requests/issues/4052/events
|
https://github.com/psf/requests/pull/4052
| 230,927,573 |
MDExOlB1bGxSZXF1ZXN0MTIyMTQ5NDU4
| 4,052 |
Fix regression in API caused by commit 85400d8d6751071ef78f042d1efa72…
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2484817?v=4",
"events_url": "https://api.github.com/users/hile/events{/privacy}",
"followers_url": "https://api.github.com/users/hile/followers",
"following_url": "https://api.github.com/users/hile/following{/other_user}",
"gists_url": "https://api.github.com/users/hile/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/hile",
"id": 2484817,
"login": "hile",
"node_id": "MDQ6VXNlcjI0ODQ4MTc=",
"organizations_url": "https://api.github.com/users/hile/orgs",
"received_events_url": "https://api.github.com/users/hile/received_events",
"repos_url": "https://api.github.com/users/hile/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/hile/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hile/subscriptions",
"type": "User",
"url": "https://api.github.com/users/hile",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 17 |
2017-05-24T05:51:20Z
|
2021-09-06T00:06:53Z
|
2017-05-27T01:34:30Z
|
CONTRIBUTOR
|
resolved
|
…bdcf76cc0e
- The added optional parameter changes API and should default to None
This utility call is used by for example requestbuilder package directly
which breaks because it passes only one argument to the function as it
used to be.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4052/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4052/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4052.diff",
"html_url": "https://github.com/psf/requests/pull/4052",
"merged_at": "2017-05-27T01:34:30Z",
"patch_url": "https://github.com/psf/requests/pull/4052.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4052"
}
| true |
[
"I don't believe this is a public API, and therefore we don't gaurentee API compatibility.\r\n\r\nBut, I could be wrong — either way, it seems like a nice change to make if it doesn't hurt anything. \r\n\r\n@nateprewitt any objections?",
"I think this is a good idea, but we should work out why the builders are failing first. ;)",
"Yeah, I don't see any issue with this, even if we're not supporting this API publicly.\r\n\r\nAs for the build, something in our dependency pipeline must have changed yesterday or earlier. Python <= 3.3 builds started failing on `test_https_warnings`, but I wasn't able to determine why at a quick glance. @Lukasa probably has more background about the architecture underneath this.",
"I think this has to be a pytest-httpbin thing. The error is ultimately because we now believe that the cert that pytest-httpbin is presenting on the secure endpoint now has a subjectAlternativeName field. I'm trying to locate it and work out why that warning isn't firing.",
"Hrm, no, that's not true. This test passes if run by itself, so this is a new state based test failure for some reason.",
"Ok I have a theory, testing now.",
"The new release of Pytest from yesterday has some interesting notes in the [changelog](https://docs.pytest.org/en/latest/changelog.html).\r\n\r\n* The pytest-warnings plugin has been integrated into the core, so now pytest automatically captures and displays warnings at the end of the test session. Thanks @nicoddemus for the PR.\r\n\r\n * pytest.warns now checks for subclass relationship rather than class equality. Thanks @lesteve for the PR (#2166)",
"Yeah, this is the warning issue: see #4056.",
"Ok cool, can we rebase this on the new master?",
"@Lukasa no conflicts!",
"@kennethreitz, I kicked the Travis builds and they seem to be working with the changes to master. If you can kick appveyor, we should be able to merge.",
"Hi everyone, sorry for barging in, but was this regression related to pytest's 3.1.0 warnings capture?",
"Hey @nicoddemus, no need to apologize 😊 The regression is unrelated to the pytest 3.1.0 changes but those changes were effecting our build when this PR was opened. We've since implemented the suggested workaround in #4056.",
"Great, thanks for the answer! 👍 ",
"So I'm not in favor of merging this. As was said earlier, this is not a public API.",
"@sigmavirus24, if we're not going to maintain this as a public API, it should probably be moved to `_internal_utils.py`? The doctstring for `utils.py` states \"This module provides utility functions that are used within Requests that are also useful for external consumption.\" which to me reads they're fair game for public use.\r\n\r\nEdit: or modify the docstring. There's a fair amount of stuff in `utils.py` that probably shouldn't be depended on externally.",
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4052?src=pr&el=h1) Report\n> Merging [#4052](https://codecov.io/gh/kennethreitz/requests/pull/4052?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/0ba21a79c49117495e2c4f70264f366dea6e480e?src=pr&el=desc) will **increase** coverage by `1.5%`.\n> The diff coverage is `100%`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/4052?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #4052 +/- ##\n=========================================\n+ Coverage 88.15% 89.65% +1.5% \n=========================================\n Files 15 16 +1 \n Lines 1941 1962 +21 \n=========================================\n+ Hits 1711 1759 +48 \n+ Misses 230 203 -27\n```\n\n\n| [Impacted Files](https://codecov.io/gh/kennethreitz/requests/pull/4052?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/utils.py](https://codecov.io/gh/kennethreitz/requests/pull/4052?src=pr&el=tree#diff-cmVxdWVzdHMvdXRpbHMucHk=) | `86% <100%> (ø)` | :arrow_up: |\n| [requests/\\_\\_version\\_\\_.py](https://codecov.io/gh/kennethreitz/requests/pull/4052?src=pr&el=tree#diff-cmVxdWVzdHMvX192ZXJzaW9uX18ucHk=) | `100% <0%> (ø)` | |\n| [requests/adapters.py](https://codecov.io/gh/kennethreitz/requests/pull/4052?src=pr&el=tree#diff-cmVxdWVzdHMvYWRhcHRlcnMucHk=) | `92.92% <0%> (+0.47%)` | :arrow_up: |\n| [requests/models.py](https://codecov.io/gh/kennethreitz/requests/pull/4052?src=pr&el=tree#diff-cmVxdWVzdHMvbW9kZWxzLnB5) | `93.42% <0%> (+2.26%)` | :arrow_up: |\n| [requests/\\_internal\\_utils.py](https://codecov.io/gh/kennethreitz/requests/pull/4052?src=pr&el=tree#diff-cmVxdWVzdHMvX2ludGVybmFsX3V0aWxzLnB5) | `100% <0%> (+6.25%)` | :arrow_up: |\n| [...es/urllib3/packages/ssl\\_match\\_hostname/\\_\\_init\\_\\_.py](https://codecov.io/gh/kennethreitz/requests/pull/4052?src=pr&el=tree#diff-cmVxdWVzdHMvcGFja2FnZXMvdXJsbGliMy9wYWNrYWdlcy9zc2xfbWF0Y2hfaG9zdG5hbWUvX19pbml0X18ucHk=) | `92.68% <0%> (+9.34%)` | :arrow_up: |\n| [requests/packages/chardet/compat.py](https://codecov.io/gh/kennethreitz/requests/pull/4052?src=pr&el=tree#diff-cmVxdWVzdHMvcGFja2FnZXMvY2hhcmRldC9jb21wYXQucHk=) | `100% <0%> (+35.13%)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4052?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4052?src=pr&el=footer). Last update [0ba21a7...a9470fe](https://codecov.io/gh/kennethreitz/requests/pull/4052?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n"
] |
https://api.github.com/repos/psf/requests/issues/4051
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4051/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4051/comments
|
https://api.github.com/repos/psf/requests/issues/4051/events
|
https://github.com/psf/requests/issues/4051
| 230,793,549 |
MDU6SXNzdWUyMzA3OTM1NDk=
| 4,051 |
gevent - 'module' object has no attribute 'epoll' with Python2-7 and Gevent -1.0.1; with Requests-2.13.0
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6896731?v=4",
"events_url": "https://api.github.com/users/iamtennislover/events{/privacy}",
"followers_url": "https://api.github.com/users/iamtennislover/followers",
"following_url": "https://api.github.com/users/iamtennislover/following{/other_user}",
"gists_url": "https://api.github.com/users/iamtennislover/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/iamtennislover",
"id": 6896731,
"login": "iamtennislover",
"node_id": "MDQ6VXNlcjY4OTY3MzE=",
"organizations_url": "https://api.github.com/users/iamtennislover/orgs",
"received_events_url": "https://api.github.com/users/iamtennislover/received_events",
"repos_url": "https://api.github.com/users/iamtennislover/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/iamtennislover/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iamtennislover/subscriptions",
"type": "User",
"url": "https://api.github.com/users/iamtennislover",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2017-05-23T17:56:24Z
|
2021-09-08T10:00:43Z
|
2017-05-23T19:30:45Z
|
NONE
|
resolved
|
requests 2.13.0 fails
>>> import sys
>>> import gevent.monkey
>>> gevent.monkey.patch_all() # This must be called before any other modules calling socket/request
>>> import requests
>>>
>>> print 'py version:', sys.version_info
py version: sys.version_info(major=2, minor=7, micro=8, releaselevel='final', serial=0)
>>> print 'requests version:', requests.__version__
requests version: 2.13.0
>>> print 'gevent version:', gevent.__version__
gevent version: 1.0.1
>>>
>>> assert requests.get('http://www.python.org').content
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/barikak/workplace/NetengCommonW/env/NetengCommon-1.0/runtime/lib/python2.7/site-packages/requests/api.py", line 70, in get
return request('get', url, params=params, **kwargs)
File "/home/barikak/workplace/NetengCommonW/env/NetengCommon-1.0/runtime/lib/python2.7/site-packages/requests/api.py", line 56, in request
return session.request(method=method, url=url, **kwargs)
File "/home/barikak/workplace/NetengCommonW/env/NetengCommon-1.0/runtime/lib/python2.7/site-packages/requests/sessions.py", line 488, in request
resp = self.send(prep, **send_kwargs)
File "/home/barikak/workplace/NetengCommonW/env/NetengCommon-1.0/runtime/lib/python2.7/site-packages/requests/sessions.py", line 630, in send
history = [resp for resp in gen] if allow_redirects else []
File "/home/barikak/workplace/NetengCommonW/env/NetengCommon-1.0/runtime/lib/python2.7/site-packages/requests/sessions.py", line 190, in resolve_redirects
**adapter_kwargs
File "/home/barikak/workplace/NetengCommonW/env/NetengCommon-1.0/runtime/lib/python2.7/site-packages/requests/sessions.py", line 609, in send
r = adapter.send(request, **kwargs)
File "/home/barikak/workplace/NetengCommonW/env/NetengCommon-1.0/runtime/lib/python2.7/site-packages/requests/adapters.py", line 423, in send
timeout=timeout
File "/home/barikak/workplace/NetengCommonW/env/NetengCommon-1.0/runtime/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 600, in urlopen
chunked=chunked)
File "/home/barikak/workplace/NetengCommonW/env/NetengCommon-1.0/runtime/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 345, in _make_request
self._validate_conn(conn)
File "/home/barikak/workplace/NetengCommonW/env/NetengCommon-1.0/runtime/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 844, in _validate_conn
conn.connect()
File "/home/barikak/workplace/NetengCommonW/env/NetengCommon-1.0/runtime/lib/python2.7/site-packages/requests/packages/urllib3/connection.py", line 326, in connect
ssl_context=context)
File "/home/barikak/workplace/NetengCommonW/env/NetengCommon-1.0/runtime/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py", line 324, in ssl_wrap_socket
return context.wrap_socket(sock, server_hostname=server_hostname)
File "/home/barikak/workplace/NetengCommonW/env/NetengCommon-1.0/runtime/lib/python2.7/site-packages/requests/packages/urllib3/contrib/pyopenssl.py", line 438, in wrap_socket
rd = util.wait_for_read(sock, sock.gettimeout())
File "/home/barikak/workplace/NetengCommonW/env/NetengCommon-1.0/runtime/lib/python2.7/site-packages/requests/packages/urllib3/util/wait.py", line 33, in wait_for_read
return _wait_for_io_events(socks, EVENT_READ, timeout)
File "/home/barikak/workplace/NetengCommonW/env/NetengCommon-1.0/runtime/lib/python2.7/site-packages/requests/packages/urllib3/util/wait.py", line 22, in _wait_for_io_events
with DefaultSelector() as selector:
File "/home/barikak/workplace/NetengCommonW/env/NetengCommon-1.0/runtime/lib/python2.7/site-packages/requests/packages/urllib3/util/selectors.py", line 364, in __init__
self._epoll = select.epoll()
AttributeError: 'module' object has no attribute 'epoll'
>>>
>>>
requests 2.9.1 works
>>> import sys
>>> import gevent.monkey
>>> gevent.monkey.patch_all() # This must be called before any other modules calling socket/request
>>>
>>> import requests
>>>
>>> print 'py version:', sys.version_info
py version: sys.version_info(major=2, minor=7, micro=8, releaselevel='final', serial=0)
>>> print 'requests version:', requests.__version__
requests version: 2.9.1
>>> print 'gevent version:', gevent.__version__
gevent version: 1.0.1
>>>
>>> assert requests.get('http://www.python.org').content
/home/barikak/workplace/NSDSwiftKanbanClientW/env/NSDSwiftKanbanClient-1.0/runtime/lib/python2.7/site-packages/requests/packages/urllib3/util/ssl_.py:120: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. For more information, see https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning.
InsecurePlatformWarning
>>>
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4051/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4051/timeline
| null |
completed
| null | null | false |
[
"Are you *absolutely certain* that nothing else imports requests before you get there? Because this seems to be a bug that only occurs if Requests is imported before gevent monkey patching.",
"In the meantime, please try 2.14.2.",
"I can't reproduce this on Python 2.7 with requests 2.13.0 and gevent 1.0.1 running the following:\r\n```python\r\n>>> import gevent.monkey\r\n>>> gevent.monkey.patch_all()\r\n>>> import requests\r\n>>> requests.get('http://www.google.com')\r\n<Response [200]>\r\n```\r\nThis means that `select` is probably patching before the import of `requests` as @Lukasa suggested.\r\n\r\nAfter upgrading to `2.14.2` you should be able to patch any time before `requests` is first used, including post-import.\r\n```python\r\n>>> import gevent.monkey\r\n>>> import requests\r\n>>> gevent.monkey.patch_all()\r\n>>> requests.get('http://www.google.com')\r\n<Response [200]>\r\n\r\n```",
"Thanks a lot... Yes, I am absolutely sure requests is not being imported before patching.\r\n\r\nGreat. I am pushing for 2.14.2, but also found a workaround:\r\n\r\n import sys\r\n import gevent.monkey\r\n gevent.monkey.patch_all() # This must be called before any other modules calling socket/request\r\n\r\n import requests\r\n\r\n print 'py version:', sys.version_info\r\n print 'requests version:', requests.__version__\r\n print 'gevent version:', gevent.__version__\r\n\r\n import select\r\n from requests.packages.urllib3.util.selectors import EpollSelector, SelectSelector, PollSelector\r\n\r\n if hasattr(select, \"epoll\"):\r\n print 'epollo'\r\n requests.packages.urllib3.util.wait.DefaultSelector = EpollSelector\r\n elif hasattr(select, \"select\"):\r\n print 'select'\r\n requests.packages.urllib3.util.wait.DefaultSelector = SelectSelector\r\n elif hasattr(select, \"poll\"):\r\n print 'poll'\r\n requests.packages.urllib3.util.wait.DefaultSelector = PollSelector\r\n\r\n assert requests.get('http://www.python.org').content",
"The error occurs even when you run the four lines that I posted above in a blank empty Python console?\r\n\r\nRun these lines in a new console and report back if this errors with the same error (It shouldn't for `requests==2.13.0`):\r\n```python\r\nimport gevent.monkey\r\ngevent.monkey.patch_all()\r\nimport requests\r\nrequests.get('http://www.google.com')\r\n```\r\n\r\nYour work-around is basically what `requests` is doing in the background in it's ported `urllib3.selectors` module. You shouldn't have to do this work-around given the proper patch->import order. If the patching occurs after the import and we assumed that you should use a selector (in this case, `select.epoll()`) that no longer exists in your post-patched `select` module then there are problems. This is why we believe that the patching is occurring before `requests` is imported.",
"I see... So, it looks like `import sys` if called before patch, throws an exception.\r\n\r\nThis doesn't throw error\r\n\r\n import gevent.monkey\r\n gevent.monkey.patch_all()\r\n import requests\r\n import sys\r\n requests.get('http://www.google.com')\r\n\r\n\r\nThis throws exception\r\n\r\n import sys\r\n import gevent.monkey\r\n gevent.monkey.patch_all()\r\n import requests\r\n requests.get('http://www.python.org')\r\n\r\n\r\n",
"Again, I cannot be emphatic enough: you must not import *anything* before applying a gevent monkey patch.",
"This is ultimately a gevent bug, not a Requests bug.",
"Thank you so much for your help"
] |
https://api.github.com/repos/psf/requests/issues/4050
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4050/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4050/comments
|
https://api.github.com/repos/psf/requests/issues/4050/events
|
https://github.com/psf/requests/pull/4050
| 230,791,794 |
MDExOlB1bGxSZXF1ZXN0MTIyMDU3NjM3
| 4,050 |
AttributeError error raised when `files` file-pointer (fp) resolves to None
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3474715?v=4",
"events_url": "https://api.github.com/users/mayani/events{/privacy}",
"followers_url": "https://api.github.com/users/mayani/followers",
"following_url": "https://api.github.com/users/mayani/following{/other_user}",
"gists_url": "https://api.github.com/users/mayani/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mayani",
"id": 3474715,
"login": "mayani",
"node_id": "MDQ6VXNlcjM0NzQ3MTU=",
"organizations_url": "https://api.github.com/users/mayani/orgs",
"received_events_url": "https://api.github.com/users/mayani/received_events",
"repos_url": "https://api.github.com/users/mayani/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mayani/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mayani/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mayani",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 12 |
2017-05-23T17:50:04Z
|
2021-09-03T00:10:56Z
|
2018-05-17T20:56:18Z
|
CONTRIBUTOR
|
resolved
|
The library raises NoneType error when file-pointer (fp) resolves to None.
```
>>> from requests import post
>>> r = post("https://example.com", files={"file-name": None})
AttributeError: 'NoneType' object has no attribute 'read'
```
However, when a param value or json field is None they are not included in the request body.
```
>>> from requests import get
>>> r = get("https://example.com", params={"file-name": None})
>>> r.request.url
```
This commit makes the beahviour consistent for files.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4050/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4050/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4050.diff",
"html_url": "https://github.com/psf/requests/pull/4050",
"merged_at": "2018-05-17T20:56:18Z",
"patch_url": "https://github.com/psf/requests/pull/4050.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4050"
}
| true |
[
"Love this!",
"# [Codecov](https://codecov.io/gh/requests/requests/pull/4050?src=pr&el=h1) Report\n> :exclamation: No coverage uploaded for pull request base (`master@5697148`). [Click here to learn what that means](https://docs.codecov.io/docs/error-reference#section-missing-base-commit).\n> The diff coverage is `100%`.\n\n[](https://codecov.io/gh/requests/requests/pull/4050?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #4050 +/- ##\n=========================================\n Coverage ? 88.17% \n=========================================\n Files ? 15 \n Lines ? 1945 \n Branches ? 0 \n=========================================\n Hits ? 1715 \n Misses ? 230 \n Partials ? 0\n```\n\n\n| [Impacted Files](https://codecov.io/gh/requests/requests/pull/4050?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/models.py](https://codecov.io/gh/requests/requests/pull/4050?src=pr&el=tree#diff-cmVxdWVzdHMvbW9kZWxzLnB5) | `91.23% <100%> (ø)` | |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/requests/requests/pull/4050?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/requests/requests/pull/4050?src=pr&el=footer). Last update [5697148...916a0db](https://codecov.io/gh/requests/requests/pull/4050?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"Hi @mayani, sorry this has been in limbo for so long. Would you mind rebasing this into a single commit on the current master branch? Otherwise, I'll look into cherry-picking this commit onto the master branch. Thanks!",
"You should probably rebase, there's a lot of unnecessary changes after syncing. :)",
"@nateprewitt I am unsure how to rebase after a sync. Might be best to cherry pick the commits or I can submit a fresh pull request.",
"@mayani, you can use squash to condense them into a single commit and then either use the `--onto` option of rebase or `cherry-pick` with the squashed hash will work too.\r\n\r\nYou’ll need to reset your squashed commit before merging in current master if you go the cherrypick route.",
"Hi @nateprewitt, I did the following\r\n```\r\ngit fetch upstream\r\ngit branch none_filepointer\r\n```\r\nI cherry picked the commits from mayani:master to non_filepointer. However, I won't be able to push as I don't have permissions on main repo's none_filepointer branch.\r\n\r\n",
"@mayani, apologies if the instructions weren’t very clear. I’ve already gathered your commits and squashed them into a single commit at the top of none_filepointer. You should be able to replace this branch with a copy of master and then cherry pick that commit on top. Then use a force push to replace this branch.",
"@nateprewitt Since the else part is now not being changed, where does it leave us with the none_filepointer branch?\r\n\r\n\"You should be able to replace this branch with a copy of master and then cherry pick that commit on top.\" Did you mean rename the branches to make local_nonepointer my new master?\r\n\r\nSorry, my git skills are pretty basic. ",
"@mayani, it looks like you created the pull request against your master branch. The easiest option is to replace it and then force-push to your repo. The following commands should do it.\r\n\r\n```\r\ngit branch -f master none_filepointer\r\ngit push -f origin master\r\n```\r\n\r\nOtherwise if we’re not making changes, I can just open up a PR with the none_filepointer branch and that will merge your commit in.",
"@nateprewitt Thanks. I have executed the commands as suggested\r\n\r\n```bash\r\ngit branch -f master none_filepointer\r\ngit push -f origin master\r\n```\r\n",
"Thanks @mayani!"
] |
https://api.github.com/repos/psf/requests/issues/4049
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4049/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4049/comments
|
https://api.github.com/repos/psf/requests/issues/4049/events
|
https://github.com/psf/requests/issues/4049
| 230,685,739 |
MDU6SXNzdWUyMzA2ODU3Mzk=
| 4,049 |
Prevent reading whole file when using the files= argument
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/314716?v=4",
"events_url": "https://api.github.com/users/astrofrog/events{/privacy}",
"followers_url": "https://api.github.com/users/astrofrog/followers",
"following_url": "https://api.github.com/users/astrofrog/following{/other_user}",
"gists_url": "https://api.github.com/users/astrofrog/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/astrofrog",
"id": 314716,
"login": "astrofrog",
"node_id": "MDQ6VXNlcjMxNDcxNg==",
"organizations_url": "https://api.github.com/users/astrofrog/orgs",
"received_events_url": "https://api.github.com/users/astrofrog/received_events",
"repos_url": "https://api.github.com/users/astrofrog/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/astrofrog/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/astrofrog/subscriptions",
"type": "User",
"url": "https://api.github.com/users/astrofrog",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-05-23T12:14:05Z
|
2021-09-08T10:00:44Z
|
2017-05-23T12:25:49Z
|
NONE
|
resolved
|
I am trying to send a very large file (100Gb) using a POST request, with a call that looks like:
```
r = requests.post(url, data={'filename': filename}, files={'file': f})
```
but the issue is that internally, requests tries to read the whole file before sending:
```
File "upload.py", line 19, in <module>
files=files)
File "/home/robitaille/anaconda/lib/python2.7/site-packages/requests/api.py", line 88, in post
return request('post', url, data=data, **kwargs)
File "/home/robitaille/anaconda/lib/python2.7/site-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/home/robitaille/anaconda/lib/python2.7/site-packages/requests/sessions.py", line 349, in request
prep = self.prepare_request(req)
File "/home/robitaille/anaconda/lib/python2.7/site-packages/requests/sessions.py", line 287, in prepare_request
hooks=merge_hooks(request.hooks, self.hooks),
File "/home/robitaille/anaconda/lib/python2.7/site-packages/requests/models.py", line 290, in prepare
self.prepare_body(data, files)
File "/home/robitaille/anaconda/lib/python2.7/site-packages/requests/models.py", line 427, in prepare_body
(body, content_type) = self._encode_files(files, data)
File "/home/robitaille/anaconda/lib/python2.7/site-packages/requests/models.py", line 140, in _encode_files
rf = RequestField(name=k, data=fp.read(),
MemoryError
```
For other libraries, there are solutions that involve e.g. mmap:
https://stackoverflow.com/a/2504133
but this doesn't work for requests due to the hard-coded ``f.read()`` which will force loading the whole file into memory. Is there a way to avoid this, and if not can support for ``mmap`` be implemented? (for mmap, f.read has to take an argument).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4049/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4049/timeline
| null |
completed
| null | null | false |
[
"There *is* a solution for Requests in the [requests-toolbelt](https://toolbelt.readthedocs.io). Namely the [Streaming Multipart Encoder](https://toolbelt.readthedocs.io/en/latest/uploading-data.html#streaming-multipart-data-encoder).\r\n\r\nWe've talked about adding this support into Requests directly though, so ideally this won't be necessary in the *nearish* future."
] |
https://api.github.com/repos/psf/requests/issues/4048
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4048/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4048/comments
|
https://api.github.com/repos/psf/requests/issues/4048/events
|
https://github.com/psf/requests/pull/4048
| 230,629,303 |
MDExOlB1bGxSZXF1ZXN0MTIxOTQxMTE5
| 4,048 |
Add an enhancement for transforamtion between dict and cookiejar.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1980918?v=4",
"events_url": "https://api.github.com/users/justdoit0823/events{/privacy}",
"followers_url": "https://api.github.com/users/justdoit0823/followers",
"following_url": "https://api.github.com/users/justdoit0823/following{/other_user}",
"gists_url": "https://api.github.com/users/justdoit0823/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/justdoit0823",
"id": 1980918,
"login": "justdoit0823",
"node_id": "MDQ6VXNlcjE5ODA5MTg=",
"organizations_url": "https://api.github.com/users/justdoit0823/orgs",
"received_events_url": "https://api.github.com/users/justdoit0823/received_events",
"repos_url": "https://api.github.com/users/justdoit0823/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/justdoit0823/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/justdoit0823/subscriptions",
"type": "User",
"url": "https://api.github.com/users/justdoit0823",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-05-23T08:35:52Z
|
2021-09-06T00:06:55Z
|
2017-05-24T04:12:30Z
|
NONE
|
resolved
|
When using dict_from_cookiejar, I can only get name and value from cookie object, otherwise other cookie attributes are lost. So I want to make an enhancement about two methods dict_from_cookiejar and cookiejar_from_dict, which can be more flexible and safe to do cookie transformation and persistence like the following code.
```
import requests
from requests.utils import dict_from_cookiejar
res = requests.get('https://github.com/justdoit0823/notes')
dict_from_cookiejar(res.cookies)
dict_from_cookiejar(res.cookies, with_cookie_attr=True)
```
And the output:
```
# with_cookie_attr is False
{'_gh_sess': 'somevalue', 'logged_in': 'no'}
# with_cookie_attr is True
{'_gh_sess': {'discard': True,
'domain': 'github.com',
'name': '_gh_sess',
'path': '/',
'rest': {'HttpOnly': None},
'rfc2109': False,
'secure': True,
'value': 'somevalue',
'version': 0},
'logged_in': {'discard': False,
'domain': '.github.com',
'expires': 2126671044,
'name': 'logged_in',
'path': '/',
'rest': {'HttpOnly': None},
'rfc2109': False,
'secure': True,
'value': 'no',
'version': 0}}
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4048/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4048/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4048.diff",
"html_url": "https://github.com/psf/requests/pull/4048",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/4048.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4048"
}
| true |
[
"Thanks for this patch! I'm afraid I don't think it's necessary: these are primarily intended as internal utilities and so we're very disinclined to add features to them. Sorry about that! :sparkles:"
] |
https://api.github.com/repos/psf/requests/issues/4047
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4047/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4047/comments
|
https://api.github.com/repos/psf/requests/issues/4047/events
|
https://github.com/psf/requests/pull/4047
| 230,522,295 |
MDExOlB1bGxSZXF1ZXN0MTIxODY3NDQ5
| 4,047 |
More Verbose Execption Handling
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4099109?v=4",
"events_url": "https://api.github.com/users/bobstrecansky/events{/privacy}",
"followers_url": "https://api.github.com/users/bobstrecansky/followers",
"following_url": "https://api.github.com/users/bobstrecansky/following{/other_user}",
"gists_url": "https://api.github.com/users/bobstrecansky/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bobstrecansky",
"id": 4099109,
"login": "bobstrecansky",
"node_id": "MDQ6VXNlcjQwOTkxMDk=",
"organizations_url": "https://api.github.com/users/bobstrecansky/orgs",
"received_events_url": "https://api.github.com/users/bobstrecansky/received_events",
"repos_url": "https://api.github.com/users/bobstrecansky/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bobstrecansky/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bobstrecansky/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bobstrecansky",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-05-22T21:17:24Z
|
2021-09-06T00:06:54Z
|
2017-05-24T17:57:58Z
|
NONE
|
resolved
|
So I tried to add more verbose exception handling without breaking current functionality (this feels really hacky). It seems to me that it may be pertinent to break these out and not raise a connection error for each request (as denoted in : https://github.com/kennethreitz/requests/issues/2876)
In the current implementation, a ConnectionError is raised for each call, as well as a max retries message. It seems like the max retries message error should either be removed or made as more of a general string response statement. This general string response is here:
https://github.com/kennethreitz/requests/blob/master/requests/adapters.py#L503
Doing so would require removing that line and adding explicit responses within that max retry block, but this might break current functionality if people are depending on the response from requests to make other decisions.
Looking forward to a discussion on this topic.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4099109?v=4",
"events_url": "https://api.github.com/users/bobstrecansky/events{/privacy}",
"followers_url": "https://api.github.com/users/bobstrecansky/followers",
"following_url": "https://api.github.com/users/bobstrecansky/following{/other_user}",
"gists_url": "https://api.github.com/users/bobstrecansky/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bobstrecansky",
"id": 4099109,
"login": "bobstrecansky",
"node_id": "MDQ6VXNlcjQwOTkxMDk=",
"organizations_url": "https://api.github.com/users/bobstrecansky/orgs",
"received_events_url": "https://api.github.com/users/bobstrecansky/received_events",
"repos_url": "https://api.github.com/users/bobstrecansky/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bobstrecansky/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bobstrecansky/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bobstrecansky",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4047/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4047/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4047.diff",
"html_url": "https://github.com/psf/requests/pull/4047",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/4047.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4047"
}
| true |
[
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4047?src=pr&el=h1) Report\n> Merging [#4047](https://codecov.io/gh/kennethreitz/requests/pull/4047?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/6fb9b1ce53e2b5b047d875a916cfae8c468649dc?src=pr&el=desc) will **decrease** coverage by `0.04%`.\n> The diff coverage is `50%`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/4047?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #4047 +/- ##\n=========================================\n- Coverage 89.74% 89.7% -0.05% \n=========================================\n Files 15 15 \n Lines 1941 1943 +2 \n=========================================\n+ Hits 1742 1743 +1 \n- Misses 199 200 +1\n```\n\n\n| [Impacted Files](https://codecov.io/gh/kennethreitz/requests/pull/4047?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/adapters.py](https://codecov.io/gh/kennethreitz/requests/pull/4047?src=pr&el=tree#diff-cmVxdWVzdHMvYWRhcHRlcnMucHk=) | `92.05% <50%> (-0.87%)` | :arrow_down: |\n| [...es/urllib3/packages/ssl\\_match\\_hostname/\\_\\_init\\_\\_.py](https://codecov.io/gh/kennethreitz/requests/pull/4047?src=pr&el=tree#diff-cmVxdWVzdHMvcGFja2FnZXMvdXJsbGliMy9wYWNrYWdlcy9zc2xfbWF0Y2hfaG9zdG5hbWUvX19pbml0X18ucHk=) | `96.66% <0%> (+3.33%)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4047?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4047?src=pr&el=footer). Last update [6fb9b1c...0273f6c](https://codecov.io/gh/kennethreitz/requests/pull/4047?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"Closing this in favor of parsing the execution in the client code."
] |
https://api.github.com/repos/psf/requests/issues/4046
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4046/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4046/comments
|
https://api.github.com/repos/psf/requests/issues/4046/events
|
https://github.com/psf/requests/issues/4046
| 230,243,695 |
MDU6SXNzdWUyMzAyNDM2OTU=
| 4,046 |
Force use of specific SSL version and/or ciphers
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3749186?v=4",
"events_url": "https://api.github.com/users/devt/events{/privacy}",
"followers_url": "https://api.github.com/users/devt/followers",
"following_url": "https://api.github.com/users/devt/following{/other_user}",
"gists_url": "https://api.github.com/users/devt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/devt",
"id": 3749186,
"login": "devt",
"node_id": "MDQ6VXNlcjM3NDkxODY=",
"organizations_url": "https://api.github.com/users/devt/orgs",
"received_events_url": "https://api.github.com/users/devt/received_events",
"repos_url": "https://api.github.com/users/devt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/devt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/devt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/devt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2017-05-21T19:41:57Z
|
2021-09-08T10:00:45Z
|
2017-05-21T19:46:55Z
|
NONE
|
resolved
|
I need to overwrite the default behavior of choosing which ssl_version or ciphers are allowed to be negotiated.
There is a 'broken' internal site which accepts TLS v1.0 only. It works with curl and Chrome (who end up selecting TLSv1.0 / DES-CBC3-SHA), but from Python it ends up 'choosing' SSLv3 and you get the error requests.exceptions.SSLError: [SSL: UNSUPPORTED_PROTOCOL] unsupported protocol (_ssl.c:590)
I implemented the overwrite for requests.packages.urllib3.poolmanager as described in http://docs.python-requests.org/en/latest/user/advanced/#example-specific-ssl-version or https://lukasa.co.uk/2013/01/Choosing_SSL_Version_In_Requests/, but that still does not 'prohibit' the use of SSLv3 and you end up again using it.
I implemented the changes of SSLContext as suggested at http://stackoverflow.com/questions/29153271/sending-tls-1-2-request-in-python-2-6 and this fixes the issue, but I had to directly patch the code in requests.packages.urllib3.connection, since there is no way to overwrite which 'kind' of HTTPSConnection it will instantiate. Or is there a way to do that?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4046/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4046/timeline
| null |
completed
| null | null | false |
[
"You can create an SSLContext and configure it as needed by using [these instructions](https://lukasa.co.uk/2017/02/Configuring_TLS_With_Requests/). You can set a bunch of `OP_NO` flags to disable everything you don't need. ",
"Thanks a bunch. The explanation of the '3DES stream cipher' being removed makes sense.\r\n\r\nThis allowed me to overwrite the context, but it did not work literally (see below what worked). So I guess there is a way to overwrite the context being used. Did I miss some documentation or it is simply my inexperience with Python?\r\n\r\nThe code that worked for me was the following, effectively I did not overwrite the list of allowed ciphers, but rather disabled SSL v3, which is I guess not part of ssl.create_default_context(). Basically I do not require the ctx.set_ciphers() call\r\n\r\n\r\n```\r\nFORCED_CIPHERS = (\r\n 'ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:'\r\n 'DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES'\r\n)\r\n\r\nclass DESAdapter(HTTPAdapter):\r\n \"\"\"\r\n A TransportAdapter that re-enables 3DES support in Requests.\r\n \"\"\"\r\n def create_ssl_context(self):\r\n #ctx = create_urllib3_context(ciphers=FORCED_CIPHERS)\r\n ctx = ssl.create_default_context()\r\n # allow TLS 1.0 and TLS 1.2 and later (disable SSLv3 and SSLv2)\r\n ctx.options |= ssl.OP_NO_SSLv2\r\n ctx.options |= ssl.OP_NO_SSLv3 \r\n ctx.set_ciphers( FORCED_CIPHERS )\r\n return ctx\r\n \r\n def init_poolmanager(self, *args, **kwargs):\r\n logging.debug(' ----------- DESAdapter.init_poolmanager -------------- ')\r\n kwargs['ssl_context'] = self.create_ssl_context()\r\n return super(DESAdapter, self).init_poolmanager(*args, **kwargs)\r\n\r\n def proxy_manager_for(self, *args, **kwargs):\r\n logging.debug(' ----------- DESAdapter.proxy_manager_for -------------- ')\r\n kwargs['ssl_context'] = self.create_ssl_context()\r\n return super(DESAdapter, self).proxy_manager_for(*args, **kwargs)\r\n\r\n```",
"Yeah, that code was intended as an example for you to extend rather than as an expected complete solution for you. "
] |
https://api.github.com/repos/psf/requests/issues/4045
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4045/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4045/comments
|
https://api.github.com/repos/psf/requests/issues/4045/events
|
https://github.com/psf/requests/issues/4045
| 230,240,670 |
MDU6SXNzdWUyMzAyNDA2NzA=
| 4,045 |
Async-only for Requests v3?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"color": "e102d8",
"default": false,
"description": null,
"id": 117745,
"name": "Planned",
"node_id": "MDU6TGFiZWwxMTc3NDU=",
"url": "https://api.github.com/repos/psf/requests/labels/Planned"
},
{
"color": "0b02e1",
"default": false,
"description": null,
"id": 191274,
"name": "Contributor Friendly",
"node_id": "MDU6TGFiZWwxOTEyNzQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly"
},
{
"color": "fbca04",
"default": false,
"description": null,
"id": 44501249,
"name": "Needs BDFL Input",
"node_id": "MDU6TGFiZWw0NDUwMTI0OQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input"
},
{
"color": "fef2c0",
"default": false,
"description": null,
"id": 60669570,
"name": "Please Review",
"node_id": "MDU6TGFiZWw2MDY2OTU3MA==",
"url": "https://api.github.com/repos/psf/requests/labels/Please%20Review"
},
{
"color": "e11d21",
"default": false,
"description": null,
"id": 136589914,
"name": "Needs Info",
"node_id": "MDU6TGFiZWwxMzY1ODk5MTQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20Info"
}
] |
closed
| true | null |
[] | null | 18 |
2017-05-21T19:02:34Z
|
2021-09-08T10:00:40Z
|
2017-05-24T17:32:53Z
|
MEMBER
|
resolved
|
A discussion at PyCon in the Requests BoF has suggested two approaches to our glorious async future:
1. Write a codebase that is async and Python 3-only, and then apply an automated code transformation to turn that into a synchronous codebase that ships on Python 2. This will mean that Requests v3 supports *only* synchronous code on Python 2, and both sync and async code on Python 3.
2. Mark Requests v2 as a LTS release that we will support until 2020, and then ship Requests v3 as a Python-3-only release (like Django!)
I don't want to go into too much detail on (1), but I'd like to ask our users what they think about option (2). If it's acceptable to you, leave a :+1:, if it's unacceptable leave a :-1:, and if you have notes about why it's either please leave them below.
Let the comment period begin!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 24,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 8,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 32,
"url": "https://api.github.com/repos/psf/requests/issues/4045/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4045/timeline
| null |
completed
| null | null | false |
[
"Thumb up for v2 as LTS and v3 Py3 only :). 👍\r\nAdding the Django example, it gives good push in the right of direction of Py3 only. Hopefully this starts a trend and all libraries will follow that model. It helps me to promote that model as well internally, for our own libraries (if everybody does that, let's do the same :))\r\nSide note, like @kennethreitz was saying at the BoF, v3 being Py3 only opens great opportunities to have native type hints support (not based on comments), or to avoid adding that burden to a code transformer.",
"👍 ",
"I'm in favor of option 2, and here are my reasons:\r\n\r\n### Political\r\n\r\nRequests is in the privileged position of being an incredibly widely used—and thus influential—package. Python 3 is not just the future but also present for many of its users, and the Python community at large. Requests _can and should_ help move the community forward.\r\n\r\n### Practical\r\n\r\nMajor version numbers are meant to represent significant, breaking changes (at least in SemVer). Deprecating a slated-for-deprecation version of the language seems to me a rather fitting scenario.\r\n\r\nNothing stands in the way of the LTS version being updated even past 2020, depending on the needs of the community.\r\n\r\nAnd another shocker: should it prove desirable, it's entirely possible to restore Python 2 compatibility in a future major version.",
"As it has been for the past several years, my opinion is that Python 2 should have died long ago and thus I am 100% on board with this. I'd suggest that we reach out to @dims @sdague @ttx and @jakubroztocil though to ensure that some of our larger open source dependencies are quite aware of this discussion.",
"Thanks for the heads up. It seems pretty reasonable to me that requests 2.x goes into LTS mode for python2.\r\n\r\nI think the big question would be on how much the calling interface changes, so how much do we have to actively port to requests v3. Most of the base iaas in OpenStack is stradling both python2 and python3 right now, so understanding what libraries we'd need to embed requests 2 vs. requests 3 logic in would be helpful to know.",
"Thanks @sigmavirus24 +1 to @sdague 's comment above",
"I'm +1 on option 2, with some caveats stemming from our discussion last night, namely that rather than attempt to maintain some degree of superficial, possibly source-level API compatibility with the existing API, the change should be *visibly* breaking. Don't leave room for subtle compatibility issues that come about because of the existing users updating with unpinned dependencies.\r\n\r\nYou may even want to consider a full name change. Here's one question: is it feasible to determine how your dependencies are being pinned from a central location? It would be worth asking PyPi to see how requests is pulled in; is it via `requests>=2.0` or is it via something safer like `requests>=2.0,<3.0`? If not, then you're laying a bit of a time bomb. I suppose you can get away with this using trove classifiers, but it'll be hinky. Possibly pip fixes this for you now?\r\n\r\nShort answer: Yes, do a 2.x LTS version, and a 3.x future version. Break 2.x consumers HARD if they use the 3.x version, though.\r\n\r\n",
"> I suppose you can get away with this using trove classifiers, but it'll be hinky.\r\n\r\nI don't understand how Trove Classifiers would help in this case.\r\n\r\n> If not, then you're laying a bit of a time bomb.\r\n\r\nIf not I think there's plenty of time to advocate that users cap to `<3.0` or more simply pin to their current version.\r\n\r\n> Break 2.x consumers HARD if they use the 3.x version, though.\r\n\r\nWe broke 0.x consumers HARD with 1.x several years ago and people still take the opportunity to complain to me. That said, this is a significantly better reason to do so.",
">> I suppose you can get away with this using trove classifiers, but it'll be hinky.\r\n\r\n> I don't understand how Trove Classifiers would help in this case.\r\n\r\nI'm thinking that you could mark 3.x as only working for certain Python versions. If pip actually heeds those, then \"pip install requests >=2.0\" will stop at 2.* for Python 2.x and use 3.x for the 3.6+. If it doesn't, then they won't help at all.\r\n\r\n>> If not, then you're laying a bit of a time bomb.\r\n\r\n> If not I think there's plenty of time to advocate that users cap to <3.0 or more simply pin to their current version.\r\n\r\nNot to put too fine a point on it, but we both know that people will ignore this, or use text from old tutorials on the website, or just do the wrong damn thing.\r\n\r\n>> Break 2.x consumers HARD if they use the 3.x version, though.\r\n\r\n> We broke 0.x consumers HARD with 1.x several years ago and people still take the opportunity to complain to me. That said, this is a significantly better reason to do so.\r\n\r\nIndeed it is.",
"> I'm thinking that you could mark 3.x as only working for certain Python versions. If pip actually heeds those, then \"pip install requests >=2.0\" will stop at 2.* for Python 2.x and use 3.x for the 3.6+. If it doesn't, then they won't help at all.\r\n\r\nPip doesn't do that. The trove classifiers were always more suggestive than prescriptive and meant for users, not necessarily for tooling.\r\n\r\n> Not to put too fine a point on it, but we both know that people will ignore this, or use text from old tutorials on the website, or just do the wrong damn thing.\r\n\r\nOh, I absolutely agree. That doesn't mean we can't say \"That tutorial is very old. You should refer to this newer and more correct tutorial.\"",
"Starting Pip 9 you can limit the package version that gets installed per Python version. This is how iPython enforces iPython 5 for Python 2.\r\n\r\n - iPython devs wrote up a guide here: http://www.python3statement.org/practicalities/\r\n - PyCon 2017 talk about it: https://www.youtube.com/watch?v=2DkfPzWWC2Q",
"We figured this out. \r\n\r\n",
"Requests will (contingient upon reasonable prototyping) depend on Twisted to provide the following:\r\n\r\n- an event loop backend\r\n- async support for Python 2 and 3 simultaneouslly\r\n- support for the `async` and `await` keywords in Python 3\r\n\r\nIt's going to be beautiful, when we're done. \r\n\r\nExpect this work to be completed within the next year or so, unless I *really* get a coding itch soon. Who knows 👍 ",
"```pycon\r\n>>> from requests import async\r\n>>> async.get('...')\r\n```\r\n^ awaitable deferred repr here ^\r\n\r\n`async` will also be a Requests session instance, so... automatic connection-pooling. \r\n\r\n:heart:\r\n",
"there's a 10% chance i'll want to give it a more clever name, but that's the working name. ",
"`async` is a reserved key-word in Python 3.6 onwards.",
"@SethMichaelLarson thanks! someone brought this up during the conference too, i forgot.\r\n\r\nOkay, 100% chance I'll find a more clever name now. ",
"`requests.concurrent` wouldn't be the end of the world. I'll give it a deep think. "
] |
https://api.github.com/repos/psf/requests/issues/4044
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4044/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4044/comments
|
https://api.github.com/repos/psf/requests/issues/4044/events
|
https://github.com/psf/requests/pull/4044
| 230,228,907 |
MDExOlB1bGxSZXF1ZXN0MTIxNjY1Mzc3
| 4,044 |
Fix cookies storage on GAE
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/709053?v=4",
"events_url": "https://api.github.com/users/cmin764/events{/privacy}",
"followers_url": "https://api.github.com/users/cmin764/followers",
"following_url": "https://api.github.com/users/cmin764/following{/other_user}",
"gists_url": "https://api.github.com/users/cmin764/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/cmin764",
"id": 709053,
"login": "cmin764",
"node_id": "MDQ6VXNlcjcwOTA1Mw==",
"organizations_url": "https://api.github.com/users/cmin764/orgs",
"received_events_url": "https://api.github.com/users/cmin764/received_events",
"repos_url": "https://api.github.com/users/cmin764/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/cmin764/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cmin764/subscriptions",
"type": "User",
"url": "https://api.github.com/users/cmin764",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 18 |
2017-05-21T15:39:06Z
|
2021-09-03T00:10:59Z
|
2017-05-30T17:12:52Z
|
NONE
|
resolved
|
When using requests on GAE, the urllib3 response containing cookies
is a little bit different, therefore we have to extract the cookies
from the returned headers directly from the response object.
Fix issue #4039
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4044/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4044/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4044.diff",
"html_url": "https://github.com/psf/requests/pull/4044",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/4044.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4044"
}
| true |
[
"Just to clarify: are you interested in making further amendments to this patch, or would you like core devs or volunteers to build on it instead?",
"No, I would like to let the core devs take it and adapt it to what is right. What I want most, is to be aware of the fact that this is a serious issue affecting all GAE projects using requests with cookies.",
"No-one is disputing that this is an issue. =)",
"Is this the case on the latest offering (I fail to recall the name) on GAE, or only the legacy offering?",
"Latest. =)",
"If I recall correctly, GAE now has a \"container\" style offering that offers much more compatibility with standard Python code. ",
"Ah, thanks! Just checking. Good to know. \r\n\r\nDisappointing, but good to know :-/",
"Hey @jonparrott, I went to update this and bumped into this error:\r\n\r\n```\r\n_____________________ test_requests_are_updated_each_time ______________________\r\n[gw3] darwin -- Python 3.6.0 /Users/cory/Documents/Python/requests_org/requests/.tox/py36/bin/python\r\nhttpbin = <function prepare_url.<locals>.inner at 0x111194d08>\r\n\r\n def test_requests_are_updated_each_time(httpbin):\r\n session = RedirectSession([303, 307])\r\n prep = requests.Request('POST', httpbin('post')).prepare()\r\n r0 = session.send(prep)\r\n assert r0.request.method == 'POST'\r\n assert session.calls[-1] == SendCall((r0.request,), {})\r\n redirect_generator = session.resolve_redirects(r0, prep)\r\n default_keyword_args = {\r\n 'stream': False,\r\n 'verify': True,\r\n 'cert': None,\r\n 'timeout': None,\r\n 'allow_redirects': False,\r\n 'proxies': {},\r\n }\r\n> for response in redirect_generator:\r\n\r\ntests/test_requests.py:2145: \r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ \r\nrequests/sessions.py:184: in resolve_redirects\r\n extract_cookies_to_jar(prepared_request._cookies, req, resp.raw)\r\nrequests/cookies.py:141: in extract_cookies_to_jar\r\n jar.extract_cookies(res, req)\r\n/Users/cory/.pyenv/versions/3.6.0/lib/python3.6/http/cookiejar.py:1664: in extract_cookies\r\n for cookie in self.make_cookies(response, request):\r\n_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ \r\n\r\nself = <RequestsCookieJar[]>\r\nresponse = <requests.cookies.MockResponse object at 0x110dc7a90>\r\nrequest = <requests.cookies.MockRequest object at 0x110dc7c88>\r\n\r\n def make_cookies(self, response, request):\r\n \"\"\"Return sequence of Cookie objects extracted from response object.\"\"\"\r\n # get cookie-attributes for RFC 2965 and Netscape protocols\r\n headers = response.info()\r\n> rfc2965_hdrs = headers.get_all(\"Set-Cookie2\", [])\r\nE AttributeError: 'MockResponse' object has no attribute 'get_all'\r\n```\r\n\r\nNow, for the mainline urllib3 case I can just proxy through to the base httplib class, but is there an equivalent method for GAE that I can use?",
"@kennethreitz requests works fine on \"flexible\" (the container offering) because none of this silly urlfetch crap exists there. This only affects the \"standard\" environment.\r\n\r\n@Lukasa help me understand what's going on here- why does cookie jar need to poke at the `_original_response` object (which by all accounts is *private*)? Why couldn't it just use the public methods?",
"@jonparrott Because cookielib is fundamentally written to work with *httplib*, which means it expects an httplib response object.",
"@Lukasa well that's fun! I guess we could update `AppEngineManager` to also set the `_original_response` object for compatibility.",
"I think that might be the better bugfix, tbh.",
"@Lukasa wanna file a bug on urllib3 and assign it to me? pypa is getting most of my OSS time right now, but I can probably do it within a few weeks. ",
"Alright, closing in favour of shazow/urllib3#1193.",
"Unfortunately this is still an issue...",
"@trickidicki, as I believe you've seen, there is an active PR (shazow/urllib3#1283) to address this and once that's merged it will likely be available in Requests next release. Thanks for taking the time to check in.",
"just running into this today, as best as I understand seems like https://github.com/urllib3/urllib3/pull/1283 got merged, would be amazing that requests incorporates it @nateprewitt ",
"Hey @euri10, we’re currently waiting for a release from urllib3 and then I’m hoping to have a Requests release soon™ after."
] |
https://api.github.com/repos/psf/requests/issues/4043
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4043/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4043/comments
|
https://api.github.com/repos/psf/requests/issues/4043/events
|
https://github.com/psf/requests/issues/4043
| 230,210,181 |
MDU6SXNzdWUyMzAyMTAxODE=
| 4,043 |
python requests runs infinitely if the called script executes more than 2 minutes
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/28837804?v=4",
"events_url": "https://api.github.com/users/jj-one/events{/privacy}",
"followers_url": "https://api.github.com/users/jj-one/followers",
"following_url": "https://api.github.com/users/jj-one/following{/other_user}",
"gists_url": "https://api.github.com/users/jj-one/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jj-one",
"id": 28837804,
"login": "jj-one",
"node_id": "MDQ6VXNlcjI4ODM3ODA0",
"organizations_url": "https://api.github.com/users/jj-one/orgs",
"received_events_url": "https://api.github.com/users/jj-one/received_events",
"repos_url": "https://api.github.com/users/jj-one/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jj-one/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jj-one/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jj-one",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 36 |
2017-05-21T09:59:50Z
|
2021-09-08T10:00:42Z
|
2017-05-23T21:08:39Z
|
NONE
|
resolved
|
I am setting up a vps. Everything has worked well with python requests module until I had to run a script on the server that exceeded two minutes. As soon as any script executes through python requests on the remote server for more than 2 minutes, the request goes on infinitely. It never returns or terminates. If I call the script from any browser, it executes normally and returns the appropriate response IRRESPECTIVE OF THE DURATION. The only means of terminating any scripting lasting more than two minutes and executed through python requests is by including the timeout parameter in the requests call, which normally gives the following traceback:
Traceback (most recent call last):
File "C:\Python34\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 376, in _make_request
httplib_response = conn.getresponse(buffering=True)
TypeError: getresponse() got an unexpected keyword argument 'buffering'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Python34\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 378, in _make_request
httplib_response = conn.getresponse()
File "C:\Python34\lib\http\client.py", line 1148, in getresponse
response.begin()
File "C:\Python34\lib\http\client.py", line 352, in begin
version, status, reason = self._read_status()
File "C:\Python34\lib\http\client.py", line 314, in _read_status
line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
File "C:\Python34\lib\socket.py", line 371, in readinto
return self._sock.recv_into(b)
File "C:\Python34\lib\ssl.py", line 708, in recv_into
return self.read(nbytes, buffer)
File "C:\Python34\lib\ssl.py", line 580, in read
v = self._sslobj.read(len, buffer)
socket.timeout: The read operation timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Python34\lib\site-packages\requests\adapters.py", line 376, in send
timeout=timeout
File "C:\Python34\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 609, in urlopen
_stacktrace=sys.exc_info()[2])
File "C:\Python34\lib\site-packages\requests\packages\urllib3\util\retry.py", line 247, in increment
raise six.reraise(type(error), error, _stacktrace)
File "C:\Python34\lib\site-packages\requests\packages\urllib3\packages\six.py", line 310, in reraise
raise value
File "C:\Python34\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 559, in urlopen
body=body, headers=headers)
File "C:\Python34\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 380, in _make_request
self._raise_timeout(err=e, url=url, timeout_value=read_timeout)
File "C:\Python34\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 308, in _raise_timeout
raise ReadTimeoutError(self, url, "Read timed out. (read timeout=%s)" % timeout_value)
requests.packages.urllib3.exceptions.ReadTimeoutError: HTTPSConnectionPool(host='domain1.com', port=443): Read timed out. (read timeout=130)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "all-test.py", line 32, in <module>
cool = requests.post("https://domain1.com/test3.php", stream=True, verify=False, timeout=130)
File "C:\Python34\lib\site-packages\requests\api.py", line 107, in post
return request('post', url, data=data, json=json, **kwargs)
File "C:\Python34\lib\site-packages\requests\api.py", line 53, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Python34\lib\site-packages\requests\sessions.py", line 468, in request
resp = self.send(prep, **send_kwargs)
File "C:\Python34\lib\site-packages\requests\sessions.py", line 576, in send
r = adapter.send(request, **kwargs)
File "C:\Python34\lib\site-packages\requests\adapters.py", line 449, in send
raise ReadTimeout(e, request=request)
requests.exceptions.ReadTimeout: HTTPSConnectionPool(host='domain1.com', port=443): Read timed out. (read timeout=130)
Note, everything works perfectly with no issue if called from browser.
Also, checking the access.log file on the remove server shows that the script completed at the right time with no issue and responded with 200 status code even when called by python requests that ran infinitely.
To eliminate some probable causes, i have set up and run similar script on my local wamp, calling the same script via python's requests with no issue.
Please, I will highly appreciate any help anybody can give me to resolve the above issue. It has taken me 4 days with no success.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/28837804?v=4",
"events_url": "https://api.github.com/users/jj-one/events{/privacy}",
"followers_url": "https://api.github.com/users/jj-one/followers",
"following_url": "https://api.github.com/users/jj-one/following{/other_user}",
"gists_url": "https://api.github.com/users/jj-one/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jj-one",
"id": 28837804,
"login": "jj-one",
"node_id": "MDQ6VXNlcjI4ODM3ODA0",
"organizations_url": "https://api.github.com/users/jj-one/orgs",
"received_events_url": "https://api.github.com/users/jj-one/received_events",
"repos_url": "https://api.github.com/users/jj-one/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jj-one/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jj-one/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jj-one",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4043/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4043/timeline
| null |
completed
| null | null | false |
[
"This sounds very much like it relates to the web server you're running. Can you try with a different web server?",
"I think you might be right because I have a shared server somewhere where I normally upload files lasting more than 15 minutes using python requests. It has always worked well.\r\nOn the other hand, I want to believe it is not web server, but python requests, because the code completes well irrespective of duration when called from a browser. Moreover, checking the access.log file on the web server shows that the script completed at the right time with no issue and responded with 200 status code even when called by python requests that ran infinitely.\r\nI would easily have tried this on another server but like I said earlier, it's my first vps experience and have not got another somewhere\r\nPlease, Lukasa, can u think of any setting on the web server (apache2, php5.6 and Ubuntu 14.04) that might cause this?",
"I think I should add that the script in question is designed to echo output only at the end. So, could it be that if requests has to wait for more than two minutes without a response, something goes wrong?",
"Is your script sending its response using HTTP or HTTPS? If HTTP, can you use wireshark or tcpdump to intercept the request/response data?",
"it sends through HTTPS",
"Hrm. Can you set `stream=True` and print `response.headers` immediately after, before doing anything with the body?",
"Ok, I ill try that and get back soon. But note, it does not return except with timeout.",
"@jj-one My suspicion is that this is being caused by a parsing error on the response such that we are waiting for connection close. `stream=True` will cause it to return once the header block is complete, which should probably work.",
"Unfortunately, it crashed out because the supposed response object is NoneType. Remember, it was stopped by the timeout parameter and trying to response.headers in the except requests.exceptions.Timeout block cause another exception \"AttributeError: 'NoneType' object has no attribuete 'headers'.",
"The most confusing issue here is the fact that the requests never returns by itself once two minutes is exceeded except by timeout parameter which make the response object to still remain as NoneType. If it could stop by itself, we could investigate the response object.",
"Does it not return by itself even if you set `stream=True`, without a timeout?",
"No. It only stop when the timeout value is reached.",
"Ok, that suggests something really bad is going on, because it means the data we're receiving must be *extremely* ill-formed: it must really not be HTTP at all.",
"Ok, testing with stream=True, without a timeout.",
"Just terminated it through Windows Task Manager after 5 minutes (i.e while running with stream=True, without timeout).",
"Ok, so this strongly suggests that the data coming from the server is pretty much not valid HTTP. ",
"I strongly suspect that requests abandons response after waiting for two minutes and goes into whatever mode which never ends. ",
"The server access.log still shows that all the call calls so far completed successfully at the right time.",
"Remember that if I reduce the script's iterations such that it ends in less than two minutes, it completes and requests receives appropriate response.",
"This is really driving me crazy. PLEASE, PLEASE, PLEASE, SOMEBODY HELP!!!",
"Requests does not abandon anything, it's simply not how the codebase works. Please try to remain calm: we are busy people and are all only volunteers. \r\n\r\nAt this point I think we need to see the plaintext of the server response ",
"I really appreciate your effort both in dev and volunteering to help users troubleshoot Requests. Don't misunderstand my cry for help. This has stalled my project for four days and I don't know how soon I can be able to resolve it. Requests is awesome. I have used it in several other project: sending data to and receiving data from remote servers, tt hasn't given me any issue before. The current project that has developed the above issue is almost at completion level. All other scripts in the same project have run and still run successfully and return the expected results (Array of five) because each one of them complete within two minutes. It is only this one that has taken up to two minutes to execute and has caused me this much delay.",
"So it would be helpful to see if you can try to remove code to get a minimal reproducer. Try pulling chunks of code out one at a time and see if the problem keeps happening. If we can get a small code example that'll help us narrow the problem down, and if at any point you remove some code and the problem goes away you have a bit of an idea of where the problem might be. ",
"Here is the response header when successful:\r\n{'Content-Length': '52', 'Date': 'Sun, 21 May 2017 20:27:28 GMT', 'Server': 'Apache/2.4.7 (Ubuntu)', 'Keep-Alive': 'timeout=5', 'Connection': 'Keep-Alive', 'Content-Type': 'text/html; charset=UTF-8'}\r\n\r\nAnd the following the response.content when successful:\r\nb'[\"no\", \"dummy39\", \"dummy40\", \"dummy41\", \"BEST CASE\"]'\r\n\r\n",
"Also, here is the content of test3.php, which is a file I created only to test the above issue:\r\n`<?php\r\n$start = $end = time();\r\nwhile(($end - $start) < 50){\r\n\t$nt = \"\";\r\n\t$end = time();\r\n}\r\necho '[\"no\", \"dummy39\", \"dummy40\", \"dummy41\", \"BEST CASE\"]';\r\nexit;`\r\n\r\nOf course, the above code is successful, but if I increase 50 to anything 120 or more, Requests never returns unless by timeout parameter or I forcefully end task via Windows Task Manager.",
"Can you thing of any setting on Apache2, PHP5.6 or Ubuntu 14.04 that will interfere with Requests but not browser's execution time?",
"No, but can you show the raw data the browser saw?",
"Sorry, it was late night at my place. Meanwhile, the following code would go infinitely in Requests:\r\n\r\n`<?php $start = $end = time(); while(($end - $start) < 140){ $nt = \"\"; $end = time(); } echo '[\"no\", \"dummy39\", \"dummy40\", \"dummy41\", \"BEST CASE\"]'; exit;`\r\n\r\nBut with browser, here is the output:\r\n\r\n[\"no\", \"dummy39\", \"dummy40\", \"dummy41\", \"BEST CASE\"]",
"I just attempted this locally with ubuntu 14.04, apache2 and PHP5. Using your exact script, and Requests 2.14.2, I didn't reproduce the problem: the request returns fine.\r\n\r\nCan you provide us with more details about what your Python environment looks like? What Python version, what version of Requests, etc?",
"Python 3.4, Requests 2.9.1, Window8 (64Bit) and calling the request script via command prompt.\r\nOn the remote server, Ubuntu 14.04, PHP5.6, Apache2, HTTPS requests and response.\r\n\r\nAlso, I tried this locally on wamp server. It failed first. But when I increased Timeout in php.ini in Apache directory of wamp (that of php directory has always been at 600), it worked perfectly; both via Requests and browser. \r\n\r\nThe same thing happened with the browser. The first time above two minutes, the browser was timed out, but setting timeout in /etc/apache2/apache2.conf to 600 solved the problem.\r\n\r\nIf it will be of use, I could upload the php.ini and apache2.conf on the remote server."
] |
https://api.github.com/repos/psf/requests/issues/4042
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4042/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4042/comments
|
https://api.github.com/repos/psf/requests/issues/4042/events
|
https://github.com/psf/requests/pull/4042
| 230,187,218 |
MDExOlB1bGxSZXF1ZXN0MTIxNjQyNzI2
| 4,042 |
Persist Session-level CookiePolicy
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2017-05-21T00:06:13Z
|
2021-09-06T00:06:55Z
|
2017-05-23T14:39:37Z
|
MEMBER
|
resolved
|
This is a followup for #3463, but refactored without the persistence flag which makes this a breaking change. Currently, Requests silently throws away any cookie policies set on a CookieJar as noted in #3416. This patch will preserve the CookiePolicy set on a session-level CookieJar for both redirected and subsequent requests.
The only caveat here is this won't work for a non-RequestsCookieJar since the policy isn't exposed publicly. We may want to add some documentation about this caveat but given the rarity of this case, the information in the issue tracker may be sufficient.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4042/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4042/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4042.diff",
"html_url": "https://github.com/psf/requests/pull/4042",
"merged_at": "2017-05-23T14:39:37Z",
"patch_url": "https://github.com/psf/requests/pull/4042.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4042"
}
| true |
[
"@Lukasa we're going to need to merge in master to get the builds running in 3.0.0 again. I can do it but probably won't get to it till tomorrow. I'll need to strip out get_policy too because it's no longer needed either.",
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4042?src=pr&el=h1) Report\n> Merging [#4042](https://codecov.io/gh/kennethreitz/requests/pull/4042?src=pr&el=desc) into [proposed/3.0.0](https://codecov.io/gh/kennethreitz/requests/commit/a889b62c5025ed064d4058f27cb1cc62df78f2e8?src=pr&el=desc) will **increase** coverage by `<.01%`.\n> The diff coverage is `100%`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/4042?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## proposed/3.0.0 #4042 +/- ##\n==================================================\n+ Coverage 89.97% 89.98% +<.01% \n==================================================\n Files 15 15 \n Lines 1986 1987 +1 \n==================================================\n+ Hits 1787 1788 +1 \n Misses 199 199\n```\n\n\n| [Impacted Files](https://codecov.io/gh/kennethreitz/requests/pull/4042?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/sessions.py](https://codecov.io/gh/kennethreitz/requests/pull/4042?src=pr&el=tree#diff-cmVxdWVzdHMvc2Vzc2lvbnMucHk=) | `94.01% <100%> (+0.02%)` | :arrow_up: |\n| [requests/cookies.py](https://codecov.io/gh/kennethreitz/requests/pull/4042?src=pr&el=tree#diff-cmVxdWVzdHMvY29va2llcy5weQ==) | `77.11% <100%> (ø)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4042?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4042?src=pr&el=footer). Last update [a889b62...009b80c](https://codecov.io/gh/kennethreitz/requests/pull/4042?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"Alright, I've got the 3.0.0 branch hiccups worked out and I think this is ready for a peek. It's a trivial patch at this point with the flag gone and works across anything that looks like a cookielib.CookieJar.",
"I'm confused. Does this mean that we expect people to set the cookie policy on the cookie jar directly or specify a new RequestsCookieJar with a policy that persists that way? I expected that this added an attribute to the Session that was used to create cookie-jars. Also, since we're adding this to 3.0.0, I'd like us to use a stricter cookie policy. The default one is very relaxed.",
"@sigmavirus24, having the user either set it with `set_policy` or creating a new `RequestsCookieJar(policy)` will work.\r\n\r\nThere are two related problems this is trying to solve. The first is the CookieJar is currently copied on a redirect, but our `copy` method throws away the policy so we get a new `RequestsCookieJar` with the default policy. The second is we merge our session cookies into an empty `RequestsCookieJar` before merging in request level cookies, and that also silently discards a user set policy.\r\n\r\nThis patch is going with the assumption that if you assign a CookieJar to the Session, we continue using everything that was set on that instance of the CookieJar. We could add an attr to the Session, but I *think* that's unneeded surface area at this point. If you don't like the policy on the CookieJar you're handing us, then simply change it back to the default with `set_policy`.\r\n\r\nI agree with upping the default policy for the RequestsCookieJar but don't know if that's directly related to #3416. I'm happy to spin off another PR for that, unless you feel it needs to be done here."
] |
https://api.github.com/repos/psf/requests/issues/4041
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4041/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4041/comments
|
https://api.github.com/repos/psf/requests/issues/4041/events
|
https://github.com/psf/requests/pull/4041
| 230,157,603 |
MDExOlB1bGxSZXF1ZXN0MTIxNjI1Nzk0
| 4,041 |
Add setup.cfg to declare wheel as universal
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/347634?v=4",
"events_url": "https://api.github.com/users/jdufresne/events{/privacy}",
"followers_url": "https://api.github.com/users/jdufresne/followers",
"following_url": "https://api.github.com/users/jdufresne/following{/other_user}",
"gists_url": "https://api.github.com/users/jdufresne/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jdufresne",
"id": 347634,
"login": "jdufresne",
"node_id": "MDQ6VXNlcjM0NzYzNA==",
"organizations_url": "https://api.github.com/users/jdufresne/orgs",
"received_events_url": "https://api.github.com/users/jdufresne/received_events",
"repos_url": "https://api.github.com/users/jdufresne/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jdufresne/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jdufresne/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jdufresne",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-05-20T14:41:02Z
|
2021-09-04T00:06:46Z
|
2017-05-20T15:11:12Z
|
CONTRIBUTOR
|
resolved
|
As the project is pure Python, a built wheel should always be universal,
so define in the project globally. Can remove --universal command from
Makefile.
See:
http://pythonwheels.com/
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4041/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4041/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4041.diff",
"html_url": "https://github.com/psf/requests/pull/4041",
"merged_at": "2017-05-20T15:11:12Z",
"patch_url": "https://github.com/psf/requests/pull/4041.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4041"
}
| true |
[
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4041?src=pr&el=h1) Report\n> Merging [#4041](https://codecov.io/gh/kennethreitz/requests/pull/4041?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/fb3f2a030f82475145369b01adbee68d17bd4f43?src=pr&el=desc) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/4041?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #4041 +/- ##\n=======================================\n Coverage 89.74% 89.74% \n=======================================\n Files 15 15 \n Lines 1941 1941 \n=======================================\n Hits 1742 1742 \n Misses 199 199\n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4041?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4041?src=pr&el=footer). Last update [fb3f2a0...2c71f2d](https://codecov.io/gh/kennethreitz/requests/pull/4041?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n"
] |
https://api.github.com/repos/psf/requests/issues/4040
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4040/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4040/comments
|
https://api.github.com/repos/psf/requests/issues/4040/events
|
https://github.com/psf/requests/issues/4040
| 230,123,627 |
MDU6SXNzdWUyMzAxMjM2Mjc=
| 4,040 |
Custom Auth Handler - HTTP Method and Body changes not functioning with re-direct
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/8226358?v=4",
"events_url": "https://api.github.com/users/pfoppe/events{/privacy}",
"followers_url": "https://api.github.com/users/pfoppe/followers",
"following_url": "https://api.github.com/users/pfoppe/following{/other_user}",
"gists_url": "https://api.github.com/users/pfoppe/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/pfoppe",
"id": 8226358,
"login": "pfoppe",
"node_id": "MDQ6VXNlcjgyMjYzNTg=",
"organizations_url": "https://api.github.com/users/pfoppe/orgs",
"received_events_url": "https://api.github.com/users/pfoppe/received_events",
"repos_url": "https://api.github.com/users/pfoppe/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/pfoppe/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pfoppe/subscriptions",
"type": "User",
"url": "https://api.github.com/users/pfoppe",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2017-05-20T02:09:49Z
|
2021-09-08T10:00:38Z
|
2017-05-26T14:18:25Z
|
NONE
|
resolved
|
Hello,
I've written a pretty basic Custom Authentication Handler to work with a vendor product we have deployed. This product requires an HTTP request with a username/password to a specific web-endpoint which will generate an 'authentication token' that expires at a set time. This authentication token is then passed with future requests which will authorize access to the services.
I successfully have this working by setting the PreparedRequest "method" to POST and update the body to include the authorization token.
I have been successful with this if I access a URL that does not have a re-direct. Unfortunately I've stumbled across a few re-directs in the system, and it seems that my method overwrite (from get -> post) and including my token in the body of the request are lost on re-directs (these are not persisted with requests following the re-direct).
I enabled IIS Failed Request Tracing on the server side to confirm this scenario. I can see the following requests:
**1st Request** - Generate the authentication token with username/password. Successful HTTP 200 response with the token returned. My Custom Auth Handler uses this token to update the PreparedRequest.body
**2nd Request** - Executed HTTP POST to the protected management console (that requires this token). Server responds with an HTTP Status code 302 (re-direct) with a new location in the response header (in this case it just added a '/' at the end of the path). I can see my authorization token in the request body
**3rd Request** - Requests does follow the re-direct, however it is re-converted back to an HTTP GET and my token (from the body) is also lost. It does appear that my request headers did persist.
Thoughts? Thanks for any help!!!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/8226358?v=4",
"events_url": "https://api.github.com/users/pfoppe/events{/privacy}",
"followers_url": "https://api.github.com/users/pfoppe/followers",
"following_url": "https://api.github.com/users/pfoppe/following{/other_user}",
"gists_url": "https://api.github.com/users/pfoppe/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/pfoppe",
"id": 8226358,
"login": "pfoppe",
"node_id": "MDQ6VXNlcjgyMjYzNTg=",
"organizations_url": "https://api.github.com/users/pfoppe/orgs",
"received_events_url": "https://api.github.com/users/pfoppe/received_events",
"repos_url": "https://api.github.com/users/pfoppe/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/pfoppe/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pfoppe/subscriptions",
"type": "User",
"url": "https://api.github.com/users/pfoppe",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4040/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4040/timeline
| null |
completed
| null | null | false |
[
"Have you considered having the auth handler be stored on the Session object, and also have it do its own request creation to obtain auth tokens? Essentially, if it has an auth token it will just add it to all requests, and if it doesn't it will obtain one by making its own brand new request instead of by mutating the one the user is making?",
"Thanks for the feedback!\r\n\r\nI will take a look at that. Currently every request will generate another token passing the username/password every time (increasing the overall chatter). The session enhancement should fix that issue as well. \r\n\r\nSince this posting... the code base was uploaded to: https://github.com/DOI-BLM/requests-agstoken",
"So, I've spent some time playing with requests Sessions and can confirm that my authentication handler works with that approach, however, I think there is still an issue to be discussed. Here is the current control flow: \r\n\r\n- Developer creates an auth instance providing a username/password. \r\n\r\n- The auth instance is added to a request (or a session). \r\n\r\n- On the **first request**... the auth instance will execute its own request to the server 'get token' endpoint, provide the username/password and obtain the authorization token. The auth instance will store the token and the date/time it expires (default is 60 min). \r\n\r\n- On all requests (first and subsequent), the auth instance will: \r\n - Check token 'expires' timestamp and acquire a new one if needed\r\n - Change the PreparedRequest.method to POST\r\n - Add the token parameter to the PreparedRequest body\r\n\r\nI can confirm that if I am accessing a URL that does not have a re-direct (HTTP 302), that everything works as expected. The request is modified to a POST and the token is present in the body. \r\n\r\nIf, however, there is an HTTP re-direct (302) then the requests API will follow the new **location** (defined in the response header), but the method gets set back to GET (if it was truly a GET originally), and the token is dropped from the body. So I'm guessing there is an issue with re-directs not persisting the method and body changes of the request. \r\n\r\nSo a couple questions... \r\n\r\n**Should it be a legitimate implementation approach to:**\r\n\r\n1. **allow an auth handler to adjust the request body?** I looked through a couple of other auth handlers (like the HTTP Basic, NTLM and Kerberos handlers) and it appears those just modify request headers (www-authenticate). I think yes. If so, should this be classified as a bug?\r\n\r\n2. **allow an auth handler to force the request method (for this example POST)?** I could see this getting a little hairy, however HTTP POST requests are supposed to be more secure than HTTP GET requests (and in this case the token represents a users identity, hence the need to protect confidentiality of this data payload as best as possible). \r\n\r\nI might be able to register a response hook for HTTP 302 (or other re-directs) and possibly handle this scenario, if it is deemed that the underlying requests API will not be adjusted to account for this scenario. \r\n\r\nThanks!\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n",
"I'm not a contributor so feel free to take what I say with a grain of salt, but #3107 looks at the fact that POST gets changed to GET upon a HTTP 302 as that's what sending a 302 is telling requests to do, so that much is working as intended. You could always set `allow_redirects=False` in the request, capture the Location header yourself and then perform the new request. Or... get the server to return a 307/308 not a 302, as mentioned in #3107 ",
"An auth handler is free to do *whatever it likes*. It can arbitrarily transform a request as it sees fit.\r\n\r\nAs to the redirect auth-stripping, I think that that is actually a separate bug: `requests.sessions.SessionRedirectMixin.rebuild_auth` should probably re-apply any auth handler that is present, which it currently does not.",
"Thank you all for the feedback. \r\n\r\nI've updated the code base (https://github.com/DOI-BLM/requests-agstoken) to handle re-directs. I'm currently having problems with the vendor product recognizing my token in the body and had to revert to placing that in the URL query string (which I would rather not do since I know it is logged all over the server access logs). This mimics the behavior I see in the vendor products management console (requests to the back-end 'admin' REST API are all HTTP GET with the token passed in a URL Query-String Parameter). \r\n\r\nAt any rate, I think this is functioning well enough to start using in some operational tools. I'll leave this issue open for another day or two in case someone wants to collaborate on this scenario, otherwise I'll close it. \r\n\r\nThanks!!\r\n"
] |
https://api.github.com/repos/psf/requests/issues/4039
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4039/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4039/comments
|
https://api.github.com/repos/psf/requests/issues/4039/events
|
https://github.com/psf/requests/issues/4039
| 230,024,036 |
MDU6SXNzdWUyMzAwMjQwMzY=
| 4,039 |
GAE - not retrieving cookies from the response
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/709053?v=4",
"events_url": "https://api.github.com/users/cmin764/events{/privacy}",
"followers_url": "https://api.github.com/users/cmin764/followers",
"following_url": "https://api.github.com/users/cmin764/following{/other_user}",
"gists_url": "https://api.github.com/users/cmin764/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/cmin764",
"id": 709053,
"login": "cmin764",
"node_id": "MDQ6VXNlcjcwOTA1Mw==",
"organizations_url": "https://api.github.com/users/cmin764/orgs",
"received_events_url": "https://api.github.com/users/cmin764/received_events",
"repos_url": "https://api.github.com/users/cmin764/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/cmin764/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cmin764/subscriptions",
"type": "User",
"url": "https://api.github.com/users/cmin764",
"user_view_type": "public"
}
|
[
{
"color": "fef2c0",
"default": false,
"description": null,
"id": 298537994,
"name": "Needs More Information",
"node_id": "MDU6TGFiZWwyOTg1Mzc5OTQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20More%20Information"
},
{
"color": "fbca04",
"default": false,
"description": null,
"id": 615414998,
"name": "GAE Support",
"node_id": "MDU6TGFiZWw2MTU0MTQ5OTg=",
"url": "https://api.github.com/repos/psf/requests/labels/GAE%20Support"
}
] |
closed
| true | null |
[] | null | 6 |
2017-05-19T16:22:24Z
|
2021-09-08T01:21:05Z
|
2018-08-04T12:31:39Z
|
NONE
|
resolved
|
I've found an issue when using requests with toolbelt through Google App Engine, because of the urllib3 patching with urlfetch. The problem is that the cookie jar is not populated due to a missing attribute (original_response) into the urllib3 response (when switching from sockets to urlfetch and using the HTTPAdapter through the toolbelt monkeypatch).
The bug is self explained within this patch that I've made and works flawlessly into the GAE production and development environments:
```python
"""Runtime monkey-patching for various problematic libraries."""
import re
import requests
from requests.cookies import MockRequest
class MockResponse(object):
"""Wraps a `httplib.HTTPMessage` to mimic a `urllib.addinfourl`.
...what? Basically, expose the parsed HTTP headers from the server response
the way `cookielib` expects to see them.
"""
SEPARATOR = re.compile(r",\s*(?=[\w-]+=)")
def __init__(self, response):
"""Make a MockResponse for `cookielib` to read.
:param response: a HTTPResponse or analogous carrying the headers
"""
self._response = response
def info(self):
return self
def getheaders(self, name):
headers = self._response.getheader(name)
return self.SEPARATOR.split(headers) if headers else []
def extract_cookies_to_jar(jar, request, response):
"""Extract the cookies from the response into a CookieJar.
:param jar: cookielib.CookieJar (not necessarily a RequestsCookieJar)
:param request: our own requests.Request object
:param response: urllib3.HTTPResponse object
"""
# the _original_response field is the wrapped httplib.HTTPResponse object,
req = MockRequest(request)
# pull out the HTTPMessage with the headers and put it in the mock:
res = MockResponse(response)
jar.extract_cookies(res, req)
def _monkeypatch_requests():
"""Monkey patch that problematic cookie extraction function."""
modules = [
# Patch the entire namespace within Python.
requests.adapters,
requests.auth,
requests.cookies,
requests.sessions,
]
for module in modules:
setattr(module, "extract_cookies_to_jar", extract_cookies_to_jar)
def monkeypatch():
_monkeypatch_requests()
```
https://pastebin.com/FgDZfT8y
You should integrate this somehow because is affecting everyone (that's the reason they switch to urlfetch, standard urllib or other primitives). Thanks!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/4039/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4039/timeline
| null |
completed
| null | null | false |
[
"Sorry, but it's not entirely clear to me where this bug is coming from. What code is causing this attribute to be missing? Because I don't think that problem is in the Requests code, is it?",
"The issue: http://stackoverflow.com/questions/18080556/python-gae-requests-not-returning-cookies-after-get-request/44079378#44079378\r\n\r\nWhen using requests library in GAE, the cookies are not stored within the session, therefore you can't do authentications or request some pages that requires enabling cookies.\r\n\r\nThe problem is in the `MockResponse` class and into the `extract_cookies_to_jar`, because is using the old original response within the urllib3 response object and this is absent when ran through GAE which leads to the incapability of retrieving and storing the response cookies which make requests fail badly in most situations.\r\n\r\nYou should replace the code here: https://github.com/kennethreitz/requests/blob/master/requests/cookies.py#L98-L133 with the code I gave you above.. or something similar (selecting the old attribute first and if it's not existing, then getting the headers and splitting them from the new one). Line **126** is the real issue/bug. Also, there is a method called `getheaders` which is not returning. The bug didn't replicate yet because the cookielib from Python is not using that, but how could you let such things into production? https://github.com/kennethreitz/requests/blob/master/requests/cookies.py#L116",
"Ok, so that seems fair.\r\n\r\nPlease feel free to open a pull request with this change.",
"Here's the PR: https://github.com/kennethreitz/requests/pull/4044. Feel free to refactor it as you want.\r\n\r\nTested with GAE on development and production environments and also on normal runs (outside GAE). Thanks!",
"Also, through my patch, I've just discovered that there are some unstable/wrong tests whose issues weren't replicating due to unrelated circumstances: https://travis-ci.org/kennethreitz/requests/jobs/234560896",
"Should be fixed now that we've had a release of urllib3 and requests."
] |
https://api.github.com/repos/psf/requests/issues/4038
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4038/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4038/comments
|
https://api.github.com/repos/psf/requests/issues/4038/events
|
https://github.com/psf/requests/issues/4038
| 229,882,019 |
MDU6SXNzdWUyMjk4ODIwMTk=
| 4,038 |
The Host header is ignored?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1421477?v=4",
"events_url": "https://api.github.com/users/GitBruno/events{/privacy}",
"followers_url": "https://api.github.com/users/GitBruno/followers",
"following_url": "https://api.github.com/users/GitBruno/following{/other_user}",
"gists_url": "https://api.github.com/users/GitBruno/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/GitBruno",
"id": 1421477,
"login": "GitBruno",
"node_id": "MDQ6VXNlcjE0MjE0Nzc=",
"organizations_url": "https://api.github.com/users/GitBruno/orgs",
"received_events_url": "https://api.github.com/users/GitBruno/received_events",
"repos_url": "https://api.github.com/users/GitBruno/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/GitBruno/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/GitBruno/subscriptions",
"type": "User",
"url": "https://api.github.com/users/GitBruno",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2017-05-19T06:25:31Z
|
2021-09-08T10:00:45Z
|
2017-05-19T13:48:17Z
|
NONE
|
resolved
|
I am unable to create a request using a relative path, even when I set the Host header.
`requests.exceptions.MissingSchema: Invalid URL '/wiki/List_of_HTTP_header_fields': No schema supplied. Perhaps you meant http:///wiki/List_of_HTTP_header_fields?`
I expect to be able to configure a Host header and use it in my request:
headers = {'Host': 'en.wikipedia.org/wiki' }
resp = requests.get("/List_of_HTTP_header_fields", headers=headers)
resp = requests.get("/Virtual_hosting", headers=headers)
> The domain name of the server (for virtual hosting), and the TCP port number on which the server is listening. The port number may be omitted if the port is the standard port for the service requested. (Mandatory since HTTP/1.1)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4038/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4038/timeline
| null |
completed
| null | null | false |
[
"Thanks for this!\r\n\r\nI don't know why you expect to be able to do this: Requests has never allowed this, and never documented that it would be possible. We very intentionally require that users provide absolute URIs so there is no ambiguity: for example, in your examples above should Requests be using HTTP or HTTPS?\r\n\r\nThis is just not how Requests works. ",
"I expected this to work as it is part of the protocol (HTTP/1.1):\r\n\r\n> The most common form of Request-URI is that used to identify a resource on an origin server or gateway. In this case the absolute path of the URI MUST be transmitted (see section 3.2.1, abs_path) as the Request-URI, and the network location of the URI (authority) MUST be transmitted in a Host header field. For example, a client wishing to retrieve the resource above directly from the origin server would create a TCP connection to port 80 of the host \"www.w3.org\" and send the lines:\r\n\r\n GET /pub/WWW/TheProject.html HTTP/1.1\r\n Host: www.w3.org\r\n\r\nI can see how this is a can of worms from a security perspective and it deserves a warning. But when I set a hostname manually following the spec I do expect it to work. \r\n\r\nYou know way more about the subject then I do, so I really appreciate your feedback. Either way the error messaging can be improved as I clearly did not intent to go here: `http:///wiki/List_of_HTTP_header_fields` (Check for forward slash character?)",
"Requests is a high-level HTTP library: it performs a number of transformations for you. Ultimately, if you want to be able to craft low-level HTTP requests like this, Requests is not the tool for you: its value comes entirely from providing high-level abstractions and transforming them into the low-level Python bytes format.\r\n\r\nSpecifically, if you do `requests.get('http://host/some/path')`, that will lead to Requests sending:\r\n\r\n```\r\nGET /some/path HTTP/1.1\r\nHost: host\r\n```\r\n\r\nIn general, Requests does not introspect user headers: they are considered to be opaque bytes from our perspective. URLs, however, are structured data, and thus make a much better way to provide us with information about the host you want.",
"That is cool, I did not realise Requests is automatically generating the Host header. I assumed the Host header was empty and Requests kept full path in the GET command.\r\n\r\nI think this issue needs two action points:\r\n\r\n1. Update [Request Manual Custom Headers](http://docs.python-requests.org/en/master/user/quickstart/#custom-headers) to include Host headers info:\r\n\r\n- Authorization headers set with headers= will be overridden if credentials are specified in .netrc, which in turn will be overridden by the auth= parameter.\r\n- Authorization headers will be removed if you get redirected off-host.\r\n- Proxy-Authorization headers will be overridden by proxy credentials provided in the URL.\r\n- Content-Length headers will be overridden when we can determine the length of the content.\r\n- Host headers will be removed, they are automatically generated by Requests.\r\n\r\n2. Create seperate error to make `Invalid URL` more clear. Checking for single forward slash should make clear user is trying to use relative path? Something like `No host found in URL`",
"Host headers will not be removed: we will let you override them, we just don't let them affect our processing at all. But yes, a PR that adds a note about Host headers to that section is a good idea. \r\n\r\nAs to `Invalid URL`, we can certainly do it, but I'm not sure how useful it is. This really doesn't come up very often, as evidenced by the fact that this is the first time in the seven year history of this project that anyone has had this problem. Still, I'm happy to review a pull request adding that to see how it looks. "
] |
https://api.github.com/repos/psf/requests/issues/4037
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4037/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4037/comments
|
https://api.github.com/repos/psf/requests/issues/4037/events
|
https://github.com/psf/requests/issues/4037
| 229,640,099 |
MDU6SXNzdWUyMjk2NDAwOTk=
| 4,037 |
Two transport 'location' headers cause an incorrect url
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4683118?v=4",
"events_url": "https://api.github.com/users/evgenykovalyov/events{/privacy}",
"followers_url": "https://api.github.com/users/evgenykovalyov/followers",
"following_url": "https://api.github.com/users/evgenykovalyov/following{/other_user}",
"gists_url": "https://api.github.com/users/evgenykovalyov/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/evgenykovalyov",
"id": 4683118,
"login": "evgenykovalyov",
"node_id": "MDQ6VXNlcjQ2ODMxMTg=",
"organizations_url": "https://api.github.com/users/evgenykovalyov/orgs",
"received_events_url": "https://api.github.com/users/evgenykovalyov/received_events",
"repos_url": "https://api.github.com/users/evgenykovalyov/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/evgenykovalyov/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/evgenykovalyov/subscriptions",
"type": "User",
"url": "https://api.github.com/users/evgenykovalyov",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-05-18T11:36:54Z
|
2021-09-08T10:00:46Z
|
2017-05-18T15:22:20Z
|
NONE
|
resolved
|
I noticed a bug while trying to simulate a login process for a service. At some point I receive a 302 response where there are two Location headers.
```
HTTP/1.1 302 Found
Location: https://some_url.com/
Accept-Ranges: bytes
Location: https://some_url.com/
```
While using the ```Session``` object, I noticed that both Location values are merged into one value like this:
```python
{'Location':'https://some_url.com/, https://some_url.com/'}
```
I added a temp split solution into ```SessionRedirectMixin.get_redirect_target()```, but I was wondering if you could come up with a permanent fix for this please?
```python
if resp.is_redirect:
location = resp.headers['location']
location = location.split(', ')[0]
```
Thanks in advance.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4037/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4037/timeline
| null |
completed
| null | null | false |
[
"So I'm rather certain this is an issue with urllib3's Header handler. It needs a whitelist of which headers are allowed to be repeated and which can be joined with a `,` like you found here.",
"Hey @evgenykovalyov, thanks for taking the time to open this issue. This is actually a dupe of #2939 and should be solved in Requests 3.0.0 with #3417. You'll likely have to wait until that's released for a more permanent solution. That is unless @sigmavirus24 has something else in mind.\r\n\r\nI'm going to close this as resolved for now. Thanks again!"
] |
https://api.github.com/repos/psf/requests/issues/4036
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4036/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4036/comments
|
https://api.github.com/repos/psf/requests/issues/4036/events
|
https://github.com/psf/requests/pull/4036
| 229,502,963 |
MDExOlB1bGxSZXF1ZXN0MTIxMTY0MzE0
| 4,036 |
Pin sphinx
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-05-17T22:16:11Z
|
2021-09-06T00:06:57Z
|
2017-05-17T22:47:49Z
|
MEMBER
|
resolved
|
This will fix the build issues in Python 2.6 due to Sphinx's new requirement of `typing` in versions 1.6+. This pinning (capping) should only be in place until a solution for representing markers is introduced in Pipfile.
While we're here doing cleanup, I've also removed the no longer needed `pipenv lock` before each build. Some preliminary tests show it saving 15-20% on the total build time for CI. We can scrap that though if it's not inline with the goal of the PR, getting the build working.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4036/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4036/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4036.diff",
"html_url": "https://github.com/psf/requests/pull/4036",
"merged_at": "2017-05-17T22:47:49Z",
"patch_url": "https://github.com/psf/requests/pull/4036.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4036"
}
| true |
[
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4036?src=pr&el=h1) Report\n> Merging [#4036](https://codecov.io/gh/kennethreitz/requests/pull/4036?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/0280a18b53aef5e09fa0a2886077b0ede5e89e79?src=pr&el=desc) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/4036?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #4036 +/- ##\n=======================================\n Coverage 89.74% 89.74% \n=======================================\n Files 15 15 \n Lines 1940 1940 \n=======================================\n Hits 1741 1741 \n Misses 199 199\n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4036?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4036?src=pr&el=footer). Last update [0280a18...dbdbffd](https://codecov.io/gh/kennethreitz/requests/pull/4036?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n"
] |
https://api.github.com/repos/psf/requests/issues/4035
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4035/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4035/comments
|
https://api.github.com/repos/psf/requests/issues/4035/events
|
https://github.com/psf/requests/issues/4035
| 229,441,939 |
MDU6SXNzdWUyMjk0NDE5Mzk=
| 4,035 |
The case of the haunted stream request
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/381280?v=4",
"events_url": "https://api.github.com/users/jacquerie/events{/privacy}",
"followers_url": "https://api.github.com/users/jacquerie/followers",
"following_url": "https://api.github.com/users/jacquerie/following{/other_user}",
"gists_url": "https://api.github.com/users/jacquerie/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jacquerie",
"id": 381280,
"login": "jacquerie",
"node_id": "MDQ6VXNlcjM4MTI4MA==",
"organizations_url": "https://api.github.com/users/jacquerie/orgs",
"received_events_url": "https://api.github.com/users/jacquerie/received_events",
"repos_url": "https://api.github.com/users/jacquerie/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jacquerie/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jacquerie/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jacquerie",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-05-17T18:08:29Z
|
2021-09-08T10:00:48Z
|
2017-05-17T18:15:29Z
|
NONE
|
resolved
|
Hi,
I'm not sure if this is a bug in `requests` or a misconfiguration in the server I'm interacting with, but I have the following very puzzling result:
```python
>>> from requests import get
>>> response = get('http://edok01.tib.uni-hannover.de/edoks/e01dh15/815761511.pdf', stream=True)
>>> next(response.iter_content(4))
'%PD'
```
which returns 3 characters instead of the 4 that I requested, but also:
```python
>>> from requests import get
>>> response = get('http://edok01.tib.uni-hannover.de/edoks/e01dh15/815761511.pdf', stream=True)
>>> iter = response.iter_content(1)
>>> next(iter)
'%'
>>> next(iter)
'P'
>>> next(iter)
'D'
>>> next(iter)
'F'
```
which instead returns all the 4 characters. Note also that this happens with both `unicode_decode=True` or `unicode_decode=False`.
How is this possible? This is happening with `requests==2.14.2`. The description of the problem in https://github.com/kennethreitz/requests/issues/3980#issue-223287665 looks relevant, but its fix in #3984 looks less relevant. In fact, I still see the problem on a local checkout of `master`.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4035/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4035/timeline
| null |
completed
| null | null | false |
[
"`iter_content` transparently decompresses gzipped data received from the server. This means that when we receive bytes from the network we don't know how many bytes that will actually resolve into. For that reason, we don't guarantee that you'll get literally that number of bytes: we just promise that you won't get the empty string. That's why `iter_content(1)` appears to \"work\".\r\n\r\nThis is the expected behaviour, I'm afraid.",
"Ah, thanks a lot. At one point I had the feeling it had something to do with compression, but I didn't investigate in that direction."
] |
https://api.github.com/repos/psf/requests/issues/4034
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4034/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4034/comments
|
https://api.github.com/repos/psf/requests/issues/4034/events
|
https://github.com/psf/requests/issues/4034
| 229,441,147 |
MDU6SXNzdWUyMjk0NDExNDc=
| 4,034 |
POST header Content-Length with negative value
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/28763083?v=4",
"events_url": "https://api.github.com/users/msvicky1206/events{/privacy}",
"followers_url": "https://api.github.com/users/msvicky1206/followers",
"following_url": "https://api.github.com/users/msvicky1206/following{/other_user}",
"gists_url": "https://api.github.com/users/msvicky1206/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/msvicky1206",
"id": 28763083,
"login": "msvicky1206",
"node_id": "MDQ6VXNlcjI4NzYzMDgz",
"organizations_url": "https://api.github.com/users/msvicky1206/orgs",
"received_events_url": "https://api.github.com/users/msvicky1206/received_events",
"repos_url": "https://api.github.com/users/msvicky1206/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/msvicky1206/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/msvicky1206/subscriptions",
"type": "User",
"url": "https://api.github.com/users/msvicky1206",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-05-17T18:05:20Z
|
2021-09-08T10:00:48Z
|
2017-05-17T18:10:06Z
|
NONE
|
resolved
|
Im new to requests and to http methods.
Im trying to test an server accepting http calls. As part of validation, Im trying to set negative value to 'Content-Length' header and expect 400 'Bad request' but requests.post() returns 200.
I also observed that in POST request headers, content-length is modified to actual value as per the file data being uploaded.
```python
In [75]: headers={'Date': 'Wed, 17 May 2017 08:45:39 GMT', 'Content-Length': '-50'}
In [76]: data = open('/tmp/50').read()
In [77]: out = requests.post(url=url, headers=headers, data=data)
In [78]: out.request.headers
Out[78]: {'Content-Length': '50', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'User-Agent': 'python-requests/2.14.2', 'Connection': 'keep-alive', 'Date': 'Wed, 17 May 2017 08:45:39 GMT'}
In [79]: out.headers
Out[79]: { 'Content-Length': '37', 'Server': '1/1.0/', 'Connection': 'keep-alive', 'Date': 'Wed, 17 May 2017 18:01:08 GMT'}
In [80]: out.status_code
Out[80]: 201
```
Am i missing something here?
I have tested the same with curl, which responds back with 400 response code.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4034/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4034/timeline
| null |
completed
| null | null | false |
[
"Requests is not intended as a HTTP testing tool: in particular, it is not intended to allow you to craft malformed HTTP requests. If that's what you're looking for you'll need to find a different tool for that."
] |
https://api.github.com/repos/psf/requests/issues/4033
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4033/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4033/comments
|
https://api.github.com/repos/psf/requests/issues/4033/events
|
https://github.com/psf/requests/pull/4033
| 229,436,675 |
MDExOlB1bGxSZXF1ZXN0MTIxMTE1MzAw
| 4,033 |
Test that the readme renders in CI.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-05-17T17:47:58Z
|
2021-09-06T00:06:56Z
|
2017-05-18T01:41:01Z
|
MEMBER
|
resolved
|
Resolves #4030.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4033/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4033/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4033.diff",
"html_url": "https://github.com/psf/requests/pull/4033",
"merged_at": "2017-05-18T01:41:01Z",
"patch_url": "https://github.com/psf/requests/pull/4033.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4033"
}
| true |
[
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4033?src=pr&el=h1) Report\n> Merging [#4033](https://codecov.io/gh/kennethreitz/requests/pull/4033?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/da292953695122c84ac9e95b115142532a180fe1?src=pr&el=desc) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/4033?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #4033 +/- ##\n=======================================\n Coverage 89.74% 89.74% \n=======================================\n Files 15 15 \n Lines 1940 1940 \n=======================================\n Hits 1741 1741 \n Misses 199 199\n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4033?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4033?src=pr&el=footer). Last update [da29295...570c4c8](https://codecov.io/gh/kennethreitz/requests/pull/4033?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"Merge this when everything else is ✔️ "
] |
https://api.github.com/repos/psf/requests/issues/4032
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4032/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4032/comments
|
https://api.github.com/repos/psf/requests/issues/4032/events
|
https://github.com/psf/requests/issues/4032
| 229,423,331 |
MDU6SXNzdWUyMjk0MjMzMzE=
| 4,032 |
Allow verify parameter to take a file-like object such as StringIO
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7624997?v=4",
"events_url": "https://api.github.com/users/ronanpaixao/events{/privacy}",
"followers_url": "https://api.github.com/users/ronanpaixao/followers",
"following_url": "https://api.github.com/users/ronanpaixao/following{/other_user}",
"gists_url": "https://api.github.com/users/ronanpaixao/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ronanpaixao",
"id": 7624997,
"login": "ronanpaixao",
"node_id": "MDQ6VXNlcjc2MjQ5OTc=",
"organizations_url": "https://api.github.com/users/ronanpaixao/orgs",
"received_events_url": "https://api.github.com/users/ronanpaixao/received_events",
"repos_url": "https://api.github.com/users/ronanpaixao/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ronanpaixao/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ronanpaixao/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ronanpaixao",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2017-05-17T16:56:54Z
|
2021-09-01T00:11:24Z
|
2017-05-17T16:59:05Z
|
NONE
|
resolved
|
Sometimes it is useful and more efficient to have SSL certificate data on memory, like when grabbing certificates from the Windows Certificate Store or some other non-file source.
As it is implemented now, requests neither grabs the certificates from the OS nor allows simple usage from a file-like object. Thus, the user is forced to store the certificates in a temporary file to allow usage, or the request fails when trying `os.path.isdir` (line 224 of `adapters.py` on requests version 2.12.4).
The unfortunate consequence is also that the dev has to keep the file as long as sessions, which is awkward specially for sessions which only request data occasionally. Otherwise, one has to grab all certificates again for each request. Both options also have the problem of leaving temp files if the process is terminated abruptly.
Example use case (on Windows 7, Python 3.6):
```python
import ssl
import requests
from io import StringIO # For Python 3.x
from tempfile import NamedTemporaryFile
# A site for which we have the CA in Windows Certificate Store (case of intranet on AD)
url = "https://some.secure.intranet.site"
requests.get(url) # Raises SSLError
# Grab certificates from Windows Certificate Store
# delete=False is required for some reason
tempcertfile = NamedTemporaryFile('w', encoding='utf8',delete=False)
memcertfile = StringIO()
context = ssl.create_default_context()
der_certs = context.get_ca_certs(binary_form=True)
pem_certs = [ssl.DER_cert_to_PEM_cert(der) for der in der_certs]
for pem in pem_certs:
tempcertfile.write(pem + '\n')
memcertfile.write(pem + '\n')
tempcertfile.seek(0)
memcertfile.seek(0)
requests.get(url, verify=tempcertfile.name) # Works
requests.get(url, verify=memcertfile) # Errors with a TypeError on adapters.py line 224
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 4,
"-1": 0,
"confused": 0,
"eyes": 1,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 5,
"url": "https://api.github.com/repos/psf/requests/issues/4032/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4032/timeline
| null |
completed
| null | null | false |
[
"Requests is unlikely to support this on the `verify` kwarg because it interacts very poorly with the API for pointing to a directory or file. However, you can use a Transport Adapter to pass an SSLContext object to do the validation, and this object can almost certainly load the data from a string. See [this explanation](https://lukasa.co.uk/2017/02/Configuring_TLS_With_Requests/), which indicates how to provide an SSLContext.",
"Also came here to see how can we provide string alike object. Very useful if you store certificates in database and do not want to use temp files.",
"I think accepting a file-like object for `verify` should be re-considered. Adding a trusted certificate in-memory to `requests` is way harder than it should be.\r\n\r\nI took a look at @Lukasa's `SSLContext` suggestion but it seems far too low-level. Certainly not something I would even consider using as it's bound to eventually break in future versions.",
"Having the ability to use an in-memory key and cert is very useful in oddball environments like cloud functions/lambda where otherwise one must do the named temp file dance. \r\n\r\nDistinguishing between a str and file is fairly straightforward: `isinstance(something, io.IOBase)` in Python 3. \r\n\r\nUsing SSLContext is very un-Requests. It requires a deep understanding of the nature of TLS connections to do right, which is what Requests shouldn't require.",
"> Having the ability to use an in-memory key and cert is very useful in oddball environments like cloud functions/lambda where otherwise one must do the named temp file dance.\r\n\r\n> Using SSLContext is very un-Requests. It requires a deep understanding of the nature of TLS connections to do right, which is what Requests shouldn't require.\r\n\r\n:+1: \r\n\r\nWe are progressively moving towards cloud environments where programs are packaged as [12 factor apps](https://12factor.net/). Many deployment environments promote passing config as environment variables (or reading from remote secret stores) over traditional file-based config.\r\n\r\n> Requests is unlikely to support this on the verify kwarg because it interacts very poorly with the API for pointing to a directory or file.\r\n\r\nWould you be willing to reconsider this in 2020 if someone provided a patch?\r\n\r\nPossible workaround by using tempfiles in https://stackoverflow.com/questions/30598950/python-requests-send-certificate-as-string/46570264",
"@Lukasa can you provide an example of using an adapter. I've tried the [example in your blog](https://lukasa.co.uk/2017/02/Configuring_TLS_With_Requests/), but it doesn't work.\r\n\r\n```python\r\nimport ssl\r\nimport requests\r\nfrom requests.adapters import HTTPAdapter\r\n\r\nclass Adapter(HTTPAdapter):\r\n def init_poolmanager(self, *args, **kwargs):\r\n ctx = ssl.create_default_context(cadata=<data>)\r\n ctx.check_hostname = False\r\n ctx.verify_mode = ssl.CERT_NONE\r\n kwargs['ssl_context'] = ctx\r\n return super().init_poolmanager(*args, **kwargs)\r\n\r\ns = requests.Session()\r\ns.mount('https://foo', Adapter())\r\n```",
"As far as I can tell this _requires_ changes to the stdlib ssl module or the use of pyOpenSSL's implementation of SSL contexts since that exposes the required functionality. \r\n\r\nUnfortunately it appears that [BPO-16487](https://bugs.python.org/issue16487) has been outstanding for _8 years_ and counting..."
] |
https://api.github.com/repos/psf/requests/issues/4031
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4031/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4031/comments
|
https://api.github.com/repos/psf/requests/issues/4031/events
|
https://github.com/psf/requests/issues/4031
| 229,392,725 |
MDU6SXNzdWUyMjkzOTI3MjU=
| 4,031 |
unvendor dependencies for 3.0.0
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"color": "fbca04",
"default": false,
"description": null,
"id": 44501249,
"name": "Needs BDFL Input",
"node_id": "MDU6TGFiZWw0NDUwMTI0OQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input"
},
{
"color": "fef2c0",
"default": false,
"description": null,
"id": 298537994,
"name": "Needs More Information",
"node_id": "MDU6TGFiZWwyOTg1Mzc5OTQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20More%20Information"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
},
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
},
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2013-11-17T11:29:34Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": null,
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/20",
"id": 487518,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels",
"node_id": "MDk6TWlsZXN0b25lNDg3NTE4",
"number": 20,
"open_issues": 12,
"state": "open",
"title": "3.0.0",
"updated_at": "2024-05-19T18:43:00Z",
"url": "https://api.github.com/repos/psf/requests/milestones/20"
}
| 17 |
2017-05-17T15:26:43Z
|
2021-09-08T10:00:37Z
|
2017-05-27T03:50:42Z
|
CONTRIBUTOR
|
resolved
|
thoughts?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4031/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4031/timeline
| null |
completed
| null | null | false |
[
"/cc @sigmavirus24 @Lukasa \r\n\r\nI'm not sold that it's a good idea. I just like change. Maybe it's time. ",
"I'm not sold on it being a good idea either. Presently projects *may* do the following:\r\n\r\n```\r\n# requirements.txt\r\nrequests>=2.14.2\r\nurllib3==1.20.0\r\nchardet<2.0.0\r\n```\r\n\r\nIn this case, that would break Requests 2.14.2. We'd have to expand our CI to start testing against a wider breadth of our dependencies to allow for users doing something like above. In part, this problem is due to Pip not having a dependency resolver still that would tell the user: `Hey, Requests needs at least urllib3 1.21.2 and chardet 2.0.0 so please resolve your dependencies.`\r\n\r\nAt the same time, I'd like us to stop being the project everyone points at and berates because once a year someone on reddit realizes we vendor our dependencies for good reason but is shocked, horrified, mortified, and supposedly going back to httplib.",
"So, I'm open to doing this if and when pip gets a proper dependency solver. A little birdie told me that pip has a GSOC student who is working on doing exactly that, so if that lands I'm open to us considering that change.",
"hell let's just do it. assigning to myself. \r\n\r\nminor version bump. \r\n\r\nI'll be pragmatic and require the latest version of urllib3 or greater (or less than the next major). ",
"We'll also emit warnings for incompatible versions at import time. ",
"> We'll also emit warnings for incompatible versions at import time.\r\n\r\nWarnings are often ignored by users. This should be a hard exception if we're going to bother with version checks at all.",
"interesting idea. I like it. \r\n\r\n@sigmavirus24 can you provide me with a list of our compatible version ranges, if you're up to helping out with this? Would be much appreciated. ",
"would be nice if Python had an IncompatiblePackages warning. Should we subclass or use RuntimeError?",
"The closest thing to an `IncompatiblePackages` warning is using `pkg_resources` and by \"using\" I mean importing it. The problem with that is that importing `pkg_resources` kicks off a scan of your entire site-packages directory which can be *very* slow especially if you've done something silly like install several OpenStack services into one site-packages directory (hint, that's not actually that silly).\r\n\r\nI don't *like* subclassing RuntimeError but I can't think of a better option at the moment (then again I'm not really thinking too hard about this).\r\n\r\nWhat I'm worried about is maintaining versions that we use to check for incompatible requirements and maintaining the versions in our setup.py/setup.cfg and how we might do that with the *least* amount of pain.",
"I'll find a DRY solution. ",
"Thanks for your thoughts! Much appreciated. I think this will reduce maintinence burden, in an ideal world. ",
"Ian, do you want to get me that list? If not, I'll make another issue for a new contributor to maybe ",
"Meaning, no rush, just wondering :)",
"List of dependencies? As they are now,\r\n\r\n```\r\nchardet >= 3.0.2, < 3.1.0\r\nidna >= 2.5, < 2.6\r\nurllib3 >= 1.21.1, < 1.22\r\n```\r\n\r\nThat said, I'm struggling to think of a way to cover all of our bases:\r\n\r\n1. setup.cfg needs a static listing of requirements (that's not python and thus not easily determined unless we try to generate it as part of our release/build process somehow ... which I don't think we should try)\r\n\r\n2. setup.py can use a dynamic list in a file like `__version__.py`\r\n\r\n3. We then need to use the list and dynamically assert things about our dependencies when we import them.\r\n\r\nI am thinking what we might want is to make an attempt to go for 90% static package configuration via `setup.cfg`. It's technically plausible and we could then use something like `pkginfo` to introspect our selves to find our dependencies. It'll be gross and incur another dependency that will make little sense, but it's a start?",
"@sigmavirus24 do you want to get a working prototype in place?",
"related: #4067 4067",
"Done!"
] |
https://api.github.com/repos/psf/requests/issues/4030
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4030/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4030/comments
|
https://api.github.com/repos/psf/requests/issues/4030/events
|
https://github.com/psf/requests/issues/4030
| 229,375,476 |
MDU6SXNzdWUyMjkzNzU0NzY=
| 4,030 |
fix pypi page
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
},
{
"color": "e102d8",
"default": false,
"description": null,
"id": 117745,
"name": "Planned",
"node_id": "MDU6TGFiZWwxMTc3NDU=",
"url": "https://api.github.com/repos/psf/requests/labels/Planned"
},
{
"color": "0b02e1",
"default": false,
"description": null,
"id": 191274,
"name": "Contributor Friendly",
"node_id": "MDU6TGFiZWwxOTEyNzQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly"
},
{
"color": "fad8c7",
"default": false,
"description": null,
"id": 136616769,
"name": "Documentation",
"node_id": "MDU6TGFiZWwxMzY2MTY3Njk=",
"url": "https://api.github.com/repos/psf/requests/labels/Documentation"
}
] |
closed
| true | null |
[] | null | 3 |
2017-05-17T14:38:38Z
|
2021-09-08T10:00:47Z
|
2017-05-18T01:41:01Z
|
CONTRIBUTOR
|
resolved
|
markup is bad
https://pypi.python.org/pypi/requests/
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4030/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4030/timeline
| null |
completed
| null | null | false |
[
"The `readme` package would help us detect this if we added it to our CI",
"@sigmavirus24 i'm all for it! I think having that page in tip-top shape is top-priority!",
"See #4033."
] |
https://api.github.com/repos/psf/requests/issues/4029
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4029/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4029/comments
|
https://api.github.com/repos/psf/requests/issues/4029/events
|
https://github.com/psf/requests/pull/4029
| 229,373,463 |
MDExOlB1bGxSZXF1ZXN0MTIxMDcyOTE3
| 4,029 |
Fix #4025
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/839339?v=4",
"events_url": "https://api.github.com/users/kaptajnen/events{/privacy}",
"followers_url": "https://api.github.com/users/kaptajnen/followers",
"following_url": "https://api.github.com/users/kaptajnen/following{/other_user}",
"gists_url": "https://api.github.com/users/kaptajnen/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kaptajnen",
"id": 839339,
"login": "kaptajnen",
"node_id": "MDQ6VXNlcjgzOTMzOQ==",
"organizations_url": "https://api.github.com/users/kaptajnen/orgs",
"received_events_url": "https://api.github.com/users/kaptajnen/received_events",
"repos_url": "https://api.github.com/users/kaptajnen/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kaptajnen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kaptajnen/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kaptajnen",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 10 |
2017-05-17T14:32:46Z
|
2021-09-06T00:06:55Z
|
2017-05-18T16:05:41Z
|
CONTRIBUTOR
|
resolved
|
Fixes #4025 by basically not doing anything when HTTP is used
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4029/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4029/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4029.diff",
"html_url": "https://github.com/psf/requests/pull/4029",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/4029.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4029"
}
| true |
[
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4029?src=pr&el=h1) Report\n> Merging [#4029](https://codecov.io/gh/kennethreitz/requests/pull/4029?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/0280a18b53aef5e09fa0a2886077b0ede5e89e79?src=pr&el=desc) will **increase** coverage by `<.01%`.\n> The diff coverage is `100%`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/4029?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #4029 +/- ##\n==========================================\n+ Coverage 89.74% 89.74% +<.01% \n==========================================\n Files 15 15 \n Lines 1940 1941 +1 \n==========================================\n+ Hits 1741 1742 +1 \n Misses 199 199\n```\n\n\n| [Impacted Files](https://codecov.io/gh/kennethreitz/requests/pull/4029?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/adapters.py](https://codecov.io/gh/kennethreitz/requests/pull/4029?src=pr&el=tree#diff-cmVxdWVzdHMvYWRhcHRlcnMucHk=) | `92.92% <100%> (+0.03%)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4029?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4029?src=pr&el=footer). Last update [0280a18...2c6ad70](https://codecov.io/gh/kennethreitz/requests/pull/4029?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"I've updated the request with your suggestion. Initially I was thinking about doing this, however it seemed like a bad practice to add random attribute to objects (at least to me). I do agree that it is a simpler fix though.",
"> Initially I was thinking about doing this, however it seemed like a bad practice to add random attribute to objects (at least to me). \r\n\r\nI get that, but we were already doing it in the line above: this time around we're just being consistent. :wink:",
"Ah, yes, I forgot the Travis builds are busted. We'll need to hold off until they get fixed.",
"Ok cool, mind rebasing on the new master?",
"This looks like you rebased *and* merged in master, so you have duplicate commits. Do you mind fixing that up for me?",
"Sorry I guess I do not know what I'm doing (have not used rebase before). What would I need to do to fix it?",
"The other option is I can just handle the merge manually, I suppose. =) Might be the simpler thing to do. Is that suitable to you?",
"That's fine with me",
"Ok, I merged manually in fb3f2a030f82475145369b01adbee68d17bd4f43. Thanks for the work!"
] |
https://api.github.com/repos/psf/requests/issues/4028
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4028/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4028/comments
|
https://api.github.com/repos/psf/requests/issues/4028/events
|
https://github.com/psf/requests/issues/4028
| 229,316,732 |
MDU6SXNzdWUyMjkzMTY3MzI=
| 4,028 |
Session send auth only with one request
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/16837024?v=4",
"events_url": "https://api.github.com/users/Smosker/events{/privacy}",
"followers_url": "https://api.github.com/users/Smosker/followers",
"following_url": "https://api.github.com/users/Smosker/following{/other_user}",
"gists_url": "https://api.github.com/users/Smosker/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Smosker",
"id": 16837024,
"login": "Smosker",
"node_id": "MDQ6VXNlcjE2ODM3MDI0",
"organizations_url": "https://api.github.com/users/Smosker/orgs",
"received_events_url": "https://api.github.com/users/Smosker/received_events",
"repos_url": "https://api.github.com/users/Smosker/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Smosker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Smosker/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Smosker",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-05-17T11:10:20Z
|
2021-09-08T10:00:49Z
|
2017-05-17T13:00:48Z
|
NONE
|
resolved
|
Using session object like this:
```
import requests
session = requests.Session()
session.auth = (u'user', 'test')
session.verify = False
response = session.get(url='https://my_url/rest/api/1.0/users')
```
if i looking for this response request headers i see:
`{'Authorization': 'Basic auth_data', 'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'User-Agent': 'python-requests/2.12.3'}`
but if i send next request using the same url:
`response = session.get(url='https://my_url/rest/api/1.0/users')`
I can see that where is not auth header in request:
```
print response.request.headers
{'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'User-Agent': 'python-requests/2.12.3'}
```
And i getting 401 response because of it.
Why is it so? Shouldnt session send auth with every request made using it? How can i send auth data with every request using session?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/16837024?v=4",
"events_url": "https://api.github.com/users/Smosker/events{/privacy}",
"followers_url": "https://api.github.com/users/Smosker/followers",
"following_url": "https://api.github.com/users/Smosker/following{/other_user}",
"gists_url": "https://api.github.com/users/Smosker/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Smosker",
"id": 16837024,
"login": "Smosker",
"node_id": "MDQ6VXNlcjE2ODM3MDI0",
"organizations_url": "https://api.github.com/users/Smosker/orgs",
"received_events_url": "https://api.github.com/users/Smosker/received_events",
"repos_url": "https://api.github.com/users/Smosker/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Smosker/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Smosker/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Smosker",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4028/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4028/timeline
| null |
completed
| null | null | false |
[
"Solve problem - it was because of redirect"
] |
https://api.github.com/repos/psf/requests/issues/4027
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4027/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4027/comments
|
https://api.github.com/repos/psf/requests/issues/4027/events
|
https://github.com/psf/requests/pull/4027
| 229,196,312 |
MDExOlB1bGxSZXF1ZXN0MTIwOTQ2MTY0
| 4,027 |
Include tests in sdist tarball
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/504287?v=4",
"events_url": "https://api.github.com/users/floppym/events{/privacy}",
"followers_url": "https://api.github.com/users/floppym/followers",
"following_url": "https://api.github.com/users/floppym/following{/other_user}",
"gists_url": "https://api.github.com/users/floppym/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/floppym",
"id": 504287,
"login": "floppym",
"node_id": "MDQ6VXNlcjUwNDI4Nw==",
"organizations_url": "https://api.github.com/users/floppym/orgs",
"received_events_url": "https://api.github.com/users/floppym/received_events",
"repos_url": "https://api.github.com/users/floppym/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/floppym/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/floppym/subscriptions",
"type": "User",
"url": "https://api.github.com/users/floppym",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-05-16T23:24:10Z
|
2021-09-06T00:06:56Z
|
2017-05-18T15:24:27Z
|
CONTRIBUTOR
|
resolved
|
Fixes: https://github.com/kennethreitz/requests/issues/4024
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/504287?v=4",
"events_url": "https://api.github.com/users/floppym/events{/privacy}",
"followers_url": "https://api.github.com/users/floppym/followers",
"following_url": "https://api.github.com/users/floppym/following{/other_user}",
"gists_url": "https://api.github.com/users/floppym/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/floppym",
"id": 504287,
"login": "floppym",
"node_id": "MDQ6VXNlcjUwNDI4Nw==",
"organizations_url": "https://api.github.com/users/floppym/orgs",
"received_events_url": "https://api.github.com/users/floppym/received_events",
"repos_url": "https://api.github.com/users/floppym/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/floppym/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/floppym/subscriptions",
"type": "User",
"url": "https://api.github.com/users/floppym",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4027/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4027/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4027.diff",
"html_url": "https://github.com/psf/requests/pull/4027",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/4027.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4027"
}
| true |
[
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4027?src=pr&el=h1) Report\n> Merging [#4027](https://codecov.io/gh/kennethreitz/requests/pull/4027?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/0280a18b53aef5e09fa0a2886077b0ede5e89e79?src=pr&el=desc) will **decrease** coverage by `0.15%`.\n> The diff coverage is `n/a`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/4027?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #4027 +/- ##\n==========================================\n- Coverage 89.74% 89.58% -0.16% \n==========================================\n Files 15 15 \n Lines 1940 1940 \n==========================================\n- Hits 1741 1738 -3 \n- Misses 199 202 +3\n```\n\n\n| [Impacted Files](https://codecov.io/gh/kennethreitz/requests/pull/4027?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [...es/urllib3/packages/ssl\\_match\\_hostname/\\_\\_init\\_\\_.py](https://codecov.io/gh/kennethreitz/requests/pull/4027?src=pr&el=tree#diff-cmVxdWVzdHMvcGFja2FnZXMvdXJsbGliMy9wYWNrYWdlcy9zc2xfbWF0Y2hfaG9zdG5hbWUvX19pbml0X18ucHk=) | `83.33% <0%> (-10%)` | :arrow_down: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4027?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4027?src=pr&el=footer). Last update [0280a18...f823125](https://codecov.io/gh/kennethreitz/requests/pull/4027?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n"
] |
https://api.github.com/repos/psf/requests/issues/4026
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4026/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4026/comments
|
https://api.github.com/repos/psf/requests/issues/4026/events
|
https://github.com/psf/requests/pull/4026
| 229,192,828 |
MDExOlB1bGxSZXF1ZXN0MTIwOTQzNTg0
| 4,026 |
Include tests in sdist tarball
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/504287?v=4",
"events_url": "https://api.github.com/users/floppym/events{/privacy}",
"followers_url": "https://api.github.com/users/floppym/followers",
"following_url": "https://api.github.com/users/floppym/following{/other_user}",
"gists_url": "https://api.github.com/users/floppym/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/floppym",
"id": 504287,
"login": "floppym",
"node_id": "MDQ6VXNlcjUwNDI4Nw==",
"organizations_url": "https://api.github.com/users/floppym/orgs",
"received_events_url": "https://api.github.com/users/floppym/received_events",
"repos_url": "https://api.github.com/users/floppym/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/floppym/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/floppym/subscriptions",
"type": "User",
"url": "https://api.github.com/users/floppym",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 10 |
2017-05-16T23:02:02Z
|
2021-09-06T00:06:56Z
|
2017-05-18T14:50:40Z
|
CONTRIBUTOR
|
resolved
|
Fixes: https://github.com/kennethreitz/requests/issues/4024
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4026/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4026/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4026.diff",
"html_url": "https://github.com/psf/requests/pull/4026",
"merged_at": "2017-05-18T14:50:40Z",
"patch_url": "https://github.com/psf/requests/pull/4026.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4026"
}
| true |
[
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4026?src=pr&el=h1) Report\n> Merging [#4026](https://codecov.io/gh/kennethreitz/requests/pull/4026?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/0280a18b53aef5e09fa0a2886077b0ede5e89e79?src=pr&el=desc) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/4026?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #4026 +/- ##\n=======================================\n Coverage 89.74% 89.74% \n=======================================\n Files 15 15 \n Lines 1940 1940 \n=======================================\n Hits 1741 1741 \n Misses 199 199\n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4026?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4026?src=pr&el=footer). Last update [0280a18...2667c77](https://codecov.io/gh/kennethreitz/requests/pull/4026?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"I'd rather take the recursive-include for now. ",
"Given that this was overlooked for a year, it seems like an easy place to forget about again. Perhaps simply changing it to `recursive-include tests *` which has worked for `urllib3`? You handle the releases though, so whatever you think works best, I'm on board with.\r\n\r\nAs for the failure, it's due to our Sphinx dependency in dev-packages. A release that came out today finally broke 2.6 which Sphinx stopped supporting a while ago. We'll likely want to pin it at `<=1.5.5` until we deprecate 2.6 support ourselves.",
"> Given that this was overlooked for a year, it seems like an easy place to forget about again.\r\n\r\nIt's especially easy given that tests aren't part of the actual package and should only be included in source archives, not source distributions. But *shrug* I'm not about to have this disagreement with another downstream redistributor.\r\n\r\nI'd also like to avoid capping Sphinx to preserve 2.6 support. I'd rather Pipfile joined the time-period it was born into and understood dependencies that were for specific versions of Python or specific environments.",
"> It's especially easy given that tests aren't part of the actual package and should only be included in source archives, not source distributions.\r\n\r\nWhat's the difference between a \"source archive\" and a \"source distribution\"? I'm missing your point.\r\n\r\nMy goal is to be able to run the tests as part of the Gentoo Linux build script for this package. We skip this currently; otherwise I would have noticed the missing files sooner.",
"> What's the difference between a \"source archive\" and a \"source distribution\"? I'm missing your point.\r\n\r\nOne (a source archive) is literally the archive of the repository made by setuptools and uploaded to PyPI. The other (a source distribution) is meant to actually be installed by things Python tooling for an end user. One is meant as historical data, the other as an actual installable product (for lack of a better term). One can be downloaded by downstream redistributors and used to run tests, the other is meant to be downloaded and used by users.\r\n\r\n> My goal is to be able to run the tests as part of the Gentoo Linux build script for this package. We skip this currently; otherwise I would have noticed the missing files sooner.\r\n\r\nAnd that's an admirable goal. No one wants a downstream redistributor to package something incorrectly and only find out about it from users.",
"@sigmavirus24 I'm sympathetic to wanting the improvement from pipfile, but adding that feature will likely take longer than us just pinning sphinx. We can remove the pin when pipfile improves. Does that sound reasonable?",
"+1 to pinning",
"If we want to pin, let's pin. At the moment we're capping. I'm fine with pinning *temporarily*.",
"Closing and re-opening to trigger new builds with our fixed dependencies"
] |
https://api.github.com/repos/psf/requests/issues/4025
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4025/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4025/comments
|
https://api.github.com/repos/psf/requests/issues/4025/events
|
https://github.com/psf/requests/issues/4025
| 229,176,490 |
MDU6SXNzdWUyMjkxNzY0OTA=
| 4,025 |
Session with cert set can't do HTTP connections
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/839339?v=4",
"events_url": "https://api.github.com/users/kaptajnen/events{/privacy}",
"followers_url": "https://api.github.com/users/kaptajnen/followers",
"following_url": "https://api.github.com/users/kaptajnen/following{/other_user}",
"gists_url": "https://api.github.com/users/kaptajnen/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kaptajnen",
"id": 839339,
"login": "kaptajnen",
"node_id": "MDQ6VXNlcjgzOTMzOQ==",
"organizations_url": "https://api.github.com/users/kaptajnen/orgs",
"received_events_url": "https://api.github.com/users/kaptajnen/received_events",
"repos_url": "https://api.github.com/users/kaptajnen/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kaptajnen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kaptajnen/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kaptajnen",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-05-16T21:37:15Z
|
2021-09-08T10:00:46Z
|
2017-05-18T16:05:15Z
|
CONTRIBUTOR
|
resolved
|
It is not clear from the documentation if this is supposed to work, but it did work previously.
If you set the cert attribute on a Session and then make an HTTP request, an AttributeError exception is thrown since `key_file` does not exist on an HTTPConnectionPool object. This behavior was introduced in [this commit](https://github.com/kennethreitz/requests/commit/7d8b87c37f3a5fb993fd83eda6888ac217cd108e#diff-31f6e77c031977d33226530924b4337aR243)
From what I can tell it would continue to work if the certificate was set as a tuple instead, since that would then add the key_file attribute to the HTTPConnectionPool a few lines above.
>>> import requests
>>> s = requests.Session()
>>> s.cert='/path/to/file.pem'
>>> s.get('http://example.com')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 531, in get
return self.request('GET', url, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 518, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/sessions.py", line 639, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests/adapters.py", line 407, in send
self.cert_verify(conn, request.url, verify, cert)
File "/usr/local/lib/python2.7/site-packages/requests/adapters.py", line 247, in cert_verify
if conn.key_file and not os.path.exists(conn.key_file):
AttributeError: 'HTTPConnectionPool' object has no attribute 'key_file'
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4025/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4025/timeline
| null |
completed
| null | null | false |
[
"Yup, this is definitely a bug. Feel free to take a swing at this if you're interested, otherwise I'll try to fix it tomorrow. ",
"I've added a pull request. I'm not familiar with the requests code but I think my simple fix of just not doing anything unless HTTPS is used should work (the tests pass at least)."
] |
https://api.github.com/repos/psf/requests/issues/4024
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4024/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4024/comments
|
https://api.github.com/repos/psf/requests/issues/4024/events
|
https://github.com/psf/requests/issues/4024
| 229,157,926 |
MDU6SXNzdWUyMjkxNTc5MjY=
| 4,024 |
pypi tarballs are missing tests directory
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/504287?v=4",
"events_url": "https://api.github.com/users/floppym/events{/privacy}",
"followers_url": "https://api.github.com/users/floppym/followers",
"following_url": "https://api.github.com/users/floppym/following{/other_user}",
"gists_url": "https://api.github.com/users/floppym/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/floppym",
"id": 504287,
"login": "floppym",
"node_id": "MDQ6VXNlcjUwNDI4Nw==",
"organizations_url": "https://api.github.com/users/floppym/orgs",
"received_events_url": "https://api.github.com/users/floppym/received_events",
"repos_url": "https://api.github.com/users/floppym/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/floppym/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/floppym/subscriptions",
"type": "User",
"url": "https://api.github.com/users/floppym",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2017-05-16T20:24:06Z
|
2021-09-08T10:00:47Z
|
2017-05-18T14:50:40Z
|
CONTRIBUTOR
|
resolved
|
The tests are missing from the pypi tarballs for at least a few releases.
Oddly, when I run `python setup.py sdist` from a git checkout, the resulting tarball does have the tests; I'm not sure how that is working.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4024/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4024/timeline
| null |
completed
| null | null | false |
[
"Which releases are affected?",
"@Lukasa, I believe this is related to moving the tests into their own directory starting in 2.10.\r\n\r\n#4015 probably should have been to change test_requests.py to 'tests' in MANIFEST.in rather than removing the bad filename.",
"I checked the tarballs for the following releases, they are all missing the tests directory.\r\n\r\n2.11.1\r\n2.12.3\r\n2.13.0\r\n2.14.2",
"That sounds right @nateprewitt. Want to fix that up with a PR?",
"Yep, I'll throw the PR up when I get back to a computer, unless you have any interest @floppym.",
"> Oddly, when I run python setup.py sdist from a git checkout, the resulting tarball does have the tests; I'm not sure how that is working.\r\n\r\nThis was working for me because I had the `hgdistver` python package installed. This installs a setuptools plugin that auto-detects any files under version control."
] |
https://api.github.com/repos/psf/requests/issues/4023
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4023/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4023/comments
|
https://api.github.com/repos/psf/requests/issues/4023/events
|
https://github.com/psf/requests/issues/4023
| 229,144,728 |
MDU6SXNzdWUyMjkxNDQ3Mjg=
| 4,023 |
Requests extremely slow compared to urllib.request.urlopen
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1117703?v=4",
"events_url": "https://api.github.com/users/dhimmel/events{/privacy}",
"followers_url": "https://api.github.com/users/dhimmel/followers",
"following_url": "https://api.github.com/users/dhimmel/following{/other_user}",
"gists_url": "https://api.github.com/users/dhimmel/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dhimmel",
"id": 1117703,
"login": "dhimmel",
"node_id": "MDQ6VXNlcjExMTc3MDM=",
"organizations_url": "https://api.github.com/users/dhimmel/orgs",
"received_events_url": "https://api.github.com/users/dhimmel/received_events",
"repos_url": "https://api.github.com/users/dhimmel/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dhimmel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dhimmel/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dhimmel",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2017-05-16T19:32:59Z
|
2018-11-04T13:37:11Z
|
2017-05-17T17:20:05Z
|
NONE
|
resolved
|
I'm having poor performance getting the following URL using requests:
```
http://greycite.knowledgeblog.org/json?uri=http%3A%2F%2Fblog.dhimmel.com%2Firreproducible-timestamps%2F
```
The following notebook screenshot highlights the issue:

Less than a second using `urllib.request.urlopen` but 15 seconds using `requests.get`. I've evaluated `curl` and my web browser, which both retrieve the response quickly. It looks like the holdup is on `{method 'recv_into' of '_socket.socket' objects}`.
Any help would be appreciated.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4023/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4023/timeline
| null |
completed
| null | null | false |
[
"Just to remove influence from importing the modules, could you repeat the tests measuring the time, after the `import` statements?",
"> could you repeat the tests measuring the time, after the import statements?\r\n\r\n@HelioGuilherme66 see below:\r\n\r\n\r\n\r\nTo see if the problem replicates on your system, you can see if the following is slow or fast:\r\n\r\n```python\r\nimport requests\r\nurl = \"http://greycite.knowledgeblog.org/json?uri=http%3A%2F%2Fblog.dhimmel.com%2Firreproducible-timestamps%2F\"\r\nresponse = requests.get(url)\r\n```\r\n\r\nNote that I've also confirmed the request is slow in requests version 2.14.2.",
"The remote server is at fault here: its response is not really valid HTTP/1.1. Here are the response headers:\r\n\r\n```\r\nHTTP/1.1 200 OK\r\nDate: Wed, 17 May 2017 17:11:52 GMT\r\nServer: Apache\r\nContent-Type : application/citeproc+json\r\nAccess-Control-Allow-Origin: *\r\nContent-Length: 426\r\nKeep-Alive: timeout=15, max=50\r\nConnection: Keep-Alive\r\nContent-Type: text/html; charset=UTF-8\r\n```\r\n\r\nThe problem is the `Content-Type` header. You are **not allowed** to have a space between the header name value and the colon: the specification forbids it. Python 3 hits a parsing problem on this, and so only sees the headers before that one:\r\n\r\n```python\r\n>>> r.headers\r\n{'Date': 'Wed, 17 May 2017 17:11:52 GMT', 'Server': 'Apache'}\r\n```\r\n\r\nBecause we only see those headers, we don't know what the content-length is, so we have to wait for the TCP FIN to work out when the end of the body is. That's why we're taking so long. I suspect this is related to a bug in the standard library: probably [CPython issue 24363](https://bugs.python.org/issue24363).",
"(For note, the reason urllib.request is so fast is because it doesn't use persistent connections: that is, it sends the header `Connection: close`. This forces the server to close the connection immediately, so that TCP FIN comes quickly. You can reproduce this in Requests by sending that same header.)",
"@Lukasa thanks for diagnosing the problem. I've confirmed that I get the same faulty header of `Content-Type : application/citeproc+json` when I run:\r\n\r\n```sh\r\ncurl --head http://greycite.knowledgeblog.org/json?uri=http%3A%2F%2Fblog.dhimmel.com%2Firreproducible-timestamps%2F\r\n```\r\n\r\nThe `Connection: close` workaround was successful as shown below:\r\n\r\n\r\n\r\nTagging @phillord who I believe is an author of [Greycite](https://arxiv.org/abs/1304.7151) and thus may be able to fix the incorrect header.",
"Thanks for the information. I'll forward it to Lindsay Marshall (who is actually the author of greycite) and will see if we can get this fixed.",
"\r\nHopefully fixed at our end now.\r\n\r\n\r\n In [1]: import requests\r\n ...: url = \"http://greycite.knowledgeblog.org/json?uri=http%3A%2F%2Fblog.dhim\r\n ...: mel.com%2Firreproducible-timestamps%2F\"\r\n ...: \r\n \r\n In [2]: url\r\n Out[2]: 'http://greycite.knowledgeblog.org/json?uri=http%3A%2F%2Fblog.dhimmel.com%2Firreproducible-timestamps%2F'\r\n \r\n In [3]: %%time\r\n ...: response = requests.get(url)\r\n ...: \r\n CPU times: user 8 ms, sys: 4 ms, total: 12 ms\r\n Wall time: 104 ms\r\n\r\n\r\nCan you confirm? Thanks for the report BTW.",
"@phillord confirming that the `requests.get` without the `Connection: close` header is now speedy. Thanks!\r\n\r\nHave you considered putting the Greycite source on GitHub? It's a really great application, and I'm sure many people would be interested in seeing how it works and contributing enhancements.",
"Hello, \r\n\r\nI have a problem very similar to the previous one exposed, with the following url:\r\n\r\n> https://api.coinmarketcap.com/v1/ticker/\r\n\r\ncurl --head https://api.coinmarketcap.com/v1/ticker/\r\nHTTP/1.1 200 OK\r\nDate: Tue, 20 Mar 2018 03:19:00 GMT\r\nContent-Type: application/json\r\nContent-Length: 53976\r\nConnection: keep-alive\r\nSet-Cookie: __cfduid=dda658faa04cd96eb95607fa899d2bcc41521515940; expires=Wed, 20-Mar-19 03:19:00 GMT; path=/; domain=.coinmarketcap.com; HttpOnly; Secure\r\nAccess-Control-Allow-Origin: *\r\nCF-Cache-Status: HIT\r\nExpect-CT: max-age=604800, report-uri=\"https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct\"\r\nServer: cloudflare\r\nCF-RAY: 3fe508e2ba069b68-DFW\r\n\r\n\r\nThe response is very slow \r\n\r\n\r\nimport requests\r\nAPI_URL = \"https://api.coinmarketcap.com/v1/ticker/\"\r\nr = requests.get(API_URL)\r\n\r\nheaders --->\r\n{'Set-Cookie': '__cfduid=d3a92212eccddc19e51f7d9872801cc5d1521520491; expires=Wed, 20-Mar-19 04:34:51 GMT; path=/; domain=.coinmarketcap.com; HttpOnly; Secure', 'Date': 'Tue, 20 Mar 2018 04:34:51 GMT', 'Access-Control-Allow-Origin': '*', 'Content-Encoding': 'gzip', 'Vary': 'Accept-Encoding', 'Connection': 'keep-alive', 'Expect-CT': 'max-age=604800, report-uri=\"https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct\"', 'Server': 'cloudflare', 'CF-Cache-Status': 'HIT', 'Transfer-Encoding': 'chunked', 'CF-RAY': '3fe57802aeac1fe8-DFW', 'Content-Type': 'application/json'}\r\n\r\n\r\nRegards,\r\nEd"
] |
https://api.github.com/repos/psf/requests/issues/4022
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4022/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4022/comments
|
https://api.github.com/repos/psf/requests/issues/4022/events
|
https://github.com/psf/requests/issues/4022
| 228,820,126 |
MDU6SXNzdWUyMjg4MjAxMjY=
| 4,022 |
Handling of URLs with unicode in python 2.7
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2691631?v=4",
"events_url": "https://api.github.com/users/lee-hodg/events{/privacy}",
"followers_url": "https://api.github.com/users/lee-hodg/followers",
"following_url": "https://api.github.com/users/lee-hodg/following{/other_user}",
"gists_url": "https://api.github.com/users/lee-hodg/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/lee-hodg",
"id": 2691631,
"login": "lee-hodg",
"node_id": "MDQ6VXNlcjI2OTE2MzE=",
"organizations_url": "https://api.github.com/users/lee-hodg/orgs",
"received_events_url": "https://api.github.com/users/lee-hodg/received_events",
"repos_url": "https://api.github.com/users/lee-hodg/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/lee-hodg/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lee-hodg/subscriptions",
"type": "User",
"url": "https://api.github.com/users/lee-hodg",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 16 |
2017-05-15T19:33:39Z
|
2021-09-08T10:00:49Z
|
2017-05-16T23:44:03Z
|
NONE
|
resolved
|
Consider the URL: `https://www.sainsburys.co.uk/shop/gb/groceries/chablis/chablis-premièr-cru-brocard-75cl`. This URL contains the unicode character u`è`. If you open in most major browsers there should be no problem, similarly cURLing like `curl -L 'https://www.sainsburys.co.uk/shop/gb/groceries/chabllis-premi%C3%A8r-cru-brocard-75cl'` works without issue (i.e. we request the percent encoded utf8 encoded URL).
However if we try to get this url with python requests as follows:
unicode_url = u'https://www.sainsburys.co.uk/shop/gb/groceries/chablis/chablis-premièr-cru-brocard-75cl'
r = requests.get(unicode_url)
then it will 404. At this point I considered that maybe I just need to UTF8 encode the url and pass the byte str to requests:
r = requests.get(unicode_url.encode('utf8'))
Nope, 404 again. Finally I considered also manually doing the percent encoding (although I'm sure requests deals with this):
r = requests.get(urllib.quote(unicode_url.encode('utf8')))
Nope, it really doesn't like that.
I also tried `requests.get(unicode_url.encode('latin1'))` but now got a UnicodeDecodeError (presumably because requests is attempting to decode with utf8).
So how on earth can I request this URL in python-requests in python 2.7?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4022/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4022/timeline
| null |
completed
| null | null | false |
[
"Just adding some local testing to see what the web server is seeing. With a local HTTP server running and then running this:\r\n```python\r\nimport requests\r\nrequests.get(u'http://127.0.0.1:8000/è')\r\n```\r\nThe web server receives this as the HTTP header:\r\n`127.0.0.1 - - [15/May/2017 14:52:35] \"GET /%C3%A8 HTTP/1.1\" 404 -`\r\n\r\nSimilarly running this:\r\n```python\r\nimport requests\r\nrequests.get(u'http://127.0.0.1:8000/%C3%A8')\r\n```\r\nnets the same result on the web server:\r\n`127.0.0.1 - - [15/May/2017 14:52:45] \"GET /%C3%A8 HTTP/1.1\" 404 -`\r\n\r\nRunning this:\r\n```python\r\nimport requests\r\nrequests.get(u'http://127.0.0.1:8000/è'.encode('utf-8'))\r\n```\r\nalso nets the same result on the web server.\r\n`127.0.0.1 - - [15/May/2017 14:55:18] \"GET /%C3%A8 HTTP/1.1\" 404 -`",
"To add to those local tests, I also tried fetching with Scrapy in the Scrapy shell:\r\n\r\n fetch(Request(u'https://www.sainsburys.co.uk/shop/gb/groceries/chablis/chablis-premièr-cru-brocard-75cl'.encode('utf8')))\r\n\r\nThis is successful with 200 status.\r\n\r\nTesting against a local server with Scrapy: \r\n\r\n fetch(Request(u'http://127.0.0.1:8000/è'.encode('utf8'), dont_filter=True))\r\n\r\ngives `127.0.0.1 - - [15/May/2017 20:03:47] \"GET /%C3%A8 HTTP/1.1\" 404 -` \r\n\r\nso I'm slightly puzzled at how the local log looks the same for Scrapy and python-requests for the u`è` char, ye, the Scrapy request is successful (I tried setting the User-Agent in python requests too, just in case, but to no avail).",
"Whats `response.request.url` in the failing case?",
"@Lukasa Oh that's interesting, the `response.request.url` is the monsterous \r\n\r\n`'https://www.sainsburys.co.uk/webapp/wcs/stores/servlet/gb/groceries/chablis/chablis-premi%E8r-cru-brocard-75cl?langId=44&storeId=10151&krypto=avT10G07EA5Ef5vUiHvBmeJymCkV1iJKI%2BICpLkmAAfHIZJZ6oAoS5YpWh0JsNL4Y5DVN%2F%2Bx%2BLg23H1HqHhekHiyPEml4aL4FBurJg9kEqyOjrk%2BymN9sOje5plJkgTf9f8w05kssDkEByvy5LHAVUPfnEURk0xqW7M2Za9MJWY%3D&ddkey=https%3Agb%2Fgroceries%2Fchablis%2Fchablis-premi%C3%A8r-cru-brocard-75cl'`\r\n\r\nin the failing case.",
"If I turn off redirects I get something more sensible.\r\n```python\r\nprint(requests.get(u'http://www.sainsburys.co.uk/shop/gb/groceries/chablis/chablis-premièr-cru-brocard-75cl', allow_redirects=False).request.url)\r\n```\r\nprints out `'https://www.sainsburys.co.uk/shop/gb/groceries/chablis/chablis-premi%C3%A8r-cru-brocard-75cl'`",
"Yeah, so this is unlikely to be directly related to the URL and much more likely to be about the rest of the Request. I wonder if one of the request headers is causing problems. ",
"Here are the headers for the HTTP redirect response, note the `LOCATION` field having `LATIN-1 \\xe8` in place of our `UTF-8 è`.\r\n\r\n```\r\nCONTENT-LENGTH=0\r\nCONTENT-TYPE=text/html;charset=UTF-8\r\nCONTENT-LANGUAGE=en-GB\r\nSET-COOKIE=Apache=10.173.10.13.1494879611612576; path=/, JSESSIONID=0000VFtqeHv_MVwpT3yTbZ_ESeK:17buic8rt; HTTPOnly; Path=/, WC_PERSISTENT=F1JXlopv9%2Fvb%2BgnuCpQnfmVuEsE%3D%0A%3B2017-05-15+21%3A20%3A11.615_1494879611615-630867_0; HTTPOnly; Expires=Wed, 14-Jun-17 20:20:10 GMT; Path=/, SESSION_COOKIEACCEPT=true; HTTPOnly; Path=/, WC_SESSION_ESTABLISHED=true; HTTPOnly; Path=/, WC_PERSISTENT=kbGcXW5r9J0l1UOqxJS3jGdEr8U%3D%0A%3B2017-05-15+21%3A20%3A11.624_1494879611615-630867_10151; HTTPOnly; Expires=Wed, 14-Jun-17 20:20:10 GMT; Path=/, WC_AUTHENTICATION_-1002=-1002%2CJLwz3JSOi44I7duGuk3ubPJibnI%3D; HTTPOnly; Path=/; Secure, WC_ACTIVEPOINTER=44%2C10151; HTTPOnly; Path=/, WC_USERACTIVITY_-1002=-1002%2C10151%2Cnull%2Cnull%2Cnull%2Cnull%2Cnull%2Cnull%2Cnull%2Cnull%2CP4jUxIHNtjybZiX1clIkTMr4hqkqtEkvAIxD%2FjBtgOmmRSV2%2BQ8pSlCkdE16%2FPpdKxlgI5DnXS478Bd6xrPa4VXYvNhRPMujgCRlnF%2B%2BOFLZWIxk9XQATZ0yADjEJ8qATKm0GlxV3il0UPwL8JPngI9OCC03gHm16Asva4TFbGfceuE4G0WqHEODeNQhoTgflGUcYt0Xmzvoz4bRwaOL7w%3D%3D; HTTPOnly; Path=/, WC_GENERIC_ACTIVITYDATA=[18456303419%3Atrue%3Afalse%3A0%3Azx1HjzLxjqoYBsHke%2Bd7Ug4RlL0%3D][com.ibm.commerce.context.audit.AuditContext|1494879611615-630867][com.ibm.commerce.store.facade.server.context.StoreGeoCodeContext|null%26null%26null%26null%26null%26null][CTXSETNAME|Store][com.sol.commerce.context.SolBusinessContext|null%26null%26null%26null%26null%26null%26null%26null%26false%26false%26false%26null%26false%26null%26false%26false%26null%26false%26false%26false%26null%26false%26false][com.ibm.commerce.context.globalization.GlobalizationContext|44%26GBP%2644%26GBP][com.ibm.commerce.catalog.businesscontext.CatalogContext|10241%26null%26false%26false%26false][com.ibm.commerce.context.preview.PreviewContext|null%26null%26false][com.ibm.commerce.context.base.BaseContext|10151%26-1002%26-1002%26-1][com.ibm.commerce.context.experiment.ExperimentContext|null][com.ibm.commerce.context.entitlement.EntitlementContext|10502%2610502%26null%26-2000%26null%26null%26null]; HTTPOnly; Path=/, sbrycookie1=630263751;path=/;, sbrycookie2=FdrHbjOczkHbHbkHbHbjOc;path=/;domain=sainsburys.co.uk;, TS017d4e39=01a4f6ca7c6475495093834d7fd522443e41ba56ffd1d2b30a200be8149f886bad6c420c09120ee1903630ce23e061783b2d82ec56032188b7e79057cc61fc425a3ba8e099db7903081dfb4b084a60dde7e751a8c857388621c15d6db47574064f7f5b835cfa9ca301c69113251139456ef238a2fddc83350fdd1bae711a372b15aac6cf10d2f4af97ae38a37c42ca8bbe6d08b9a8201f4b85c31ede51f9a83cb56b17287ff5628ab383d33ee9d3d34e673aecd4c15be6b32839f1a5e76e6d04c84bc43cb4d7ba9114f0698080e2e9611f8f75dda6; Path=/, TS01cd64fc=01a4f6ca7c59e5c22be507a4b33c6594f8edbdc5d729b3fac53ed692e9fe894f1042e3fc38857b9063f35a2b81a9e7253e8aaf8ff8; path=/; domain=sainsburys.co.uk\r\nEXPIRES=Thu, 01 Jan 1970 00:00:00 GMT\r\nKEEP-ALIVE=timeout=15, max=4955\r\nCONNECTION=Keep-Alive\r\nLOCATION=https://www.sainsburys.co.uk/webapp/wcs/stores/servlet/gb/groceries/chablis/chablis-premi�r-cru-brocard-75cl?langId=44&storeId=10151&krypto=EhgWKRYw8NiBb2J%2B2NCnNPKz3cPG2ogpLvUgcZ7dD%2F71VThR7j0Sa%2BaCyf%2B5aGWf%2BXEqhAjOjcJr3xWVANaQnlPKA5uCK%2BnI36mOKAmflrzCj8mTkF1pcc85xQVnUMSMZMhnCBDRNRB0th%2B%2FJb6u4gBcjXp5FEpO5fNyWYz0TCI%3D&ddkey=https%3Agb%2Fgroceries%2Fchablis%2Fchablis-premi%C3%A8r-cru-brocard-75cl\r\nPRAGMA=No-cache\r\nCACHE-CONTROL=no-cache,no-store,max-age=0\r\nDATE=Mon, 15 May 2017 20:20:11 GMT\r\nX-FRAME-OPTIONS=SAMEORIGIN, SAMEORIGIN\r\nTTFB=D=23924\r\nREQ_TIMESTAMP=t=1494879611612501\r\n```",
"I observed something odd:\r\n\r\n```\r\nIn [1]: import requests\r\n\r\nIn [2]: s = requests.Session()\r\n\r\nIn [3]: unicode_url = u'https://www.sainsburys.co.uk/shop/gb/groceries/chablis/chablis-premièr-cru-brocard-75cl'\r\n\r\nIn [4]: s.get(unicode_url)\r\nOut[4]: <Response [404]>\r\n\r\nIn [5]: s.get(unicode_url)\r\nOut[5]: <Response [200]>\r\n```\r\n\r\nSeems to work in a session the second time around! \r\n\r\nTempting to say this is actually about cookies. The first request is without cookies and the webserver 404s because of that whilst still setting some session cookies. The next request sends the cookies and the webserver 200s.\r\n\r\n**BUT**, notice that the second request now does not need a redirect; you could replace the second request with `s.get(unicode_url, allow_redirects=False)` and still get a 200, not a 302. Hence the only reason it works now is that redirection is being sidestepped with cookies. This indicates the encoding issues are taking place somewhere in the redirect chain.\r\n\r\n**There is one weird thing too:**\r\n\r\n In [11]: requests.get(u'http://www.sainsburys.co.uk/shop/gb/groceries/chablis/chablis-premièr-cru-brocard-75cl')\r\n Out[11]: <Response [200]>\r\n\r\ndespite no cookies/sessions. I am struggling to understand that. The location header of the redirect in this case looks like:\r\n\r\n 'Location': 'http://www.sainsburys.co.uk/webapp/wcs/stores/servlet/gb/groceries/chablis/chablis-premi\\xc3\\xa8r-cru-brocard-75cl?langId=44&storeId=10151&krypto=dZB7Mt97QsHQQ%2BGMpb1iMZwdVfmbg%2BbRUdkh%2FciAItm7%2F4VSUi8NRUiszN3mSofKSCyAv%2F0QRKSsjhHzoo1x7in7Ctd4vzPIDIW5CcjiksLKE48%2BFU9nLNGkVzGj92PknAgP%2FmIFz63xpKhvPkxbJrtUmwi%2FUpbXNW9XIygHyTA%3D&ddkey=http%3Agb%2Fgroceries%2Fchablis%2Fchablis-premi%C3%83%C2%A8r-cru-brocard-75cl'\r\n\r\ni.e. we have the utf8 encoded u'è' instead of latin1 encoded.\r\n\r\nOur `response.request.url` in this case was:\r\n\r\n 'https://www.sainsburys.co.uk/shop/gb/groceries/chablis/chablis-premi%C3%83%C2%A8r-cru-brocard-75cl'\r\n\r\nfrom which the original unicode url can be recovered with\r\n\r\n In [46]: print urllib.unquote(response.request.url).decode('utf8').encode('latin1').decode('utf8')\r\n https://www.sainsburys.co.uk/shop/gb/groceries/chablis/chablis-premièr-cru-brocard-75cl\r\n\r\nSo webserver is using latin1 it seems....\r\n\r\n",
"I observed this behaviour even if a clean session of Chrome, 404 on the first attempt at the URL and 200 on the second.",
"**Summary**\r\n\r\nGiven that the first request (the request that actually has no cookies and relies on redirects) fails in every platform I have tried - Chrome, Scrapy, python-requests - I would put this down to a bug on the host server itself. It latin1 encodes it's location header in the redirect, but wants utf8 encoded URLs and 404s when the browser actually requests that redirect location URL, because the server was actually expecting a utf8 encoded URL. It should really be utf8 encoding the location header of it's redirect response in order to be consistent with the URL encoding it is using elsewhere.\r\n\r\nThis is why when you cheat and use `u'http://www.sainsburys.co.uk/shop/gb/groceries/chablis/chablis-premièr-cru-brocard-75cl'` you actually end up with the correct utf8 encoded location header in the redirect, because `u'https://www.sainsburys.co.uk/shop/gb/groceries/chablis/chablis-premièr-cru-brocard-75cl'.encode('latin1')` is `'https://www.sainsburys.co.uk/shop/gb/groceries/chablis/chablis-premi\\xc3\\xa8r-cru-brocard-75cl'`, which happens to be the correct utf8 encoded byte str of `u'https://www.sainsburys.co.uk/shop/gb/groceries/chablis/chablis-premièr-cru-brocard-75cl'`, so when the browser redirects it works.\r\n\r\nIf you have cookies set from visiting the URL already or perhaps somewhere else on the website, you sidestep the need for a redirect, and avoid the broken redirect process altogether.\r\n\r\nAll this is absolutely true in a recent version of Chrome and Firefox too.\r\n",
"Issue is avoided in python3 because location header is str not bytes however.",
"I think this explains a lot:\r\n\r\nhttps://github.com/kennethreitz/requests/blob/eae38b8d131e8b51c3daf3583e69879d1c02f9a4/requests/sessions.py#L101-L114",
"Ok, so there isn't really much Requests can do about this I think: the fact that the server is encoding the URL in an encoding that it doesn't itself accept is not really very sensible. We avoid this problem by sheer good luck on Python 3, but I don't think there is any sensible approach to fixing this generally. I think this might just be Sainsbury's crappy web server being crappy again.",
"@Lukasa I would agree with that. \r\n\r\n>> We avoid this problem by sheer good luck on Python 3\r\n\r\nCould you point me to the relevant bit of the code that does this please? I'm just curious. I've been trying to work out how in the python3 version the byte str received from the location headers is correctly decoded in latin1 to the correct str, and then re-encoded to utf8 byte before being sent.",
"The decode is actually done inside the standard library in `http.client`.",
"Thanks, well feel free to close this if you like."
] |
https://api.github.com/repos/psf/requests/issues/4021
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4021/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4021/comments
|
https://api.github.com/repos/psf/requests/issues/4021/events
|
https://github.com/psf/requests/issues/4021
| 228,673,701 |
MDU6SXNzdWUyMjg2NzM3MDE=
| 4,021 |
Requests issue with PyInstaller: Import error: No module named ordered_dict
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/28704581?v=4",
"events_url": "https://api.github.com/users/pablocoo/events{/privacy}",
"followers_url": "https://api.github.com/users/pablocoo/followers",
"following_url": "https://api.github.com/users/pablocoo/following{/other_user}",
"gists_url": "https://api.github.com/users/pablocoo/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/pablocoo",
"id": 28704581,
"login": "pablocoo",
"node_id": "MDQ6VXNlcjI4NzA0NTgx",
"organizations_url": "https://api.github.com/users/pablocoo/orgs",
"received_events_url": "https://api.github.com/users/pablocoo/received_events",
"repos_url": "https://api.github.com/users/pablocoo/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/pablocoo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pablocoo/subscriptions",
"type": "User",
"url": "https://api.github.com/users/pablocoo",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-05-15T10:38:53Z
|
2021-09-08T10:00:51Z
|
2017-05-15T11:49:58Z
|
NONE
|
resolved
|
I'm trying to create executable for my script, which uses `requests `module.
When running I get error:
> Import error: No module named ordered_dict
After further inspection the problem occurs in requests/compat.py file, on:
`from .packages.urllib3.packages.ordered_dict import OrderedDict`
I think the the problem is fact, that module is imported from subdirectory, but in my environment requests/packages does not contain other modules, as they are imported externally.
requests/packages/__init__.py contains some workaround that allows to correcly import required libraries:
> '''
Debian and other distributions "unbundle" requests' vendored dependencies, and
rewrite all imports to use the global versions of ``urllib3`` and ``chardet``.
The problem with this is that not only requests itself imports those
dependencies, but third-party code outside of the distros' control too.
In reaction to these problems, the distro maintainers replaced
``requests.packages`` with a magical "stub module" that imports the correct
modules. The implementations were varying in quality and all had severe
problems. For example, a symlink (or hardlink) that links the correct modules
into place introduces problems regarding object identity, since you now have
two modules in `sys.modules` with the same API, but different identities::
`requests.packages.urllib3 is not urllib3`
With version ``2.5.2``, requests started to maintain its own stub, so that
distro-specific breakage would be reduced to a minimum, even though the whole
issue is not requests' fault in the first place. See
https://github.com/kennethreitz/requests/pull/2375 for the corresponding pull
request.'''
```
from __future__ import absolute_import
import sys
try:
from . import urllib3
except ImportError:
import urllib3
sys.modules['%s.urllib3' % __name__] = urllib3
try:
from . import chardet
except ImportError:
import chardet
sys.modules['%s.chardet' % __name__] = chardet
```
But PyInstaller doesn't seem to recognize it. Is there an option/hook to use this configuration?
I'm using:
Ubuntu Mate, 16.04.2 LTS
requests - 2.9.1 - installed through pip
Pyinstaller - 3.2.1 - compiled from source
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4021/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4021/timeline
| null |
completed
| null | null | false |
[
"This is a PyInstaller problem, not a Requests one. When you install Requests from pip, it comes with the packages subdirectory populated. If it's not being populated when you use PyInstaller, then PyInstaller has lost track of what code it needs. You'll need to take it up with the maintainers of PyInstaller."
] |
https://api.github.com/repos/psf/requests/issues/4020
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4020/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4020/comments
|
https://api.github.com/repos/psf/requests/issues/4020/events
|
https://github.com/psf/requests/issues/4020
| 228,543,096 |
MDU6SXNzdWUyMjg1NDMwOTY=
| 4,020 |
GAE - Error during the import _winreg
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/11619400?v=4",
"events_url": "https://api.github.com/users/avilla-br/events{/privacy}",
"followers_url": "https://api.github.com/users/avilla-br/followers",
"following_url": "https://api.github.com/users/avilla-br/following{/other_user}",
"gists_url": "https://api.github.com/users/avilla-br/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/avilla-br",
"id": 11619400,
"login": "avilla-br",
"node_id": "MDQ6VXNlcjExNjE5NDAw",
"organizations_url": "https://api.github.com/users/avilla-br/orgs",
"received_events_url": "https://api.github.com/users/avilla-br/received_events",
"repos_url": "https://api.github.com/users/avilla-br/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/avilla-br/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/avilla-br/subscriptions",
"type": "User",
"url": "https://api.github.com/users/avilla-br",
"user_view_type": "public"
}
|
[
{
"color": "fbca04",
"default": false,
"description": null,
"id": 615414998,
"name": "GAE Support",
"node_id": "MDU6TGFiZWw2MTU0MTQ5OTg=",
"url": "https://api.github.com/repos/psf/requests/labels/GAE%20Support"
}
] |
closed
| true | null |
[] | null | 13 |
2017-05-14T13:49:36Z
|
2021-09-08T07:00:35Z
|
2017-07-29T23:01:08Z
|
NONE
|
resolved
|
Hi,
I use GAE - Google Application Engine - and my project raises error during 'import requests' when it is ran under GAE sandbox (locally).
GAE doesn't have _winreg in sys.modules and the _winreg is called by platform.py. In platform.py: system() function -> uname() -> win32_ver() -> import _winreg.
'platform' now is used in requests/utils.py after the commit '**proxy bypass on Windows without DNS lookups**' by @schlamar.
I don't know why GAE doesn't import _winreg. It seems GAE was developed in Unix-like platform.

If I copy an old requests version to my project (without import platform) and it works nice.
I'm not sure if this is a problem with the Requests project or Google.
Regards,
André
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4020/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4020/timeline
| null |
completed
| null | null | false |
[
"It's a problem with Requests: we removed some error handling code there as we didn't think there was any situation where that module would not be present: apparently we were wrong. 😟\r\n\r\n@schlamar; want to fix this up?",
"Remark: I use Windows 10 and Python 2.7, because it is a GAE limitation. I want to import google.auth.transport.requests, consequently it uses requests library. \r\n\r\nThanks.",
"I don't entirely agree that this is a Requests bug. We can and *should* fix it there, but I think the bug is actually in the standard library (it should handle an ImportError like this). That stacktrace, however, is very odd. Why would `platform.py` be going through `C:\\Program Files (x86)\\Google\\google_appengine\\google\\appengine\\tools\\devappserver2\\python\\sandbox.py`. It seems like this might be a bug in the GAE sandbox utility. It seems to have an import hook and it seems to be broken. :/",
"https://issuetracker.google.com/issues/35895346 it seems there's a whitelist of modules in the devappserver2 source code that may need to be updated to allow the use of `_winreg`.",
"cc @jonparrott",
"I'll definitely look into this. Quick question- does this work when deployed? Does it only fail locally?",
"It works fine when it is deployed. Only at the local level (GAE sandbox) that the error occurs.",
"Cool, I'll report this to the SDK team. It should be a relatively easy fix,\nbut fair warning that the release cycle is generally at least 2 weeks. I'll\nkeep this bug updated.\n\nOn Sun, May 14, 2017, 10:38 AM André Avilla <[email protected]>\nwrote:\n\n> It works fine when it is deployed.\n>\n> —\n> You are receiving this because you were mentioned.\n> Reply to this email directly, view it on GitHub\n> <https://github.com/kennethreitz/requests/issues/4020#issuecomment-301327649>,\n> or mute the thread\n> <https://github.com/notifications/unsubscribe-auth/AAPUc7smNQhA8Aq_MWd0gpWP19tJlYdJks5r5zv3gaJpZM4NaYgf>\n> .\n>\n",
"Google-side public issue: https://issuetracker.google.com/issues/38290292",
"This was fixed 3 weeks ago. Based on @jonparrott's previous comments, this should be released. I believe this is fixed now so I'm going to close this. If I'm mistaken, please let me know and we can reopen this.",
"I see this exact same case in my local GAE sandbox on Windows 10 machine using Python 2.7\r\nIs the fix available in master already? Thanks!",
"It should be *released*, not just in master, but the fix is upstream of us in the GAE sandbox.",
"> It should be released, not just in master, but the fix is upstream of us in the GAE sandbox.\r\n\r\nCorrect, and if it's still present, it should be reported there as well."
] |
https://api.github.com/repos/psf/requests/issues/4019
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4019/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4019/comments
|
https://api.github.com/repos/psf/requests/issues/4019/events
|
https://github.com/psf/requests/issues/4019
| 228,407,219 |
MDU6SXNzdWUyMjg0MDcyMTk=
| 4,019 |
OpenSSL issue when vendoring requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/14895024?v=4",
"events_url": "https://api.github.com/users/jonathan-ku-reflektion/events{/privacy}",
"followers_url": "https://api.github.com/users/jonathan-ku-reflektion/followers",
"following_url": "https://api.github.com/users/jonathan-ku-reflektion/following{/other_user}",
"gists_url": "https://api.github.com/users/jonathan-ku-reflektion/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jonathan-ku-reflektion",
"id": 14895024,
"login": "jonathan-ku-reflektion",
"node_id": "MDQ6VXNlcjE0ODk1MDI0",
"organizations_url": "https://api.github.com/users/jonathan-ku-reflektion/orgs",
"received_events_url": "https://api.github.com/users/jonathan-ku-reflektion/received_events",
"repos_url": "https://api.github.com/users/jonathan-ku-reflektion/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jonathan-ku-reflektion/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jonathan-ku-reflektion/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jonathan-ku-reflektion",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-05-12T21:01:22Z
|
2021-09-08T10:00:50Z
|
2017-05-13T14:14:28Z
|
NONE
|
resolved
|
I was trying to vendor requests in my own library. I used a similar vendoring structure as requests and I put requests under ceslib/packages
However, when I am trying to send a HTTPS request, I see the following error.
It seems like OpenSSL needs to be vendored by requests too?
Code Snippet:
```
import packages.requests as requests
self._session = requests.Session()
req = requests.Request('GET', 'https://www.google.com')
req = self._session.prepare_request(req)
response = self._session.send(req)
```
Error:
```Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.7/site-packages/ceslib/packages/requests/sessions.py", line 609, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/site-packages/ceslib/packages/requests/adapters.py", line 423, in send
timeout=timeout
File "/usr/lib/python2.7/site-packages/ceslib/packages/requests/packages/urllib3/connectionpool.py", line 600, in urlopen
chunked=chunked)
File "/usr/lib/python2.7/site-packages/ceslib/packages/requests/packages/urllib3/connectionpool.py", line 345, in _make_request
self._validate_conn(conn)
File "/usr/lib/python2.7/site-packages/ceslib/packages/requests/packages/urllib3/connectionpool.py", line 844, in _validate_conn
conn.connect()
File "/usr/lib/python2.7/site-packages/ceslib/packages/requests/packages/urllib3/connection.py", line 326, in connect
ssl_context=context)
File "/usr/lib/python2.7/site-packages/ceslib/packages/requests/packages/urllib3/util/ssl_.py", line 308, in ssl_wrap_socket
context.load_verify_locations(ca_certs, ca_cert_dir)
File "/usr/lib/python2.7/site-packages/ceslib/packages/requests/packages/urllib3/contrib/pyopenssl.py", line 411, in load_verify_locations
self._ctx.load_verify_locations(cafile, capath)
File "/usr/lib/python2.7/site-packages/OpenSSL/SSL.py", line 525, in load_verify_locations
_raise_current_error()
File "/usr/lib/python2.7/site-packages/OpenSSL/_util.py", line 48, in exception_from_error_queue
raise exception_type(errors)
OpenSSL.SSL.Error: [('system library', 'fopen', 'No such file or directory'), ('BIO routines', 'BIO_new_file', 'no such file'), ('x509 certificate routines', 'X509_load_cert_crl_file', 'system lib')]```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4019/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4019/timeline
| null |
completed
| null | null | false |
[
"Nope, you aren't missing OpenSSL, you're missing the certs bundle `cacerts.pem` in the requests code directory. Looks like you misplaced it in your vendoring.",
"Thank you. Was able to solve this problem by updating the vendored version from 2.13.0 to 2.14.2"
] |
https://api.github.com/repos/psf/requests/issues/4018
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4018/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4018/comments
|
https://api.github.com/repos/psf/requests/issues/4018/events
|
https://github.com/psf/requests/issues/4018
| 228,350,588 |
MDU6SXNzdWUyMjgzNTA1ODg=
| 4,018 |
requests.exceptions.SSLError.__str__ returns non-string type
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10983875?v=4",
"events_url": "https://api.github.com/users/dev-dull/events{/privacy}",
"followers_url": "https://api.github.com/users/dev-dull/followers",
"following_url": "https://api.github.com/users/dev-dull/following{/other_user}",
"gists_url": "https://api.github.com/users/dev-dull/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dev-dull",
"id": 10983875,
"login": "dev-dull",
"node_id": "MDQ6VXNlcjEwOTgzODc1",
"organizations_url": "https://api.github.com/users/dev-dull/orgs",
"received_events_url": "https://api.github.com/users/dev-dull/received_events",
"repos_url": "https://api.github.com/users/dev-dull/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dev-dull/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dev-dull/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dev-dull",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-05-12T16:58:01Z
|
2021-09-08T10:00:52Z
|
2017-05-12T17:02:10Z
|
NONE
|
resolved
|
requests 2.4.3
python 2.7.9
os: Debian Jesse
I looked through the release notes from 2.4.3 to present and didn't see this one as fixed.
Issue:
requests.exceptions.SSLError is raised, but when the error is printed or cast to string, another error is thrown stating that __str__ returned a non-string.
Just to restate: the non-string value is the issue here. I'm not yet sure how to reproduce as we were doing some wacky things. If I'm successful in reproducing this, I'll follow-up with steps.
I was using python/requests to troubleshoot a 'connection reset by peer' error at work and we made the following manipulations:
- changed `/etc/hosts` to point a domain at a test IP.
- added a 'Host' header that referred back at that domain.
- hit an HTTPS/SSL URL on that domain
-- because of previous changes, the cert and domain likely mismatch
-- This URL does a 302 redirect (it's a redirect service)
- requests.exceptions.SSLError is raised and caught as expected
- printing the error and casting the error as `str()` raises an error about __str__ returning a non-string.
- iirc, this issue still existed when I set `verify=False`
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4018/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4018/timeline
| null |
completed
| null | null | false |
[
"Before reporting this issue to us, can you please report to the Debian maintainers? We are no longer providing support for 2.4.3, so unless it can be reproduced on the current release we're not really able to do much for you. However, the Debian maintainers from whom you installed the package *are* supporting that version of the software, so contacting them is more likely to be useful for you.\r\n\r\nI should note, per the [Debian bug reporting notes](https://www.debian.org/Bugs/Reporting), that you should refrain from reporting bugs to the upstream maintainers:\r\n\r\n> **Don't file bugs upstream**\r\n>\r\n> If you file a bug in Debian, don't send a copy to the upstream software maintainers yourself, as it is possible that the bug exists only in Debian. If necessary, the maintainer of the package will forward the bug upstream.\r\n\r\nThanks for reaching out!"
] |
https://api.github.com/repos/psf/requests/issues/4017
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4017/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4017/comments
|
https://api.github.com/repos/psf/requests/issues/4017/events
|
https://github.com/psf/requests/issues/4017
| 227,782,610 |
MDU6SXNzdWUyMjc3ODI2MTA=
| 4,017 |
Cookies not passed to redirect
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7799980?v=4",
"events_url": "https://api.github.com/users/allComputableThings/events{/privacy}",
"followers_url": "https://api.github.com/users/allComputableThings/followers",
"following_url": "https://api.github.com/users/allComputableThings/following{/other_user}",
"gists_url": "https://api.github.com/users/allComputableThings/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/allComputableThings",
"id": 7799980,
"login": "allComputableThings",
"node_id": "MDQ6VXNlcjc3OTk5ODA=",
"organizations_url": "https://api.github.com/users/allComputableThings/orgs",
"received_events_url": "https://api.github.com/users/allComputableThings/received_events",
"repos_url": "https://api.github.com/users/allComputableThings/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/allComputableThings/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/allComputableThings/subscriptions",
"type": "User",
"url": "https://api.github.com/users/allComputableThings",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-05-10T19:17:30Z
|
2021-09-08T10:00:53Z
|
2017-05-10T20:29:27Z
|
NONE
|
resolved
|
If I:
`requests.post(LOGIN_URL, data=DATA, headers=HEADERS)`
and the initial response is a redirect with with new cookies in the response, it seems the cookies are passed to the redirected request and the above may receive no cookies. :-(
If I do
`requests.post(LOGIN_URL, data=DATA, headers=HEADERS, allow_redirects=False)`
then I do get back cookies, but then have to extract the cookie, and add them to a new request for the redirected URL.
If the redirect is to the same domain, it would be helpful to have the returned cookies passed to the second request where `allow_redirects=True`, emulating a browser's treatment of cookies.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7799980?v=4",
"events_url": "https://api.github.com/users/allComputableThings/events{/privacy}",
"followers_url": "https://api.github.com/users/allComputableThings/followers",
"following_url": "https://api.github.com/users/allComputableThings/following{/other_user}",
"gists_url": "https://api.github.com/users/allComputableThings/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/allComputableThings",
"id": 7799980,
"login": "allComputableThings",
"node_id": "MDQ6VXNlcjc3OTk5ODA=",
"organizations_url": "https://api.github.com/users/allComputableThings/orgs",
"received_events_url": "https://api.github.com/users/allComputableThings/received_events",
"repos_url": "https://api.github.com/users/allComputableThings/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/allComputableThings/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/allComputableThings/subscriptions",
"type": "User",
"url": "https://api.github.com/users/allComputableThings",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4017/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4017/timeline
| null |
completed
| null | null | false |
[
"Sorry, I need some more information. Requests absolutely *does* persist cookies like a browser does. Where are your cookies coming from? Are you setting them in the headers, or is the server setting them?",
"This is the behavior for 2.14.2\r\n\r\nWithout redirects:\r\n```\r\n r = requests.post(LOGIN_URL, data=DATA, headers=HEADERS, allow_redirects=False)\r\n r.cookies # is not empty\r\n```\r\n\r\nWith redirects:\r\n```\r\n r = requests.post(LOGIN_URL, data=DATA, headers=HEADERS, allow_redirects=False)\r\n r.cookies # is empty\r\n```\r\n\r\nHEADERS doesn't have any cookies (or any fields that are reserved by http).\r\nI didn't supply the cookies, they are returned from the request.\r\n\r\nI think the sequence of events is as follows:\r\n\r\n- Request: post LOGIN_URL (with form data, and boring headers, no cookies)\r\n Response: \r\n has a set-cookie\r\n Is a 302 with location='/'\r\n \r\n- Request: get with path='/' (and same domain), supply cookie\r\n Reponse: 200, but no set-cookie is returned from this final response.\r\n\r\nOnly the final response requests.post is returned -- this doesn't have any cookies from the login response as I'd hoped.\r\n\r\nLooking at the implementation of requests.post I see I can get what I want with:\r\n```\r\nwith sessions.Session() as session:\r\n r = session.request(url=LOGIN_URL,\r\n method=\"post\",\r\n data=LOGIN_FORMDATA,\r\n headers=LOGIN_HEADERS)\r\n return r, session.cookies\r\n```\r\n\r\nI'd would be nice to be able to collect the final response and the collected session cookies in a single command. However, thinking about it, it probably wouldn't help the request API -- is probably better to use the session object to do what I want.\r\n"
] |
https://api.github.com/repos/psf/requests/issues/4016
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4016/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4016/comments
|
https://api.github.com/repos/psf/requests/issues/4016/events
|
https://github.com/psf/requests/issues/4016
| 227,760,094 |
MDU6SXNzdWUyMjc3NjAwOTQ=
| 4,016 |
Error when requesting subdomain from localhost
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9379317?v=4",
"events_url": "https://api.github.com/users/markokajzer/events{/privacy}",
"followers_url": "https://api.github.com/users/markokajzer/followers",
"following_url": "https://api.github.com/users/markokajzer/following{/other_user}",
"gists_url": "https://api.github.com/users/markokajzer/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/markokajzer",
"id": 9379317,
"login": "markokajzer",
"node_id": "MDQ6VXNlcjkzNzkzMTc=",
"organizations_url": "https://api.github.com/users/markokajzer/orgs",
"received_events_url": "https://api.github.com/users/markokajzer/received_events",
"repos_url": "https://api.github.com/users/markokajzer/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/markokajzer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/markokajzer/subscriptions",
"type": "User",
"url": "https://api.github.com/users/markokajzer",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-05-10T17:50:07Z
|
2021-09-08T10:00:54Z
|
2017-05-10T19:07:42Z
|
NONE
|
resolved
|
Requesting a subdomain from localhost does not work.
As you can see requesting a url without subdomain from localhost works. When adding a subdomain "[Errno 8] nodename nor servname provided, or not known" is raised.
```python
import requests
requests.get('http://localhost:3000/highlights')
==> <Response [404]>
requests.get('http://api.localhost:3000/highlights')
==> Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/requests/packages/urllib3/connection.py", line 138, in _new_conn
(self.host, self.port), self.timeout, **extra_kw)
File "/usr/local/lib/python3.6/site-packages/requests/packages/urllib3/util/connection.py", line 75, in create_connection
for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
File "/usr/local/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/socket.py", line 743, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
socket.gaierror: [Errno 8] nodename nor servname provided, or not known
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/requests/packages/urllib3/connectionpool.py", line 594, in urlopen
chunked=chunked)
File "/usr/local/lib/python3.6/site-packages/requests/packages/urllib3/connectionpool.py", line 361, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/local/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1239, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/usr/local/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1285, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/usr/local/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1234, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/usr/local/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1026, in _send_output
self.send(msg)
File "/usr/local/Cellar/python3/3.6.1/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 964, in send
self.connect()
File "/usr/local/lib/python3.6/site-packages/requests/packages/urllib3/connection.py", line 163, in connect
conn = self._new_conn()
File "/usr/local/lib/python3.6/site-packages/requests/packages/urllib3/connection.py", line 147, in _new_conn
self, "Failed to establish a new connection: %s" % e)
requests.packages.urllib3.exceptions.NewConnectionError: <requests.packages.urllib3.connection.HTTPConnection object at 0x105a18470>: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/requests/adapters.py", line 423, in send
timeout=timeout
File "/usr/local/lib/python3.6/site-packages/requests/packages/urllib3/connectionpool.py", line 643, in urlopen
_stacktrace=sys.exc_info()[2])
File "/usr/local/lib/python3.6/site-packages/requests/packages/urllib3/util/retry.py", line 363, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
requests.packages.urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='api.localhost', port=3000): Max retries exceeded with url: /highlights (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x105a18470>: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known',))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python3.6/site-packages/requests/api.py", line 70, in get
return request('get', url, params=params, **kwargs)
File "/usr/local/lib/python3.6/site-packages/requests/api.py", line 56, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python3.6/site-packages/requests/sessions.py", line 488, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python3.6/site-packages/requests/sessions.py", line 609, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python3.6/site-packages/requests/adapters.py", line 487, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='api.localhost', port=3000): Max retries exceeded with url: /highlights (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x105a18470>: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known',))
```
Any ideas how to make this work?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4016/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4016/timeline
| null |
completed
| null | null | false |
[
"You need to add a hosts file entry. Sub domains of local host do not automatically resolve to the loop back address, and generally speaking are not well defined. This problem is really nothing to do with Requests: your OS is responsible for resolving names into IP addresses."
] |
https://api.github.com/repos/psf/requests/issues/4015
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4015/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4015/comments
|
https://api.github.com/repos/psf/requests/issues/4015/events
|
https://github.com/psf/requests/pull/4015
| 227,732,129 |
MDExOlB1bGxSZXF1ZXN0MTE5OTIxNDc4
| 4,015 |
remove old test_requests.py listing from MANIFEST.in
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-05-10T16:06:54Z
|
2021-09-05T00:07:17Z
|
2017-05-10T19:04:04Z
|
MEMBER
|
resolved
|
This is a trivial fix to address a warning pointed out in #4012.
`test_requests.py` was moved from the top level directory into the `tests` directory in 2.10.0 but the MANIFEST.in wasn't updated afterwards. Installing from setup.py produces a warning (`no files found matching 'test_requests.py'`). This patch will remove the unneeded listing and keep the warning from cluttering stderr.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4015/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4015/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4015.diff",
"html_url": "https://github.com/psf/requests/pull/4015",
"merged_at": "2017-05-10T19:04:04Z",
"patch_url": "https://github.com/psf/requests/pull/4015.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4015"
}
| true |
[
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4015?src=pr&el=h1) Report\n> Merging [#4015](https://codecov.io/gh/kennethreitz/requests/pull/4015?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/e4693eba4f429230d482ca50d0a537fc13ea6884?src=pr&el=desc) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/4015?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #4015 +/- ##\n=======================================\n Coverage 89.74% 89.74% \n=======================================\n Files 15 15 \n Lines 1940 1940 \n=======================================\n Hits 1741 1741 \n Misses 199 199\n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4015?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4015?src=pr&el=footer). Last update [e4693eb...f4b9082](https://codecov.io/gh/kennethreitz/requests/pull/4015?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n"
] |
https://api.github.com/repos/psf/requests/issues/4014
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4014/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4014/comments
|
https://api.github.com/repos/psf/requests/issues/4014/events
|
https://github.com/psf/requests/pull/4014
| 227,677,833 |
MDExOlB1bGxSZXF1ZXN0MTE5ODgxMjc2
| 4,014 |
Remove range operators from markers.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-05-10T13:20:58Z
|
2021-09-06T00:06:58Z
|
2017-05-10T14:02:55Z
|
MEMBER
|
resolved
|
Resolves #4013.
It turns out that older setuptools were pretty stupid and couldn't handle range operators, but they *can* handle parentheses and the word "or". So let's do that instead.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4014/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4014/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4014.diff",
"html_url": "https://github.com/psf/requests/pull/4014",
"merged_at": "2017-05-10T14:02:55Z",
"patch_url": "https://github.com/psf/requests/pull/4014.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4014"
}
| true |
[
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4014?src=pr&el=h1) Report\n> Merging [#4014](https://codecov.io/gh/kennethreitz/requests/pull/4014?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/3618f3f68ed7b4efe5a6adf6ddf2f413dc71d0e8?src=pr&el=desc) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/4014?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #4014 +/- ##\n=======================================\n Coverage 89.74% 89.74% \n=======================================\n Files 15 15 \n Lines 1940 1940 \n=======================================\n Hits 1741 1741 \n Misses 199 199\n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4014?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4014?src=pr&el=footer). Last update [3618f3f...f3f4c3e](https://codecov.io/gh/kennethreitz/requests/pull/4014?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"Cool, ok. Let's ship a 2.14.2 with this and see if this cleans up the last few lingering pain points."
] |
https://api.github.com/repos/psf/requests/issues/4013
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4013/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4013/comments
|
https://api.github.com/repos/psf/requests/issues/4013/events
|
https://github.com/psf/requests/issues/4013
| 227,666,894 |
MDU6SXNzdWUyMjc2NjY4OTQ=
| 4,013 |
Amazon Linux easy_install incompatible with 2.14.1
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/19824146?v=4",
"events_url": "https://api.github.com/users/jfwarner/events{/privacy}",
"followers_url": "https://api.github.com/users/jfwarner/followers",
"following_url": "https://api.github.com/users/jfwarner/following{/other_user}",
"gists_url": "https://api.github.com/users/jfwarner/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jfwarner",
"id": 19824146,
"login": "jfwarner",
"node_id": "MDQ6VXNlcjE5ODI0MTQ2",
"organizations_url": "https://api.github.com/users/jfwarner/orgs",
"received_events_url": "https://api.github.com/users/jfwarner/received_events",
"repos_url": "https://api.github.com/users/jfwarner/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jfwarner/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jfwarner/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jfwarner",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2017-05-10T12:43:45Z
|
2021-09-08T10:00:55Z
|
2017-05-10T14:02:55Z
|
NONE
|
resolved
|
I've been following #4006 #4012 and I'm aware the guidance is to update to a newer pip, and that easy_install is not supported. I just wanted to make the case that not all of us have a choice on python package manager.
Take my case - I'm using Amazon Linux bootstrapped by Amazon CloudFormation. We give the CloudFormation initializer code (cfn-init for short) a list of python packages to install, and it installs them in any order it wants, using any mechanism it wants. Studying the logs, it looks like under the hood it's using easy_install.
I'm not even trying to install requests. I'm getting a floating version of it transitively through trying to install docker-py. So I see:
`[ERROR] easy_install failed. Output: Searching for docker-py==1.7.2`
...
`Searching for requests>=2.5.2`
...
`Best match: requests 2.14.1`
...
```
Running requests-2.14.1/setup.py -q bdist_egg --dist-dir /tmp/easy_install-wfkGhb/requests-2.14.1/egg-dist-tmp-H3aUMy
error: Setup script exited with error in requests setup command: Invalid environment marker: sys_platform == "win32" and python_version<"3.3"
```
Version information:
Amazon Linux AMI release 2016.09
Python 2.7.12
```
easy_install --version
setuptools 12.2
```
So at the moment I have a cumbersome workaround where I run cfn-init in two phases, first where I expressly install requests 2.13.0, then next where I install docker-py and it's happy with the requests already installed. But I just wanted to raise awareness that there is a category of people who are seeing this error through transitive use of requests (I didn't even know we were using it) and for which upgrading or changing package managers is non-trivial.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4013/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4013/timeline
| null |
completed
| null | null | false |
[
"@jfwarner Thanks for this report.\r\n\r\nIs it not possible to [upgrade the version of setuptools before you run cloudformation](https://gist.github.com/kixorz/10194688)? That will likely resolve your issue.",
"Unfortunately setuptools 12.2 is the latest and greatest available from Amazon Linux's yum repo.\r\n\r\n```\r\nyum update python27-setuptools\r\nLoaded plugins: priorities, update-motd, upgrade-helper\r\namzn-main\r\namzn-updates\r\nNo packages marked for update\r\n```\r\n\r\n```\r\nyum info python27-setuptools\r\nLoaded plugins: priorities, update-motd, upgrade-helper\r\nInstalled Packages\r\nName : python27-setuptools\r\nArch : noarch\r\nVersion : 12.2\r\nRelease : 1.32.amzn1\r\nSize : 1.9 M\r\nRepo : installed\r\n```",
"Gah, that's pretty frustrating. This seems to be down to setuptools 12 not knowing how to use range operators in markers. We can probably fix it though. See #4014.",
"Thanks - I know it's our own fault for not locking down all our transitive dependencies, but this is causing us a production outage on the ability to boot new servers. ",
"Confirmed that the issue has been resolved in 2.14.2 - thanks!",
"No problem, glad we could help!"
] |
https://api.github.com/repos/psf/requests/issues/4012
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4012/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4012/comments
|
https://api.github.com/repos/psf/requests/issues/4012/events
|
https://github.com/psf/requests/issues/4012
| 227,649,635 |
MDU6SXNzdWUyMjc2NDk2MzU=
| 4,012 |
Invalid environment marker
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1065155?v=4",
"events_url": "https://api.github.com/users/atarkowska/events{/privacy}",
"followers_url": "https://api.github.com/users/atarkowska/followers",
"following_url": "https://api.github.com/users/atarkowska/following{/other_user}",
"gists_url": "https://api.github.com/users/atarkowska/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/atarkowska",
"id": 1065155,
"login": "atarkowska",
"node_id": "MDQ6VXNlcjEwNjUxNTU=",
"organizations_url": "https://api.github.com/users/atarkowska/orgs",
"received_events_url": "https://api.github.com/users/atarkowska/received_events",
"repos_url": "https://api.github.com/users/atarkowska/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/atarkowska/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/atarkowska/subscriptions",
"type": "User",
"url": "https://api.github.com/users/atarkowska",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 13 |
2017-05-10T11:32:28Z
|
2021-09-08T10:00:54Z
|
2017-05-10T11:56:46Z
|
NONE
|
resolved
|
requests 2.14+ centos7 py27
```
omeroweb_1 | Running setup.py egg_info for package requests
omeroweb_1 | error in requests setup command: Invalid environment marker: platform_system == "Windows" and python_version<"3.3"
omeroweb_1 | Complete output from command python setup.py egg_info:
omeroweb_1 | error in requests setup command: Invalid environment marker: platform_system == "Windows" and python_version<"3.3"
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4012/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4012/timeline
| null |
completed
| null | null | false |
[
"Are you sure you receive the exact same output for 2.14 and 2.14.1? Because this appears to be a duplicate of #4006, which should have been resolved by #4007 in 2.14.1.",
"yes because https://github.com/ome/omero-mapr/blob/master/requirements.txt#L1\r\n\r\nI have to `requests==2.13.0`",
"@aleksandra-tarkowska When are you encountering the error. Right now?",
"What version of pip are you using to install requests? What version of setuptools is currently installed?",
"yes, at the moment, it is new container\r\n\r\n```\r\nomeroweb_1 | Downloading/unpacking requests==2.14.1 (from omero-mapr==0.1.14->-r /home/omero/requirements-webapps.txt (line 2))\r\nomeroweb_1 | Running setup.py egg_info for package requests\r\nomeroweb_1 | error in requests setup command: Invalid environment marker: sys_platform == \"win32\" and python_version<\"3.3\"\r\nomeroweb_1 | Complete output from command python setup.py egg_info:\r\nomeroweb_1 | error in requests setup command: Invalid environment marker: sys_platform == \"win32\" and python_version<\"3.3\"\r\nomeroweb_1 | \r\n```\r\n\r\nCentOS 7\r\n```\r\n[omero@496158a7ba55 ~]$ easy_install --version\r\nsetuptools 0.9.8\r\n[omero@496158a7ba55 ~]$ pip --version\r\npip 8.1.2 from /usr/lib/python2.7/site-packages (python 2.7)\r\n```",
"default rpm provided by yum",
"Ok, so I think your setuptools is too old and unsupported by Requests. Try doing `pip install -U setuptools`: that should resolve your issue. This is a duplicate of #4006.",
"I agree. This is a duplicate of #4006. I'll also add that it appears that you're using easy_install to install requests. That may have worked even recently but has not been supported for *years*. Please use pip to install requests.",
"I'm running into this on Travis. Installing 'requests' fails since today with:\r\n```\r\nerror: Setup script exited with error in requests setup command: Invalid environment marker: sys_platform == \"win32\" and python_version<\"3.3\"\r\n```\r\nI am confused since this job is running on Linux with Python 3.4.",
"Job URL: https://travis-ci.org/remram44/usagestats/jobs/230593204",
"@remram44 Try again, we shipped a 2.14.2 release that should have resolved your issue.",
"👍 \r\n```\r\n Running setup.py install for requests\r\n \r\n warning: no files found matching 'test_requests.py'\r\nSuccessfully installed omero-mapr requests\r\n```",
"New build passed, thanks for the quick turnaround!"
] |
https://api.github.com/repos/psf/requests/issues/4011
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4011/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4011/comments
|
https://api.github.com/repos/psf/requests/issues/4011/events
|
https://github.com/psf/requests/issues/4011
| 227,611,425 |
MDU6SXNzdWUyMjc2MTE0MjU=
| 4,011 |
Install fails 2.14.x
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/95203?v=4",
"events_url": "https://api.github.com/users/rohityadavcloud/events{/privacy}",
"followers_url": "https://api.github.com/users/rohityadavcloud/followers",
"following_url": "https://api.github.com/users/rohityadavcloud/following{/other_user}",
"gists_url": "https://api.github.com/users/rohityadavcloud/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rohityadavcloud",
"id": 95203,
"login": "rohityadavcloud",
"node_id": "MDQ6VXNlcjk1MjAz",
"organizations_url": "https://api.github.com/users/rohityadavcloud/orgs",
"received_events_url": "https://api.github.com/users/rohityadavcloud/received_events",
"repos_url": "https://api.github.com/users/rohityadavcloud/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rohityadavcloud/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rohityadavcloud/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rohityadavcloud",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2017-05-10T09:00:32Z
|
2021-09-08T10:00:55Z
|
2017-05-10T09:24:22Z
|
NONE
|
resolved
|
The pypi page is not loading as well:
https://pypi.python.org/pypi/requests/2.14.1
Logs:
07:58:51 Processing requests-2.14.1.tar.gz
07:58:51 Writing /tmp/easy_install-WACg1D/requests-2.14.1/setup.cfg
07:58:51 Running requests-2.14.1/setup.py -q bdist_egg --dist-dir /tmp/easy_install-WACg1D/requests-2.14.1/egg-dist-tmp-MeAs3M
07:58:51
07:58:51 stderr: no previously-included directories found matching 'docs/_build'
07:58:51 warning: no previously-included files matching '*.py[cdo]' found anywhere in distribution
07:58:51 warning: no previously-included files matching '__pycache__' found anywhere in distribution
07:58:51 warning: no previously-included files matching '*.so' found anywhere in distribution
07:58:51 warning: no previously-included files matching '*.pyd' found anywhere in distribution
07:58:51 zip_safe flag not set; analyzing archive contents...
07:58:51 error: Setup script exited with error in requests setup command: Invalid environment marker: sys_platform == "win32" and python_version<"3.3"
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/95203?v=4",
"events_url": "https://api.github.com/users/rohityadavcloud/events{/privacy}",
"followers_url": "https://api.github.com/users/rohityadavcloud/followers",
"following_url": "https://api.github.com/users/rohityadavcloud/following{/other_user}",
"gists_url": "https://api.github.com/users/rohityadavcloud/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rohityadavcloud",
"id": 95203,
"login": "rohityadavcloud",
"node_id": "MDQ6VXNlcjk1MjAz",
"organizations_url": "https://api.github.com/users/rohityadavcloud/orgs",
"received_events_url": "https://api.github.com/users/rohityadavcloud/received_events",
"repos_url": "https://api.github.com/users/rohityadavcloud/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rohityadavcloud/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rohityadavcloud/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rohityadavcloud",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4011/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4011/timeline
| null |
completed
| null | null | false |
[
"Thanks, I think this is an issue with your system. What OS are you running, and can you print out the version of pip you have installed please? This is almost certainly related to #4006.",
"@Lukasa I just checked our Jenkins/Ansible setup, we're running `easy_install cloudmonkey` that depends on requests: https://github.com/apache/cloudstack-cloudmonkey/blob/5.3.3/setup.py#L36\r\nThe OS is CentOS7.1.\r\n\r\nThe pypi page used to render correctly for old versions: \r\nhttps://pypi.python.org/pypi/requests/2.13.0\r\n\r\nBut not for the latest:\r\nhttps://pypi.python.org/pypi/requests/2.14.1\r\n\r\nThis looks like broke it for me: https://github.com/kennethreitz/requests/commit/800a074b5d7adbbcaf96a5e6b3754c560f6768ff",
"Requests does not support being installed via `easy_install`. Have you tried running `pip install cloudmonkey`?",
"@Lukasa okay will amend changes, looks like the failure happens when `sys_platform` is not defined, pip install --upgrade distribute may fix it.",
"Thanks @Lukasa for your support, using pip fixed the issue for me. Though bear in mind, our scripts used to work well with requests `1.13.0`, so something has changed.\r\nI'll close the ticket now.",
"See the discussion in #4006 for an idea of what has changed."
] |
https://api.github.com/repos/psf/requests/issues/4010
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4010/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4010/comments
|
https://api.github.com/repos/psf/requests/issues/4010/events
|
https://github.com/psf/requests/issues/4010
| 227,527,675 |
MDU6SXNzdWUyMjc1Mjc2NzU=
| 4,010 |
requests 2.14.x erroring when using a custom HTTPSConnectionPool
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/151667?v=4",
"events_url": "https://api.github.com/users/ludokx/events{/privacy}",
"followers_url": "https://api.github.com/users/ludokx/followers",
"following_url": "https://api.github.com/users/ludokx/following{/other_user}",
"gists_url": "https://api.github.com/users/ludokx/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ludokx",
"id": 151667,
"login": "ludokx",
"node_id": "MDQ6VXNlcjE1MTY2Nw==",
"organizations_url": "https://api.github.com/users/ludokx/orgs",
"received_events_url": "https://api.github.com/users/ludokx/received_events",
"repos_url": "https://api.github.com/users/ludokx/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ludokx/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ludokx/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ludokx",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-05-09T23:34:42Z
|
2021-09-08T10:00:56Z
|
2017-05-10T00:58:38Z
|
NONE
|
resolved
|
This worked fine with 2.13.0 but started failing today since 2.14.0, this is the stacktrace
```
Traceback (most recent call last):
File "requests_test.py", line 31, in <module>
s.request(method='GET', url='https://google.com')
File "/Users/ludo/Library/Python/2.7/lib/python/site-packages/requests/sessions.py", line 518, in request
resp = self.send(prep, **send_kwargs)
File "/Users/ludo/Library/Python/2.7/lib/python/site-packages/requests/sessions.py", line 639, in send
r = adapter.send(request, **kwargs)
File "/Users/ludo/Library/Python/2.7/lib/python/site-packages/requests/adapters.py", line 403, in send
conn = self.get_connection(request.url, proxies)
File "/Users/ludo/Library/Python/2.7/lib/python/site-packages/requests/adapters.py", line 307, in get_connection
conn = self.poolmanager.connection_from_url(url)
File "/Users/ludo/Library/Python/2.7/lib/python/site-packages/requests/packages/urllib3/poolmanager.py", line 279, in connection_from_url
pool_kwargs=pool_kwargs)
File "/Users/ludo/Library/Python/2.7/lib/python/site-packages/requests/packages/urllib3/poolmanager.py", line 227, in connection_from_host
return self.connection_from_context(request_context)
File "/Users/ludo/Library/Python/2.7/lib/python/site-packages/requests/packages/urllib3/poolmanager.py", line 240, in connection_from_context
return self.connection_from_pool_key(pool_key, request_context=request_context)
File "/Users/ludo/Library/Python/2.7/lib/python/site-packages/requests/packages/urllib3/poolmanager.py", line 261, in connection_from_pool_key
pool = self._new_pool(scheme, host, port, request_context=request_context)
TypeError: _new_pool() got an unexpected keyword argument 'request_context'
```
Here's the sample code to reproduce the issue:
```
import requests
from requests.adapters import HTTPAdapter
from requests.packages.urllib3.poolmanager import PoolManager, HTTPSConnectionPool
class MyHTTPSConnectionPool(HTTPSConnectionPool):
def __init__(self, host, port=None, **conn_kw):
HTTPSConnectionPool.__init__(self, host, port, **conn_kw)
class MyPoolManager(PoolManager):
def __init__(self, **kw_args):
PoolManager.__init__(self, **kw_args)
def _new_pool(self, scheme, host, port):
if scheme == 'https':
kwargs = self.connection_pool_kw
return MyHTTPSConnectionPool(host, port, **kwargs)
return super(MyPoolManager, self)._new_pool(scheme, host, port)
class MyAdapter(HTTPAdapter):
def __init__(self, **kw_args):
HTTPAdapter.__init__(self, **kw_args)
def init_poolmanager(self, connections, maxsize, block=False):
self.poolmanager = MyPoolManager(num_pools=connections,
s = requests.Session()
s.mount('https://', MyAdapter())
s.request(method='GET', url='https://google.com')
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4010/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4010/timeline
| null |
completed
| null | null | false |
[
"I believe this is fixed with #4007 please upgrade to 2.14.1.",
"I'm not sure if this is actually related to #4007. Some work was done in urllib3 1.21 to allow the use of pool_kwargs in the `connection_from_*` functions. Part of [this commit](https://github.com/shazow/urllib3/commit/7c14966fc134146223650a8393e17e91a358ba6d) introduced the passing of a new argument, `request_context` in `_new_pool`.\r\n\r\nYou'll find the function signature you're providing in your custom PoolManager no longer matches what urllib3 expects. Modifying your example code to `def _new_pool(self, scheme, host, port, request_context=None)` will fix this issue."
] |
https://api.github.com/repos/psf/requests/issues/4009
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4009/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4009/comments
|
https://api.github.com/repos/psf/requests/issues/4009/events
|
https://github.com/psf/requests/issues/4009
| 227,521,850 |
MDU6SXNzdWUyMjc1MjE4NTA=
| 4,009 |
OSError Tunnel connection failed: 400 Bad Request
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/11723331?v=4",
"events_url": "https://api.github.com/users/ed-biz23/events{/privacy}",
"followers_url": "https://api.github.com/users/ed-biz23/followers",
"following_url": "https://api.github.com/users/ed-biz23/following{/other_user}",
"gists_url": "https://api.github.com/users/ed-biz23/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ed-biz23",
"id": 11723331,
"login": "ed-biz23",
"node_id": "MDQ6VXNlcjExNzIzMzMx",
"organizations_url": "https://api.github.com/users/ed-biz23/orgs",
"received_events_url": "https://api.github.com/users/ed-biz23/received_events",
"repos_url": "https://api.github.com/users/ed-biz23/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ed-biz23/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ed-biz23/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ed-biz23",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 11 |
2017-05-09T22:58:10Z
|
2019-04-03T12:58:44Z
|
2017-07-29T23:09:45Z
|
NONE
|
resolved
|
Hi, I'm using Requests version 2.14.1, but I'm having problem with HTTPS request when using proxies.(works fine with http url) *My proxies are already ip authorized.* I'm novice to programming so I cant exactly figure out what I'm doing wrong.
```python
import requests, random, time
proxies_list = [
'104.153.xx.xxx:xxxx',
'104.153.xx.xxx:xxxx'
]
while True:
try:
r = requests.get('https://icanhazip.com', proxies={'https': 'http://{}'.format(random.choice(proxies_list))})
print (r.text), time.sleep(0.5)
except Exception as e:
print(e)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4009/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4009/timeline
| null |
completed
| null | null | false |
[
"Can you give us more information about your system and the actual exception traceback? It'd be helpful to know exactly which OSError you're receiving.",
"sorry, here is the trackback. and im using Mac OSX\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 594, in urlopen\r\n self._prepare_proxy(conn)\r\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 810, in _prepare_proxy\r\n conn.connect()\r\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/packages/urllib3/connection.py\", line 294, in connect\r\n self._tunnel()\r\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py\", line 832, in _tunnel\r\n message.strip()))\r\nOSError: Tunnel connection failed: 400 Bad Request\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/adapters.py\", line 438, in send\r\n timeout=timeout\r\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 649, in urlopen\r\n _stacktrace=sys.exc_info()[2])\r\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/packages/urllib3/util/retry.py\", line 388, in increment\r\n raise MaxRetryError(_pool, url, error or ResponseError(cause))\r\nrequests.packages.urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='icanhazip.com', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 400 Bad Request',)))\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"/Users/edwardbiswas/Desktop/Script/proxy.py\", line 107, in <module>\r\n r = requests.get('https://icanhazip.com', proxies={'https': 'http://{}'.format(random.choice(proxies_list))})\r\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/api.py\", line 72, in get\r\n return request('get', url, params=params, **kwargs)\r\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/api.py\", line 58, in request\r\n return session.request(method=method, url=url, **kwargs)\r\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/sessions.py\", line 518, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/sessions.py\", line 639, in send\r\n r = adapter.send(request, **kwargs)\r\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/adapters.py\", line 500, in send\r\n raise ProxyError(e, request=request)\r\nrequests.exceptions.ProxyError: HTTPSConnectionPool(host='icanhazip.com', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', OSError('Tunnel connection failed: 400 Bad Request',)))\r\n```",
"So this is the proxy saying that it doesn't like the tunnel request. In a situation like this, if you get a `requests.exceptions.ProxyError`, you should consider sleeping and then retrying. You may even want to remove the proxy that fired the error from your list of proxies for some time, as it clearly doesn't like you using it very much.",
"All of my 100 proxies returning same error. I also added 5sec sleep, but still giving me this issue. However these proxies working fine when I change the link to http.\r\n\r\n>r = requests.get('http://icanhazip.com', proxies={'http': random.choice(proxies_list)}\r\n\r\nSo I really don't understand why it wont work with https.",
"@eviled23 That suggests that your proxies don't like the CONNECT verb. This is quite possible: you may not be able to tunnel HTTPS through your proxies.",
"I just tested the proxies on my chrome browser with extension and they were able to connect fine with https url. Perhaps requests has problem with reverse backconnect proxies?",
"Not as far as I know. Are you interested in trying to use Wireshark or tcpdump to compare what Requests is doing with what Chrome is doing?",
"yea sure. ",
"I think it's issue with rotate proxies because I just tried https requests with non rotate proxies, and it seem to be working fine.",
"This issue has stalled. I'm going to close this. If there is anything for Requests to do, feel free to let us know and we can reopen this.",
"hi, i have a big problem with proxy rotation. I have few bot for parse a web site and proxyHandler class doesen't work. Proxy are tested, and i don't have response from the web site that i'm trying to connect using proxies.\r\n __need help__"
] |
https://api.github.com/repos/psf/requests/issues/4008
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4008/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4008/comments
|
https://api.github.com/repos/psf/requests/issues/4008/events
|
https://github.com/psf/requests/issues/4008
| 227,465,441 |
MDU6SXNzdWUyMjc0NjU0NDE=
| 4,008 |
[2.14.0] NameError: name 'platform_system' is not defined for pip 6.0.7
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6644187?v=4",
"events_url": "https://api.github.com/users/moylop260/events{/privacy}",
"followers_url": "https://api.github.com/users/moylop260/followers",
"following_url": "https://api.github.com/users/moylop260/following{/other_user}",
"gists_url": "https://api.github.com/users/moylop260/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/moylop260",
"id": 6644187,
"login": "moylop260",
"node_id": "MDQ6VXNlcjY2NDQxODc=",
"organizations_url": "https://api.github.com/users/moylop260/orgs",
"received_events_url": "https://api.github.com/users/moylop260/received_events",
"repos_url": "https://api.github.com/users/moylop260/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/moylop260/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/moylop260/subscriptions",
"type": "User",
"url": "https://api.github.com/users/moylop260",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-05-09T19:02:14Z
|
2021-09-08T10:00:57Z
|
2017-05-09T19:02:54Z
|
NONE
|
resolved
|
Travis use `pip 6.0.7`
Today we had the following error:
```python
pip install -U requests==2.14.0
Using cached requests-2.14.0-py2.py3-none-any.whl
Exception:
Traceback (most recent call last):
File "/home/travis/virtualenv/python2.7_with_system_site_packages/local/lib/python2.7/site-packages/pip/basecommand.py", line 232, in main
status = self.run(options, args)
File "/home/travis/virtualenv/python2.7_with_system_site_packages/local/lib/python2.7/site-packages/pip/commands/install.py", line 339, in run
requirement_set.prepare_files(finder)
File "/home/travis/virtualenv/python2.7_with_system_site_packages/local/lib/python2.7/site-packages/pip/req/req_set.py", line 436, in prepare_files
req_to_install.extras):
File "/home/travis/virtualenv/python2.7_with_system_site_packages/local/lib/python2.7/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2496, in requires
dm = self._dep_map
File "/home/travis/virtualenv/python2.7_with_system_site_packages/local/lib/python2.7/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2697, in _dep_map
self.__dep_map = self._compute_dependencies()
File "/home/travis/virtualenv/python2.7_with_system_site_packages/local/lib/python2.7/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2730, in _compute_dependencies
common = frozenset(reqs_for_extra(None))
File "/home/travis/virtualenv/python2.7_with_system_site_packages/local/lib/python2.7/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2727, in reqs_for_extra
if req.marker_fn(override={'extra':extra}):
File "/home/travis/virtualenv/python2.7_with_system_site_packages/local/lib/python2.7/site-packages/pip/_vendor/_markerlib/markers.py", line 113, in marker_fn
return eval(compiled_marker, environment)
File "<environment marker>", line 1, in <module>
NameError: name 'platform_system' is not defined
```
We can fix it using:
`pip install -U pip`
`# Successfully installed pip-9.0.1`
But maybe too much travis build use default pip version
Could be dual compatibility this requests package?
We had a similar fix for pylint: https://github.com/PyCQA/astroid/pull/360 based on https://github.com/pypa/packaging/issues/72
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4008/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4008/timeline
| null |
completed
| null | null | false |
[
"Thanks, this is a duplicate of #4006. See that issue for more discussion."
] |
https://api.github.com/repos/psf/requests/issues/4007
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4007/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4007/comments
|
https://api.github.com/repos/psf/requests/issues/4007/events
|
https://github.com/psf/requests/pull/4007
| 227,452,221 |
MDExOlB1bGxSZXF1ZXN0MTE5NzE5Nzg0
| 4,007 |
Use sys_platform instead of platform_system
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2017-05-09T18:12:19Z
|
2021-09-06T00:06:58Z
|
2017-05-09T18:55:57Z
|
MEMBER
|
resolved
|
This is more compatible as a marker and so will break fewer people. Potentially resolves #4006.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 4,
"laugh": 0,
"rocket": 0,
"total_count": 4,
"url": "https://api.github.com/repos/psf/requests/issues/4007/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4007/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4007.diff",
"html_url": "https://github.com/psf/requests/pull/4007",
"merged_at": "2017-05-09T18:55:57Z",
"patch_url": "https://github.com/psf/requests/pull/4007.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4007"
}
| true |
[
"I should note that this fix does not entirely remove the issue: sufficiently old pips will still fail to install Requests. We're just moving the goalposts on what counts as \"sufficiently old\".",
"I'm in favor of breaking less people's builds as long as people using really old `pip` versions are the ones getting broken still. This change will still carry the benefit of getting users of very old `pip` versions to upgrade, and I actually think that's a positive.",
"> sufficiently old pips will still fail to install Requests. We're just moving the goalposts on what counts as \"sufficiently old\".\r\n\r\n@Lukasa are you aware of what the specific pip version requirement would be if this got merged?",
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4007?src=pr&el=h1) Report\n> Merging [#4007](https://codecov.io/gh/kennethreitz/requests/pull/4007?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/508d47dc6e7458f2db3bc3b3bf629e0bdb1b798a?src=pr&el=desc) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/4007?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #4007 +/- ##\n=======================================\n Coverage 89.74% 89.74% \n=======================================\n Files 15 15 \n Lines 1940 1940 \n=======================================\n Hits 1741 1741 \n Misses 199 199\n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4007?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4007?src=pr&el=footer). Last update [508d47d...800a074](https://codecov.io/gh/kennethreitz/requests/pull/4007?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"@thieman I'm not. I think it actually boils down to a setuptools requirement, but setuptools from as early as 2014 should handle this I think.",
"My proposal is that we push a 2.14.1 with this fix, and then see whether things calm down on #4006.",
"Just to let know I tried with pip 8.0.0 to 8.1.1, and it works.\r\nI also tried with pip 6.0.0, and it works too.\r\nFor the cases where the dependency is on setuptools version, I could not test different version so far, sorry.",
"Still broken on Travis https://travis-ci.org/stripe/stripe-python/jobs/230553497#L498",
"@ofek thanks for pointing this out. It looks like (some or all of) the Travis-CI instances are running on `setuptools` 12.0.5. While `sys_platform` was supported then, `python_version` doesn't seem to have been added until 18.0 (although the change log says 18.5). You'll need to upgrade `setuptools` on Travis-CI to at least 18.0, or run `pip install -U setuptools` to get the latest version."
] |
https://api.github.com/repos/psf/requests/issues/4006
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4006/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4006/comments
|
https://api.github.com/repos/psf/requests/issues/4006/events
|
https://github.com/psf/requests/issues/4006
| 227,421,573 |
MDU6SXNzdWUyMjc0MjE1NzM=
| 4,006 |
requests 2.14.0 cannot be installed using pip < 8.1.2
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1050156?v=4",
"events_url": "https://api.github.com/users/lmazuel/events{/privacy}",
"followers_url": "https://api.github.com/users/lmazuel/followers",
"following_url": "https://api.github.com/users/lmazuel/following{/other_user}",
"gists_url": "https://api.github.com/users/lmazuel/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/lmazuel",
"id": 1050156,
"login": "lmazuel",
"node_id": "MDQ6VXNlcjEwNTAxNTY=",
"organizations_url": "https://api.github.com/users/lmazuel/orgs",
"received_events_url": "https://api.github.com/users/lmazuel/received_events",
"repos_url": "https://api.github.com/users/lmazuel/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/lmazuel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lmazuel/subscriptions",
"type": "User",
"url": "https://api.github.com/users/lmazuel",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 91 |
2017-05-09T16:19:18Z
|
2021-09-08T08:00:45Z
|
2017-05-10T07:06:36Z
|
NONE
|
resolved
|
Example with pip 6.1.1 (but same with pip 8.1.1):
```console
> pip install requests
You are using pip version 6.1.1, however version 9.0.1 is available.
You should consider upgrading via the 'pip install --upgrade pip' command.
Collecting requests
Using cached requests-2.14.0-py2.py3-none-any.whl
Exception:
Traceback (most recent call last):
File "D:\VEnvs\testpip\lib\site-packages\pip\basecommand.py", line 246, in main
status = self.run(options, args)
File "D:\VEnvs\testpip\lib\site-packages\pip\commands\install.py", line 342, in run
requirement_set.prepare_files(finder)
File "D:\VEnvs\testpip\lib\site-packages\pip\req\req_set.py", line 345, in prepare_files
functools.partial(self._prepare_file, finder))
File "D:\VEnvs\testpip\lib\site-packages\pip\req\req_set.py", line 290, in _walk_req_to_install
more_reqs = handler(req_to_install)
File "D:\VEnvs\testpip\lib\site-packages\pip\req\req_set.py", line 557, in _prepare_file
set(req_to_install.extras) - set(dist.extras)
File "D:\VEnvs\testpip\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 2758, in extras
return [dep for dep in self._dep_map if dep]
File "D:\VEnvs\testpip\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 2781, in _dep_map
self.__dep_map = self._compute_dependencies()
File "D:\VEnvs\testpip\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 2814, in _compute_dependencies
common = frozenset(reqs_for_extra(None))
File "D:\VEnvs\testpip\lib\site-packages\pip\_vendor\pkg_resources\__init__.py", line 2811, in reqs_for_extra
if req.marker_fn(override={'extra':extra}):
File "D:\VEnvs\testpip\lib\site-packages\pip\_vendor\_markerlib\markers.py", line 113, in marker_fn
return eval(compiled_marker, environment)
File "<environment marker>", line 1, in <module>
NameError: name 'platform_system' is not defined
```
So, installing requests 2.14.0 implicitly requires pip >= 9.x. Should be at least in the release note as a disclaimer, or be fixed if it's not on purpose.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 6,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 6,
"url": "https://api.github.com/repos/psf/requests/issues/4006/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4006/timeline
| null |
completed
| null | null | false |
[
"Yeah, we can add that as a release note. That said, pip *9* is a bit newer than I thought we were going to need: @dstufft, is the `platform_system` marker not provided before then?",
"I just bisect, and it works starting 8.1.2 (broken in 8.1.1).",
"Ok, so that means it works with a version of pip that's a year old. I think I'm ok with this, though naturally the rest of the dev team should express opinions here. Generally, I'm a bit loathe to be tied down by older pips: generally speaking for most people it's easy to upgrade pip, and if it's *hard* to upgrade pip they're usually using a distro and so should use a virtualenv to install Requests, at which point they should just install a new pip inside the virtualenv.\r\n\r\nAny thoughts on that team? @nateprewitt @sigmavirus24 @kennethreitz?",
"(For those who want the data point, pip 8.1.2 was released one year ago on Thursday).",
"I just found this issue because the default pip on Travis CI is 6.x (easily fixed with the addition of `pip install --upgrade pip` but that'll be in every project until they update the base image)",
"Hrm. That's a bit hard for Travis CI to justify: the most recent 6.x release of pip is two years old, which is really a bit much!",
"@Lukasa I'm opening an issue now",
"I'm pretty sure the justifcation is they're Rubyists and don't think much about pip :)",
"@dstufft Do you have an opinion on this, btw?",
"I think everyone should always upgrade to the latest pip! Whether enough people have done that for requests to consider it reasonable to depend on it or not, I dunno.",
"Ok, so I decided to get some data. I ran [this query over the pip BigQuery data](https://bigquery.cloud.google.com/savedquery/977624438626:e5f2a2c4215148a58419ac8c3c9d7c2b) to see what pip version was used to install Requests over the past 30 days. Here are our answers, down to 5k downloads (I don't really care about any number lower than that):\r\n\r\n| Row | details_installer_name | details_installer_version | total_downloads |\r\n|-----|------------------------|---------------------------|----------------:|\r\n| 1 | pip | 9.0.1 | 5020104 |\r\n| 2 | pip | 1.5.4 | 1271017 |\r\n| 3 | pip | 8.1.1 | 860806 |\r\n| 4 | pip | 8.1.2 | 605871 |\r\n| 5 | pip | 8.0.3 | 500970 |\r\n| 6 | pip | 7.1.2 | 471446 |\r\n| 7 | pip | 6.1.1 | 370775 |\r\n| 8 | pip | 1.5.6 | 317236 |\r\n| 9 | pip | 6.0.8 | 153005 |\r\n| 10 | pip | 6.0.7 | 149881 |\r\n| 11 | pip | 7.1.0 | 78024 |\r\n| 12 | pip | 9.0.0 | 41436 |\r\n| 13 | pip | 1.4.1 | 38936 |\r\n| 14 | pip | 8.0.2 | 34807 |\r\n| 15 | pip | 1.5.5 | 30253 |\r\n| 16 | pip | 1.5 | 28426 |\r\n| 17 | pip | 7.0.1 | 9441 |\r\n| 18 | pip | 7.0.3 | 8717 |\r\n| 19 | pip | 8.1.0 | 7126 |\r\n| 20 | pip | 1.5.1 | 4979 |\r\n\r\n",
"So the versions that support these markers are 8.1.2, 9.0.0, and 9.0.1. Together they account for 5,667,411 of our 10,015,055 pip downloads in that time frame. Put another way, 56.6% of our downloads last month came from pip installs that are compatible with the markers.",
"For some extra data points, some of the other large numbers are people using the ancient pips their distros provided. For example, pip 1.5.4 is almost entirely going to be Ubuntu Trusty, with their 1,271,017 downloads. I'm inclined not to worry about them too much: if they're installing Requests with the distro-provided pip all they're going do is *break* their pip. Same note with 1.5.6 (Debian Jessie), which together provide another 17% or so of downloads that are basically beyond us.",
"This also broke our CI a minute ago: so is Travis gonna fix their env or are projects supposed to do an update within their Travis scripts?",
"Ok, so right now with 56% of our downloaders *already* safe and most of the rest one `pip install -U pip` away from being ok, I'm pretty tempted to say that we should consider toughing this out. It's going to annoy a whole bunch of people, but if we can drag the ecosystem forward a bit to the point that people can actually meaningfully use environment markers for conditional dependencies that would be one hell of a public service.",
"@oberstet If it were me, I'd add a `pip install -U pip` to the start of my travis scripts. Travis has been known to be slow with their Python environments at the best of time, and that way you guarantee you'll never encounter a problem with a dependency that does this again in the future. Consider it a forward looking defense against this kind of pain happening again.",
"As a fellow maintainer of OSS projects who wants to use environment markers, <3. It seems like we should also engage with the travis folks to fix this, as that'll reduce a lot of the pain.",
"@Lukasa, 8.1.1 seems to likely be coming from Ubuntu Xenial too. It seems reasonable for people to be upgrading a version of pip that in some cases is more than 3 years old. I'm on board with waiting this one out as well with the current information. We can't support legacy versions of pip forever.",
"@Lukasa I'm fine with that. Travis _should_ fix that, instead of myriads of projects, but what the hell: your advice is pragmatic! =)",
"@oberstet Very much agreed, I'd prefer a world where Travis has some kind of rolling approach to pip versions, and I'm happy to work with them to make that possible if they need outside help. In the meantime, though, we shall do what we can in their absence! :grin:",
"> the rest are one `pip install -U pip` away from being ok\r\n\r\nJust to point out, they are also one `pip install requests==2.13.0` away from being ok. Seriously, this is going to impact _a lot_ of people who use Travis. Before today I had never even heard of requests, but all of a sudden every unit test for our project is failing because codecov depends on requests. I agree that Travis should install a newer version of pip, but that doesn't mean this is the right way to do it. I would prefer rolling back whatever change requires environment markers until Travis updates pip.",
"@adamjstewart Yeah, that's absolutely true, and some number of projects will probably do that. All we can do is make a decision about at what point we decide to stop supporting the older releases of `pip`: at what level of usage do we give up on them?\r\n\r\nI should note that Travis-CI is included in the above numbers, which means even with Travis-CI less than half of our downloads are currently affected. I don't consider that number to be at a level where I want to *immediately* revert this.\r\n\r\nI'm still keeping this open for discussion: I want the community and the other maintainers to weigh in. @nateprewitt and I have had our say, two maintainers have yet to get involved, and there are many more community folk who should feel free to drop in and tell us what they think. All I am saying is that *right now* I am open to leaving this as-is.",
"> Let's see how many people's computers we can break.\r\n\r\nSeeing this issue made me smile. :) I'm :+1: on keeping it how it is, on with progress!",
"> Just to point out, they are also one pip install requests==2.13.0 away from being ok.\r\n\r\nTo be fair, we ran into this issue when installing a package which depends on an unbounded `requests` version but I agree that a popular library like this forcing a `pip upgrade` is probably a good thing but will mess up a lot of people's day (like mine!).",
"This is a common feature of shipping Requests releases: when you have a userbase as large as ours it's basically impossible to avoid breaking *someone*. This was a little bit bigger than I expected, I admit, but still. :grin:",
"@blamarvt Yeah, so I should also make something clear: we are *sorry* about this. We didn't do it on purpose: when we committed the change we thought it had much broader support than it actually did. However, now that we're here we should evaluate whether this was a blessing in disguise.",
"> when you have a userbase as large as ours it's basically impossible to avoid breaking someone\r\n\r\nIs \"less than half of our users are affected\" the bar? When you have a lot of users, half your users is still a lot of users.",
"@tgamblin I don't think there is a singular bar, but rather a combination of how bad the breakage is, *when* the breakage occurs, how many people are going to be broken, how bad the remedy is, and whether the new thing is an improvement or just different.",
"@tgamblin Is it really anything beyond upgrading pip? This had to happen eventually, I think expecting users to run `pip install -U pip` at least once a year is a really low bar.",
"@tgamblin Right now we don't have a defined bar. If we did, this would be a much shorter conversation: either it would meet it, or it would not. Part of the reason we're having this discussion is to try to kick around the idea of how much pain we're willing to inflict for (IMO) positive ecosystem gain. As I've said before, we are not committed to keeping this.\r\n\r\nThe flip side to this is that if we can bring even 20% of our users forward to a much newer pip, that opens up an enormous chunk of the ecosystem to being able to use these newer Python packaging tools. Python packaging is such a commonly complained about topic, and one reason is that an enormous amount of the newer nicer features cannot actually be used because we have to support older pips. Putting on some ecosystem pressure to upgrade might be a good thing. But it *might* not. Hence: this thread."
] |
https://api.github.com/repos/psf/requests/issues/4005
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4005/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4005/comments
|
https://api.github.com/repos/psf/requests/issues/4005/events
|
https://github.com/psf/requests/issues/4005
| 227,401,963 |
MDU6SXNzdWUyMjc0MDE5NjM=
| 4,005 |
response.apparent encoding throws error after using iter_lines
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/157223?v=4",
"events_url": "https://api.github.com/users/eka/events{/privacy}",
"followers_url": "https://api.github.com/users/eka/followers",
"following_url": "https://api.github.com/users/eka/following{/other_user}",
"gists_url": "https://api.github.com/users/eka/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/eka",
"id": 157223,
"login": "eka",
"node_id": "MDQ6VXNlcjE1NzIyMw==",
"organizations_url": "https://api.github.com/users/eka/orgs",
"received_events_url": "https://api.github.com/users/eka/received_events",
"repos_url": "https://api.github.com/users/eka/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/eka/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eka/subscriptions",
"type": "User",
"url": "https://api.github.com/users/eka",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-05-09T15:18:10Z
|
2021-09-08T10:00:56Z
|
2017-05-09T15:31:48Z
|
NONE
|
resolved
|
When using iter_lines with stream=True, and later trying to get the response.apparent_encoding, I get an exception as follows:
`RuntimeError: The content for this response was already consumed`
So my question is: Is this the proper behavior? I mean, shouldn't I be able to get the apparent_encoding even after consuming the content using stream=True?
I was thinking on running the chardet on the buffer result of consuming the iteration but I'm thinking that this beats the purpose since the result is already encoded.
Am I missing something?
Thanks.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4005/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4005/timeline
| null |
completed
| null | null | false |
[
"Thanks for opening this issue!\r\n\r\n> Is this the proper behavior?\r\n\r\nYes. :smile:\r\n\r\n> I mean, shouldn't I be able to get the apparent_encoding even after consuming the content using stream=True?\r\n\r\nNo. :smile: When you set `stream=True`, you are telling Requests that you *do not* want us to buffer the content for any reason. That means we do not store it, which means we cannot hand it off to chardet to process. `apparent_encoding` cannot be used in conjunction with `stream=True`. If you are going to consume all the content into a single buffer, you can just safely use `stream=False`.",
"Thanks for your answer. Thought would be that.\r\n\r\n\r\n"
] |
https://api.github.com/repos/psf/requests/issues/4004
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4004/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4004/comments
|
https://api.github.com/repos/psf/requests/issues/4004/events
|
https://github.com/psf/requests/pull/4004
| 227,290,465 |
MDExOlB1bGxSZXF1ZXN0MTE5NjA5NDYx
| 4,004 |
Prepare for 2.14 release.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 11 |
2017-05-09T08:38:51Z
|
2021-09-06T00:06:59Z
|
2017-05-09T15:42:02Z
|
MEMBER
|
resolved
|
Changelog updated, vendored libraries all updated. Let's see how this goes. Looks like it'll be a good release!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4004/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4004/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4004.diff",
"html_url": "https://github.com/psf/requests/pull/4004",
"merged_at": "2017-05-09T15:42:02Z",
"patch_url": "https://github.com/psf/requests/pull/4004.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4004"
}
| true |
[
"Ok, so this patch has changed the way we normalize URLs of non-HTTP schemes. Specifically, we no longer lowercase `http+x` URLs. This is because of shazow/urllib3#1089. My reading of [this discussion](https://github.com/kennethreitz/requests/pull/3738#issuecomment-268805728) is that we did this more-or-less on purpose, and that this is an intended effect: we now preserve the case of these schemes.\r\n\r\nI think this deserves to be called out specifically in the changelog, and that the tests should be updated to make it clear that this is intentional behaviour. I'll do that now, but I'd like the other maintainers to weigh in.",
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4004?src=pr&el=h1) Report\n> Merging [#4004](https://codecov.io/gh/kennethreitz/requests/pull/4004?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/f8189eda10f1756b36717912b96ecdb5cafe625e?src=pr&el=desc) will **increase** coverage by `0.38%`.\n> The diff coverage is `100%`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/4004?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #4004 +/- ##\n==========================================\n+ Coverage 89.35% 89.74% +0.38% \n==========================================\n Files 18 15 -3 \n Lines 2039 1940 -99 \n==========================================\n- Hits 1822 1741 -81 \n+ Misses 217 199 -18\n```\n\n\n| [Impacted Files](https://codecov.io/gh/kennethreitz/requests/pull/4004?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/packages/chardet/compat.py](https://codecov.io/gh/kennethreitz/requests/pull/4004?src=pr&el=tree#diff-cmVxdWVzdHMvcGFja2FnZXMvY2hhcmRldC9jb21wYXQucHk=) | `100% <100%> (+35.13%)` | :arrow_up: |\n| [requests/exceptions.py](https://codecov.io/gh/kennethreitz/requests/pull/4004?src=pr&el=tree#diff-cmVxdWVzdHMvZXhjZXB0aW9ucy5weQ==) | | |\n| [requests/compat.py](https://codecov.io/gh/kennethreitz/requests/pull/4004?src=pr&el=tree#diff-cmVxdWVzdHMvY29tcGF0LnB5) | | |\n| [requests/\\_\\_init\\_\\_.py](https://codecov.io/gh/kennethreitz/requests/pull/4004?src=pr&el=tree#diff-cmVxdWVzdHMvX19pbml0X18ucHk=) | | |\n| [...es/urllib3/packages/ssl\\_match\\_hostname/\\_\\_init\\_\\_.py](https://codecov.io/gh/kennethreitz/requests/pull/4004?src=pr&el=tree#diff-cmVxdWVzdHMvcGFja2FnZXMvdXJsbGliMy9wYWNrYWdlcy9zc2xfbWF0Y2hfaG9zdG5hbWUvX19pbml0X18ucHk=) | `93.33% <0%> (+10%)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4004?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/4004?src=pr&el=footer). Last update [f8189ed...ddada2d](https://codecov.io/gh/kennethreitz/requests/pull/4004?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"@Lukasa, that's correct. Host names were changed to be invariably lowercased in urllib3 1.17 (Requests 2.12) which broke unix sockets for some. The patch you've linked to scoped the normalization a bit more tightly to only schemes we are certain must be case-insensitive.",
"Any particular reason for specifying packages manually over using `setuptools.find_packages()`? The list is getting quite long.",
"@SethMichaelLarson Uh basically historical reasons?",
"It's fine to leave it, I just know I've forgotten to add a package once on a release and had to revert and re-release.",
"Yeah, it's an ongoing problem here. PR for the next release? ;)",
"Sounds good to me :)",
"Ok, I'm calling this good. Let's see how many people's computers we can break.",
"And done. Welcome to our exciting 2.14 world!",
"Huzzah! :tada:"
] |
https://api.github.com/repos/psf/requests/issues/4003
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4003/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4003/comments
|
https://api.github.com/repos/psf/requests/issues/4003/events
|
https://github.com/psf/requests/issues/4003
| 227,202,370 |
MDU6SXNzdWUyMjcyMDIzNzA=
| 4,003 |
`requote_uri` changes the original URL
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/201694?v=4",
"events_url": "https://api.github.com/users/andreyfedoseev/events{/privacy}",
"followers_url": "https://api.github.com/users/andreyfedoseev/followers",
"following_url": "https://api.github.com/users/andreyfedoseev/following{/other_user}",
"gists_url": "https://api.github.com/users/andreyfedoseev/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/andreyfedoseev",
"id": 201694,
"login": "andreyfedoseev",
"node_id": "MDQ6VXNlcjIwMTY5NA==",
"organizations_url": "https://api.github.com/users/andreyfedoseev/orgs",
"received_events_url": "https://api.github.com/users/andreyfedoseev/received_events",
"repos_url": "https://api.github.com/users/andreyfedoseev/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/andreyfedoseev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/andreyfedoseev/subscriptions",
"type": "User",
"url": "https://api.github.com/users/andreyfedoseev",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-05-08T23:17:25Z
|
2021-09-08T10:00:57Z
|
2017-05-09T07:33:08Z
|
NONE
|
resolved
|
I'm trying to do something like:
```python
>>> import requests
>>> url = "http://beta.url2png.com/v6/P4ED6C68627746/d86b36effc2ea6862a2301273c40a2a2/png/?url=http%3A%2F%2Fwww3.geosc.psu.edu%2F%7Edmb53%2FDaveSTELLA%2FDaisyworld%2Fdaisyworld_model.htm&delay=4&fullpage=false&viewport=1280x1024"
>>> response = request.get(url)
>>> url
'http://beta.url2png.com/v6/P4ED6C68627746/d86b36effc2ea6862a2301273c40a2a2/png/?url=http%3A%2F%2Fwww3.geosc.psu.edu%2F%7Edmb53%2FDaveSTELLA%2FDaisyworld%2Fdaisyworld_model.htm&delay=4&fullpage=false&viewport=1280x1024'
>>> response.request.url
'http://beta.url2png.com/v6/P4ED6C68627746/d86b36effc2ea6862a2301273c40a2a2/png/?url=http%3A%2F%2Fwww3.geosc.psu.edu%2F~dmb53%2FDaveSTELLA%2FDaisyworld%2Fdaisyworld_model.htm&delay=4&fullpage=false&viewport=1280x1024'
```
Note how `%7E` from the original URL is changed to `~` in the actual request. Also note the `d86b36effc2ea6862a2301273c40a2a2` part of the URL. It's a hash of the query string parameters which is required to authorize the request. Since `requests.get` changes the original URL the hash becomes invalid and thus the request can't be authorized.
So, the original URL is a correct URL which you can open in a browser or download using `curl`. However the URL that `requests.get` fetches is invalid and returns `401 Unauthorized` response.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4003/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4003/timeline
| null |
completed
| null | null | false |
[
"The URL that `requests.get` fetches is not invalid in the sense that it's perfectly legitimate to represent the query string in this manner. The relevant portion of a specification is [RFC 3986 § 2.3](https://tools.ietf.org/html/rfc3986#section-2.3), with added emphasis on the most important section:\r\n\r\n> Characters that are allowed in a URI but do not have a reserved purpose are called unreserved. These include uppercase and lowercase letters, decimal digits, hyphen, period, underscore, and tilde.\r\n> ...[snip]...\r\n> URIs that differ in the replacement of an unreserved character with its corresponding percent-encoded US-ASCII octet are equivalent: they identify the same resource.\r\n> ...[snip]...\r\n> For consistency, percent-encoded octets in the ranges of ALPHA (%41-%5A and %61-%7A), DIGIT (%30-%39), hyphen (%2D), period (%2E), underscore (%5F), or tilde (%7E) **should not be created by URI producers and, when found in a URI, should be decoded to their corresponding unreserved characters by URI normalizers**.\r\n\r\nRequests tries to be intelligent about the URLs that are passed to it, because they do not consistently come encoded or unencoded. Users have passed us in the past both encoded and unencoded URLs, and we generally try to beat them into a fairly consistent shape.\r\n\r\nIf you don't want us to do that, you can opt-out by using the [PreparedRequest flow](http://docs.python-requests.org/en/master/user/advanced/#prepared-requests) and set the `url` yourself:\r\n\r\n```python\r\nfrom requests import Request, Session\r\n\r\ns = Session()\r\nreq = Request('GET', url, data=data, headers=headers)\r\n\r\nprepped = s.prepare_request(req)\r\nprepped.url = \"http://beta.url2png.com/v6/P4ED6C68627746/d86b36effc2ea6862a2301273c40a2a2/png/?url=http%3A%2F%2Fwww3.geosc.psu.edu%2F%7Edmb53%2FDaveSTELLA%2FDaisyworld%2Fdaisyworld_model.htm&delay=4&fullpage=false&viewport=1280x1024\"\r\n\r\nresp = s.send(prepped)\r\nprint(resp.status_code)\r\n```",
"It seems that it's a problem with `urllib.urlencode` function then since it doesn't treat tilde as safe character and doesn't allow to override this behavior.\r\n\r\nThank you for the proposed solution."
] |
https://api.github.com/repos/psf/requests/issues/4002
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4002/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4002/comments
|
https://api.github.com/repos/psf/requests/issues/4002/events
|
https://github.com/psf/requests/issues/4002
| 226,880,788 |
MDU6SXNzdWUyMjY4ODA3ODg=
| 4,002 |
redirect loop with manual host header
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1901770?v=4",
"events_url": "https://api.github.com/users/lr1980/events{/privacy}",
"followers_url": "https://api.github.com/users/lr1980/followers",
"following_url": "https://api.github.com/users/lr1980/following{/other_user}",
"gists_url": "https://api.github.com/users/lr1980/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/lr1980",
"id": 1901770,
"login": "lr1980",
"node_id": "MDQ6VXNlcjE5MDE3NzA=",
"organizations_url": "https://api.github.com/users/lr1980/orgs",
"received_events_url": "https://api.github.com/users/lr1980/received_events",
"repos_url": "https://api.github.com/users/lr1980/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/lr1980/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lr1980/subscriptions",
"type": "User",
"url": "https://api.github.com/users/lr1980",
"user_view_type": "public"
}
|
[] |
closed
| false | null |
[] | null | 10 |
2017-05-07T18:06:35Z
|
2024-05-19T18:54:33Z
|
2024-05-19T18:54:33Z
|
NONE
| null |
I use this for a test server:
`requests.get('http://1.2.3.4', headers={'host': 'domain.de'})`
But the request will loop because the "host" header will be also used for the redirect.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4002/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4002/timeline
| null |
completed
| null | null | false |
[
"So, this is a tricky point. Generally, when the user sets a custom Host header we have a problem: to what hosts does that custom host header apply? What about redirects?\r\n\r\nIn this case, Requests takes the easy route out and just decides to use the custom Host header for everything. That's not necessarily the *smartest* choice but it is very easy to understand, and lacks subtle bugs caused by edge cases that are hard to follow. The workaround is simple: either (1) set `allow_redirects` to `False` and handle them yourself, or (2) add a hosts file entry instead of the `Host` overload. (Technically there is a (3), which is to override the `resolve_redirects` method of the `SessionRedirectMixin` to remove the `Host` header from the `headers` dict as appropriate.)\r\n\r\nIf we want to consider this a bug (and I'm not sure we do), then there are a few possible fixes:\r\n\r\n1. Throw away the `Host` header on any redirect that supplies a redirect location with a hostname. This is pretty unsubtle approach, frankly, but would probably handle 80% or more of the uses that users want. It's also in-line with our approach for handling the `Authorization` header.\r\n2. Allow users to specify a mapping of host to IP that is used before DNS lookups. This is essentially a fake, per-`Session` hosts file. curl provides this option. This is moderately more subtle and probably useful.\r\n\r\nThoughts from other contributors?",
"Tools like curl handle this very smart and the way I would expect it:\r\n```\r\n$ curl -vLs --header 'Host: mydomain.de' --header \"TEST: yes\" http://1.2.3.4\r\n* Rebuilt URL to: http://1.2.3.4/\r\n* Trying 1.2.3.4...\r\n* TCP_NODELAY set\r\n* Connected to 1.2.3.4 (1.2.3.4) port 80 (#0)\r\n> GET / HTTP/1.1\r\n> Host: mydomain.de\r\n> User-Agent: curl/7.51.0\r\n> Accept: */*\r\n> TEST: yes\r\n>\r\n< HTTP/1.1 302 Found\r\n< Cache-Control: no-cache\r\n< Content-length: 0\r\n< Location: //www.mydomain.de/\r\n<\r\n* Curl_http_done: called premature == 0\r\n* Connection #0 to host 1.2.3.4 left intact\r\n* Issue another request to this URL: 'http://www.mydomain.de/'\r\n* Trying 1.2.3.4...\r\n* TCP_NODELAY set\r\n* Connected to www.mydomain.de (1.2.3.4) port 80 (#1)\r\n> GET / HTTP/1.1\r\n> Host: www.mydomain.de\r\n> User-Agent: curl/7.51.0\r\n> Accept: */*\r\n> TEST: yes\r\n>\r\n< HTTP/1.1 200 OK\r\n< Server: nginx\r\n< Date: Sun, 07 May 2017 21:09:12 GMT\r\n< Content-Type: text/html; charset=iso-8859-1\r\n```",
"This has been our behaviour for years and has been surprising to a very noisy select few. I'm not sure it needs changing. Users who want different behaviour have absolutely every ability to do so, for example, by controlling the redirect behaviour with `allow_redirects=False`. That said, if others believe we should throw away the `Host` header specified by users on redirects, we need 2 things:\r\n\r\n- To throw away most of the other user-specified headers on redirects\r\n- To provide a way for users to opt back into the old behaviour.",
"I believe we've also documented why we encourage users *not* to specify a `Host` header, so if users run into this, they should be able to sort themselves out without opening issues for our assistance.",
"As discussed, I'm open to changing it but I don't think it's high priority. I think I disagree with you a bit @sigmavirus24: the `Host` header is different to most other user-specified headers, and I'm ok with throwing away some set of user-specified headers when they're likely to be substantially confused if persisted across requests.\r\n\r\nI think, basically, this is a low-priority concern for me, but if someone comes along and wants to change it I'd be ok with that happening. The change would have to go in 3.0 though: this behaviour, while potentially confusing, is not a bug and so can only be changed in 3.0.",
"I would recommend turning off redirects with `allow_redirects=False` and using the `Response.next` property to get the next `PreparedResponse` to send to a `requests.Session()`.\r\n\r\ne.g:\r\n```python\r\nsession = requests.Session()\r\nresponse = session.get('http://blah', headers={'host': 'blah'}, allow_redirects=False)\r\n\r\nrequest2 = response.next\r\ndel request2.headers['host']\r\n\r\nr = session.send(request2)\r\n\r\n",
"P.S. This will only work with the latest versions of Requests available on PyPi ",
"slight chance this won't solve your problem... but off the top of my head, it works. ",
"It's difficult to understand,the host set for the first request will affect the next requests.Can you fix this? @Lukasa \r\nEspecially when we use request to implement Single Sign On feature using client dns,it's confusing that request can't redirect to the thrid website.Because we use the client dns to decide which is nearest destination.",
"I'm going to recommend users managing their own Host headers disable redirects and handle them outside of Requests."
] |
https://api.github.com/repos/psf/requests/issues/4001
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4001/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4001/comments
|
https://api.github.com/repos/psf/requests/issues/4001/events
|
https://github.com/psf/requests/pull/4001
| 226,818,345 |
MDExOlB1bGxSZXF1ZXN0MTE5MzIyMzcx
| 4,001 |
Specify that the timeout parameter is in seconds.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/354506?v=4",
"events_url": "https://api.github.com/users/bowlofeggs/events{/privacy}",
"followers_url": "https://api.github.com/users/bowlofeggs/followers",
"following_url": "https://api.github.com/users/bowlofeggs/following{/other_user}",
"gists_url": "https://api.github.com/users/bowlofeggs/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bowlofeggs",
"id": 354506,
"login": "bowlofeggs",
"node_id": "MDQ6VXNlcjM1NDUwNg==",
"organizations_url": "https://api.github.com/users/bowlofeggs/orgs",
"received_events_url": "https://api.github.com/users/bowlofeggs/received_events",
"repos_url": "https://api.github.com/users/bowlofeggs/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bowlofeggs/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bowlofeggs/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bowlofeggs",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-05-07T01:26:09Z
|
2021-09-06T00:06:59Z
|
2017-05-07T04:07:02Z
|
CONTRIBUTOR
|
resolved
|
Signed-off-by: Randy Barlow <[email protected]>
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/4001/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4001/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/4001.diff",
"html_url": "https://github.com/psf/requests/pull/4001",
"merged_at": "2017-05-07T04:07:02Z",
"patch_url": "https://github.com/psf/requests/pull/4001.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/4001"
}
| true |
[
"thanks! "
] |
https://api.github.com/repos/psf/requests/issues/4000
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/4000/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/4000/comments
|
https://api.github.com/repos/psf/requests/issues/4000/events
|
https://github.com/psf/requests/issues/4000
| 226,807,047 |
MDU6SXNzdWUyMjY4MDcwNDc=
| 4,000 |
Cookies never returned in response object
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/556503?v=4",
"events_url": "https://api.github.com/users/dubhunter/events{/privacy}",
"followers_url": "https://api.github.com/users/dubhunter/followers",
"following_url": "https://api.github.com/users/dubhunter/following{/other_user}",
"gists_url": "https://api.github.com/users/dubhunter/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dubhunter",
"id": 556503,
"login": "dubhunter",
"node_id": "MDQ6VXNlcjU1NjUwMw==",
"organizations_url": "https://api.github.com/users/dubhunter/orgs",
"received_events_url": "https://api.github.com/users/dubhunter/received_events",
"repos_url": "https://api.github.com/users/dubhunter/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dubhunter/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dubhunter/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dubhunter",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2017-05-06T21:55:39Z
|
2021-09-08T10:00:51Z
|
2017-05-15T08:04:02Z
|
NONE
|
resolved
|
I have been trying to get the cookies from a response, as your docs suggest (http://docs.python-requests.org/en/master/user/quickstart/#cookies [first example]), but it does not seem to be working.
When I dug through the `sessions.py` (https://github.com/kennethreitz/requests/blob/master/requests/sessions.py#L655), it looks like cookies are extracted to the cookie jar on the `Session` object, but never set on the response.
I had to do the following to get my intended behavior (which is probably not entirely correct, since it will copy all cookies on the `Session`, not just the ones from the most recent request):
```
>>> url = 'http://example.com/some/cookie/setting/url'
>>> s = requests.Session()
>>> r = s.get(url)
>>> r.cookies = s.cookies
>>> r.cookies['example_cookie_name']
'example_cookie_value'
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/4000/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/4000/timeline
| null |
completed
| null | null | false |
[
"Hey @dubhunter, I'm actually unable to reproduce this in either Python 2 or 3 for Requests 2.13.0.\r\n\r\nCould you please provide the version of Requests you're using and verify for me that the script below returns `<RequestsCookieJar[]>`? No need to provide the contents if the RequestsCookieJar is populated.\r\n\r\n```python\r\nimport requests\r\n\r\ns = requests.Session()\r\nresp = s.get('https://github.com/kennethreitz/requests')\r\nprint(resp.cookies)\r\n```\r\n\r\nEdit: As an after thought, is your response being redirected? `r.history` will provide the redirect chain of your final response object. If the cookies are set on a redirect response, they'll live on that object, rather than the final object you're receiving.",
"@nateprewitt You are correct. After further digging, it looks like I am having trouble because the `CookiePolicy` set on the `Session` object is not copied to `Response` and there is no way to set the `CookiePolicy` for the `Response`.\r\n\r\nWe can close this issue and open another, or reword this one.\r\n\r\nThoughts?",
"I was able to work around this with the following code in my adapter:\r\n```\r\ndef build_response(self, req, resp):\r\n response = super().build_response(req, resp)\r\n response.cookies = cookiejar_from_dict({})\r\n response.cookies.set_policy(self.cookie_policy)\r\n extract_cookies_to_jar(response.cookies, req, resp)\r\n return response\r\n\r\n@property\r\ndef cookie_policy(self):\r\n return CookiePolicy()\r\n```",
"I think the cookie policy concern is fundamentally part of #3669. However, you may find a better workaround is to patch `requests.cookies.requests_cookie_policy` to return whatever it is you want. That function is used to initialize all `RequestsCookieJar` objects.",
"It seems like this may be more closely related to #3416 and #3463. @dubhunter, would you mind clarifying a bit further your desired use case/behaviour here? It seems like you want a `Response` specific cookie policy propagated by the Session?\r\n\r\n#3416 was asking for the user to be able to persist a Session-wide policy that would scope what cookies were allowed into the Session CookieJar. While this would still allow the user to inspect ALL cookies returned on an individual response, it would only carry policy-approved cookies to the next request, which still seems reasonable to me. Is this anywhere inline with your use case?",
"Yes, #3416 seems to cover my use case. It would also be nice to be able to override `RequestsCookieJar` (since the cookie policy is never consulted when sending cookies), but that has lower utility.",
"@Lukasa @sigmavirus24, do either of you have any further thoughts on persisting cookie policies? I'm still not ecstatic about the implementation in #3463 but it seems a step in the right direction for cases like this. I only ask since #3998 will make doing this more difficult.\r\n\r\nEither way we can probably close this out as a dupe of #3416.",
"Ok, let's close this as a dupe. I think persisting cookie policies would solve the problem in #3998, so I'm open to just preferring doing that."
] |
https://api.github.com/repos/psf/requests/issues/3999
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3999/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3999/comments
|
https://api.github.com/repos/psf/requests/issues/3999/events
|
https://github.com/psf/requests/issues/3999
| 226,506,430 |
MDU6SXNzdWUyMjY1MDY0MzA=
| 3,999 |
Install in Docker does not work as expected
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2445709?v=4",
"events_url": "https://api.github.com/users/PetaPetaPeta/events{/privacy}",
"followers_url": "https://api.github.com/users/PetaPetaPeta/followers",
"following_url": "https://api.github.com/users/PetaPetaPeta/following{/other_user}",
"gists_url": "https://api.github.com/users/PetaPetaPeta/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/PetaPetaPeta",
"id": 2445709,
"login": "PetaPetaPeta",
"node_id": "MDQ6VXNlcjI0NDU3MDk=",
"organizations_url": "https://api.github.com/users/PetaPetaPeta/orgs",
"received_events_url": "https://api.github.com/users/PetaPetaPeta/received_events",
"repos_url": "https://api.github.com/users/PetaPetaPeta/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/PetaPetaPeta/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/PetaPetaPeta/subscriptions",
"type": "User",
"url": "https://api.github.com/users/PetaPetaPeta",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-05-05T08:33:23Z
|
2021-09-08T10:00:58Z
|
2017-05-06T12:01:13Z
|
NONE
|
resolved
|
I am attempting to build a Docker image (base image ubuntu:14.04) on a Windows machine where one of the commands are:
```
RUN pip install git+https://github.com/kennethreitz/requests
```
The functionality of this is to use pip inside the container to install the latest version of requests. It has functioned correctly in the past, but it currently throws an error.
```
Step X/Y : RUN pip install git+https://github.com/kennethreitz/requests
---> Running in 33fa1f8103a0
Downloading/unpacking git+https://github.com/kennethreitz/requests
Cloning https://github.com/kennethreitz/requests to /tmp/pip-ZPkqbQ-build
Running setup.py (path:/tmp/pip-ZPkqbQ-build/setup.py) egg_info for package from git+https://github.com/kennethreitz/requests
error in requests setup command: Invalid environment marker: platform_system == "Windows" and python_version<"3.3"
Complete output from command python setup.py egg_info:
error in requests setup command: Invalid environment marker: platform_system == "Windows" and python_version<"3.3"
```
The error is being triggered by the following extra requirement introduced [here](
https://github.com/kennethreitz/requests/commit/d89f8c0d70195d1ee707426ac7f3bdab870e0a18#diff-2eeaed663bd0d25b7e608891384b7298R100).
The correct thing to do in this case is of course to lock the version of requests in the `pip install`-command and I have done this. I do however suspect that this could be an issue in the future if this commit is released.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3999/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3999/timeline
| null |
completed
| null | null | false |
[
"This error is caused because the version of `pip` and `setuptools` being used in Ubuntu 14.04 is extremely old (necessarily it's from before 2014), and so doesn't understand markers. Frankly, that doesn't bother me very much. Assuming you're using a virtualenv inside your Docker image, you can resolve this by running `pip install -U setuptools` and `pip install -U pip` and that will fix your problems. If you aren't, you shouldn't be installing from `pip` as you'll totally bust your OS.",
"Hi @PetaPetaPeta thanks for opening this, but as @Lukasa explained there is nothing we can do to fix this for ancient versions of pip and setuptools. As such, I'm closing this issue."
] |
https://api.github.com/repos/psf/requests/issues/3998
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3998/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3998/comments
|
https://api.github.com/repos/psf/requests/issues/3998/events
|
https://github.com/psf/requests/pull/3998
| 226,431,882 |
MDExOlB1bGxSZXF1ZXN0MTE5MDg0MDYz
| 3,998 |
Enable sessions to bypass creating/parsing cookies.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/808798?v=4",
"events_url": "https://api.github.com/users/rowillia/events{/privacy}",
"followers_url": "https://api.github.com/users/rowillia/followers",
"following_url": "https://api.github.com/users/rowillia/following{/other_user}",
"gists_url": "https://api.github.com/users/rowillia/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rowillia",
"id": 808798,
"login": "rowillia",
"node_id": "MDQ6VXNlcjgwODc5OA==",
"organizations_url": "https://api.github.com/users/rowillia/orgs",
"received_events_url": "https://api.github.com/users/rowillia/received_events",
"repos_url": "https://api.github.com/users/rowillia/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rowillia/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rowillia/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rowillia",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 15 |
2017-05-04T23:00:12Z
|
2021-09-03T00:11:03Z
|
2018-04-29T18:58:10Z
|
NONE
|
resolved
|
We use requests for some service-to-service communication, neither side
of which utilizes cookies. When we profiled our services, we see ~13%
of the time spend sending requests is either creating cookie jars or
extracting cookies to jars.
This PR will enable us to disable cookies for a given `requests.Session`
by setting `session.cookies = None`. If `sessions.cookies` is None, we
won't create any cookies for the request nor will we parse them on the
response.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3998/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3998/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3998.diff",
"html_url": "https://github.com/psf/requests/pull/3998",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3998.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3998"
}
| true |
[
"# [Codecov](https://codecov.io/gh/requests/requests/pull/3998?src=pr&el=h1) Report\n> :exclamation: No coverage uploaded for pull request base (`master@5697148`). [Click here to learn what that means](https://docs.codecov.io/docs/error-reference#section-missing-base-commit).\n> The diff coverage is `100%`.\n\n[](https://codecov.io/gh/requests/requests/pull/3998?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #3998 +/- ##\n=========================================\n Coverage ? 89.79% \n=========================================\n Files ? 15 \n Lines ? 1950 \n Branches ? 0 \n=========================================\n Hits ? 1751 \n Misses ? 199 \n Partials ? 0\n```\n\n\n| [Impacted Files](https://codecov.io/gh/requests/requests/pull/3998?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/models.py](https://codecov.io/gh/requests/requests/pull/3998?src=pr&el=tree#diff-cmVxdWVzdHMvbW9kZWxzLnB5) | `93.45% <100%> (ø)` | |\n| [requests/sessions.py](https://codecov.io/gh/requests/requests/pull/3998?src=pr&el=tree#diff-cmVxdWVzdHMvc2Vzc2lvbnMucHk=) | `93.97% <100%> (ø)` | |\n| [requests/adapters.py](https://codecov.io/gh/requests/requests/pull/3998?src=pr&el=tree#diff-cmVxdWVzdHMvYWRhcHRlcnMucHk=) | `92.92% <100%> (ø)` | |\n| [requests/cookies.py](https://codecov.io/gh/requests/requests/pull/3998?src=pr&el=tree#diff-cmVxdWVzdHMvY29va2llcy5weQ==) | `77.68% <100%> (ø)` | |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/requests/requests/pull/3998?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/requests/requests/pull/3998?src=pr&el=footer). Last update [5697148...8a902cf](https://codecov.io/gh/requests/requests/pull/3998?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"Thanks for this suggestion!\r\n\r\nSo, before I go any further, I'd like to get an understanding of your results. You say that 13% of the time is spent \"either creating cookie jars or extracting cookies to jars\", but also that your services never use cookies. I find that somewhat tricky to believe: in the event that there are no cookies there will be relatively little processing of the absent cookies.\r\n\r\nFirstly, the most expensive things we do in general are related to environment evaluation: checking for proxies, reading `.netrc` files, etc. So I assume you're already setting `trust_env` to `False` to avoid that overhead.\r\n\r\nSecondly, as an attempt to reproduce your results, I profiled Requests making 1000 requests through a single Session to a Twisted instance serving a directory on my machine, with `trust_env=False`. This sets no cookies. The total runtime of the test was 4 seconds. Here's the profiling output, sorted by cumulative time, for any function that we spend more than 100ms in:\r\n\r\n```\r\n\r\n ncalls tottime percall cumtime percall filename:lineno(function)\r\n 1 0.018 0.018 4.038 4.038 test.py:1(<module>)\r\n 1000 0.007 0.000 3.942 0.004 sessions.py:492(get)\r\n 1000 0.020 0.000 3.935 0.004 sessions.py:411(request)\r\n 1000 0.029 0.000 3.283 0.003 sessions.py:569(send)\r\n 1000 0.019 0.000 3.007 0.003 adapters.py:375(send)\r\n 1000 0.032 0.000 2.536 0.003 connectionpool.py:446(urlopen)\r\n 1000 0.047 0.000 2.099 0.002 connectionpool.py:321(_make_request)\r\n 1000 0.018 0.000 1.709 0.002 httplib.py:1084(getresponse)\r\n 1000 0.022 0.000 1.674 0.002 httplib.py:431(begin)\r\n 8000 0.053 0.000 1.503 0.000 socket.py:410(readline)\r\n 1000 0.022 0.000 1.477 0.001 httplib.py:392(_read_status)\r\n 1000 1.409 0.001 1.409 0.001 {method 'recv' of '_socket.socket' objects}\r\n 1000 0.020 0.000 0.570 0.001 sessions.py:371(prepare_request)\r\n 1000 0.008 0.000 0.341 0.000 models.py:297(prepare)\r\n 1000 0.005 0.000 0.305 0.000 httplib.py:1040(request)\r\n 1000 0.018 0.000 0.300 0.000 httplib.py:1067(_send_request)\r\n 1000 0.015 0.000 0.291 0.000 adapters.py:240(build_response)\r\n 5001 0.020 0.000 0.203 0.000 structures.py:42(__init__)\r\n 1000 0.013 0.000 0.199 0.000 response.py:437(from_httplib)\r\n 7000 0.020 0.000 0.192 0.000 sessions.py:42(merge_setting)\r\n 1000 0.002 0.000 0.191 0.000 httplib.py:1025(endheaders)\r\n 3009 0.010 0.000 0.190 0.000 socket.py:227(meth)\r\n 1000 0.005 0.000 0.189 0.000 httplib.py:867(_send_output)\r\n 1000 0.007 0.000 0.182 0.000 httplib.py:840(send)\r\n 1000 0.169 0.000 0.169 0.000 {method 'sendall' of '_socket.socket' objects}\r\n 1000 0.010 0.000 0.162 0.000 mimetools.py:24(__init__)\r\n 85456 0.066 0.000 0.155 0.000 {isinstance}\r\n 8004 0.041 0.000 0.140 0.000 _abcoll.py:548(update)\r\n 1000 0.007 0.000 0.137 0.000 models.py:784(content)\r\n 1000 0.004 0.000 0.135 0.000 rfc822.py:88(__init__)\r\n 1000 0.045 0.000 0.132 0.000 httplib.py:271(readheaders)\r\n 2000 0.015 0.000 0.129 0.000 cookies.py:121(extract_cookies_to_jar)\r\n 1000 0.005 0.000 0.129 0.000 connectionpool.py:214(_get_conn)\r\n 19027 0.008 0.000 0.122 0.000 {method 'join' of 'str' objects}\r\n 1000 0.016 0.000 0.122 0.000 _collections.py:307(from_httplib)\r\n 1000 0.006 0.000 0.121 0.000 adapters.py:277(get_connection)\r\n 2000 0.004 0.000 0.114 0.000 models.py:715(generate)\r\n 999 0.004 0.000 0.110 0.000 connection.py:7(is_connection_dropped)\r\n 2000 0.005 0.000 0.109 0.000 response.py:411(stream)\r\n 1000 0.022 0.000 0.109 0.000 models.py:350(prepare_url)\r\n 999 0.002 0.000 0.105 0.000 wait.py:29(wait_for_read)\r\n 999 0.010 0.000 0.102 0.000 wait.py:9(_wait_for_io_events)\r\n 1000 0.019 0.000 0.100 0.000 models.py:591(__init__)\r\n 1000 0.019 0.000 0.100 0.000 response.py:342(read)\r\n```\r\n\r\nAs you can see, cookies *are* in there, but only in one place (`extract_cookies_to_jar`), with a total runtime of 129ms. Now, admittedly, I suspect the content we're sending back and forth is longer in my case than in yours, so we can assume that you'd spend less time in `socket.recv`, but even if we assumed you spent *zero* time there we don't get anywhere near to cookies being 30% of the overhead. In the case of my test, the other cookie-based code that is not part of `extract_cookies_to_jar` adds maybe another 80ms of overhead, leading to 209ms. To be pessimistic, let's double that estimate to 420ms. 420ms spread across 1000 requests is 420μs: not an entirely unreasonable overhead, given that we spend about 1/4 of that time checking that that the connection is still down.\r\n\r\nSo, certainly if we were going to add this feature I'd want the code to be *vastly* simpler. I'd also want to ask whether or not it's easier in your case simply to patch the `extract_cookies_to_jar` function to be a no-op.",
"@Lukasa Thanks a ton for your detailed analysis! I really appreciate all of the work you put into evaluating this 😄 .\r\n\r\nHere's a snippit of flame graph (Generated using https://github.com/uber/pyflame) from one of our services running in production, filtering out to only focus on `requests`:\r\n[only_requests.zip](https://github.com/kennethreitz/requests/files/979777/only_requests.zip)\r\n\r\nHere's a screenshot, highlighting all stacks that contain the word `cookies`:\r\n<img width=\"1203\" alt=\"screen shot 2017-05-05 at 9 53 53 am\" src=\"https://cloud.githubusercontent.com/assets/808798/25755693/d97b28fe-3178-11e7-966a-271f44b6051a.png\">\r\n\r\n(If you're not comfortable opening random zips, here's the raw output -\r\n https://gist.github.com/rowillia/e7232fc12e9010c4663811d2166a1228 , run it through https://github.com/brendangregg/FlameGraph to get an SVG)\r\n\r\nThanks for the `trust_env` tip, I noticed how expensive `netrc` and merging environment settings were but haven't had a chance to change things in production.\r\n\r\nAny pointers on how we can make the code simpler? This is my first requests PR so I'm not super familiar with the coding style here beyond what's in the docs.",
"Making it simpler would focus on moving the complexity into helper functions that deal with this. For example, `extract_cookies_to_jar` could be enhanced to handle `None` arguments, keeping this complexity out of the main code paths.",
"Thanks for the suggestion @Lukasa, I've simplified the implementation to encapsulate some of the complexity inside of `cookies.py`.\r\n\r\nOne other thing I wanted to run by you: setting `cookies` to `None` to avoid parsing/creating cookies feels somewhat magic to me. Do you think it would be better if there were simply a `discard cookies` flag? The main motivator for structuring like this is the behavior I'd expect if the `session` has no cookies but the `request` does is we would pass through the requests cookies and having `discard cookies` set to `True` while specifying cookies would be illegal (I tend to be a big fan of the idea that illegal states should be unrepresentable when possible).",
"@Lukasa one additional thought I just had reading your comment - I'm most worried about optimizing CPU consumption, not wall time. Totally agree that this will have minimal impact on wall time, but for some services that simply farm out a bunch of requests and wait for their response (using gevent) the CPU consumed here is non-trivial.\r\n\r\nFor servers using threads, I worry that the RLock on the cookiejar is creating contention so it'd be best if we could avoid it all together.",
"> Do you think it would be better if there were simply a discard cookies flag?\r\n\r\nNo. =) We strongly, strongly resist adding API surface because it adds to the possible confusion of interacting with Requests. This goes double for situations like yours which are extremely uncommon: adding an entire additional API property for a bit of function that fewer than 0.5% of users will ever use is just complete overkill. I'd much rather add a subtle use of an existing property.\r\n\r\n> (I tend to be a big fan of the idea that illegal states should be unrepresentable when possible).\r\n\r\nWhat makes this state illegal? Setting `session.cookies = None` requests that no cookies be *persisted*, but there's no reason to prevent you putting cookie headers in the request. In fact, this is a good example as to why we don't want to add such a flag: it is unclear what exactly it applies to, and there are numerous other flags and options that it *could* apply to, so I'd say it impedes clarity more than anything else.\r\n\r\n> For servers using threads, I worry that the RLock on the cookiejar is creating contention so it'd be best if we could avoid it all together.\r\n\r\nIt probably is, but not by much. After all, we have a lock in the connection pool too, though admittedly it's one in the default configuration that you can't contend on (more a semaphore than a lock in that case). Merely *having* the lock, even if you don't contend on it, is very expensive in Python.\r\n\r\nI agree that minimising CPU time is important, but if you want to do that I *strongly* encourage you to look away from Requests. Requests' attempts to provide you with a helpful, intuitive API involve a *lot* of Python code that is strictly overhead. If you can ensure that your HTTP data will always be ready to go and require basically zero transformation you may find urllib3 is a better fit for your needs.",
"I don't think you pushed your changes. :smile:",
"@Lukasa yeah jumped the gun on the comments 😄 ",
"Can you rebase this onto the current master for me? It's having some trouble with testing because it's not mergeable.",
"I really like this suggestion, as well as the API proposed. Very Requests. ",
"add to whitelist",
"add to whitelist",
"add to whitelist",
"Given that it's been almost a year since we've seen progress on this PR, I'm going to close this out. Please feel free to reopen if you'd like to continue. Thanks!"
] |
https://api.github.com/repos/psf/requests/issues/3997
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3997/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3997/comments
|
https://api.github.com/repos/psf/requests/issues/3997/events
|
https://github.com/psf/requests/issues/3997
| 226,267,556 |
MDU6SXNzdWUyMjYyNjc1NTY=
| 3,997 |
ModuleNotFoundError: No module named 'email.parser'; 'email' is not a package
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/24445418?v=4",
"events_url": "https://api.github.com/users/astronaut0131/events{/privacy}",
"followers_url": "https://api.github.com/users/astronaut0131/followers",
"following_url": "https://api.github.com/users/astronaut0131/following{/other_user}",
"gists_url": "https://api.github.com/users/astronaut0131/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/astronaut0131",
"id": 24445418,
"login": "astronaut0131",
"node_id": "MDQ6VXNlcjI0NDQ1NDE4",
"organizations_url": "https://api.github.com/users/astronaut0131/orgs",
"received_events_url": "https://api.github.com/users/astronaut0131/received_events",
"repos_url": "https://api.github.com/users/astronaut0131/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/astronaut0131/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/astronaut0131/subscriptions",
"type": "User",
"url": "https://api.github.com/users/astronaut0131",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-05-04T12:39:53Z
|
2021-09-08T11:00:28Z
|
2017-05-04T12:44:53Z
|
NONE
|
resolved
|
when i run : import requests
i get the error as following:
~~~shell
D:\python\Python36\python.exe "D:/Python Work Space/test003.py"
Traceback (most recent call last):
File "D:/Python Work Space/test003.py", line 1, in <module>
import requests
File "D:\python\Python36\lib\site-packages\requests\__init__.py", line 60, in <module>
from .packages.urllib3.exceptions import DependencyWarning
File "D:\python\Python36\lib\site-packages\requests\packages\__init__.py", line 29, in <module>
import urllib3
File "D:\python\Python36\lib\site-packages\urllib3\__init__.py", line 8, in <module>
from .connectionpool import (
File "D:\python\Python36\lib\site-packages\urllib3\connectionpool.py", line 11, in <module>
from .exceptions import (
File "D:\python\Python36\lib\site-packages\urllib3\exceptions.py", line 2, in <module>
from .packages.six.moves.http_client import (
File "D:\python\Python36\lib\site-packages\urllib3\packages\six.py", line 203, in load_module
mod = mod._resolve()
File "D:\python\Python36\lib\site-packages\urllib3\packages\six.py", line 115, in _resolve
return _import_module(self.mod)
File "D:\python\Python36\lib\site-packages\urllib3\packages\six.py", line 82, in _import_module
__import__(name)
File "D:\python\Python36\lib\http\client.py", line 71, in <module>
import email.parser
ModuleNotFoundError: No module named 'email.parser'; 'email' is not a package
~~~
but i have already installed it
see:
~~~shell
D:\Python Work Space>pip install email
Requirement already satisfied: email in d:\python\python36\lib\site-packages\email-6.0.0a1-py3.6.egg
~~~
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3997/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3997/timeline
| null |
completed
| null | null | false |
[
"Email is a standard library module that should have come with Python. I *suspect* that `pip install email` made that worse, not better. However, more likely you have a file in the same directory as your code called `email.py`, you need to change that filename."
] |
https://api.github.com/repos/psf/requests/issues/3996
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3996/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3996/comments
|
https://api.github.com/repos/psf/requests/issues/3996/events
|
https://github.com/psf/requests/pull/3996
| 225,675,686 |
MDExOlB1bGxSZXF1ZXN0MTE4NTQxMzk3
| 3,996 |
Allowing users to override the default agent using env variable
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/234253?v=4",
"events_url": "https://api.github.com/users/amir-hadi/events{/privacy}",
"followers_url": "https://api.github.com/users/amir-hadi/followers",
"following_url": "https://api.github.com/users/amir-hadi/following{/other_user}",
"gists_url": "https://api.github.com/users/amir-hadi/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/amir-hadi",
"id": 234253,
"login": "amir-hadi",
"node_id": "MDQ6VXNlcjIzNDI1Mw==",
"organizations_url": "https://api.github.com/users/amir-hadi/orgs",
"received_events_url": "https://api.github.com/users/amir-hadi/received_events",
"repos_url": "https://api.github.com/users/amir-hadi/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/amir-hadi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/amir-hadi/subscriptions",
"type": "User",
"url": "https://api.github.com/users/amir-hadi",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-05-02T12:08:27Z
|
2021-09-06T00:07:00Z
|
2017-05-02T12:12:52Z
|
NONE
|
resolved
|
We need to override the default user agent that requests uses. I think its ok to allow overriding the default user agent using environment variables.
I didn't find other options (except monkey patching default_user_agent function in utils), maybe this can help other users as well.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3996/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3996/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3996.diff",
"html_url": "https://github.com/psf/requests/pull/3996",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3996.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3996"
}
| true |
[
"Thanks for this!\r\n\r\nI don't really understand why Requests should merge this into the tree. If you want to have this behaviour in your application, you can just use `Session` objects and set `Session.headers['User-Agent']` to whatever you like.\r\n\r\nIn fact, I'd say that having this functionality in Requests as provided here is not good. Mostly this is because this ignores the `trust_env` variable that Requests normally uses, preventing applications from opting-out of the environment. I am also concerned about unintended side-effects: environment variables are a very sledgehammer approach to this problem, such that any application using Requests with the default user-agent variable will be affected, even if that's not the one the user was concerned about. It's almost always better for applications to work with environment variables, rather than have the libraries go looking around for environment variables to control their behaviour.\r\n\r\nSo I'm afraid I don't think we'll be merging this in its current form, sorry!"
] |
https://api.github.com/repos/psf/requests/issues/3995
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3995/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3995/comments
|
https://api.github.com/repos/psf/requests/issues/3995/events
|
https://github.com/psf/requests/issues/3995
| 225,568,259 |
MDU6SXNzdWUyMjU1NjgyNTk=
| 3,995 |
Global session
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/17710333?v=4",
"events_url": "https://api.github.com/users/hangll/events{/privacy}",
"followers_url": "https://api.github.com/users/hangll/followers",
"following_url": "https://api.github.com/users/hangll/following{/other_user}",
"gists_url": "https://api.github.com/users/hangll/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/hangll",
"id": 17710333,
"login": "hangll",
"node_id": "MDQ6VXNlcjE3NzEwMzMz",
"organizations_url": "https://api.github.com/users/hangll/orgs",
"received_events_url": "https://api.github.com/users/hangll/received_events",
"repos_url": "https://api.github.com/users/hangll/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/hangll/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hangll/subscriptions",
"type": "User",
"url": "https://api.github.com/users/hangll",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-05-02T01:29:49Z
|
2021-09-08T11:00:28Z
|
2017-05-02T07:06:23Z
|
NONE
|
resolved
|
I'd like to have persistent connection for all my http requests to improve performance. Is it recommended to have a global session object and use it everywhere? Or should I create a session dictionary to have different session per request type?
What would be the best practice here?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3995/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3995/timeline
| null |
completed
| null | null | false |
[
"A single global session is absolutely fine."
] |
https://api.github.com/repos/psf/requests/issues/3994
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3994/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3994/comments
|
https://api.github.com/repos/psf/requests/issues/3994/events
|
https://github.com/psf/requests/issues/3994
| 225,334,047 |
MDU6SXNzdWUyMjUzMzQwNDc=
| 3,994 |
Add requests.Session(timeout) parameter
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/681260?v=4",
"events_url": "https://api.github.com/users/giampaolo/events{/privacy}",
"followers_url": "https://api.github.com/users/giampaolo/followers",
"following_url": "https://api.github.com/users/giampaolo/following{/other_user}",
"gists_url": "https://api.github.com/users/giampaolo/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/giampaolo",
"id": 681260,
"login": "giampaolo",
"node_id": "MDQ6VXNlcjY4MTI2MA==",
"organizations_url": "https://api.github.com/users/giampaolo/orgs",
"received_events_url": "https://api.github.com/users/giampaolo/received_events",
"repos_url": "https://api.github.com/users/giampaolo/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/giampaolo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/giampaolo/subscriptions",
"type": "User",
"url": "https://api.github.com/users/giampaolo",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-04-30T15:28:41Z
|
2021-09-08T11:00:29Z
|
2017-04-30T15:58:41Z
|
NONE
|
resolved
|
...as a parameter which gets inherited globally by `get()` and other HTTP methods.
```
session = requests.Session(timeout=30)
session.get(...)
session.get(...)
```
Would be equivalent to:
```
session = requests.Session()
session.get(..., timeout=30)
session.get(..., timeout=30)
```
Thoughts?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 3,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 3,
"url": "https://api.github.com/repos/psf/requests/issues/3994/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3994/timeline
| null |
completed
| null | null | false |
[
"Thanks for opening this! This is a duplicate of #3341: please read the discussion there for more.",
"Oh sorry about that."
] |
https://api.github.com/repos/psf/requests/issues/3993
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3993/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3993/comments
|
https://api.github.com/repos/psf/requests/issues/3993/events
|
https://github.com/psf/requests/issues/3993
| 225,222,584 |
MDU6SXNzdWUyMjUyMjI1ODQ=
| 3,993 |
Deadlocks when retry with pool_block=True
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/299649?v=4",
"events_url": "https://api.github.com/users/mattbillenstein/events{/privacy}",
"followers_url": "https://api.github.com/users/mattbillenstein/followers",
"following_url": "https://api.github.com/users/mattbillenstein/following{/other_user}",
"gists_url": "https://api.github.com/users/mattbillenstein/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mattbillenstein",
"id": 299649,
"login": "mattbillenstein",
"node_id": "MDQ6VXNlcjI5OTY0OQ==",
"organizations_url": "https://api.github.com/users/mattbillenstein/orgs",
"received_events_url": "https://api.github.com/users/mattbillenstein/received_events",
"repos_url": "https://api.github.com/users/mattbillenstein/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mattbillenstein/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mattbillenstein/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mattbillenstein",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2017-04-29T00:49:37Z
|
2021-09-08T11:00:29Z
|
2017-04-29T21:52:34Z
|
NONE
|
resolved
|
I created a repo with a testcase for this issue:
https://github.com/mattbillenstein/requests-retry-pool-hang
I can repro using gevent and threads. It seems there are a few other issues exhibiting deadlocks, so hopefully this testcase is helpful in getting to the bottom of this.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3993/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3993/timeline
| null |
completed
| null | null | false |
[
"Ok, so I have something that fixes this for me -- retries are implemented using recursion and before recursing, the connection isn't released back to the pool, so for every retry, we would grab another connection while still holding onto the old one for no real reason afaict. See:\r\n\r\nhttps://github.com/mattbillenstein/requests/commit/524b43154553489bf60216acb9030ce5b36f7215",
"Connections should be released here: https://github.com/mattbillenstein/requests/commit/524b43154553489bf60216acb9030ce5b36f7215#diff-56e4c9a9f8955a91d87d7da5335a7dddR668.",
"Yeah, I traced through that, the connection is not put back to the pool there. If I add a print in there:\r\n\r\n finally:\r\n if not clean_exit:\r\n # We hit some kind of exception, handled or otherwise. We need\r\n # to throw the connection away unless explicitly told not to.\r\n # Close the connection, set the variable to None, and make sure\r\n # we put the None back in the pool to avoid leaking it.\r\n conn = conn and conn.close()\r\n release_this_conn = True\r\n\r\n print clean_exit, release_this_conn\r\n if release_this_conn:\r\n # Put the connection back to be reused. If the connection is\r\n # expired then it will be None, which will get replaced with a\r\n # fresh connection during _get_conn.\r\n self._put_conn(conn)\r\n\r\nAnd reduce my test to just one one request:\r\n\r\n $ time ./retry_pool_hang_gevent.py --retry\r\n True False\r\n True False\r\n True False\r\n True False\r\n True False\r\n .\r\n real 0m0.898s\r\n user 0m0.206s\r\n sys 0m0.041s",
"So it seems like Requests is not asking for that connection to be returned to the pool. This feels like a urllib3 issue; it seems like urllib3 leaks connections with `preload_content=False` and retries set. Mind opening an issue there?"
] |
https://api.github.com/repos/psf/requests/issues/3992
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3992/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3992/comments
|
https://api.github.com/repos/psf/requests/issues/3992/events
|
https://github.com/psf/requests/issues/3992
| 224,479,168 |
MDU6SXNzdWUyMjQ0NzkxNjg=
| 3,992 |
Invalid file object with requests >= 2.13.0 on Jython
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/546693?v=4",
"events_url": "https://api.github.com/users/LordGaav/events{/privacy}",
"followers_url": "https://api.github.com/users/LordGaav/followers",
"following_url": "https://api.github.com/users/LordGaav/following{/other_user}",
"gists_url": "https://api.github.com/users/LordGaav/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/LordGaav",
"id": 546693,
"login": "LordGaav",
"node_id": "MDQ6VXNlcjU0NjY5Mw==",
"organizations_url": "https://api.github.com/users/LordGaav/orgs",
"received_events_url": "https://api.github.com/users/LordGaav/received_events",
"repos_url": "https://api.github.com/users/LordGaav/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/LordGaav/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LordGaav/subscriptions",
"type": "User",
"url": "https://api.github.com/users/LordGaav",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 11 |
2017-04-26T14:03:02Z
|
2021-11-26T05:00:32Z
|
2021-08-28T04:18:22Z
|
NONE
|
resolved
|
I'm running into a repeatable error when testing [omniconf](https://github.com/cyso/omniconf/tree/1.1.1) on Jython with requests >= 2.13.0:
```
======================================================================
1) ERROR: test suite for <class 'omniconf.tests.backends.test_vault.TestVaultBackend'>
----------------------------------------------------------------------
Traceback (most recent call last):
.tox/jython/Lib/site-packages/nose/suite.py line 209 in run
self.setUp()
.tox/jython/Lib/site-packages/nose/suite.py line 292 in setUp
self.setupContext(ancestor)
.tox/jython/Lib/site-packages/nose/suite.py line 315 in setupContext
try_run(context, names)
.tox/jython/Lib/site-packages/nose/util.py line 471 in try_run
return func()
omniconf/tests/backends/test_vault.py line 87 in setUpClass
cls.manager.initialize()
.tox/jython/Lib/site-packages/hvac/tests/util.py line 44 in initialize
assert not self.client.is_initialized()
.tox/jython/Lib/site-packages/hvac/v1/__init__.py line 92 in is_initialized
return self._get('/v1/sys/init').json()['initialized']
.tox/jython/Lib/site-packages/hvac/v1/__init__.py line 944 in _get
return self.__request('get', url, **kwargs)
.tox/jython/Lib/site-packages/hvac/v1/__init__.py line 971 in _Client__request
response = self.session.request(method, url, headers=headers,
.tox/jython/Lib/site-packages/requests/sessions.py line 488 in request
resp = self.send(prep, **send_kwargs)
.tox/jython/Lib/site-packages/requests/sessions.py line 609 in send
r = adapter.send(request, **kwargs)
.tox/jython/Lib/site-packages/requests/adapters.py line 413 in send
resp = conn.urlopen(
.tox/jython/Lib/site-packages/requests/adapters.py line 413 in send
resp = conn.urlopen(
.tox/jython/Lib/site-packages/requests/packages/urllib3/connectionpool.py line 588 in urlopen
conn = self._get_conn(timeout=pool_timeout)
.tox/jython/Lib/site-packages/requests/packages/urllib3/connectionpool.py line 588 in urlopen
conn = self._get_conn(timeout=pool_timeout)
.tox/jython/Lib/site-packages/requests/packages/urllib3/connectionpool.py line 241 in _get_conn
if conn and is_connection_dropped(conn):
.tox/jython/Lib/site-packages/requests/packages/urllib3/util/connection.py line 27 in is_connection_dropped
return bool(wait_for_read(sock, timeout=0.0))
.tox/jython/Lib/site-packages/requests/packages/urllib3/util/connection.py line 27 in is_connection_dropped
return bool(wait_for_read(sock, timeout=0.0))
.tox/jython/Lib/site-packages/requests/packages/urllib3/util/wait.py line 33 in wait_for_read
return _wait_for_io_events(socks, EVENT_READ, timeout)
.tox/jython/Lib/site-packages/requests/packages/urllib3/util/wait.py line 24 in _wait_for_io_events
selector.register(sock, events)
.tox/jython/Lib/site-packages/requests/packages/urllib3/util/selectors.py line 314 in register
key = super(PollSelector, self).register(fileobj, events, data)
.tox/jython/Lib/site-packages/requests/packages/urllib3/util/selectors.py line 179 in register
key = SelectorKey(fileobj, self._fileobj_lookup(fileobj), events, data)
.tox/jython/Lib/site-packages/requests/packages/urllib3/util/selectors.py line 163 in _fileobj_lookup
return _fileobj_to_fd(fileobj)
.tox/jython/Lib/site-packages/requests/packages/urllib3/util/selectors.py line 47 in _fileobj_to_fd
raise ValueError("Invalid file object: {0!r}".format(fileobj))
ValueError: Invalid file object: <_socket._socketobject object at 0x1cc>
-------------------- >> begin captured logging << --------------------
requests.packages.urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): localhost
requests.packages.urllib3.connectionpool: DEBUG: http://localhost:18200 "GET /v1/sys/init HTTP/1.1" 200 22
--------------------- >> end captured logging << ---------------------
```
My test [initializes hvac](https://github.com/cyso/omniconf/blob/1.1.1/omniconf/tests/backends/test_vault.py#L78), which in turn checks if the requests client [is already initialized](https://github.com/ianunruh/hvac/blob/v0.2.17/hvac/v1/__init__.py#L92). This seems to do a GET request, but when [opening the response](https://github.com/kennethreitz/requests/blob/v2.13.0/requests/packages/urllib3/util/selectors.py#L45), the resulting file object is reported as being invalid.
I can reproduce this error when running omniconf tests with `tox -e jython`, after installing Jython and Hashicorp Vault (see the [.travis-ci.yml](https://github.com/cyso/omniconf/blob/1.1.1/.travis.yml#L34) for steps).
Older versions of requests work fine, specifically when specifying `requests < 1.12.0` (because of the earlier issues with `Method code too large!`).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3992/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3992/timeline
| null |
completed
| null | null | false |
[
"This seems extremely likely to be related to the use of selector objects in Python. This error is raised when we attempt to call the `fileno` method on the socket object, so that we can wait on it. Can you confirm that this simply doesn't work on Jython?",
"Ah, nevermind, I can:\r\n\r\n```python\r\n>>> import socket\r\n>>> s = socket.socket()\r\n>>> s.fileno()\r\n<_realsocket at 0x2 type=unknown open_count=1 channel=None timeout=None>\r\n>>> int(s.fileno())\r\nTraceback (most recent call last):\r\n File \"<stdin>\", line 1, in <module>\r\nTypeError: int() argument must be a string or a number, not '<_realsocket at 0x2 type=unknown open_count=1 channel=None timeout=None>'\r\n```\r\n\r\nSo I'm pretty sure this is Jython just doing exactly the wrong thing. The [CPython documentation makes it very clear](https://docs.python.org/2.7/library/socket.html#socket.socket.fileno) that `socket.fileno` should return a small integer, which it does not on Jython. I'm not really sure what Jython is up to here, but it doesn't look at all right.\r\n\r\nCan we get a Jython developer to weigh in on how we're supposed to use selectors with Jython?",
"I won't be much use for you there unfortunately, except for testing fixes. The [Jython docs](http://www.jython.org/docs/library/select.html) do have a section on it, and the language seems to imply that `file.fileno` or `socket.fileno` should be usable.\r\n\r\nThis bug on Jython might also be related: http://bugs.jython.org/issue2320",
"I think you probably need to report this to the Jython devs directly: the result just seems incorrect and inconsistent with their docs.",
"What has changed between 2.12.x and 2.13.x in relation to selectors in `requests`? It seems to have worked fine before on Jython.",
"We started using them. =)\r\n\r\nHistorically we had only ever used `select`, directly on socket objects. That's not so efficient, and limits us in high-concurrency situations, so we moved to allowing different kinds of selector by embracing the abstraction used in Python 3's `selectors` module via a backport. That was the change you saw here.",
"cc @jimbaker",
"I'm surprised it took so long for this issue to be noticed if Jython has been returning non-integers from `socket.socket.fileno()`.",
"So actually it looks like this is both documented and intended behavior? https://wiki.python.org/jython/NewSocketModule#socket.fileno.28.29_does_not_return_an_integer\r\n\r\nMeaning `urllib3` shouldn't be using anything except `select.select` on Jython even if all the other selectors are available. That'll necessitate a PR.\r\n\r\nI'll download Jython tonight and see if I can get something with minimal changes to `urllib3/util/selectors.py` working.\r\n\r\nTo make matters worse the current implementation of `BaseSelector` relies heavily on getting back something hashable to identify `SeelectorKeys` by (ie `fileno`) so I may have to move to a subclass of `BaseSelector` that uses `select.select` and is Jython specific.",
"I uninstalled requests. \r\n# /usr/local/jython/bin/jython /usr/local/jython/pip uninstall requests \r\nand download request 2.7.0 source code. (requests-2.7.0.tar.gz) \r\n# /usr/local/jython/bin/jython setup.py install \r\nit is no necessary to reinstall your package (my case is pysolr )",
"This is ultimately an issue with Jython deviating from the file object spec laid out by Python. If we are eventually able to resolve this is will need to happen at the urllib3 layer rather than Requests."
] |
https://api.github.com/repos/psf/requests/issues/3991
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3991/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3991/comments
|
https://api.github.com/repos/psf/requests/issues/3991/events
|
https://github.com/psf/requests/pull/3991
| 224,268,510 |
MDExOlB1bGxSZXF1ZXN0MTE3NTc4Mzc4
| 3,991 |
Remove some unused imports.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5584439?v=4",
"events_url": "https://api.github.com/users/chrisgavin/events{/privacy}",
"followers_url": "https://api.github.com/users/chrisgavin/followers",
"following_url": "https://api.github.com/users/chrisgavin/following{/other_user}",
"gists_url": "https://api.github.com/users/chrisgavin/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/chrisgavin",
"id": 5584439,
"login": "chrisgavin",
"node_id": "MDQ6VXNlcjU1ODQ0Mzk=",
"organizations_url": "https://api.github.com/users/chrisgavin/orgs",
"received_events_url": "https://api.github.com/users/chrisgavin/received_events",
"repos_url": "https://api.github.com/users/chrisgavin/repos",
"site_admin": true,
"starred_url": "https://api.github.com/users/chrisgavin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chrisgavin/subscriptions",
"type": "User",
"url": "https://api.github.com/users/chrisgavin",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-04-25T20:52:20Z
|
2021-09-06T00:07:01Z
|
2017-04-26T12:09:35Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3991/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3991/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3991.diff",
"html_url": "https://github.com/psf/requests/pull/3991",
"merged_at": "2017-04-26T12:09:35Z",
"patch_url": "https://github.com/psf/requests/pull/3991.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3991"
}
| true |
[
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3991?src=pr&el=h1) Report\n> Merging [#3991](https://codecov.io/gh/kennethreitz/requests/pull/3991?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/44106f18b75bfbafa63b1a1910366d57d554bc41?src=pr&el=desc) will **decrease** coverage by `<.01%`.\n> The diff coverage is `100%`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/3991?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #3991 +/- ##\n==========================================\n- Coverage 89.72% 89.72% -0.01% \n==========================================\n Files 15 15 \n Lines 1947 1946 -1 \n==========================================\n- Hits 1747 1746 -1 \n Misses 200 200\n```\n\n\n| [Impacted Files](https://codecov.io/gh/kennethreitz/requests/pull/3991?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/auth.py](https://codecov.io/gh/kennethreitz/requests/pull/3991?src=pr&el=tree#diff-cmVxdWVzdHMvYXV0aC5weQ==) | `86.41% <ø> (-0.09%)` | :arrow_down: |\n| [requests/adapters.py](https://codecov.io/gh/kennethreitz/requests/pull/3991?src=pr&el=tree#diff-cmVxdWVzdHMvYWRhcHRlcnMucHk=) | `92.89% <ø> (ø)` | :arrow_up: |\n| [requests/models.py](https://codecov.io/gh/kennethreitz/requests/pull/3991?src=pr&el=tree#diff-cmVxdWVzdHMvbW9kZWxzLnB5) | `93.42% <100%> (ø)` | :arrow_up: |\n| [requests/utils.py](https://codecov.io/gh/kennethreitz/requests/pull/3991?src=pr&el=tree#diff-cmVxdWVzdHMvdXRpbHMucHk=) | `84.98% <100%> (ø)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3991?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3991?src=pr&el=footer). Last update [44106f1...19ba9f1](https://codecov.io/gh/kennethreitz/requests/pull/3991?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n"
] |
|
https://api.github.com/repos/psf/requests/issues/3990
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3990/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3990/comments
|
https://api.github.com/repos/psf/requests/issues/3990/events
|
https://github.com/psf/requests/issues/3990
| 223,918,494 |
MDU6SXNzdWUyMjM5MTg0OTQ=
| 3,990 |
Can't get Proxy to work with https and http works only with auth, not like the Docu descripes
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/27972108?v=4",
"events_url": "https://api.github.com/users/eisbaer9/events{/privacy}",
"followers_url": "https://api.github.com/users/eisbaer9/followers",
"following_url": "https://api.github.com/users/eisbaer9/following{/other_user}",
"gists_url": "https://api.github.com/users/eisbaer9/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/eisbaer9",
"id": 27972108,
"login": "eisbaer9",
"node_id": "MDQ6VXNlcjI3OTcyMTA4",
"organizations_url": "https://api.github.com/users/eisbaer9/orgs",
"received_events_url": "https://api.github.com/users/eisbaer9/received_events",
"repos_url": "https://api.github.com/users/eisbaer9/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/eisbaer9/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eisbaer9/subscriptions",
"type": "User",
"url": "https://api.github.com/users/eisbaer9",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2017-04-24T18:57:18Z
|
2021-09-08T11:00:30Z
|
2017-04-25T08:09:44Z
|
NONE
|
resolved
|
I'm trying to get requests work with an Proxy. From the Documentation it should work with this Dictionary:
`proxies = {'http' : 'http://username:password@proxy:8080',
'https': 'https://proxy:8080'}
`
when i try:
`r = requests.get('http://google.com', proxies=proxies)`
i get this error:
requests.exceptions.ProxyError:
HTTPConnectionPool(host='username', port=80):
Max retries exceeded with url:
http://google.com/
(Caused by ProxyError('Cannot connect to proxy.',
NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection
object at 0x04755510>: Failed to establish a new connection:
[Errno 11001] getaddrinfo failed',)))
If i use it like this:
```
auth = HTTPProxyAuth(username, password)
proxies = {'http' : 'http://www-proxy2-de001.d.grp:8080'}
r = requests.get('http://google.com', proxies=proxies, auth=auth)
```
Everything is fine.
Now i'm trying the same with https, but most time i get an 407, with this:
```
auth = HTTPProxyAuth(username, password)
proxies = {'http' : 'http://proxy:8080',
'https': 'https://proxy:8080'}
r = requests.get('https://google.com', proxies=proxies, auth=auth)
```
And with this code:
` proxies = {'http' : 'www-proxy2-de001.d.grp:8080',
'https': 'http://username:password@proxy:8080'}
r = requests.get('https://google.com', proxies=proxies)`
i get this error:
raise ProxyError(e, request=request)
requests.exceptions.ProxyError: HTTPSConnectionPool(host='google.com', port=443): Max retries exceeded with url: / (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x045344D0>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed',)))
I have no more idea, why it is not working.
Any help are great.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/27972108?v=4",
"events_url": "https://api.github.com/users/eisbaer9/events{/privacy}",
"followers_url": "https://api.github.com/users/eisbaer9/followers",
"following_url": "https://api.github.com/users/eisbaer9/following{/other_user}",
"gists_url": "https://api.github.com/users/eisbaer9/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/eisbaer9",
"id": 27972108,
"login": "eisbaer9",
"node_id": "MDQ6VXNlcjI3OTcyMTA4",
"organizations_url": "https://api.github.com/users/eisbaer9/orgs",
"received_events_url": "https://api.github.com/users/eisbaer9/received_events",
"repos_url": "https://api.github.com/users/eisbaer9/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/eisbaer9/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/eisbaer9/subscriptions",
"type": "User",
"url": "https://api.github.com/users/eisbaer9",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3990/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3990/timeline
| null |
completed
| null | null | false |
[
"Forgott something:\r\n\r\ni use Python on WIndows WinPython-32bit-3.6.1.0Qt5\r\nand Requests: requests==2.10.0",
"It's *extremely* likely that your proxy password contains an `@` symbol, which is causing our URL parsing to get confused. You need to urlencode your password (that is, run it through `urllib.quote`) before passing it to us or it gets pretty confusing.\r\n",
"Hi Cory, \n\nmy password has no @ symbol. But it has an # sign. Can a # Make Problems? \nIf I use http with proxy and fill in username and password in the proxies dictionary. I see my username in the error message in the URL field. \n\n\n-------- Originalnachricht --------\nBetreff: Re: [kennethreitz/requests] Can't get Proxy to work with https and http works only with auth, not like the Docu descripes (#3990)\nVon: Cory Benfield <[email protected]>\nAn: kennethreitz/requests <[email protected]>\nCc: eisbaer9 <[email protected]>,Author <[email protected]>\n\n>It's *extremely* likely that your proxy password contains an `@` symbol, which is causing our URL parsing to get confused. You need to urlencode your password (that is, run it through `urllib.quote`) before passing it to us or it gets pretty confusing.\n>\n>\n>-- \n>You are receiving this because you authored the thread.\n>Reply to this email directly or view it on GitHub:\n>https://github.com/kennethreitz/requests/issues/3990#issuecomment-296790541",
"Yes, a `#` can also cause issues. Do you have any symbols in your username, either? Both should be encoded.",
"No, username has only letters and number. The pass word are an # and letters and numbers. \n\nI try with encoding or change password to test and get back to you. ",
"Hi,\r\n\r\nmany thanks, urllib.quote did it! If someone has the same problem, here is my small working code Example:\r\n`import requests\r\nfrom urllib.parse import quote\r\n\r\nusername = quote('type in username')\r\npassword = quote(' type in password')\r\nproxy = 'type in proxy name or ip'\r\nproxy_port = '8080'\r\n\r\nurl = 'https://google.com'\r\n\r\nproxy_http = 'http://' + username + ':' + password + '@' + proxy + ':' + proxy_port\r\nproxy_https = 'https://' + username + ':' + password + '@' + proxy + ':' + proxy_port\r\n \r\nproxies = {'http' : proxy_http, 'https': proxy_https}\r\n\r\nr = requests.get(url, proxies=proxies)\r\n\r\nprint(r.status_code)\r\nprint(r.text)\r\n`"
] |
https://api.github.com/repos/psf/requests/issues/3989
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3989/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3989/comments
|
https://api.github.com/repos/psf/requests/issues/3989/events
|
https://github.com/psf/requests/issues/3989
| 223,908,316 |
MDU6SXNzdWUyMjM5MDgzMTY=
| 3,989 |
Upload data progress
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/987796?v=4",
"events_url": "https://api.github.com/users/gladiacxtylish/events{/privacy}",
"followers_url": "https://api.github.com/users/gladiacxtylish/followers",
"following_url": "https://api.github.com/users/gladiacxtylish/following{/other_user}",
"gists_url": "https://api.github.com/users/gladiacxtylish/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gladiacxtylish",
"id": 987796,
"login": "gladiacxtylish",
"node_id": "MDQ6VXNlcjk4Nzc5Ng==",
"organizations_url": "https://api.github.com/users/gladiacxtylish/orgs",
"received_events_url": "https://api.github.com/users/gladiacxtylish/received_events",
"repos_url": "https://api.github.com/users/gladiacxtylish/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gladiacxtylish/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gladiacxtylish/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gladiacxtylish",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-04-24T18:18:00Z
|
2021-09-08T11:00:30Z
|
2017-04-25T11:52:32Z
|
NONE
|
resolved
|
I saw examples of using `iter_content` to get progress of the download, but I do not see a corresponding API for upload a blob. I am requesting such feature. Thanks.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3989/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3989/timeline
| null |
completed
| null | null | false |
[
"If you are uploading a blob of data you can provide either an iterable object (which will be uploaded using chunked encoding) or a file-like object that provides a `read` method (which, if we can determine a length, will be streamed to the server). In either case, we have to call into your code to retrieve the next chunk of data, so you will get notified whenever we've sent the last chunk you gave us. That's the expected way to be notified about uploading data.\n\nIs that sufficient for your needs?",
"@gladiacxtylish you can look at an example in [requests-toolbelt](https://github.com/sigmavirus24/requests-toolbelt/blob/3ac79c3ca358a7d3152a3d06a4f33306f9b6772f/requests_toolbelt/multipart/encoder.py#L316) for this. Cheers!"
] |
https://api.github.com/repos/psf/requests/issues/3988
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3988/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3988/comments
|
https://api.github.com/repos/psf/requests/issues/3988/events
|
https://github.com/psf/requests/pull/3988
| 223,896,978 |
MDExOlB1bGxSZXF1ZXN0MTE3MzE4NTY3
| 3,988 |
[WIP] Fix up appveyor testing.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 44 |
2017-04-24T17:36:21Z
|
2021-09-06T00:07:00Z
|
2017-05-03T17:25:26Z
|
MEMBER
|
resolved
|
This PR will track work as I try to work out where the hell `make` is on appveyor.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3988/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3988/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3988.diff",
"html_url": "https://github.com/psf/requests/pull/3988",
"merged_at": "2017-05-03T17:25:26Z",
"patch_url": "https://github.com/psf/requests/pull/3988.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3988"
}
| true |
[
"Ok, some progress has been made. We are now getting build failures due to our dependencies.\r\n\r\nPython 2.6 is failing because of click, presumably because click doesn't support Python 2.6 + Windows:\r\n\r\n```\r\npipenv lock\r\nTraceback (most recent call last):\r\n File \"c:\\python266-x64\\lib\\runpy.py\", line 122, in _run_module_as_main\r\n \"__main__\", fname, loader, pkg_name)\r\n File \"c:\\python266-x64\\lib\\runpy.py\", line 34, in _run_code\r\n exec code in run_globals\r\n File \"C:\\Python266-x64\\Scripts\\pipenv.exe\\__main__.py\", line 5, in <module>\r\n File \"c:\\python266-x64\\lib\\site-packages\\pipenv\\__init__.py\", line 14, in <module>\r\n from .cli import cli\r\n File \"c:\\python266-x64\\lib\\site-packages\\pipenv\\cli.py\", line 11, in <module>\r\n import click\r\n File \"c:\\python266-x64\\lib\\site-packages\\pipenv\\vendor\\click\\__init__.py\", line 18, in <module>\r\n from .core import Context, BaseCommand, Command, MultiCommand, Group, \\\r\n File \"c:\\python266-x64\\lib\\site-packages\\pipenv\\vendor\\click\\core.py\", line 8, in <module>\r\n from .types import convert_type, IntRange, BOOL\r\n File \"c:\\python266-x64\\lib\\site-packages\\pipenv\\vendor\\click\\types.py\", line 4, in <module>\r\n from ._compat import open_stream, text_type, filename_to_ui, \\\r\n File \"c:\\python266-x64\\lib\\site-packages\\pipenv\\vendor\\click\\_compat.py\", line 536, in <module>\r\n from ._winconsole import _get_windows_console_stream\r\n File \"c:\\python266-x64\\lib\\site-packages\\pipenv\\vendor\\click\\_winconsole.py\", line 19, in <module>\r\n from ctypes import byref, POINTER, c_int, c_char, c_char_p, \\\r\nImportError: cannot import name c_ssize_t\r\n```\r\n\r\nPython 2.7 and later are failing at install time:\r\n\r\n```\r\npipenv lock\r\nCreating a virtualenv for this project...\r\nWarning! If you are running on Windows, you should use the --python option instead.\r\nUsing base prefix 'c:\\\\python33-x64'\r\nNew python executable in C:\\Users\\appveyor\\.virtualenvs\\requests-hjhYtdkP\\Scripts\\python.exe\r\nInstalling setuptools, pip, wheel...done.\r\nVirtualenv location: C:\\Users\\appveyor\\.virtualenvs\\requests-hjhYtdkP\r\nLocking [dev-packages] dependencies...\r\nTraceback (most recent call last):\r\n File \"c:\\python33-x64\\lib\\runpy.py\", line 160, in _run_module_as_main\r\n \"__main__\", fname, loader, pkg_name)\r\n File \"c:\\python33-x64\\lib\\runpy.py\", line 73, in _run_code\r\n exec(code, run_globals)\r\n File \"C:\\Python33-x64\\Scripts\\pipenv.exe\\__main__.py\", line 9, in <module>\r\n File \"c:\\python33-x64\\lib\\site-packages\\pipenv\\vendor\\click\\core.py\", line 722, in __call__\r\n return self.main(*args, **kwargs)\r\n File \"c:\\python33-x64\\lib\\site-packages\\pipenv\\vendor\\click\\core.py\", line 697, in main\r\n rv = self.invoke(ctx)\r\n File \"c:\\python33-x64\\lib\\site-packages\\pipenv\\vendor\\click\\core.py\", line 1066, in invoke\r\n return _process_result(sub_ctx.command.invoke(sub_ctx))\r\n File \"c:\\python33-x64\\lib\\site-packages\\pipenv\\vendor\\click\\core.py\", line 895, in invoke\r\n return ctx.invoke(self.callback, **ctx.params)\r\n File \"c:\\python33-x64\\lib\\site-packages\\pipenv\\vendor\\click\\core.py\", line 535, in invoke\r\n return callback(*args, **kwargs)\r\n File \"c:\\python33-x64\\lib\\site-packages\\pipenv\\cli.py\", line 893, in lock\r\n do_lock(no_hashes=no_hashes)\r\n File \"c:\\python33-x64\\lib\\site-packages\\pipenv\\cli.py\", line 440, in do_lock\r\n results = get_downloads_info(names_map, 'dev-packages')\r\n File \"c:\\python33-x64\\lib\\site-packages\\pipenv\\cli.py\", line 405, in get_downloads_info\r\n name = list(convert_deps_from_pip(names_map[fname]))[0]\r\nKeyError: 'Babel-2.4.0-py2.py3-none-any.whl'\r\n```\r\n\r\nMy *suspicion* here is that this is pipenv related. @nateprewitt, any insights into what's happening there?",
"As to the Python 2.6 thing, how severe that is is unclear. We technically count 2.6 as a supported platform, so it's pretty ungraceful that you can't run our tests on it. On the other hand, pipenv doesn't support Python 2.6 on Windows (because, transitively, click doesn't), so maybe that's just not a surprise?",
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3988?src=pr&el=h1) Report\n> Merging [#3988](https://codecov.io/gh/kennethreitz/requests/pull/3988?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/84099eea9f99168b73ff48270dcab24e5a1ee959?src=pr&el=desc) will **increase** coverage by `0.04%`.\n> The diff coverage is `100%`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/3988?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #3988 +/- ##\n==========================================\n+ Coverage 89.72% 89.76% +0.04% \n==========================================\n Files 15 15 \n Lines 1946 1955 +9 \n==========================================\n+ Hits 1746 1755 +9 \n Misses 200 200\n```\n\n\n| [Impacted Files](https://codecov.io/gh/kennethreitz/requests/pull/3988?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/sessions.py](https://codecov.io/gh/kennethreitz/requests/pull/3988?src=pr&el=tree#diff-cmVxdWVzdHMvc2Vzc2lvbnMucHk=) | `93.95% <100%> (+0.2%)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3988?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3988?src=pr&el=footer). Last update [84099ee...fcc3b7d](https://codecov.io/gh/kennethreitz/requests/pull/3988?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"@Lukasa yep, that's definitely something wrong in Pipenv. I'm unsure if it's a transient issue or something more. Can we kick the build? I don't seem to have permissions on appveyor, or perhaps it doesn't have an option to rebuild the test.",
"Yup, this is a duplicate of kennethreitz/pipenv#213. @nateprewitt, on that issue you said you wanted a repro environment: I think I got you one. ;)",
"@nateprewitt The easiest thing to do, honestly, if @kennethreitz doesn't mind too much, is to just open a new PR and push to it repeatedly to kick off builds. That'll hopefully allow you to play around and work out what's happening.\r\n\r\nFailing that, I think you'd either have to spin up a cloud Windows machine or wait for about an hour for me to get to my local Windows box.",
"Aha! Good catch, I'll spin up appveyor for pipenv after lunch and try to get that pinned down.",
"@nateprewitt You may not be able to enable it for pipenv: we needed Kenneth to do it for Requests. ",
"@lukasa I have 2.7 running tests on Windows now (although they're failing on Socks support) with a patched branch of pipenv. It's a casing conflict for file names. I think we can get that fix rolled out pretty quickly.\r\n\r\nThe click issue seems to be noted in pallets/click#735 but hasn't received any response. Not sure if they have any intent to support 2.6 in this environment.",
"Ok cool, I'll strip 2.6 support when the pipenv fix lands.",
"Alright @Lukasa, sorry for the delay. I've cut 3.6.0 for pipenv, so this issue should be fixed if you want to rebase or push another commit.\r\n\r\nThere are a few more errors when you get past the pipenv failure that I haven't had time to fully investigate. From a cursory glance though, this is what I've found so far.\r\n\r\n * 2.6 needs to be removed from the test list for appveyor.\r\n * `win_inet_pton` appears to be a requirement for PySocks to run in Python 2.X on Windows.\r\n * The `test_time_elapsed_blank` was failing with what appeared to be a division rounding issue, but my latest builds this morning aren't producing this. Something to keep an eye out for though.",
"Thanks @nateprewitt, I'll start working through that stuff now. =)",
"Hey look, this is why we should be testing on Windows! It seems that `elapsed` doesn't work as expected on Windows, because `datetime.datetime.utcnow` has a pretty low resolution, meaning that sufficiently fast requests may be labelled as taking zero time.\r\n\r\nI believe this will work more sensibly if we use a different function to build the timedelta object.",
"Hrm. Sadly, using `time.time()` also doesn't fix the problem on Windows: the two timestamps remain the same. I think it's possible that we cannot get an appropriately accurate clock on a Windows machine for this test to run as expected.",
"So, on Windows, we need to use `time.clock` to get the accurate timestamps we want. In this case, we probably need to use `time.time` and `time.clock` to get the accuracy we want, and then create the `timedelta` ourselves. Should be easy enough.",
"Probably shouldn't rely on `time.clock` as it's been [deprecated](https://docs.python.org/3/library/time.html#time.clock) since 3.3.\r\nIs it valid to use `time.perf_counter` when available in this context?",
"Oh god, who knows? I guess, based on the reading of that documentation, the answer is yes?",
"Hrm, these clock changes do not appear to be helping. I wonder what that's about.",
"Uh...awkward. Let's not talk about what I got wrong there.",
"@nateprewitt Is it possible to declare a conditional dependency in a Pipfile?",
"@Lukasa I was thinking about that last week with the pton dependency requirement. We don't currently have a way to handle something like that, but from what I can tell neither does pip without conditional code in setup.py.\r\n\r\nPerhaps we define the list used for `socks` in `extras_require` earlier and then append `win_inet_pton` if `platform.system()` is Windows? That should work consistently for users without any syntax changes, and pipenv already uses `extras_requires` from setup.py.",
"I think pip can do it fine with markers. I tried to add something in 63e5b80a6b1b386f9f43ac07948306d0fa2d333b but have yet to test it live: will do so shortly. @dstufft, am I right in assuming that I can add a marker that says \"if you do `pip install requests[socks]` on Windows, install this other dependency?\"\r\n\r\n@nateprewitt Adding dependencies in code in `setup.py` is almost always a bad idea, because wheels don't execute `setup.py`. That means whether or not users get the Windows dependency boils down to the system on which the wheel was built: that is, if I build the wheel on my Mac then no-one gets the Windows dep, while if I build it on my Windows box *everyone* does. That's not really a winning strategy. =)",
"@Lukasa 😬 I'm glad you're here to vet my advice. So the marker is evaluated at install time even for wheels? Good to know!",
"@Lukasa, sorry I should have had my coffee before replying to this. This is probably what you were referencing in your original question, but we aren't installing anything from setup.py in the current Makefile.\r\n\r\nSince we can't declare a conditional dependency in Pipfile, this will either need to be handled in the Makefile or appveyor.yml, rather than setup.py. It seems pretty kludgy to shove the conditional into `init`, but `make` should give you what you need to run the test suite. appveyor.yml would fix the testing on our end, but probably result in tickets complaining about tests not working locally on Windows.",
"Hrm.\r\n\r\nWell, in the first instance we should consider this a feature request for conditional dependencies in Pipfile. I've used them a bit in Python, they're a fairly useful feature, so Pipfile should probably aim to support it at some point. :wink:\r\n\r\nDoing this in the Makefile is gonna suck, so for now we'll do it in appveyor.yml.",
"Hrm, it looks a lot like appveyor isn't reporting coverage. I wonder why that is.",
"What happens if we just run `codecov`? I don't know much about `pipenv`.",
"We need to run it via `pipenv`, that will automatically use the virtualenv in which codecov is installed.",
"*Realizing I should probably read more on `pipenv`*\r\n:thinking::thought_balloon: ",
"There we go, that appeared to work. Ok, so let's not merge this yet: I want to confirm with @dstufft that the marker I put in place here actually, you know, works."
] |
https://api.github.com/repos/psf/requests/issues/3987
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3987/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3987/comments
|
https://api.github.com/repos/psf/requests/issues/3987/events
|
https://github.com/psf/requests/pull/3987
| 223,885,370 |
MDExOlB1bGxSZXF1ZXN0MTE3MzExMjUx
| 3,987 |
Revert "Add appveyor testing of Requests."
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2017-04-24T16:56:10Z
|
2021-09-06T00:07:02Z
|
2017-04-24T16:56:19Z
|
CONTRIBUTOR
|
resolved
|
Reverts kennethreitz/requests#3983
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3987/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3987/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3987.diff",
"html_url": "https://github.com/psf/requests/pull/3987",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3987.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3987"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/3986
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3986/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3986/comments
|
https://api.github.com/repos/psf/requests/issues/3986/events
|
https://github.com/psf/requests/issues/3986
| 223,624,710 |
MDU6SXNzdWUyMjM2MjQ3MTA=
| 3,986 |
Allow empty data values in multipart form upload
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/160010?v=4",
"events_url": "https://api.github.com/users/charyorde/events{/privacy}",
"followers_url": "https://api.github.com/users/charyorde/followers",
"following_url": "https://api.github.com/users/charyorde/following{/other_user}",
"gists_url": "https://api.github.com/users/charyorde/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/charyorde",
"id": 160010,
"login": "charyorde",
"node_id": "MDQ6VXNlcjE2MDAxMA==",
"organizations_url": "https://api.github.com/users/charyorde/orgs",
"received_events_url": "https://api.github.com/users/charyorde/received_events",
"repos_url": "https://api.github.com/users/charyorde/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/charyorde/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/charyorde/subscriptions",
"type": "User",
"url": "https://api.github.com/users/charyorde",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2017-04-23T09:47:32Z
|
2021-09-08T10:00:51Z
|
2017-05-14T15:33:54Z
|
NONE
|
resolved
|
A multipart form upload with data field such as `{'resources': []}` is transformed as `[]`, leaving out the `resources` key.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3986/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3986/timeline
| null |
completed
| null | null | false |
[
"Can you provide sample code that demonstrates the problem, as well as an example of what you believe the correct body should look like?",
"For example, \r\n```\r\ndata = {'resources': []}\r\nfiles = {'file': open('file.zip', 'rb')}\r\nrequests.put('/upload', data=data, files=files)\r\n```",
"And what is your expected outcome?",
"Thanks for the response. I think `new_fields` can take up the value as it is without any transformation if the data field is in that structure. \r\n\r\nAt line https://github.com/kennethreitz/requests/blob/master/requests/models.py#L139 , `new_fields` can contain the data value. At least, we should be able to maintain the `resources` key.",
"Sorry, I meant literally \"what should the request body look like in this case?\"",
"This should probably work\r\n```\r\nbody --7c5c7d822e7646a5b3c6a1a6b9e3d72f\r\nContent-Disposition: form-data; name=\"resources\"\r\n\r\n[]\r\n--7c5c7d822e7646a5b3c6a1a6b9e3d72f\r\nContent-Disposition: form-data; name=\"file\"; filename=\"file.zip\"\r\n```\r\n\r\nIf `resources` is populated for example:\r\n```\r\nbody --7c5c7d822e7646a5b3c6a1a6b9e3d72f\r\nContent-Disposition: form-data; name=\"resources\"\r\n\r\n{'sha1': '4fc46b3bd86860a07f84bce98e1cd6fb0f7ef0c4', 'size': 9212}\r\n--7c5c7d822e7646a5b3c6a1a6b9e3d72f\r\nContent-Disposition: form-data; name=\"file\"; filename=\"file.zip\"\r\n```\r\n\r\nHere's what a sample request body looks like with curl:\r\n```\r\ncurl -v -k -X PUT -F 'resources=[]' -F \"application=@/tmp/application.zip\" https://api.system.domain.net/v2/apps/130adaba-b017-4234-8d2c-8f2874334e3d/bits \r\n* TCP_NODELAY set\r\n> PUT /v2/apps/130adaba-b017-4234-8d2c-8f2874334e3d/bits HTTP/1.1\r\n> User-Agent: curl/7.51.0\r\n> Accept: */*\r\n> Content-Length: 44852\r\n> Expect: 100-continue\r\n> Content-Type: multipart/form-data; boundary=------------------------854d952145f669ed\r\n>\r\n< HTTP/1.1 100 Continue\r\n< HTTP/1.1 201 Created\r\n< Content-Length: 2 \r\n< Content-Type: application/json;charset=utf-8\r\n< Date: Sat, 01 Apr 2017 22:05:26 GMT\r\n< Server: nginx\r\n< X-Content-Type-Options: nosniff\r\n< X-Vcap-Request-Id: ffd5d2c7-f9f3-44da-7db4-e62e71fbadde\r\n< X-Vcap-Request-Id: ffd5d2c7-f9f3-44da-7db4-e62e71fbadde::da9da2b6-8afc-48d5-af5a-aebbbd9a9a68\r\n<\r\n* Curl_http_done: called premature == 0\r\n{}\r\n```",
"The curl data you showed does not show the request body, only the request header. Based on your example, you can get the result you want by doing:\r\n\r\n```python\r\ndata = {'resources': '[]'}\r\nfiles = {'file': open('file.zip', 'rb')}\r\nrequests.put('/upload', data=data, files=files)\r\n```",
"Ha. Hackity. Thanks "
] |
https://api.github.com/repos/psf/requests/issues/3985
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3985/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3985/comments
|
https://api.github.com/repos/psf/requests/issues/3985/events
|
https://github.com/psf/requests/issues/3985
| 223,427,350 |
MDU6SXNzdWUyMjM0MjczNTA=
| 3,985 |
No module named 'requests.packages.urllib3.packages.six' // requests on python3.6 Lambda
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/139987?v=4",
"events_url": "https://api.github.com/users/Miserlou/events{/privacy}",
"followers_url": "https://api.github.com/users/Miserlou/followers",
"following_url": "https://api.github.com/users/Miserlou/following{/other_user}",
"gists_url": "https://api.github.com/users/Miserlou/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Miserlou",
"id": 139987,
"login": "Miserlou",
"node_id": "MDQ6VXNlcjEzOTk4Nw==",
"organizations_url": "https://api.github.com/users/Miserlou/orgs",
"received_events_url": "https://api.github.com/users/Miserlou/received_events",
"repos_url": "https://api.github.com/users/Miserlou/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Miserlou/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Miserlou/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Miserlou",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 10 |
2017-04-21T16:02:19Z
|
2021-09-08T10:00:53Z
|
2017-04-21T16:52:20Z
|
NONE
|
resolved
|
Hey all!
Note: This issue _may_ be out of scope for the requests project, I'm not sure, but honestly I'd really appreciate your help in explaining the rationale behind [some of the behavior](https://github.com/kennethreitz/requests/blob/master/requests/packages/__init__.py#L26-L30) related to this section of the codebase ([related PR discussion](https://github.com/kennethreitz/requests/pull/2375)). Let me know if this ticket is out of scope and we can move the discussion back downstream.
I'm currently working on adding [Python3.6 support to Zappa](https://github.com/Miserlou/Zappa/issues/793) - it pretty much works, _except_ for the requests library. Any time `import requests` is called, this happens:
```javascript
'message':'An uncaught exception happened while servicing this request. You can investigate this with the `zappa tail` command.',
'traceback':[
'Traceback (most recent call last):\n',
' File "/var/task/requests/packages/__init__.py", line 27, in <module>\n from . import urllib3\n',
' File "/var/task/requests/packages/urllib3/__init__.py", line 8, in <module>\n from .connectionpool import (\n',
' File "/var/task/requests/packages/urllib3/connectionpool.py", line 11, in <module>\n from .exceptions import (\n',
' File "/var/task/requests/packages/urllib3/exceptions.py", line 2, in <module>\n from .packages.six.moves.http_client import (\n',
"ModuleNotFoundError: No module named 'requests.packages.urllib3.packages.six'\n",
'\nDuring handling of the above exception, another exception occurred:\n\n',
'Traceback (most recent call last):\n',
' File "/var/task/handler.py", line 471, in handler\n response = Response.from_app(self.wsgi_app, environ)\n',
' File "/var/task/werkzeug/wrappers.py", line 903, in from_app\n return cls(*_run_wsgi_app(app, environ, buffered))\n',
' File "/var/task/werkzeug/test.py", line 884, in run_wsgi_app\n app_rv = app(environ, start_response)\n',
' File "/var/task/zappa/middleware.py", line 66, in __call__\n response = self.application(environ, encode_response)\n',
' File "/var/task/flask/app.py", line 1997, in __call__\n return self.wsgi_app(environ, start_response)\n',
' File "/var/task/flask/app.py", line 1985, in wsgi_app\n response = self.handle_exception(e)\n',
' File "/var/task/flask/app.py", line 1540, in handle_exception\n reraise(exc_type, exc_value, tb)\n',
' File "/var/task/flask/_compat.py", line 33, in reraise\n raise value\n',
' File "/var/task/flask/app.py", line 1982, in wsgi_app\n response = self.full_dispatch_request()\n',
' File "/var/task/flask/app.py", line 1614, in full_dispatch_request\n rv = self.handle_user_exception(e)\n',
' File "/var/task/flask/app.py", line 1517, in handle_user_exception\n reraise(exc_type, exc_value, tb)\n',
' File "/var/task/flask/_compat.py", line 33, in reraise\n raise value\n',
' File "/var/task/flask/app.py", line 1612, in full_dispatch_request\n rv = self.dispatch_request()\n',
' File "/var/task/flask/app.py", line 1598, in dispatch_request\n return self.view_functions[rule.endpoint](**req.view_args)\n',
' File "/var/task/app3.py", line 10, in home\n import requests\n',
' File "/var/task/requests/__init__.py", line 60, in <module>\n from .packages.urllib3.exceptions import DependencyWarning\n',
' File "/var/task/requests/packages/__init__.py", line 29, in <module>\n import urllib3\n',
' File "/var/task/urllib3/__init__.py", line 8, in <module>\n from .connectionpool import (\n',
' File "/var/task/urllib3/connectionpool.py", line 11, in <module>\n from .exceptions import (\n',
' File "/var/task/urllib3/exceptions.py", line 2, in <module>\n from .packages.six.moves.http_client import (\n',
"ModuleNotFoundError: No module named 'urllib3.packages.six'\n"
```
It looks to me like the vendoring of requests's urllib3 is conflicting with the urllib3 in the Lambda environment.
I'd really prefer not to add a special case for the import of requests - is there a recommended way of packaging or configuring of requests that avoids this problem?
(For some context, for those who don't know how Zappa works, we are creating a virtual environment one system, packaging it up into a zip and deploying it to another system, which may or may not be on the same operating system. For pure-python projects, this isn't usually a problem, but since there is some operating-system dependent logic here, maybe it is here, although it hasn't been a problem for the 2.7 environment.)
Any ideas what's happening here?
Thanks!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/139987?v=4",
"events_url": "https://api.github.com/users/Miserlou/events{/privacy}",
"followers_url": "https://api.github.com/users/Miserlou/followers",
"following_url": "https://api.github.com/users/Miserlou/following{/other_user}",
"gists_url": "https://api.github.com/users/Miserlou/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Miserlou",
"id": 139987,
"login": "Miserlou",
"node_id": "MDQ6VXNlcjEzOTk4Nw==",
"organizations_url": "https://api.github.com/users/Miserlou/orgs",
"received_events_url": "https://api.github.com/users/Miserlou/received_events",
"repos_url": "https://api.github.com/users/Miserlou/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Miserlou/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Miserlou/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Miserlou",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3985/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3985/timeline
| null |
completed
| null | null | false |
[
"It looks extremely likely that urllib3 has been torn apart in some way. Where are you getting that urllib3 from? Does it reproduce with just the code `import urllib3`?",
"Yes, the error is recreated with just `import urllib3`, if `urllib3` is included in the Lambda package. Without urllib3 in the package, I get `\"ModuleNotFoundError: No module named 'urllib3'\"`. With both in the package, I get `No module named 'urllib3.packages.six'\\`. It's difficult to test the urllib3 import without requests being in the package since we're quite dependent on requests for other reasons.",
"So this strongly suggests that your packaging tool is not noticing a vendored package inside urllib3 (specifically, six).",
"Yep, `six.py` is missing in my package. Sorry for wasting your time, handy to braindump this out loud. I'll close the ticket once I can confirm this end to end.",
"No problem at all, happy to help. :smile:",
"Major fail on my part. Bug origin: https://github.com/Miserlou/Zappa/pull/581",
"TL;DR - we were making some optimizations for the python2.7 environment that turned into bugs in the new python3.6 environment.",
"I know this might not be the most correct place to ask this (maybe open another issue in zappa?) but I'm having the exact same error in python 2.7, with Zappa. Any clues?\r\n\r\nEdit: the error is the same, but this time it's raised from inside elasticsearch, then urllib3",
"@jonathanglima this is absolutely not the correct place to look for Zappa support. It would seem @Miserlou fixed Zappa though so you may try using a new version or reporting a bug as that fix may have regressed. Cheers!",
"Yeah, I know. I'm not looking for Zappa support. Just looking for some fix that solved this specific bug, since I already tried an update without any success.\r\n\r\nI'll open a ticket @ Zappa. Sorry for the disturbance."
] |
https://api.github.com/repos/psf/requests/issues/3984
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3984/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3984/comments
|
https://api.github.com/repos/psf/requests/issues/3984/events
|
https://github.com/psf/requests/pull/3984
| 223,389,060 |
MDExOlB1bGxSZXF1ZXN0MTE2OTg5MjU3
| 3,984 |
Fix the additional newline generated by iter_lines() caused by a '\r\n' pair being separated in two different chunks.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/46543?v=4",
"events_url": "https://api.github.com/users/PCMan/events{/privacy}",
"followers_url": "https://api.github.com/users/PCMan/followers",
"following_url": "https://api.github.com/users/PCMan/following{/other_user}",
"gists_url": "https://api.github.com/users/PCMan/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/PCMan",
"id": 46543,
"login": "PCMan",
"node_id": "MDQ6VXNlcjQ2NTQz",
"organizations_url": "https://api.github.com/users/PCMan/orgs",
"received_events_url": "https://api.github.com/users/PCMan/received_events",
"repos_url": "https://api.github.com/users/PCMan/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/PCMan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/PCMan/subscriptions",
"type": "User",
"url": "https://api.github.com/users/PCMan",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 12 |
2017-04-21T13:48:45Z
|
2021-09-06T00:07:01Z
|
2017-04-26T15:07:46Z
|
NONE
|
resolved
|
When a "\r\n" CRLF pair is splitted into two chunks accidentally, iter_lines() generates two line breaks instead of one. This patch fixes the incorrect behavior and closes issue #3980.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3984/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3984/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3984.diff",
"html_url": "https://github.com/psf/requests/pull/3984",
"merged_at": "2017-04-26T15:07:46Z",
"patch_url": "https://github.com/psf/requests/pull/3984.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3984"
}
| true |
[
"@Lukasa Thanks for the review and I already modified the code by following your instructions.\r\nHowever there seems to be some issues with the unit tests. I'll take a look on them.",
"@Lukasa While trying to fix the unit tests, I noted that treating \"\\r\\n\" as two line breaks is actually the intended behavior based on the current unit tests. While this behavior is incompatible with the default of python (universal newline) and is not explained in the requests API doc either, what `delimiters=None` does should be well defined and documented before we try to fix this. The unit tests should be fixed to reflect the change as well.\r\n\r\n* If treating \"\\r\\n\" as two line breaks is intended, then the fix is avoiding the use of splitlines() and split the lines on either '\\r' or '\\n' outselves. Then, make this documented in the API doc.\r\n* Otherwise, we should simulate the behavior of splitlines() and follow python universal newline handling, and fix all the unit tests and API doc to reflect this behavior change.\r\n\r\nEither way, the current code needs some fix since different chunk size can yield different results.\r\n\r\nAnother issue with the current uint tests is it only works with python2. For working with python3, delimiter should be bytes rather than str if decode_unicode is not True, which is not addressed in the unit tests.\r\n@Lukasa Your suggestions are needed. Thank you.",
"The unit tests should be updated: `splitlines` is clearly intended to be part of the behaviour of the code and so the unit tests as written should be considered as codifying the v2 behaviour, rather than guiding the v3. We can change them for v3.\r\n\r\n> For working with python3, delimiter should be bytes rather than str if decode_unicode is not True, which is not addressed in the unit tests.\r\n\r\nYup, that should change as well.",
"It seems sensible to have the test data default be in bytes since that's what we should be receiving from urllib3. We can have a `parametrize` toggle if we want to test both True and False for `decode_unicode`.",
"@Lukasa @nateprewitt Just fixed all of the unit tests and the mentioned chunk[1:] problems. All related tests should pass now. Also provide a small test program to demonstrate the bug.\r\n```\r\nimport requests\r\n\r\ntest_content = b\"line1\\r\\nline2\"\r\n\r\ndef mock_iter_content(chunk_size=1, decode_unicode=None):\r\n for i in range(0, len(test_content), chunk_size):\r\n yield test_content[i: i + chunk_size]\r\n\r\nr = requests.Response()\r\nr._content_consumed = True\r\nr.iter_content = mock_iter_content\r\n\r\nassert list(r.iter_lines(chunk_size=6)) == test_content.splitlines()\r\n```\r\nThanks!",
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3984?src=pr&el=h1) Report\n> Merging [#3984](https://codecov.io/gh/kennethreitz/requests/pull/3984?src=pr&el=desc) into [proposed/3.0.0](https://codecov.io/gh/kennethreitz/requests/commit/73456b00485a59c6318b36724e3a1ef663e07b85?src=pr&el=desc) will **increase** coverage by `0.04%`.\n> The diff coverage is `100%`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/3984?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## proposed/3.0.0 #3984 +/- ##\n==================================================\n+ Coverage 89.44% 89.48% +0.04% \n==================================================\n Files 15 15 \n Lines 1932 1941 +9 \n==================================================\n+ Hits 1728 1737 +9 \n Misses 204 204\n```\n\n\n| [Impacted Files](https://codecov.io/gh/kennethreitz/requests/pull/3984?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/models.py](https://codecov.io/gh/kennethreitz/requests/pull/3984?src=pr&el=tree#diff-cmVxdWVzdHMvbW9kZWxzLnB5) | `94.33% <100%> (+0.11%)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3984?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3984?src=pr&el=footer). Last update [73456b0...64ff760](https://codecov.io/gh/kennethreitz/requests/pull/3984?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"@Lukasa Thanks for the detailed review. All problems are cleared now.",
"@Lukasa Done! :)",
"> So this is now looking good except for my possible concern about the readability of our two alternative approaches.\r\n\r\nI think I'm missing something. I've looked through the PR but I don't see alternative approaches or readability issues. This could just be the woefully immature GitHub Review system hiding it from me though.",
"Yeah, it is. See [this discussion](https://github.com/kennethreitz/requests/pull/3984#discussion_r112929801).",
"@Lukasa Fixed as suggested by @sigmavirus24 .\r\n",
"@Lukasa @sigmavirus24 git squashed. Thanks for your patience."
] |
https://api.github.com/repos/psf/requests/issues/3983
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3983/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3983/comments
|
https://api.github.com/repos/psf/requests/issues/3983/events
|
https://github.com/psf/requests/pull/3983
| 223,352,676 |
MDExOlB1bGxSZXF1ZXN0MTE2OTYyNjAz
| 3,983 |
Add appveyor testing of Requests.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2017-04-21T11:09:03Z
|
2021-09-06T00:07:01Z
|
2017-04-24T16:49:26Z
|
MEMBER
|
resolved
|
In general it'd be good to have more testing, but it's also possible that via PR #3979 we'll get unexpected different behaviour on Windows than on other platforms. For this reason, we should start also testing on Windows.
This PR takes the configuration used for urllib3, updates it for Requests, and then provides it. We'll still need to actually enable Appveyor so that it'll try to build, which I cannot do as I don't have admin on the repo. @kennethreitz, mind giving that a try?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3983/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3983/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3983.diff",
"html_url": "https://github.com/psf/requests/pull/3983",
"merged_at": "2017-04-24T16:49:25Z",
"patch_url": "https://github.com/psf/requests/pull/3983.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3983"
}
| true |
[
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3983?src=pr&el=h1) Report\n> Merging [#3983](https://codecov.io/gh/kennethreitz/requests/pull/3983?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/f6c58ec5820e29c1214c8e3ff887f571c32ffd48?src=pr&el=desc) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/3983?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #3983 +/- ##\n=======================================\n Coverage 89.72% 89.72% \n=======================================\n Files 15 15 \n Lines 1947 1947 \n=======================================\n Hits 1747 1747 \n Misses 200 200\n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3983?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3983?src=pr&el=footer). Last update [f6c58ec...fae7530](https://codecov.io/gh/kennethreitz/requests/pull/3983?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"i'm very skeptical, but let's give it a shot",
"doesn't travis have a windows option?",
"build failed, looks like it's because of 2.6 maybe. https://ci.appveyor.com/project/kennethreitz/requests",
"Travis does not have a Windows option and does not include one in their roadmap. ",
"So the reason those builds are failing is because I have to work out where the hell GNU make is on those systems. Given that our CI builds use make, I need to find where it is. If you're happy to be a bit patient, I'll open a PR to fix it in the next few hours "
] |
https://api.github.com/repos/psf/requests/issues/3982
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3982/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3982/comments
|
https://api.github.com/repos/psf/requests/issues/3982/events
|
https://github.com/psf/requests/pull/3982
| 223,347,491 |
MDExOlB1bGxSZXF1ZXN0MTE2OTU4NzUx
| 3,982 |
configure sudo: false on travis
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/413772?v=4",
"events_url": "https://api.github.com/users/graingert/events{/privacy}",
"followers_url": "https://api.github.com/users/graingert/followers",
"following_url": "https://api.github.com/users/graingert/following{/other_user}",
"gists_url": "https://api.github.com/users/graingert/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/graingert",
"id": 413772,
"login": "graingert",
"node_id": "MDQ6VXNlcjQxMzc3Mg==",
"organizations_url": "https://api.github.com/users/graingert/orgs",
"received_events_url": "https://api.github.com/users/graingert/received_events",
"repos_url": "https://api.github.com/users/graingert/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/graingert/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/graingert/subscriptions",
"type": "User",
"url": "https://api.github.com/users/graingert",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-04-21T10:44:14Z
|
2021-09-06T00:07:02Z
|
2017-04-21T10:54:33Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3982/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3982/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3982.diff",
"html_url": "https://github.com/psf/requests/pull/3982",
"merged_at": "2017-04-21T10:54:33Z",
"patch_url": "https://github.com/psf/requests/pull/3982.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3982"
}
| true |
[
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3982?src=pr&el=h1) Report\n> Merging [#3982](https://codecov.io/gh/kennethreitz/requests/pull/3982?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/52a3526a99e0129ef239ec811a125272398a9f7c?src=pr&el=desc) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/3982?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #3982 +/- ##\n=======================================\n Coverage 89.72% 89.72% \n=======================================\n Files 15 15 \n Lines 1947 1947 \n=======================================\n Hits 1747 1747 \n Misses 200 200\n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3982?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3982?src=pr&el=footer). Last update [52a3526...047056c](https://codecov.io/gh/kennethreitz/requests/pull/3982?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"Cool, thanks @graingert!"
] |
|
https://api.github.com/repos/psf/requests/issues/3981
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3981/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3981/comments
|
https://api.github.com/repos/psf/requests/issues/3981/events
|
https://github.com/psf/requests/pull/3981
| 223,342,580 |
MDExOlB1bGxSZXF1ZXN0MTE2OTU1MTc0
| 3,981 |
enable travis pip cache
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/413772?v=4",
"events_url": "https://api.github.com/users/graingert/events{/privacy}",
"followers_url": "https://api.github.com/users/graingert/followers",
"following_url": "https://api.github.com/users/graingert/following{/other_user}",
"gists_url": "https://api.github.com/users/graingert/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/graingert",
"id": 413772,
"login": "graingert",
"node_id": "MDQ6VXNlcjQxMzc3Mg==",
"organizations_url": "https://api.github.com/users/graingert/orgs",
"received_events_url": "https://api.github.com/users/graingert/received_events",
"repos_url": "https://api.github.com/users/graingert/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/graingert/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/graingert/subscriptions",
"type": "User",
"url": "https://api.github.com/users/graingert",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-04-21T10:23:04Z
|
2021-09-06T00:07:03Z
|
2017-04-21T10:32:41Z
|
CONTRIBUTOR
|
resolved
| null |
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3981/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3981/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3981.diff",
"html_url": "https://github.com/psf/requests/pull/3981",
"merged_at": "2017-04-21T10:32:41Z",
"patch_url": "https://github.com/psf/requests/pull/3981.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3981"
}
| true |
[
"Thanks @graingert!",
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3981?src=pr&el=h1) Report\n> Merging [#3981](https://codecov.io/gh/kennethreitz/requests/pull/3981?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/524627ef21818e06a33bf4b54cd9c93a7dc35b21?src=pr&el=desc) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/3981?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #3981 +/- ##\n=======================================\n Coverage 89.72% 89.72% \n=======================================\n Files 15 15 \n Lines 1947 1947 \n=======================================\n Hits 1747 1747 \n Misses 200 200\n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3981?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3981?src=pr&el=footer). Last update [524627e...af383b3](https://codecov.io/gh/kennethreitz/requests/pull/3981?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n"
] |
https://api.github.com/repos/psf/requests/issues/3980
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3980/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3980/comments
|
https://api.github.com/repos/psf/requests/issues/3980/events
|
https://github.com/psf/requests/issues/3980
| 223,287,665 |
MDU6SXNzdWUyMjMyODc2NjU=
| 3,980 |
iter_lines() sometimes yields broken results when chunk_size is small
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/46543?v=4",
"events_url": "https://api.github.com/users/PCMan/events{/privacy}",
"followers_url": "https://api.github.com/users/PCMan/followers",
"following_url": "https://api.github.com/users/PCMan/following{/other_user}",
"gists_url": "https://api.github.com/users/PCMan/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/PCMan",
"id": 46543,
"login": "PCMan",
"node_id": "MDQ6VXNlcjQ2NTQz",
"organizations_url": "https://api.github.com/users/PCMan/orgs",
"received_events_url": "https://api.github.com/users/PCMan/received_events",
"repos_url": "https://api.github.com/users/PCMan/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/PCMan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/PCMan/subscriptions",
"type": "User",
"url": "https://api.github.com/users/PCMan",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 14 |
2017-04-21T06:15:38Z
|
2021-09-04T00:06:13Z
|
2017-05-01T14:09:50Z
|
NONE
|
resolved
|
Here is a small test program to demonstrate the bug:
```
import requests
url = "http://lohas.pixnet.net/blog"
r = requests.get(url)
iter_lines = [line for line in r.iter_lines(chunk_size=7, decode_unicode=False)]
split_lines = r.content.splitlines()
for index, (iline, sline) in enumerate(zip(iter_lines, split_lines)):
if iline != sline:
print("line {} is broken".format(index))
print(iline, len(iline))
print(sline, len(sline))
break
```
Expected behavior:
r.iter_lines() should give the same result as r.content.splitlines()
Actual behavior:
The test program generates the following output:
```
line 275 is broken
b'' 0
b'\t\t\t<tbody>' 10
```
Changing chunk_size can break different lines.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3980/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3980/timeline
| null |
completed
| null | null | false |
[
"It's not caused by incorrect unicode decoding.\r\nI tested this with both `decode_unicode=True` and `decode_unicode=False`. In both cases the bug can be reproduced if chunk_size is small enough.",
"I currently use the following code snip as a drop-in replacement.\r\n```\r\ndef my_iter_lines(result):\r\n buf = b''\r\n for chunk in result.iter_content(chunk_size=64 * 1024, decode_unicode=False):\r\n buf += chunk\r\n pos = 0\r\n while True:\r\n eol = buf.find(b'\\n', pos)\r\n if eol != -1:\r\n yield buf[pos:eol]\r\n pos = eol + 1\r\n else:\r\n buf = buf[pos:]\r\n break\r\n if buf:\r\n yield buf\r\n```\r\n",
"This is not a breakage, it's entirely expected behaviour. `iter_lines` takes a `chunk_size` argument that limits the size of the chunk it will return, which means it will occasionally yield before a line delimiter is reached. This is the behaviour `iter_lines` has always had and is expected to have by the vast majority of requests users.\n\nTo avoid this issue, you can set the chunk_size to be very large indeed. Alternatively, you'll need to process it by wrapping the `read` method on `response.raw` yourself and handling the line splitting using the IO module.",
"@Lukasa no this is not. See the API doc.\r\n```\r\niter_lines(chunk_size=512, decode_unicode=None, delimiter=None)[source]\r\nIterates over the response data, one line at a time. When stream=True is set on the request, this avoids reading the content at once into memory for large responses.\r\n```\r\nchunk_size is not used to limit the max line size. It's pass into iter_content to limit the buffer size used to read the stream. In the source code of iter_lines(), you can see it actually tries to concatenate the data from different chunks rather than limiting max line size.\r\nFYI: http://docs.python-requests.org/en/master/_modules/requests/models/#Response.iter_lines\r\nSee the `pending` stuff.",
"Sorry, you're quite right. That's what I get for working while jetlagged.\n\nMy guess is that this is likely a duplicate of #2431, and so would have been fixed by #3923. That fix was landed into the 3.0 branch because it unfortunately does break compatibility. However, it should be fixed in the future 3.0 release. Would you like to investigate by trying the 3.0 branch with your example?",
"@Lukasa This looks like a duplicate of #2431 indeed. However, I just tested the 3.0 branch and the above problem still exists. I'll look into the 3.0 branch so see what I can do here.",
"Provide another simpler test:\r\n```\r\nimport requests\r\n\r\nurl = \"http://lohas.pixnet.net/blog\"\r\nr = requests.get(url)\r\niter_lines = r.iter_lines(chunk_size=7, decode_unicode=False, delimiter=b'\\n')\r\nsplit_lines = r.content.split(b'\\n')\r\nfor index, (iline, sline) in enumerate(zip(iter_lines, split_lines)):\r\n\tif iline != sline:\r\n\t\tprint(\"line {} is broken\".format(index))\r\n\t\tprint(iline, len(iline))\r\n\t\tprint(sline, len(sline))\r\n\t\tbreak\r\n```\r\nEven if I call split() and specify delimiter b'\\n' explicitly, their results remain different.\r\n```\r\n[pcman@pcman-arch-laptop requests]$ python test_requests.py \r\nline 281 is broken\r\nb'' 0\r\nb' style=\"display:inline-block;width:300px;height:250px\"' 58\r\n```",
"My bad. Tested the wrong code. Specifying b'\\n' as delimiter for iter_lines() and split() works.\r\nHowever, the test case with delimiter=None still yield different result from that of content.splitlines().\r\n",
"Yup, that inconsistency seems to be present still. Let me dig into what exactly is going on here.",
"Ah, yes, I can see the problem. It boils down to this:\n\n```python\n>>> 'test\\r\\n'.splitlines()\n['test']\n>>> 'test\\r'.splitlines()\n['test']\n```\n\nBasically, the code as written has an implicit assumption that there will only be UNIX-style newline characters in the document, but that's not guaranteed to be true. `splitlines` handles `\\r\\n` equivalently to `\\n`. That means that if a chunk ends up looking like `this\\r`, our code will assume that this was a complete line and eat the carriage return. If the first character of the next chunk is a `\\n` then the code will have *forgotten* that the last character was a `\\r` and so will treat it as a *new* line terminating character, and so will emit a bonus blank line.\n\nThis is a moderately tricky thing to handle. Most of the simplistic solutions have nasty side effects: for example, we cannot attempt to over-read because that will in some cases cause us to block and prevent us emitting a line we do actually have. We cannot simply try to memorize what kind of line terminator we have, partly because it will expose us to this failure mode on the *first* line and partly because line termination characters can be mixed throughout a document.\n\nI think there are two approaches, in the end. The first is to fall back to using `split` on `\\r` and `\\n`, and then to manually transform to universal newlines (that is, to spot empty lines and check what's going on). This is strictly more complex than the solution I'd actually propose, which is to just remember when we have treated `\\r` as a final line terminator.\n\nBasically, we'd update the code to add a check when using splitlines:\n\n```python\nlines = chunk.splitlines()\nlast_split_on_cr = lines[-1] and lines[-1] == '\\r'\n```\n\nThen, when we come through the loop again, if `last_split_on_cr` is `True` we'd check whether the *first* character of the next line is a line feed and, if it is, we'd strip it off.\n\n```python\nif lines[0] and lines[0][0] == '\\n':\n lines[0] = lines[0][1:]\n```\n\nI believe this would resolve the issue.\n\nAre you interested in supplying the necessary PR @PCMan, along with an appropriate test?",
"@Lukasa Yes, I'm interested in digging into the issue. I'll see if I can provide a patch. :)",
"@Lukasa Here is the pull request fixing the issue.\r\nhttps://github.com/kennethreitz/requests/pull/3984\r\nThanks for your review :-)",
"This should be resolved in 3.0.0 with #3984. Thanks again @PCMan!",
"This bug is still present in the latest release. Are there plans to release the fix soon?"
] |
https://api.github.com/repos/psf/requests/issues/3979
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3979/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3979/comments
|
https://api.github.com/repos/psf/requests/issues/3979/events
|
https://github.com/psf/requests/pull/3979
| 223,071,801 |
MDExOlB1bGxSZXF1ZXN0MTE2Nzc0MTUw
| 3,979 |
improve proxy bypass on Windows
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/238652?v=4",
"events_url": "https://api.github.com/users/schlamar/events{/privacy}",
"followers_url": "https://api.github.com/users/schlamar/followers",
"following_url": "https://api.github.com/users/schlamar/following{/other_user}",
"gists_url": "https://api.github.com/users/schlamar/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/schlamar",
"id": 238652,
"login": "schlamar",
"node_id": "MDQ6VXNlcjIzODY1Mg==",
"organizations_url": "https://api.github.com/users/schlamar/orgs",
"received_events_url": "https://api.github.com/users/schlamar/received_events",
"repos_url": "https://api.github.com/users/schlamar/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/schlamar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/schlamar/subscriptions",
"type": "User",
"url": "https://api.github.com/users/schlamar",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 19 |
2017-04-20T13:42:43Z
|
2021-09-06T00:06:59Z
|
2017-05-04T14:18:30Z
|
CONTRIBUTOR
|
resolved
|
See #3976
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3979/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3979/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3979.diff",
"html_url": "https://github.com/psf/requests/pull/3979",
"merged_at": "2017-05-04T14:18:30Z",
"patch_url": "https://github.com/psf/requests/pull/3979.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3979"
}
| true |
[
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3979?src=pr&el=h1) Report\n> Merging [#3979](https://codecov.io/gh/kennethreitz/requests/pull/3979?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/26ab951300d192ed3ab62fdb1f0548c961731e6a?src=pr&el=desc) will **decrease** coverage by `0.02%`.\n> The diff coverage is `100%`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/3979?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #3979 +/- ##\n==========================================\n- Coverage 89.76% 89.74% -0.03% \n==========================================\n Files 15 15 \n Lines 1955 1940 -15 \n==========================================\n- Hits 1755 1741 -14 \n+ Misses 200 199 -1\n```\n\n\n| [Impacted Files](https://codecov.io/gh/kennethreitz/requests/pull/3979?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/structures.py](https://codecov.io/gh/kennethreitz/requests/pull/3979?src=pr&el=tree#diff-cmVxdWVzdHMvc3RydWN0dXJlcy5weQ==) | `100% <ø> (ø)` | :arrow_up: |\n| [requests/compat.py](https://codecov.io/gh/kennethreitz/requests/pull/3979?src=pr&el=tree#diff-cmVxdWVzdHMvY29tcGF0LnB5) | `100% <100%> (ø)` | :arrow_up: |\n| [requests/utils.py](https://codecov.io/gh/kennethreitz/requests/pull/3979?src=pr&el=tree#diff-cmVxdWVzdHMvdXRpbHMucHk=) | `86.25% <100%> (+1.26%)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3979?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3979?src=pr&el=footer). Last update [26ab951...5c389cf](https://codecov.io/gh/kennethreitz/requests/pull/3979?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"As first step I added tests to check the current behavior (they take around 30 seconds to run on my system) so we can detect regressions in the new implementation. \r\n\r\nTested with Python 2.7 and 3.5.",
"Well coverage drops obviously as these tests are only run on Windows :-/\r\n\r\nFrom my POV this is ready to get merged, please review.",
"The code is copy pasted from Python 3.5's urlib/request.py with the lines doing DNS lookups (and appending to `host`) removed. I would recommend leaving it as is (maybe add a clarifying comment about this).\r\n\r\n",
"I'm a bit disinclined to do that unless we think this is going to be temporary code (which it won't be until we drop 2.7 support at least). If we are going to support it I'd like the code to be sensible and comprehensible. 😃",
"@Lukasa Please give me a note when AppVeyor tests are working. Then I'll udpate this PR.",
"@Lukasa Should I revert c121b98c4eb1e4ee8927995d9b9f1d6bdccd9349? With this PR there is no reason anymore for caching the results...",
"@schlamar Yeah, might be worth doing.",
"close/reopened to get appveyor to re-run",
"It'll need to be rebased on top of master. ",
"@Lukasa @nateprewitt Updated, please review again.",
"BTW, coverage drops because reverting 8e6e47af43d1d89a0201bd58e97c700bebbddd5e and c121b98c4eb1e4ee8927995d9b9f1d6bdccd9349 removes a lot of covered lines. My changes alone do not negatively affect coverage.",
"@schlamar Based on codecov's output the actual reason you're reducing coverage is that the diff isn't covered enough to avoid regressing it. To match coverage you'd need to cover 90% of the diff, but only 86% is covered. You can see the lines we're missing [here](https://codecov.io/gh/kennethreitz/requests/compare/26ab951300d192ed3ab62fdb1f0548c961731e6a...32a34722be08aecb6af10ff249480bb255831ff3/diff): what are the odds that we can write tests that appropriately cover them?",
"@Lukasa Updated, please review again :)",
"When was the proxy bypass cache introduced? I'd like to make a reference but can't find anything in the changelog :)",
"@schlamar, I believe it was only a couple months ago (#3885).",
"Ah yes, this wasn't even released yet. @Lukasa in this case I would ignore this?!",
"Yeah, you would. =)",
":tada:"
] |
https://api.github.com/repos/psf/requests/issues/3978
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3978/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3978/comments
|
https://api.github.com/repos/psf/requests/issues/3978/events
|
https://github.com/psf/requests/pull/3978
| 223,049,688 |
MDExOlB1bGxSZXF1ZXN0MTE2NzU4NDgw
| 3,978 |
remove seemingly redundant pyflakes references
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/413772?v=4",
"events_url": "https://api.github.com/users/graingert/events{/privacy}",
"followers_url": "https://api.github.com/users/graingert/followers",
"following_url": "https://api.github.com/users/graingert/following{/other_user}",
"gists_url": "https://api.github.com/users/graingert/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/graingert",
"id": 413772,
"login": "graingert",
"node_id": "MDQ6VXNlcjQxMzc3Mg==",
"organizations_url": "https://api.github.com/users/graingert/orgs",
"received_events_url": "https://api.github.com/users/graingert/received_events",
"repos_url": "https://api.github.com/users/graingert/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/graingert/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/graingert/subscriptions",
"type": "User",
"url": "https://api.github.com/users/graingert",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2017-04-20T12:23:41Z
|
2021-09-06T00:07:03Z
|
2017-04-20T13:44:20Z
|
CONTRIBUTOR
|
resolved
| null |
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3978/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3978/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3978.diff",
"html_url": "https://github.com/psf/requests/pull/3978",
"merged_at": "2017-04-20T13:44:20Z",
"patch_url": "https://github.com/psf/requests/pull/3978.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3978"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/3977
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3977/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3977/comments
|
https://api.github.com/repos/psf/requests/issues/3977/events
|
https://github.com/psf/requests/issues/3977
| 223,012,125 |
MDU6SXNzdWUyMjMwMTIxMjU=
| 3,977 |
No Content-Length header when POSTing or PUTing an empty Django InMemoryUploadedFile
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3957868?v=4",
"events_url": "https://api.github.com/users/CianciuStyles/events{/privacy}",
"followers_url": "https://api.github.com/users/CianciuStyles/followers",
"following_url": "https://api.github.com/users/CianciuStyles/following{/other_user}",
"gists_url": "https://api.github.com/users/CianciuStyles/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/CianciuStyles",
"id": 3957868,
"login": "CianciuStyles",
"node_id": "MDQ6VXNlcjM5NTc4Njg=",
"organizations_url": "https://api.github.com/users/CianciuStyles/orgs",
"received_events_url": "https://api.github.com/users/CianciuStyles/received_events",
"repos_url": "https://api.github.com/users/CianciuStyles/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/CianciuStyles/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/CianciuStyles/subscriptions",
"type": "User",
"url": "https://api.github.com/users/CianciuStyles",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2017-04-20T09:57:12Z
|
2021-09-08T11:00:30Z
|
2017-04-22T12:45:50Z
|
NONE
|
resolved
|
I'm trying to send a PUT request passing as the data kwarg an instance of a InMemoryUploadedFile (which is the model Django uses for uploaded files, see for reference https://docs.djangoproject.com/en/1.10/ref/files/uploads/#django.core.files.uploadedfile.InMemoryUploadedFile), like this:
```
files = request.FILES.getlist('file')
for uploaded_file in files:
[...]
requests.put(upload_url, data=uploaded_file)
```
The problem is that the resulting request will not have the expected Content-Length header, but the Transfer-Encoding one. I think the problem is at https://github.com/kennethreitz/requests/blob/master/requests/models.py#L492, where requests sets one header or the other based on the `if length` check. In the case of a stream with 0 length, the check fails and the Transfer-Encoding header gets set.
The snippet above works as intended when the file is non-empty (because the `if length` check passes), and also `requests.put(upload_url, data=uploaded_file.read())` works (as this is not a stream anymore, but it could cause problems when reading a huge file in memory).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3977/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3977/timeline
| null |
completed
| null | null | false |
[
"Why is the transfer encoding header a problem? HTTP/1.1 servers are required to understand it. ",
"The problem is not in the Transfer-Encoding header, but the lack of the Content-Length header: if I try to upload the file to Amazon S3 (for example), it will reply with a 411 Content Length Missing response.",
"Ok, so this boils down to an S3 behaviour, as S3 requires the content length. \r\n\r\nRequests does not in general consider this behaviour a bug: I consider S3 at fault here. However, it's understandably a problem. You should be able to get the behaviour you want by writing a small proxy object that sends all `read` calls to the Django object and that implements `__len__`, which will be enough for us to work out its length. ",
"Oh I apologise. You're posting an empty file. Requests really does not provide an easy way to override our logic here, which is very deliberate (plenty of users pass us file objects that return a length of zero but which nonetheless return data from `read()`). For this reason the easiest thing to do is likely to use the [PreparedRequest flow](http://docs.python-requests.org/en/master/user/advanced/#prepared-requests) to tweak the headers of the prepared request to suit this case. ",
"Thank you for opening this @CianciuStyles. I believe if you follow @Lukasa's advice you'll be able to solve your problem nicely. As a result, I'm closing this since it's not a bug and you now have a suitable workaround. Cheers!"
] |
https://api.github.com/repos/psf/requests/issues/3976
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3976/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3976/comments
|
https://api.github.com/repos/psf/requests/issues/3976/events
|
https://github.com/psf/requests/issues/3976
| 222,988,467 |
MDU6SXNzdWUyMjI5ODg0Njc=
| 3,976 |
Don't use `urllib.proxy_bypass` on Windows
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/238652?v=4",
"events_url": "https://api.github.com/users/schlamar/events{/privacy}",
"followers_url": "https://api.github.com/users/schlamar/followers",
"following_url": "https://api.github.com/users/schlamar/following{/other_user}",
"gists_url": "https://api.github.com/users/schlamar/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/schlamar",
"id": 238652,
"login": "schlamar",
"node_id": "MDQ6VXNlcjIzODY1Mg==",
"organizations_url": "https://api.github.com/users/schlamar/orgs",
"received_events_url": "https://api.github.com/users/schlamar/received_events",
"repos_url": "https://api.github.com/users/schlamar/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/schlamar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/schlamar/subscriptions",
"type": "User",
"url": "https://api.github.com/users/schlamar",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2017-04-20T08:30:28Z
|
2021-09-08T10:00:58Z
|
2017-05-04T16:26:48Z
|
CONTRIBUTOR
|
resolved
|
The `urllib.proxy_bypass` implementation on Windows is using forward and reverse DNS requests. This can result in major performance issues (I guess this depends on the DNS server, right now I have seen delays up to 15 seconds!). Additionally, this behavior is contrary to all major browser implementations.
This could even be a security issue as people might rely on the proxy as a privacy feature and in this case the client shouldn't do any DNS requests as this could leak private information.
I would recommend using a custom implementation which isn't doing any DNS request.
References:
* https://github.com/kennethreitz/requests/issues/2988
* http://bugs.python.org/issue29533
If you are in favor of this proposal I might be able to do a PR.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/238652?v=4",
"events_url": "https://api.github.com/users/schlamar/events{/privacy}",
"followers_url": "https://api.github.com/users/schlamar/followers",
"following_url": "https://api.github.com/users/schlamar/following{/other_user}",
"gists_url": "https://api.github.com/users/schlamar/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/schlamar",
"id": 238652,
"login": "schlamar",
"node_id": "MDQ6VXNlcjIzODY1Mg==",
"organizations_url": "https://api.github.com/users/schlamar/orgs",
"received_events_url": "https://api.github.com/users/schlamar/received_events",
"repos_url": "https://api.github.com/users/schlamar/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/schlamar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/schlamar/subscriptions",
"type": "User",
"url": "https://api.github.com/users/schlamar",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3976/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3976/timeline
| null |
completed
| null | null | false |
[
"In principle we are not attached to the specific implementation of `proxy_bypass`, only the net result. Any change away would be required to reproduce all relevant functionality and to provide complete test coverage. If you're open to doing that work we'll happily consider the PR. ",
"Basically, this would be the copy/pasted Windows versions of [ `urllib.proxy_bypass`](https://github.com/python/cpython/blob/98b1c82/Lib/urllib.py#L1633) and [`urllib.proxy_bypass_registry`](https://github.com/python/cpython/blob/98b1c82/Lib/urllib.py#L1581), the latter without the DNS lookups.\r\n\r\nNot quite sure how to test that in a sensible way. Maybe mocking/monkeypatching registry access (`_winreg`)? ",
"Mainly note for myself: `urllib.proxy_bypass_registry` with host `172.16.1.12` and proxyOverride `172.16.1.1` evaluates to True. Not sure if this is a bug or implementation detail (matching start of string).",
"proxyOverride `192.168.` (without star) and host `192.168.0.1` also evaluates to True. \r\n\r\nIn the config dialog it says \"do not use proxy server for addresses **beginning with**\" (screenshot [here](https://blogs.msdn.microsoft.com/askie/2015/10/12/how-to-configure-proxy-settings-for-ie10-and-ie11-as-iem-is-not-available/)). So I assume the urllib matching behavior is correct and matches the Window behavior.",
"#3979"
] |
https://api.github.com/repos/psf/requests/issues/3975
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3975/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3975/comments
|
https://api.github.com/repos/psf/requests/issues/3975/events
|
https://github.com/psf/requests/issues/3975
| 222,780,431 |
MDU6SXNzdWUyMjI3ODA0MzE=
| 3,975 |
requests.packages.urllib3.connection.HTTPConnection object at 0x00007effa13b1ec0>: Failed to establish a new connection: [Errno -11] System error
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5113718?v=4",
"events_url": "https://api.github.com/users/steelliberty/events{/privacy}",
"followers_url": "https://api.github.com/users/steelliberty/followers",
"following_url": "https://api.github.com/users/steelliberty/following{/other_user}",
"gists_url": "https://api.github.com/users/steelliberty/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/steelliberty",
"id": 5113718,
"login": "steelliberty",
"node_id": "MDQ6VXNlcjUxMTM3MTg=",
"organizations_url": "https://api.github.com/users/steelliberty/orgs",
"received_events_url": "https://api.github.com/users/steelliberty/received_events",
"repos_url": "https://api.github.com/users/steelliberty/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/steelliberty/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/steelliberty/subscriptions",
"type": "User",
"url": "https://api.github.com/users/steelliberty",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 25 |
2017-04-19T15:38:09Z
|
2021-09-08T11:00:31Z
|
2017-04-21T14:56:12Z
|
NONE
|
resolved
|
While using grequests I get the following error after running a program which repeatedly checks a batch of urls to determine if the sites are up.
For simplicity say I do this :
I have 50 urls.
I created an httpmonitor module that is handed the urls and then the module is sent to a background/thread-with-Queue.
The httpmonitor checks all the urls using standard grequests structure such as this :
max_workers = 10 now but I have tried from 50 down to 3
r = (grequests.head(url, callback=timeme, timeout=3, allow_redirects=False, stream = False) for url in urls)
results = grequests.map(r, size=max_workers, exception_handler=handle_errors)
I deal with the results, update a database and then the thread finishes and independently a new one is added to the Queue (for this example, with the same 50 urls) -- and this goes on and on..
Note. I explicitly close connections after each url is processed with either a Response.close() or a AsyncRequest.session.close() so I believe all connections are closed after each block of urls is processed.
Then I set it running and it runs until approximately 1500 requests are processed ( 30 blocks of 50 urls ) and then for a period of time all of the grequests fail with the error above Errno -11, System Error ( which I believe means OUT OF RESOURCES) .. Sometimes it will return to normal operation for a period, most of the time . it fails on all further attempts to access a connection.
1) Is there anymore information on this error that someone might share ?
2) Is there a way to ensure all resources are released outside of grequests?
3) Is there something else I have might have missed?
Note: I was able to increase the number times the blocks of urls could be checked by explicitly closing connections -- gaining an addition 400+ url checks -- but then the same error returned. It would seem some resource is not closing and accumulating over the batch runs..
Here is the exact error (url does not matter)
HTTPConnectionPool(host='www.yahoo.com', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x00007effa13b1ec0>: Failed to establish a new connection: [Errno -11] System error',))"
Any help or ideas would be appreciated, Thank you
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3975/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3975/timeline
| null |
completed
| null | null | false |
[
"It seems likely that this is an FD limit. If you raise the ulimit of the process does it delay the error?",
"Thank you for the response .. I have not changed the ulimit -- currently at 1024 -- Is there a standard procedure for how to do this and have it remain across reboots.. sorry I have not ventured down that rabbit hole yet .. I will try it though .. However if it is a out-of-resources issue, won't increasing the ulimit simply delay the response ( as you suggested) -- what might I do to prevent opening and locking up too many File Descriptors ? ",
"No need to retain it across reboots, I'm simply interested in ruling that cause in or out. So it's enough to have it last for the process only.",
"Yes .. I upped the ulimit to 64000 soft and hard and it has been running for a long while now.. into the 4000s instead of 1500. ",
"Ok, is it possible to get a small reproducer for this bug?",
"please explain small reproducer? I have built a rather large project based on this general idea, but I should be able to provide you with a snapshot version .. meaning it does not require the 5 modules interacting which exist now to update the urls for testing and manage the background ... I'll think about it -- The database becomes available when the program uses up the available FDs -- so I am not sure how important keeping the DB in the example might be .. ",
"A small reproducer would be the minimal amount of code from your project required to reproduce the same behaviour, removing all extraneous logic. You'd try to slowly delete code until the problem goes away. ",
"Sounds good .. I am working a minimal version of just the portion that\nincludes grequests and the repetition of checking sites .. Thank you for\nhelping with this issue .. the hours spent have been very anxious :)\n\n\nOn Wed, Apr 19, 2017 at 5:31 PM, Cory Benfield <[email protected]>\nwrote:\n\n> A small reproducer would be the minimal amount of code from your project\n> required to reproduce the same behaviour, removing all extraneous logic.\n> You'd try to slowly delete code until the problem goes away.\n>\n> —\n> You are receiving this because you authored the thread.\n> Reply to this email directly, view it on GitHub\n> <https://github.com/kennethreitz/requests/issues/3975#issuecomment-295452369>,\n> or mute the thread\n> <https://github.com/notifications/unsubscribe-auth/AE4HdnHD5onYU54X-mZyfHh0T7kU9vFAks5rxn1OgaJpZM4NB3Ql>\n> .\n>\n",
"Cory I am completing a simple version of the software, which I believe will demonstrate the issue i have -- but while i was writing this, I have been monitoring the current running version and I see open FDs building up in real time -- the same blocks of urls I check remain open with http (CLOSE_WAIT) descriptions. My setup is on a Ubuntu 16.x version of linux using pypy and the latest versions of all software. When I run lsof -a -p PID_NUMBER I am seeing hundreds of (1800) versions of the same open connections waiting to be closed -- I assume. I could send you a print out privately if you would like to see the information .. \r\n\r\nlines of data looking like 47664->www.yahoo.com:http (CLOSE_WAIT)\r\n\r\nIn the time it took to write this I counted the FD for the this PID .. \r\n\r\nI actually have 1 gig of memory .. and only one processor .. \r\n\r\n(nexusrt) root@lemp-512mb-nyc3-01:/nexusrt/www/nexusrt.com# lsof -a -p 6453 |wc -l\r\n1894\r\n(nexusrt) root@lemp-512mb-nyc3-01:/nexusrt/www/nexusrt.com# lsof -a -p 6453 |wc -l\r\n2227\r\n\r\n\r\n",
"So how many different hosts are you hitting? Connection pooling should be preventing the CLOSE_WAIT getting out of hand.",
"I am only hitting 50 at the moment - but expect that to go up (as the limits of the resources of the box permit and then adding other machines - I would expect that 300 would a typical number. However, I wanted to make this scaleable. Now there are 3000 open connections with CLOSE_WAIT. ",
"So what code do you have dealing with sessions? Do you consistently use just one? Do you use any?",
"The portions of the code that deal with sessions are only contained within grequests. I do explicitly close() the session (if there is an error, I am using an AsyncRequest.session.close() ) and Response.close() for returning response objects from the map function. I do not use requests or request.Session() explicitly at all in this application. My code looks basically like this -- \r\n\r\nr = (grequests.head(url, callback=timeme, timeout=3, allow_redirects=False, stream = False) for url in urls)\r\nresults = grequests.map(r, size=max_workers, exception_handler=handle_errors)",
"i have a simple generator of the issue. It takes 50 urls, and repeatedly does a grequest.head() request (every 15 seconds) . The number of open CLOSE_WAIT stabilizes at around 380 and rises to 481 and back to 309 and this continues. The difference in this app is that I am not using a background process to start the same type of code. I use a background thread which manages a Queue -- batches of jobs just like the one running standing alone are queued and started when the one before it ends -- Not sure how that can make a difference.. But the Close_Waits clear faster in this test case. I dropped the sleep time to 5 seconds and it still stabilizes, albeit with a higher number of Close_Waits on avg. ",
"interestingly -- I run it on my mac and it works with no residual open connections .. ",
"@steelliberty Is your Mac running PyPy or CPython?",
"The reason I ask is that the likely problem here is that you're letting the Session leak out of scope and therefore are causing all the CPython connections to get closed by the refcounting GC, while PyPy's non-refcounting GC hasn't yet encountered enough memory pressure to evict those sockets from memory (and so close them).\n\nYou can resolve this by changing your code to this:\n\n```python\nwith requests.Session() as s:\n r = (grequests.head(url, callback=timeme, timeout=3, allow_redirects=False, stream = False, session=s) for url in urls)\n results = grequests.map(r, size=max_workers, exception_handler=handle_errors)\n```\n\nThis will force all the requests to use the same `Session` object, which will ensure that you don't leak the connections. Additionally, when you exit the `with` block, all the `Session` connections will be destroyed eagerly, rather than relying on the GC to clean them up.",
"This sounds like what is happening is exactly what I think can solve the problem. I failed to mention to you I act on the sessions right away, and after the code below if there are status_codes which warrant retesting, I basically hand those urls off to a similar block of code. So I am not sure how they are staying open. Thank you .. I will quickly try this. \r\n\r\nr = (grequests.head(url, callback=timeme, timeout=5, allow_redirects=False, stream = False) for url in urls)\r\n results = grequests.map(r, size=max_workers, exception_handler=handle_errors)\r\n\r\n if results:\r\n for x in list(results): #list of request objects for each url\r\n if x == None: #if none then url error_handled (where I close socket during processing)\r\n continue\r\n try:\r\n url = x.url.rstrip('/') \r\n if x.history:\r\n print \"HISTORY \",url, x.history\r\n sc = x.status_code #results status_code\r\n x.close() #explicitly closing the socket here \r\n ....\r\n further processing. \r\n\r\n \r\n",
"@steelliberty `Response.close` does *not* close the underlying socket, it just stops making it available to the response and returns it to the connection pool. You have to use `Session` objects to get the control over lifetime you want.",
"I see .. makes sense .. in the error_handler I have access at AsyncRequest.session.close() . .. \r\n\r\nLet me test this and I'll let you know. Thank you. \r\n",
"I *strongly* recommend using the `with` statement and `Session` instead.",
"seems more efficient then creating separate sessions -- and clearly easy to eliminate. ",
"That did the trick ! .. Thank you Cory. You saved me many more hours of searching for a solution. \r\n",
"Happy to help! I hope you find that your script now behaves much better.",
"It is running perfectly now .. other tiny issues vanished with that fix :)\n\n\nOn Fri, Apr 21, 2017 at 10:56 AM, Cory Benfield <[email protected]>\nwrote:\n\n> Closed #3975 <https://github.com/kennethreitz/requests/issues/3975>.\n>\n> —\n> You are receiving this because you were mentioned.\n> Reply to this email directly, view it on GitHub\n> <https://github.com/kennethreitz/requests/issues/3975#event-1052580401>,\n> or mute the thread\n> <https://github.com/notifications/unsubscribe-auth/AE4Hdv-LcMEy-Ys_6NO9QAUTQkoNBi9sks5ryMOhgaJpZM4NB3Ql>\n> .\n>\n"
] |
https://api.github.com/repos/psf/requests/issues/3974
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3974/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3974/comments
|
https://api.github.com/repos/psf/requests/issues/3974/events
|
https://github.com/psf/requests/issues/3974
| 222,688,600 |
MDU6SXNzdWUyMjI2ODg2MDA=
| 3,974 |
'headers' field does not allow multiple headers with the same name
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9937444?v=4",
"events_url": "https://api.github.com/users/ugultopu/events{/privacy}",
"followers_url": "https://api.github.com/users/ugultopu/followers",
"following_url": "https://api.github.com/users/ugultopu/following{/other_user}",
"gists_url": "https://api.github.com/users/ugultopu/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ugultopu",
"id": 9937444,
"login": "ugultopu",
"node_id": "MDQ6VXNlcjk5Mzc0NDQ=",
"organizations_url": "https://api.github.com/users/ugultopu/orgs",
"received_events_url": "https://api.github.com/users/ugultopu/received_events",
"repos_url": "https://api.github.com/users/ugultopu/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ugultopu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ugultopu/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ugultopu",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-04-19T10:16:37Z
|
2021-09-08T11:00:32Z
|
2017-04-19T10:56:59Z
|
NONE
|
resolved
|
A response returns multiple `Set-Cookie` headers. When I wanted to see these using `response.headers`, only a single `Set-Cookie` header is shown. I have expected all `Set-Cookie` headers to be shown.
Is it intentional to show only one header key-value pair per header key?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3974/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3974/timeline
| null |
completed
| null | null | false |
[
"Requests does show multiples, but it joins them into a single header field. This allows users to get a slightly more intuitive behaviour than returning lists, though it does reduce fidelity slightly. "
] |
https://api.github.com/repos/psf/requests/issues/3973
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3973/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3973/comments
|
https://api.github.com/repos/psf/requests/issues/3973/events
|
https://github.com/psf/requests/issues/3973
| 222,489,923 |
MDU6SXNzdWUyMjI0ODk5MjM=
| 3,973 |
ETA on 2.14? + docs request on release schedule
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/204779?v=4",
"events_url": "https://api.github.com/users/jvanasco/events{/privacy}",
"followers_url": "https://api.github.com/users/jvanasco/followers",
"following_url": "https://api.github.com/users/jvanasco/following{/other_user}",
"gists_url": "https://api.github.com/users/jvanasco/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jvanasco",
"id": 204779,
"login": "jvanasco",
"node_id": "MDQ6VXNlcjIwNDc3OQ==",
"organizations_url": "https://api.github.com/users/jvanasco/orgs",
"received_events_url": "https://api.github.com/users/jvanasco/received_events",
"repos_url": "https://api.github.com/users/jvanasco/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jvanasco/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jvanasco/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jvanasco",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2017-04-18T17:31:40Z
|
2021-09-08T11:00:31Z
|
2017-04-19T16:35:33Z
|
CONTRIBUTOR
|
resolved
|
1. Is there an ETA on 2.14? I couldn't find any milestone info and there seems to be a few dozen PRs merged since January.
2. There should probably be a line about the release schedule in the contributors guide or elsewhere, so I people like me don't ask annoying questions.
I don't mean to prod for a release, but just get a general idea of when one will likely be made. I've got a couple of tiny open source projects that rely on a handful of the forthcoming changes and just want to know when I should schedule an afternoon for packaging and updates.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3973/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3973/timeline
| null |
completed
| null | null | false |
[
"Heh, there is no schedule. The schedule is \"whenever the core devs feel that a release is warranted\". In this case, I have an OmniFocus task for this but am currently on holiday, so would expect that we'll release next week. \r\n\r\nGenerally speaking I'd say that a schedule doesn't work for a project like this one: we're sufficiently small in scope that it's pretty easy to coordinate the work. However, if you disagree I'd be interested to hear it. ",
"Well, I'm not talking about an \"official\" release schedule... just that a lot of projects tend to have something general like \"we aim for a release every X months or Y pull requests/core changes\".\r\n\r\nI had actually looked through the history to try and predict when a release happens, and it tends to be around 3 months (sometimes 2, sometimes 4). Looking at the commit history though, it seems like there were a lot more PRs merged than usual since 2.13.0... so I expected 2.14 to be on the earlier side. \r\n\r\n2.4 - 2014/08/29\r\n2.5 - 2014/12/01\r\n2.6 - 2015/03/14\r\n2.7 - 2015/05/03\r\n2.8 - 2015/10/08\r\n2.9 - 2015/12/15\r\n2.10 - 2016/04/29\r\n2.11 - 2016/08/06\r\n2.12 - 2016/11/15\r\n2.12 - 2017/01/24\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n",
"Heh, frankly it's looking like another 3 month period basically on the nose. \r\n\r\nI have no issue with people doing averages of our release cycle, but I'm nervous about providing that analysis ourselves. It changes the questions from \"when is the next release?\" to \"why is there no release yet?\", which isn't really a net improvement. ",
"We have no release cadence other than coincidentally. "
] |
https://api.github.com/repos/psf/requests/issues/3972
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3972/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3972/comments
|
https://api.github.com/repos/psf/requests/issues/3972/events
|
https://github.com/psf/requests/issues/3972
| 222,472,324 |
MDU6SXNzdWUyMjI0NzIzMjQ=
| 3,972 |
Custom Handler
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5131271?v=4",
"events_url": "https://api.github.com/users/achapkowski/events{/privacy}",
"followers_url": "https://api.github.com/users/achapkowski/followers",
"following_url": "https://api.github.com/users/achapkowski/following{/other_user}",
"gists_url": "https://api.github.com/users/achapkowski/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/achapkowski",
"id": 5131271,
"login": "achapkowski",
"node_id": "MDQ6VXNlcjUxMzEyNzE=",
"organizations_url": "https://api.github.com/users/achapkowski/orgs",
"received_events_url": "https://api.github.com/users/achapkowski/received_events",
"repos_url": "https://api.github.com/users/achapkowski/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/achapkowski/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/achapkowski/subscriptions",
"type": "User",
"url": "https://api.github.com/users/achapkowski",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2017-04-18T16:24:53Z
|
2021-09-08T11:00:32Z
|
2017-04-19T12:32:41Z
|
NONE
|
resolved
|
How do you extend requests to use a custom handler for a security scheme?
I'm looking for documentation on how to extend the API basically.
Thanks
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5131271?v=4",
"events_url": "https://api.github.com/users/achapkowski/events{/privacy}",
"followers_url": "https://api.github.com/users/achapkowski/followers",
"following_url": "https://api.github.com/users/achapkowski/following{/other_user}",
"gists_url": "https://api.github.com/users/achapkowski/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/achapkowski",
"id": 5131271,
"login": "achapkowski",
"node_id": "MDQ6VXNlcjUxMzEyNzE=",
"organizations_url": "https://api.github.com/users/achapkowski/orgs",
"received_events_url": "https://api.github.com/users/achapkowski/received_events",
"repos_url": "https://api.github.com/users/achapkowski/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/achapkowski/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/achapkowski/subscriptions",
"type": "User",
"url": "https://api.github.com/users/achapkowski",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3972/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3972/timeline
| null |
completed
| null | null | false |
[
"What specific kind of API extension do you have in mind? Generally speaking you'd do it by subclassing the Session object.",
"I have a site that is token based security, and I want to create a handler to work with this. So I know the username, password, expiration, and token url, and I want the authentication handler to handle the creation and update of the token based on a given time frame. ",
"It's token based, but not oauth? You may find Requests-Oauthlib's design instructive of how to handle a complex token based auth system. ",
"Thanks, I think I figured it out. So you inherit from AuthBase, then you write your logic in ```__call__```\r\n\r\nMy issue now is how do I insert the token parameter into a multi-part request. For regular POST and GET calls it's easy. Just append the token to the body or parameters off of the object ```r```.\r\n\r\nIf I insert the form-data into the r.body of the form request, I a message from server saying it's not valid:\r\n\r\nReal quick and dirty method:\r\n\r\n```\r\ntoken = \"abc123\r\nitem = '--%s\\r\\nContent-Disposition: form-data; name=\"token\"\\r\\n\\r\\n%s\\r\\n' % \\\r\n (r.headers['Content-type'].split(';')[1].strip().split('=')[1], token)\r\nr.body = str.encode(item) + r.body\r\n```\r\n\r\nIt looks right from Fiddler, but apparently it isn't",
"There is a body terminating boundary (`--uuid--`) that comes last, after which data will usually be ignored. You'd also need to update the content-length, which you may not have done.\r\n\r\nEssentially, you'll want to insert your data *before* the terminating boundary, not after.",
"ahh thanks for that! that fixed it. \r\n\r\nThank you for all the help"
] |
https://api.github.com/repos/psf/requests/issues/3971
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3971/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3971/comments
|
https://api.github.com/repos/psf/requests/issues/3971/events
|
https://github.com/psf/requests/issues/3971
| 222,419,006 |
MDU6SXNzdWUyMjI0MTkwMDY=
| 3,971 |
Requests.Get Response Includes Unexpected Values
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/26935905?v=4",
"events_url": "https://api.github.com/users/skyfxl/events{/privacy}",
"followers_url": "https://api.github.com/users/skyfxl/followers",
"following_url": "https://api.github.com/users/skyfxl/following{/other_user}",
"gists_url": "https://api.github.com/users/skyfxl/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/skyfxl",
"id": 26935905,
"login": "skyfxl",
"node_id": "MDQ6VXNlcjI2OTM1OTA1",
"organizations_url": "https://api.github.com/users/skyfxl/orgs",
"received_events_url": "https://api.github.com/users/skyfxl/received_events",
"repos_url": "https://api.github.com/users/skyfxl/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/skyfxl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/skyfxl/subscriptions",
"type": "User",
"url": "https://api.github.com/users/skyfxl",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-04-18T13:35:31Z
|
2021-09-08T10:00:44Z
|
2017-05-22T15:34:23Z
|
NONE
|
resolved
|
Hi,
I encountered an issue when using the requests.get command with Python 3.5.1. I described the issue in [this Stack Overflow post](http://stackoverflow.com/questions/43218259/python-requests-get-response-includes-unexpected-values). The solution was to use Python 2.7.13, so I'm not 100% sure whether the issue is related to the base Python package or how the requests library works with the base packages, but I thought I would post the issue just in case.
Hope that helps!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3971/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3971/timeline
| null |
completed
| null | null | false |
[
"That looks very much like a bug in httplib. If it's possible for us to recreate the bad request from our own machines we can work out exactly what's going on: otherwise we'll need to wait for a good repro. ",
"LUCKIEST HUMAN BEING NOW!!!. After setting the HTTP on the same server, Requests was able to return results from the server instead waiting infinitely. Here is what response.content has:\r\n\r\n`b'<!DOCTYPE HTML PUBLIC \"-//W3C//DTD HTML 4.0 Transitional//EN\">\\r\\n<html>\\r\\n<head>\\r\\n<title>504</title>\\r\\n<meta http-equiv=\"Cache-Control\" content=\"no-cache\"/>\\r\\n</head>\\r\\n<body>\\r\\n<p>\\r\\nGateway Timeout \\r\\n</p>\\r\\n</body>\\r\\n</html>\\r\\n'`\r\n\r\nAfter five days, solution is near. Have to hunt down the timeout setting on the server that is doing this havoc"
] |
https://api.github.com/repos/psf/requests/issues/3970
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3970/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3970/comments
|
https://api.github.com/repos/psf/requests/issues/3970/events
|
https://github.com/psf/requests/issues/3970
| 222,361,541 |
MDU6SXNzdWUyMjIzNjE1NDE=
| 3,970 |
requests.exceptions.ConnectionError
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/26059420?v=4",
"events_url": "https://api.github.com/users/srikanth-chandaluri/events{/privacy}",
"followers_url": "https://api.github.com/users/srikanth-chandaluri/followers",
"following_url": "https://api.github.com/users/srikanth-chandaluri/following{/other_user}",
"gists_url": "https://api.github.com/users/srikanth-chandaluri/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/srikanth-chandaluri",
"id": 26059420,
"login": "srikanth-chandaluri",
"node_id": "MDQ6VXNlcjI2MDU5NDIw",
"organizations_url": "https://api.github.com/users/srikanth-chandaluri/orgs",
"received_events_url": "https://api.github.com/users/srikanth-chandaluri/received_events",
"repos_url": "https://api.github.com/users/srikanth-chandaluri/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/srikanth-chandaluri/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/srikanth-chandaluri/subscriptions",
"type": "User",
"url": "https://api.github.com/users/srikanth-chandaluri",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 12 |
2017-04-18T09:53:10Z
|
2021-02-11T14:22:11Z
|
2017-04-18T16:05:04Z
|
NONE
|
off-topic
|
`response = requests.post(url, data=payload ,verify=False, timeout=(3.05, 6.05))`
works fine in python shell but not in python manage.py shell (django)
I am getting below error
'requests.exceptions.ConnectionError: HTTPConnectionPool(host='ggk-wrl-win1', port=9090): Max retries exceeded with url: /home/trusted/token (Caused by NewConnectionError('<requests.packages.urllib3.connec)'
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3970/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3970/timeline
| null |
completed
| null | null | false |
[
"You have truncated the exception message so I can't see what it is, do you mind providing the full exception string?",
"\r\n\r\n",
"You're encountering a DNS error for the host name `ggk-wrl-win1`. Can you confirm your network is set up correctly?",
"yes, setup is good\r\n",
"From inside `manage.py`, can you run:\n\n```python\nimport socket\ns = socket.create_connection(('ggk-wrl-win1', 80))\n```\n\nand tell me what happens please?",
"\r\n",
"So, as you can see, this is a problem with your environment and not requests. Inside your environment that host name is not resolvable. We cannot help you work out why that is.",
"but, its works in python shell\r\n\r\n",
"Yes, which strongly suggests that your specific environment is at fault. You need to diagnose why the manage.py shell won't make outbound network connections. ",
"I think the issue with my docker setup, thank you",
"@srikanth-chandaluri are you find solution ?",
"> I think the issue with my docker setup, thank you\r\n\r\nare you find solution\r\n"
] |
https://api.github.com/repos/psf/requests/issues/3969
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3969/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3969/comments
|
https://api.github.com/repos/psf/requests/issues/3969/events
|
https://github.com/psf/requests/pull/3969
| 222,195,637 |
MDExOlB1bGxSZXF1ZXN0MTE2MTc3NTM4
| 3,969 |
fix handle of non-ascii location on redirects issue #3888
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/23416546?v=4",
"events_url": "https://api.github.com/users/shmuelamar/events{/privacy}",
"followers_url": "https://api.github.com/users/shmuelamar/followers",
"following_url": "https://api.github.com/users/shmuelamar/following{/other_user}",
"gists_url": "https://api.github.com/users/shmuelamar/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/shmuelamar",
"id": 23416546,
"login": "shmuelamar",
"node_id": "MDQ6VXNlcjIzNDE2NTQ2",
"organizations_url": "https://api.github.com/users/shmuelamar/orgs",
"received_events_url": "https://api.github.com/users/shmuelamar/received_events",
"repos_url": "https://api.github.com/users/shmuelamar/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/shmuelamar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shmuelamar/subscriptions",
"type": "User",
"url": "https://api.github.com/users/shmuelamar",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2017-04-17T18:34:45Z
|
2021-09-06T00:07:03Z
|
2017-04-18T16:20:23Z
|
CONTRIBUTOR
|
resolved
|
fix for #3888
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3969/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3969/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3969.diff",
"html_url": "https://github.com/psf/requests/pull/3969",
"merged_at": "2017-04-18T16:20:23Z",
"patch_url": "https://github.com/psf/requests/pull/3969.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3969"
}
| true |
[
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3969?src=pr&el=h1) Report\n> Merging [#3969](https://codecov.io/gh/kennethreitz/requests/pull/3969?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/fd7dcd278557c517a0df6f6a4415c4922e6bd0d5?src=pr&el=desc) will **increase** coverage by `0.01%`.\n> The diff coverage is `100%`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/3969?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #3969 +/- ##\n==========================================\n+ Coverage 89.72% 89.73% +0.01% \n==========================================\n Files 15 15 \n Lines 1946 1949 +3 \n==========================================\n+ Hits 1746 1749 +3 \n Misses 200 200\n```\n\n\n| [Impacted Files](https://codecov.io/gh/kennethreitz/requests/pull/3969?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/sessions.py](https://codecov.io/gh/kennethreitz/requests/pull/3969?src=pr&el=tree#diff-cmVxdWVzdHMvc2Vzc2lvbnMucHk=) | `93.75% <100%> (+0.06%)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3969?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3969?src=pr&el=footer). Last update [fd7dcd2...722b1da](https://codecov.io/gh/kennethreitz/requests/pull/3969?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).\n",
"thanks for the quick reply, if you think we should make the test parameterized with some more location headers on it, let me know.",
"Yep, looks good to me too!",
"Alright let's do it!",
"great thanks for merging and the fast responses! 😃 "
] |
https://api.github.com/repos/psf/requests/issues/3968
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3968/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3968/comments
|
https://api.github.com/repos/psf/requests/issues/3968/events
|
https://github.com/psf/requests/issues/3968
| 222,060,438 |
MDU6SXNzdWUyMjIwNjA0Mzg=
| 3,968 |
URLs with Square Brackets []
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3072436?v=4",
"events_url": "https://api.github.com/users/apollodatasolutions/events{/privacy}",
"followers_url": "https://api.github.com/users/apollodatasolutions/followers",
"following_url": "https://api.github.com/users/apollodatasolutions/following{/other_user}",
"gists_url": "https://api.github.com/users/apollodatasolutions/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/apollodatasolutions",
"id": 3072436,
"login": "apollodatasolutions",
"node_id": "MDQ6VXNlcjMwNzI0MzY=",
"organizations_url": "https://api.github.com/users/apollodatasolutions/orgs",
"received_events_url": "https://api.github.com/users/apollodatasolutions/received_events",
"repos_url": "https://api.github.com/users/apollodatasolutions/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/apollodatasolutions/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/apollodatasolutions/subscriptions",
"type": "User",
"url": "https://api.github.com/users/apollodatasolutions",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2017-04-17T03:55:19Z
|
2021-09-08T11:00:33Z
|
2017-04-17T22:27:51Z
|
NONE
|
resolved
|
We were using Requests with FileStack without issues until we ran into this URL format:
**https://process.filestackapi.com/store=l:S3,p:"/watermarked/"/watermark=f:wUaFDoPfRAiT50LQZWs9,p:[bottom,right],s:63/p2aaZ8pzS8md1vLzY7V2**
the **p:[bottom,right]** part seems to be causing issues (the request doesn't properly make it to their end). FileStack's documentation suggests using curl with the -g option:
> -g, --globoff
>
> This option switches off the "URL globbing parser". When you set this option, you can specify URLs that contain the letters {}[] without having them being interpreted by curl itself. Note that these letters are not normal legal URL contents but they should be encoded according to the URI standard.
> https://curl.haxx.se/docs/manpage.html
Is there a similar option for Requests? I saw that it's not a valid URI, so I am also talking to FileStack to be able to resolve this but I am wondering if there is a temporary solution while I wait for them.
Looking at the response URL, it seems the quotes are encoded but not the brackets?
**https://process.filestackapi.com/store=l:S3,p:%22/watermarked/%22/watermark=f:9shDBUAMSMm58i5r3Ujr,p:[bottom,right],s:17/uj16GiSQa2bzDCVY34vw**
but the same happens in the p:center case which works just fine.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3072436?v=4",
"events_url": "https://api.github.com/users/apollodatasolutions/events{/privacy}",
"followers_url": "https://api.github.com/users/apollodatasolutions/followers",
"following_url": "https://api.github.com/users/apollodatasolutions/following{/other_user}",
"gists_url": "https://api.github.com/users/apollodatasolutions/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/apollodatasolutions",
"id": 3072436,
"login": "apollodatasolutions",
"node_id": "MDQ6VXNlcjMwNzI0MzY=",
"organizations_url": "https://api.github.com/users/apollodatasolutions/orgs",
"received_events_url": "https://api.github.com/users/apollodatasolutions/received_events",
"repos_url": "https://api.github.com/users/apollodatasolutions/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/apollodatasolutions/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/apollodatasolutions/subscriptions",
"type": "User",
"url": "https://api.github.com/users/apollodatasolutions",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3968/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3968/timeline
| null |
completed
| null | null | false |
[
"Are you sure you need it? Requests won't treat them as globs, so that's not a concern. We will encode them, but the snippet of documentation you provided suggested that that is fine. Have you actually tried the request at all?",
"Yes, I tried it without the square brackets, no luck. And yes, I tried the request. I get a 400 error back from FileStack only when the brackets are present.\r\n\r\nThe last URL I posted above is from the response.url property after the request is made. Looks like the quotes are encoded but the brackets are not. I'll see if FileStack has any news on this. Not sure if this is a requests or FileStack issue at this point, will update when I know more. Might be good leaving this issue just in case anyone runs into a similar problem.",
"Looks like the solution was simple, just manually encode the bracket characters like this:\r\n\r\n**(https://process.filestackapi.com/store=l:S3,p:%22/watermarked/%22/watermark=f:wUaFDoPfRAiT60LQZWs9,p:%5Bbottom,right%5D,s:17/nPh71zWTTCmQXD9ho2ZF)**"
] |
https://api.github.com/repos/psf/requests/issues/3967
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3967/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3967/comments
|
https://api.github.com/repos/psf/requests/issues/3967/events
|
https://github.com/psf/requests/issues/3967
| 222,057,659 |
MDU6SXNzdWUyMjIwNTc2NTk=
| 3,967 |
waiting for a very long time for the reponse
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10825713?v=4",
"events_url": "https://api.github.com/users/etiaxie/events{/privacy}",
"followers_url": "https://api.github.com/users/etiaxie/followers",
"following_url": "https://api.github.com/users/etiaxie/following{/other_user}",
"gists_url": "https://api.github.com/users/etiaxie/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/etiaxie",
"id": 10825713,
"login": "etiaxie",
"node_id": "MDQ6VXNlcjEwODI1NzEz",
"organizations_url": "https://api.github.com/users/etiaxie/orgs",
"received_events_url": "https://api.github.com/users/etiaxie/received_events",
"repos_url": "https://api.github.com/users/etiaxie/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/etiaxie/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/etiaxie/subscriptions",
"type": "User",
"url": "https://api.github.com/users/etiaxie",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-04-17T03:24:43Z
|
2021-09-08T10:00:50Z
|
2017-05-15T16:15:42Z
|
NONE
|
resolved
|
python version:2.7
requests version:2.11.1
when the response expected to be redirect(301),after I send out the request.get message, it will take a long time to get response.
however in the wireshark can see the response is immediately returned.it seems like somewhere in code after get the response takes the time out
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3967/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3967/timeline
| null |
completed
| null | null | false |
[
"Is the response body complete immediately? Can you provide a URL to reproduce this error?",
"This has been stale for nearly a month. We can reopen this if we get more information and can determine that this is actually a bug in Request."
] |
https://api.github.com/repos/psf/requests/issues/3966
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3966/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3966/comments
|
https://api.github.com/repos/psf/requests/issues/3966/events
|
https://github.com/psf/requests/issues/3966
| 222,045,805 |
MDU6SXNzdWUyMjIwNDU4MDU=
| 3,966 |
_ssl.c:510: error:14077438
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/13785009?v=4",
"events_url": "https://api.github.com/users/Elias-gg/events{/privacy}",
"followers_url": "https://api.github.com/users/Elias-gg/followers",
"following_url": "https://api.github.com/users/Elias-gg/following{/other_user}",
"gists_url": "https://api.github.com/users/Elias-gg/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Elias-gg",
"id": 13785009,
"login": "Elias-gg",
"node_id": "MDQ6VXNlcjEzNzg1MDA5",
"organizations_url": "https://api.github.com/users/Elias-gg/orgs",
"received_events_url": "https://api.github.com/users/Elias-gg/received_events",
"repos_url": "https://api.github.com/users/Elias-gg/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Elias-gg/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Elias-gg/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Elias-gg",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-04-17T01:10:09Z
|
2021-09-08T11:00:33Z
|
2017-04-17T12:20:27Z
|
NONE
|
resolved
|
I am currently running python 2.7.6 on C9.io
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3966/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3966/timeline
| null |
completed
| null | null | false |
[
"This is nowhere near enough information. Please let us know the installed version of Requests, all other packages installed in your environment, what host you're connecting to, and what the Python startup version data is."
] |
https://api.github.com/repos/psf/requests/issues/3965
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3965/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3965/comments
|
https://api.github.com/repos/psf/requests/issues/3965/events
|
https://github.com/psf/requests/issues/3965
| 221,865,041 |
MDU6SXNzdWUyMjE4NjUwNDE=
| 3,965 |
OpenSSL.SSL.SysCallError: (-1, 'Unexpected EOF')
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/17933013?v=4",
"events_url": "https://api.github.com/users/ShepherdTsai/events{/privacy}",
"followers_url": "https://api.github.com/users/ShepherdTsai/followers",
"following_url": "https://api.github.com/users/ShepherdTsai/following{/other_user}",
"gists_url": "https://api.github.com/users/ShepherdTsai/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ShepherdTsai",
"id": 17933013,
"login": "ShepherdTsai",
"node_id": "MDQ6VXNlcjE3OTMzMDEz",
"organizations_url": "https://api.github.com/users/ShepherdTsai/orgs",
"received_events_url": "https://api.github.com/users/ShepherdTsai/received_events",
"repos_url": "https://api.github.com/users/ShepherdTsai/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ShepherdTsai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ShepherdTsai/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ShepherdTsai",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 12 |
2017-04-14T18:35:36Z
|
2021-09-08T11:00:31Z
|
2017-04-14T18:56:30Z
|
NONE
|
resolved
|
I'm on Windows 10 with OpenSSL 1.0.1l 15 Jan 2015, python 3.4.3, Requests 2.13.10, and when attempting to connect to a website with the following code:
```
import requests
s = requests.Session()
res = s.get('https://ma.mohw.gov.tw/masearch/SearchBAS-101-1.aspx', verify=True)
```
It throws the following error:
```
Traceback (most recent call last):
File "C:\Python34\lib\site-packages\requests\packages\urllib3\contrib\pyopenssl.py", line 436, in wrap_socket
cnx.do_handshake()
File "C:\Python34\lib\site-packages\OpenSSL\SSL.py", line 1426, in do_handshake
self._raise_ssl_error(self._ssl, result)
File "C:\Python34\lib\site-packages\OpenSSL\SSL.py", line 1167, in _raise_ssl_error
raise SysCallError(-1, "Unexpected EOF")
OpenSSL.SSL.SysCallError: (-1, 'Unexpected EOF')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Python34\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 600, in urlopen
chunked=chunked)
File "C:\Python34\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 345, in _make_request
self._validate_conn(conn)
File "C:\Python34\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 844, in _validate_conn
conn.connect()
File "C:\Python34\lib\site-packages\requests\packages\urllib3\connection.py", line 326, in connect
ssl_context=context)
File "C:\Python34\lib\site-packages\requests\packages\urllib3\util\ssl_.py", line 324, in ssl_wrap_socket
return context.wrap_socket(sock, server_hostname=server_hostname)
File "C:\Python34\lib\site-packages\requests\packages\urllib3\contrib\pyopenssl.py", line 443, in wrap_socket
raise ssl.SSLError('bad handshake: %r' % e)
ssl.SSLError: ("bad handshake: SysCallError(-1, 'Unexpected EOF')",)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Python34\lib\site-packages\requests\adapters.py", line 423, in send
timeout=timeout
File "C:\Python34\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 630, in urlopen
raise SSLError(e)
requests.packages.urllib3.exceptions.SSLError: ("bad handshake: SysCallError(-1, 'Unexpected EOF')",)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:/Users/XXX/Desktop/MOHWcrawler/demo2.py", line 14, in <module>
res = s.get('https://ma.mohw.gov.tw/masearch/SearchBAS-101-1.aspx', verify=True)
File "C:\Python34\lib\site-packages\requests\sessions.py", line 501, in get
return self.request('GET', url, **kwargs)
File "C:\Python34\lib\site-packages\requests\sessions.py", line 488, in request
resp = self.send(prep, **send_kwargs)
File "C:\Python34\lib\site-packages\requests\sessions.py", line 609, in send
r = adapter.send(request, **kwargs)
File "C:\Python34\lib\site-packages\requests\adapters.py", line 497, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: ("bad handshake: SysCallError(-1, 'Unexpected EOF')",)
```
I have tried the following method:
```
import requests
# import ssl
s = requests.Session()
# s.mount('https://', SSLAdapter(ssl.PROTOCOL_TLSv1_2))
res = s.get('https://ma.mohw.gov.tw/masearch/SearchBAS-101-1.aspx', verify=True)
```
and already installed pyasn1 and ndg-httpsclient, but doesn't work
but if I use urllib package, then I can connect to the server
```
from urllib.request import urlopen
res = urlopen('https://ma.mohw.gov.tw/masearch/SearchBAS-101-1.aspx')
```
but I want to use requests package to do further work,
Is there any way I can solve this problem?
thanks.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3965/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3965/timeline
| null |
completed
| null | null | false |
[
"As you can see from the warnings [here](https://www.ssllabs.com/ssltest/analyze.html?d=ma.mohw.gov.tw&s=203.65.42.166&hideResults=on), this server has a pretty bad TLS config. In particular, the only cipher suite it supports was removed from the default Requests config because it is insecure in some uses. \r\n\r\nYou can resolve this by using the adapter discussed in [this comment](https://github.com/kennethreitz/requests/issues/3774#issuecomment-267871876). ",
"Using your method, i get 'super' object has no attribute 'init_poolmanager' error\r\nso I use bigbagboom's method, but it doesn't work too...",
"Can you show me exactly the code you used please?",
"```\r\nimport requests\r\nimport ssl\r\nfrom requests.adapters import HTTPAdapter\r\nfrom requests.packages.urllib3.poolmanager import PoolManager\r\nfrom requests.packages.urllib3.util.ssl_ import create_urllib3_context\r\n\r\nCIPHERS = (\r\n 'ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:'\r\n 'DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:!aNULL:'\r\n '!eNULL:!MD5'\r\n)\r\n\r\nrequests.packages.urllib3.disable_warnings()\r\n\r\nclass DESAdapter(HTTPAdapter):\r\n def init_poolmanager(self, connections, maxsize, block=False,*args, **kwargs):\r\n context = create_urllib3_context(ciphers=CIPHERS)\r\n kwargs['ssl_context'] = context\r\n self.poolmanager = PoolManager(\r\n num_pools=connections, maxsize=maxsize,\r\n block=block, ssl_version=ssl.PROTOCOL_TLSv1_2,*args, **kwargs)\r\n\r\ns = requests.Session()\r\ns.mount('https://ma.mohw.gov.tw/', DESAdapter())\r\nres = s.get('https://ma.mohw.gov.tw/masearch/SearchBAS-101-1.aspx', verify=False)\r\n```\r\n\r\nsorry for my poor knowledge of this field, please give me some advice.",
"Remove the `ssl_version=ssl.PROTOCOL_TLSv1_2` and try again.",
"Tried, but got the same error",
"Hrm. Are you familiar with wireshark or tcpdump? It'd be interesting to see what the TLS handshake looks like with the adapter in place.",
"I'm not familiar but i try to use wireshark to grab the package and here I show a part of the handshake, which its protocol is SSL(Does this help me?) :\r\n\r\n```\r\ntime Source Destination Protocol Length Info\r\n2.048765\t192.168.2.100\t203.65.42.166\tSSL\t 321 \t Client Hello\r\n```\r\n```\r\nFrame 27: 321 bytes on wire (2568 bits), 321 bytes captured (2568 bits) on interface 0\r\n Interface id: 0 (\\Device\\NPF_{D2AC6833-3422-46BF-BB65-5386B5A4A82A})\r\n Encapsulation type: Ethernet (1)\r\n Arrival Time: Apr 16, 2017 00:59:27.606346000 Taipei Standard Time\r\n [Time shift for this packet: 0.000000000 seconds]\r\n Epoch Time: 1492275567.606346000 seconds\r\n [Time delta from previous captured frame: 0.000406000 seconds]\r\n [Time delta from previous displayed frame: 0.000406000 seconds]\r\n [Time since reference or first frame: 2.048765000 seconds]\r\n Frame Number: 27\r\n Frame Length: 321 bytes (2568 bits)\r\n Capture Length: 321 bytes (2568 bits)\r\n [Frame is marked: False]\r\n [Frame is ignored: False]\r\n [Protocols in frame: eth:ethertype:ip:tcp:ssl]\r\n [Coloring Rule Name: TCP]\r\n [Coloring Rule String: tcp]\r\nEthernet II, Src: Giga-Byt_7f:2a:ff (94:de:80:7f:2a:ff), Dst: SmcNetwo_f1:d7:9c (b8:9b:c9:f1:d7:9c)\r\n Destination: SmcNetwo_f1:d7:9c (b8:9b:c9:f1:d7:9c)\r\n Address: SmcNetwo_f1:d7:9c (b8:9b:c9:f1:d7:9c)\r\n .... ..0. .... .... .... .... = LG bit: Globally unique address (factory default)\r\n .... ...0 .... .... .... .... = IG bit: Individual address (unicast)\r\n Source: Giga-Byt_7f:2a:ff (94:de:80:7f:2a:ff)\r\n Address: Giga-Byt_7f:2a:ff (94:de:80:7f:2a:ff)\r\n .... ..0. .... .... .... .... = LG bit: Globally unique address (factory default)\r\n .... ...0 .... .... .... .... = IG bit: Individual address (unicast)\r\n Type: IPv4 (0x0800)\r\nInternet Protocol Version 4, Src: 192.168.2.100, Dst: 203.65.42.166\r\n 0100 .... = Version: 4\r\n .... 0101 = Header Length: 20 bytes (5)\r\n Differentiated Services Field: 0x00 (DSCP: CS0, ECN: Not-ECT)\r\n 0000 00.. = Differentiated Services Codepoint: Default (0)\r\n .... ..00 = Explicit Congestion Notification: Not ECN-Capable Transport (0)\r\n Total Length: 307\r\n Identification: 0x73b7 (29623)\r\n Flags: 0x02 (Don't Fragment)\r\n 0... .... = Reserved bit: Not set\r\n .1.. .... = Don't fragment: Set\r\n ..0. .... = More fragments: Not set\r\n Fragment offset: 0\r\n Time to live: 128\r\n Protocol: TCP (6)\r\n Header checksum: 0xcd19 [validation disabled]\r\n [Header checksum status: Unverified]\r\n Source: 192.168.2.100\r\n Destination: 203.65.42.166\r\n [Source GeoIP: Unknown]\r\n [Destination GeoIP: Unknown]\r\nTransmission Control Protocol, Src Port: 63602, Dst Port: 443, Seq: 1, Ack: 1, Len: 267\r\n Source Port: 63602\r\n Destination Port: 443\r\n [Stream index: 5]\r\n [TCP Segment Len: 267]\r\n Sequence number: 1 (relative sequence number)\r\n [Next sequence number: 268 (relative sequence number)]\r\n Acknowledgment number: 1 (relative ack number)\r\n Header Length: 20 bytes\r\n Flags: 0x018 (PSH, ACK)\r\n 000. .... .... = Reserved: Not set\r\n ...0 .... .... = Nonce: Not set\r\n .... 0... .... = Congestion Window Reduced (CWR): Not set\r\n .... .0.. .... = ECN-Echo: Not set\r\n .... ..0. .... = Urgent: Not set\r\n .... ...1 .... = Acknowledgment: Set\r\n .... .... 1... = Push: Set\r\n .... .... .0.. = Reset: Not set\r\n .... .... ..0. = Syn: Not set\r\n .... .... ...0 = Fin: Not set\r\n [TCP Flags: ·······AP···]\r\n Window size value: 256\r\n [Calculated window size: 65536]\r\n [Window size scaling factor: 256]\r\n Checksum: 0x1b1c [unverified]\r\n [Checksum Status: Unverified]\r\n Urgent pointer: 0\r\n [SEQ/ACK analysis]\r\n [iRTT: 0.008987000 seconds]\r\n [Bytes in flight: 267]\r\n [Bytes sent since last PSH flag: 267]\r\nSecure Sockets Layer\r\n SSL Record Layer: Handshake Protocol: Client Hello\r\n Content Type: Handshake (22)\r\n Version: TLS 1.0 (0x0301)\r\n Length: 262\r\n Handshake Protocol: Client Hello\r\n Handshake Type: Client Hello (1)\r\n Length: 258\r\n Version: TLS 1.2 (0x0303)\r\n Random\r\n GMT Unix Time: May 31, 2083 19:49:57.000000000 Taipei Standard Time\r\n Random Bytes: 95a171c46ceb889817142eb99fdd07b4e07bec92dba53c46...\r\n Session ID Length: 0\r\n Cipher Suites Length: 124\r\n Cipher Suites (62 suites)\r\n Cipher Suite: TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384 (0xc02c)\r\n Cipher Suite: TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384 (0xc030)\r\n Cipher Suite: TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 (0xc02b)\r\n Cipher Suite: TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256 (0xc02f)\r\n Cipher Suite: TLS_DHE_DSS_WITH_AES_256_GCM_SHA384 (0x00a3)\r\n Cipher Suite: TLS_DHE_RSA_WITH_AES_256_GCM_SHA384 (0x009f)\r\n Cipher Suite: TLS_DHE_DSS_WITH_AES_128_GCM_SHA256 (0x00a2)\r\n Cipher Suite: TLS_DHE_RSA_WITH_AES_128_GCM_SHA256 (0x009e)\r\n Cipher Suite: TLS_ECDHE_ECDSA_WITH_AES_256_CCM_8 (0xc0af)\r\n Cipher Suite: TLS_ECDHE_ECDSA_WITH_AES_256_CCM (0xc0ad)\r\n Cipher Suite: TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384 (0xc024)\r\n Cipher Suite: TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384 (0xc028)\r\n Cipher Suite: TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA (0xc00a)\r\n Cipher Suite: TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA (0xc014)\r\n Cipher Suite: TLS_DHE_RSA_WITH_AES_256_CCM_8 (0xc0a3)\r\n Cipher Suite: TLS_DHE_RSA_WITH_AES_256_CCM (0xc09f)\r\n Cipher Suite: TLS_DHE_RSA_WITH_AES_256_CBC_SHA256 (0x006b)\r\n Cipher Suite: TLS_DHE_DSS_WITH_AES_256_CBC_SHA256 (0x006a)\r\n Cipher Suite: TLS_DHE_RSA_WITH_AES_256_CBC_SHA (0x0039)\r\n Cipher Suite: TLS_DHE_DSS_WITH_AES_256_CBC_SHA (0x0038)\r\n Cipher Suite: TLS_ECDHE_ECDSA_WITH_AES_128_CCM_8 (0xc0ae)\r\n Cipher Suite: TLS_ECDHE_ECDSA_WITH_AES_128_CCM (0xc0ac)\r\n Cipher Suite: TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256 (0xc023)\r\n Cipher Suite: TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256 (0xc027)\r\n Cipher Suite: TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA (0xc009)\r\n Cipher Suite: TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA (0xc013)\r\n Cipher Suite: TLS_DHE_RSA_WITH_AES_128_CCM_8 (0xc0a2)\r\n Cipher Suite: TLS_DHE_RSA_WITH_AES_128_CCM (0xc09e)\r\n Cipher Suite: TLS_DHE_RSA_WITH_AES_128_CBC_SHA256 (0x0067)\r\n Cipher Suite: TLS_DHE_DSS_WITH_AES_128_CBC_SHA256 (0x0040)\r\n Cipher Suite: TLS_DHE_RSA_WITH_AES_128_CBC_SHA (0x0033)\r\n Cipher Suite: TLS_DHE_DSS_WITH_AES_128_CBC_SHA (0x0032)\r\n Cipher Suite: TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256 (0xcca9)\r\n Cipher Suite: TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 (0xcca8)\r\n Cipher Suite: TLS_ECDHE_ECDSA_WITH_CAMELLIA_256_CBC_SHA384 (0xc073)\r\n Cipher Suite: TLS_ECDHE_RSA_WITH_CAMELLIA_256_CBC_SHA384 (0xc077)\r\n Cipher Suite: TLS_ECDHE_ECDSA_WITH_CAMELLIA_128_CBC_SHA256 (0xc072)\r\n Cipher Suite: TLS_ECDHE_RSA_WITH_CAMELLIA_128_CBC_SHA256 (0xc076)\r\n Cipher Suite: TLS_DHE_RSA_WITH_CHACHA20_POLY1305_SHA256 (0xccaa)\r\n Cipher Suite: TLS_DHE_RSA_WITH_CAMELLIA_256_CBC_SHA256 (0x00c4)\r\n Cipher Suite: TLS_DHE_DSS_WITH_CAMELLIA_256_CBC_SHA256 (0x00c3)\r\n Cipher Suite: TLS_DHE_RSA_WITH_CAMELLIA_128_CBC_SHA256 (0x00be)\r\n Cipher Suite: TLS_DHE_DSS_WITH_CAMELLIA_128_CBC_SHA256 (0x00bd)\r\n Cipher Suite: TLS_DHE_RSA_WITH_CAMELLIA_256_CBC_SHA (0x0088)\r\n Cipher Suite: TLS_DHE_DSS_WITH_CAMELLIA_256_CBC_SHA (0x0087)\r\n Cipher Suite: TLS_DHE_RSA_WITH_CAMELLIA_128_CBC_SHA (0x0045)\r\n Cipher Suite: TLS_DHE_DSS_WITH_CAMELLIA_128_CBC_SHA (0x0044)\r\n Cipher Suite: TLS_RSA_WITH_AES_256_GCM_SHA384 (0x009d)\r\n Cipher Suite: TLS_RSA_WITH_AES_128_GCM_SHA256 (0x009c)\r\n Cipher Suite: TLS_RSA_WITH_AES_256_CCM_8 (0xc0a1)\r\n Cipher Suite: TLS_RSA_WITH_AES_256_CCM (0xc09d)\r\n Cipher Suite: TLS_RSA_WITH_AES_128_CCM_8 (0xc0a0)\r\n Cipher Suite: TLS_RSA_WITH_AES_128_CCM (0xc09c)\r\n Cipher Suite: TLS_RSA_WITH_AES_256_CBC_SHA256 (0x003d)\r\n Cipher Suite: TLS_RSA_WITH_AES_128_CBC_SHA256 (0x003c)\r\n Cipher Suite: TLS_RSA_WITH_AES_256_CBC_SHA (0x0035)\r\n Cipher Suite: TLS_RSA_WITH_AES_128_CBC_SHA (0x002f)\r\n Cipher Suite: TLS_RSA_WITH_CAMELLIA_256_CBC_SHA256 (0x00c0)\r\n Cipher Suite: TLS_RSA_WITH_CAMELLIA_128_CBC_SHA256 (0x00ba)\r\n Cipher Suite: TLS_RSA_WITH_CAMELLIA_256_CBC_SHA (0x0084)\r\n Cipher Suite: TLS_RSA_WITH_CAMELLIA_128_CBC_SHA (0x0041)\r\n Cipher Suite: TLS_EMPTY_RENEGOTIATION_INFO_SCSV (0x00ff)\r\n Compression Methods Length: 1\r\n Compression Methods (1 method)\r\n Compression Method: null (0)\r\n Extensions Length: 93\r\n Extension: server_name\r\n Type: server_name (0x0000)\r\n Length: 19\r\n Server Name Indication extension\r\n Server Name list length: 17\r\n Server Name Type: host_name (0)\r\n Server Name length: 14\r\n Server Name: ma.mohw.gov.tw\r\n Extension: ec_point_formats\r\n Type: ec_point_formats (0x000b)\r\n Length: 4\r\n EC point formats Length: 3\r\n Elliptic curves point formats (3)\r\n EC point format: uncompressed (0)\r\n EC point format: ansiX962_compressed_prime (1)\r\n EC point format: ansiX962_compressed_char2 (2)\r\n Extension: elliptic_curves\r\n Type: elliptic_curves (0x000a)\r\n Length: 10\r\n Elliptic Curves Length: 8\r\n Elliptic curves (4 curves)\r\n Elliptic curve: ecdh_x25519 (0x001d)\r\n Elliptic curve: secp256r1 (0x0017)\r\n Elliptic curve: secp521r1 (0x0019)\r\n Elliptic curve: secp384r1 (0x0018)\r\n Extension: SessionTicket TLS\r\n Type: SessionTicket TLS (0x0023)\r\n Length: 0\r\n Data (0 bytes)\r\n Extension: signature_algorithms\r\n Type: signature_algorithms (0x000d)\r\n Length: 32\r\n Signature Hash Algorithms Length: 30\r\n Signature Hash Algorithms (15 algorithms)\r\n Signature Hash Algorithm: 0x0601\r\n Signature Hash Algorithm Hash: SHA512 (6)\r\n Signature Hash Algorithm Signature: RSA (1)\r\n Signature Hash Algorithm: 0x0602\r\n Signature Hash Algorithm Hash: SHA512 (6)\r\n Signature Hash Algorithm Signature: DSA (2)\r\n Signature Hash Algorithm: 0x0603\r\n Signature Hash Algorithm Hash: SHA512 (6)\r\n Signature Hash Algorithm Signature: ECDSA (3)\r\n Signature Hash Algorithm: 0x0501\r\n Signature Hash Algorithm Hash: SHA384 (5)\r\n Signature Hash Algorithm Signature: RSA (1)\r\n Signature Hash Algorithm: 0x0502\r\n Signature Hash Algorithm Hash: SHA384 (5)\r\n Signature Hash Algorithm Signature: DSA (2)\r\n Signature Hash Algorithm: 0x0503\r\n Signature Hash Algorithm Hash: SHA384 (5)\r\n Signature Hash Algorithm Signature: ECDSA (3)\r\n Signature Hash Algorithm: 0x0401\r\n Signature Hash Algorithm Hash: SHA256 (4)\r\n Signature Hash Algorithm Signature: RSA (1)\r\n Signature Hash Algorithm: 0x0402\r\n Signature Hash Algorithm Hash: SHA256 (4)\r\n Signature Hash Algorithm Signature: DSA (2)\r\n Signature Hash Algorithm: 0x0403\r\n Signature Hash Algorithm Hash: SHA256 (4)\r\n Signature Hash Algorithm Signature: ECDSA (3)\r\n Signature Hash Algorithm: 0x0301\r\n Signature Hash Algorithm Hash: SHA224 (3)\r\n Signature Hash Algorithm Signature: RSA (1)\r\n Signature Hash Algorithm: 0x0302\r\n Signature Hash Algorithm Hash: SHA224 (3)\r\n Signature Hash Algorithm Signature: DSA (2)\r\n Signature Hash Algorithm: 0x0303\r\n Signature Hash Algorithm Hash: SHA224 (3)\r\n Signature Hash Algorithm Signature: ECDSA (3)\r\n Signature Hash Algorithm: 0x0201\r\n Signature Hash Algorithm Hash: SHA1 (2)\r\n Signature Hash Algorithm Signature: RSA (1)\r\n Signature Hash Algorithm: 0x0202\r\n Signature Hash Algorithm Hash: SHA1 (2)\r\n Signature Hash Algorithm Signature: DSA (2)\r\n Signature Hash Algorithm: 0x0203\r\n Signature Hash Algorithm Hash: SHA1 (2)\r\n Signature Hash Algorithm Signature: ECDSA (3)\r\n Extension: encrypt then mac\r\n Type: encrypt then mac (0x0016)\r\n Length: 0\r\n Data (0 bytes)\r\n Extension: Extended Master Secret\r\n Type: Extended Master Secret (0x0017)\r\n Length: 0\r\n```",
"Yup, so that shows that the ciphers we wanted to add aren't being used. Out of interest, are you being redirected? (Set `allow_redirects=False`) to see what's happening.",
"oops, I set allow_redirects=False, but showed the same thing...",
"So when I copy/paste your sample code onto my machine and run it, I see 3DES in the TLS handshake (specifically, one of the ciphers is called `TLS_RSA_WITH_3DES_EDE_CBC_SHA (0x000a)`. Are you *sure* you were running exactly the sample code above?",
"That's strange, I have not dealt with this matter for a few days, but now it seems work well suddenly\r\nThank you very much for your assistance"
] |
https://api.github.com/repos/psf/requests/issues/3964
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3964/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3964/comments
|
https://api.github.com/repos/psf/requests/issues/3964/events
|
https://github.com/psf/requests/issues/3964
| 221,757,931 |
MDU6SXNzdWUyMjE3NTc5MzE=
| 3,964 |
when i import requests, UnicodeDecodeError: 'ascii' codec can't decode byte 0xcf in position 1: ordinal not in range(128),how to solve this?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/16998719?v=4",
"events_url": "https://api.github.com/users/BigFishhhh/events{/privacy}",
"followers_url": "https://api.github.com/users/BigFishhhh/followers",
"following_url": "https://api.github.com/users/BigFishhhh/following{/other_user}",
"gists_url": "https://api.github.com/users/BigFishhhh/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/BigFishhhh",
"id": 16998719,
"login": "BigFishhhh",
"node_id": "MDQ6VXNlcjE2OTk4NzE5",
"organizations_url": "https://api.github.com/users/BigFishhhh/orgs",
"received_events_url": "https://api.github.com/users/BigFishhhh/received_events",
"repos_url": "https://api.github.com/users/BigFishhhh/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/BigFishhhh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/BigFishhhh/subscriptions",
"type": "User",
"url": "https://api.github.com/users/BigFishhhh",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-04-14T07:26:33Z
|
2021-09-08T11:00:33Z
|
2017-04-14T10:32:43Z
|
NONE
|
resolved
|
>>> import requests
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Python2713\lib\site-packages\requests\__init__.py", line 52, in <module>
from .packages.urllib3.contrib import pyopenssl
File "C:\Python2713\lib\site-packages\requests\packages\urllib3\contrib\pyopenssl.py", line 47, in <module>
from cryptography import x509
File "C:\Python2713\lib\site-packages\cryptography\x509\__init__.py", line 7, in <module>
from cryptography.x509.base import (
File "C:\Python2713\lib\site-packages\cryptography\x509\base.py", line 16, in <module>
from cryptography.x509.extensions import Extension, ExtensionType
File "C:\Python2713\lib\site-packages\cryptography\x509\extensions.py", line 14, in <module>
from asn1crypto.keys import PublicKeyInfo
File "C:\Python2713\lib\site-packages\asn1crypto\keys.py", line 22, in <module>
from ._elliptic_curve import (
File "C:\Python2713\lib\site-packages\asn1crypto\_elliptic_curve.py", line 51, in <module>
from ._int import inverse_mod
File "C:\Python2713\lib\site-packages\asn1crypto\_int.py", line 56, in <module>
from ._perf._big_num_ctypes import libcrypto
File "C:\Python2713\lib\site-packages\asn1crypto\_perf\_big_num_ctypes.py", line 31, in <module>
libcrypto_path = find_library('crypto')
File "C:\Python2713\lib\ctypes\util.py", line 53, in find_library
fname = os.path.join(directory, name)
File "C:\Python2713\lib\ntpath.py", line 85, in join
result_path = result_path + p_path
UnicodeDecodeError: 'ascii' codec can't decode byte 0xcf in position 1: ordinal not in range(128)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3964/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3964/timeline
| null |
completed
| null | null | false |
[
"Hey there. You have raised this issue before, [here](https://github.com/kennethreitz/requests/issues/1252#issuecomment-291438569), and so I will give the same advice again: this appears to be a problem with asn1crypto, not Requests: please open an issue on their repository."
] |
https://api.github.com/repos/psf/requests/issues/3963
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3963/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3963/comments
|
https://api.github.com/repos/psf/requests/issues/3963/events
|
https://github.com/psf/requests/pull/3963
| 221,324,803 |
MDExOlB1bGxSZXF1ZXN0MTE1NTg3NDIy
| 3,963 |
fix unicode decode error on py2 when handling redirect without scheme
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/23416546?v=4",
"events_url": "https://api.github.com/users/shmuelamar/events{/privacy}",
"followers_url": "https://api.github.com/users/shmuelamar/followers",
"following_url": "https://api.github.com/users/shmuelamar/following{/other_user}",
"gists_url": "https://api.github.com/users/shmuelamar/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/shmuelamar",
"id": 23416546,
"login": "shmuelamar",
"node_id": "MDQ6VXNlcjIzNDE2NTQ2",
"organizations_url": "https://api.github.com/users/shmuelamar/orgs",
"received_events_url": "https://api.github.com/users/shmuelamar/received_events",
"repos_url": "https://api.github.com/users/shmuelamar/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/shmuelamar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shmuelamar/subscriptions",
"type": "User",
"url": "https://api.github.com/users/shmuelamar",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2017-04-12T16:49:28Z
|
2021-09-06T00:07:04Z
|
2017-04-14T20:13:50Z
|
CONTRIBUTOR
|
resolved
|
as discussed on #3959 , fix the UnicodeDecodeError happening because implicit decoding of str string into unicode while using python2.
This issue happens when handling redirection without scheme (i.e. Location: //URL) on python2 and only if the location is unicode.
First time applying a PR on GitHub so let me know if there is something I can do to make it better :).
Thanks,
Shmulik
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3963/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3963/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3963.diff",
"html_url": "https://github.com/psf/requests/pull/3963",
"merged_at": "2017-04-14T20:13:50Z",
"patch_url": "https://github.com/psf/requests/pull/3963.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3963"
}
| true |
[
"# [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3963?src=pr&el=h1) Report\n> Merging [#3963](https://codecov.io/gh/kennethreitz/requests/pull/3963?src=pr&el=desc) into [master](https://codecov.io/gh/kennethreitz/requests/commit/84a60407628e83c004e3ea24a2d856a55d7a5c59?src=pr&el=desc) will **increase** coverage by `0.1%`.\n> The diff coverage is `100%`.\n\n[](https://codecov.io/gh/kennethreitz/requests/pull/3963?src=pr&el=tree)\n\n```diff\n@@ Coverage Diff @@\n## master #3963 +/- ##\n=========================================\n+ Coverage 89.61% 89.72% +0.1% \n=========================================\n Files 15 15 \n Lines 1946 1946 \n=========================================\n+ Hits 1744 1746 +2 \n+ Misses 202 200 -2\n```\n\n\n| [Impacted Files](https://codecov.io/gh/kennethreitz/requests/pull/3963?src=pr&el=tree) | Coverage Δ | |\n|---|---|---|\n| [requests/sessions.py](https://codecov.io/gh/kennethreitz/requests/pull/3963?src=pr&el=tree#diff-cmVxdWVzdHMvc2Vzc2lvbnMucHk=) | `93.68% <100%> (+0.74%)` | :arrow_up: |\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3963?src=pr&el=continue).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta)\n> `Δ = absolute <relative> (impact)`, `ø = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/kennethreitz/requests/pull/3963?src=pr&el=footer). Last update [84a6040...a3e597c](https://codecov.io/gh/kennethreitz/requests/pull/3963?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments).",
"agree, its probably better - changed"
] |
https://api.github.com/repos/psf/requests/issues/3962
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3962/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3962/comments
|
https://api.github.com/repos/psf/requests/issues/3962/events
|
https://github.com/psf/requests/issues/3962
| 220,832,727 |
MDU6SXNzdWUyMjA4MzI3Mjc=
| 3,962 |
Set-Cookie header didn't handled
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4250691?v=4",
"events_url": "https://api.github.com/users/tutorial0/events{/privacy}",
"followers_url": "https://api.github.com/users/tutorial0/followers",
"following_url": "https://api.github.com/users/tutorial0/following{/other_user}",
"gists_url": "https://api.github.com/users/tutorial0/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tutorial0",
"id": 4250691,
"login": "tutorial0",
"node_id": "MDQ6VXNlcjQyNTA2OTE=",
"organizations_url": "https://api.github.com/users/tutorial0/orgs",
"received_events_url": "https://api.github.com/users/tutorial0/received_events",
"repos_url": "https://api.github.com/users/tutorial0/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tutorial0/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tutorial0/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tutorial0",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2017-04-11T03:56:52Z
|
2021-09-08T11:00:34Z
|
2017-04-11T06:30:29Z
|
NONE
|
resolved
|
Test code:
```
In [2]: rq = requests.get('http://httpbin.org/cookies/delete?k2=&k1=', cookies={'k2':'1'})
In [3]: rq.content
Out[3]: '{\n "cookies": {\n "k2": "1"\n }\n}\n'
```
requests should delete `k2` and return `"cookies": {}`, but it didn't.
Should act like:
```
➜ ~ http http://httpbin.org/cookies/delete\?k1\=\&k1\=2 'Cookie: k1=1' -p Hb
GET /cookies/delete?k1=&k1=2 HTTP/1.1
Accept: */*
Accept-Encoding: gzip, deflate
Connection: keep-alive
Cookie: k1=1
Host: httpbin.org
User-Agent: HTTPie/0.9.4
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2 Final//EN">
<title>Redirecting...</title>
<h1>Redirecting...</h1>
<p>You should be redirected automatically to target URL: <a href="/cookies">/cookies</a>. If not click the link.
➜ ~ http -F http://httpbin.org/cookies/delete\?k1\=\&k1\=2 'Cookie: k1=1' -p Hb
GET /cookies HTTP/1.1
Accept: */*
Accept-Encoding: gzip, deflate
Connection: keep-alive
Host: httpbin.org
User-Agent: HTTPie/0.9.4
{
"cookies": {}
}
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3962/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3962/timeline
| null |
completed
| null | null | false |
[
"Yes, this is somewhat intended behaviour. For setting cookies like this Requests isn't really sure what the scope of the cookie is: is it only the scope of the single request domain and path? The whole domain? The domain and subdomains? All hosts?\r\n\r\nFor that reason, given that the remote site didn't set it, we ignore the deletion. As you can see from this, Requests does handle cookies properly:\r\n\r\n```python\r\n>>> s = requests.Session()\r\n>>> s.cookies\r\n<RequestsCookieJar[]>\r\n>>> r = s.get('http://httpbin.org/cookies')\r\n>>> r.content\r\nb'{\\n \"cookies\": {}\\n}\\n'\r\n>>> r = s.get('http://httpbin.org/cookies/set', params={'k1': '1'})\r\n>>> r.content\r\nb'{\\n \"cookies\": {\\n \"k1\": \"1\"\\n }\\n}\\n'\r\n>>> s.cookies\r\n<RequestsCookieJar[Cookie(version=0, name='k1', value='1', port=None, port_specified=False, domain='httpbin.org', domain_specified=False, domain_initial_dot=False, path='/', path_specified=True, secure=False, expires=None, discard=True, comment=None, comment_url=None, rest={}, rfc2109=False)]>\r\n>>> r = s.get('http://httpbin.org/cookies/delete', params={'k1': ''})\r\n>>> r.content\r\nb'{\\n \"cookies\": {}\\n}\\n'\r\n```\r\n\r\nThe way to get cookies that behave the way you want is to use a `Session` and the cookiejar available at `Session.cookies`. Alternatively, you can do what httpie is doing, which is to not treat cookies specially at all and just set them as a regular header:\r\n\r\n```python\r\n>>> rq = requests.get('http://httpbin.org/cookies/delete?k2=&k1=', headers={'Cookie': 'k1=1'})\r\n>>> rq.content\r\nb'{\\n \"cookies\": {}\\n}\\n'\r\n```"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.