url
stringlengths 50
53
| repository_url
stringclasses 1
value | labels_url
stringlengths 64
67
| comments_url
stringlengths 59
62
| events_url
stringlengths 57
60
| html_url
stringlengths 38
43
| id
int64 597k
2.65B
| node_id
stringlengths 18
32
| number
int64 1
6.83k
| title
stringlengths 1
296
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 2
classes | assignee
dict | assignees
listlengths 0
4
| milestone
dict | comments
int64 0
211
| created_at
stringlengths 20
20
| updated_at
stringlengths 20
20
| closed_at
stringlengths 20
20
⌀ | author_association
stringclasses 3
values | active_lock_reason
stringclasses 4
values | body
stringlengths 0
65.6k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 59
62
| performed_via_github_app
null | state_reason
stringclasses 3
values | draft
bool 2
classes | pull_request
dict | is_pull_request
bool 2
classes | issue_comments
listlengths 0
30
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/psf/requests/issues/3182
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3182/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3182/comments
|
https://api.github.com/repos/psf/requests/issues/3182/events
|
https://github.com/psf/requests/issues/3182
| 153,781,683 |
MDU6SXNzdWUxNTM3ODE2ODM=
| 3,182 |
Sending too much personal information in User-Anget. Risky
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6367897?v=4",
"events_url": "https://api.github.com/users/Zerokami/events{/privacy}",
"followers_url": "https://api.github.com/users/Zerokami/followers",
"following_url": "https://api.github.com/users/Zerokami/following{/other_user}",
"gists_url": "https://api.github.com/users/Zerokami/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Zerokami",
"id": 6367897,
"login": "Zerokami",
"node_id": "MDQ6VXNlcjYzNjc4OTc=",
"organizations_url": "https://api.github.com/users/Zerokami/orgs",
"received_events_url": "https://api.github.com/users/Zerokami/received_events",
"repos_url": "https://api.github.com/users/Zerokami/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Zerokami/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Zerokami/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Zerokami",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-05-09T13:54:41Z
|
2021-09-08T18:00:46Z
|
2016-05-09T13:56:25Z
|
NONE
|
resolved
|
```
>>> r.request.headers["User-Agent"]
'python-requests/2.x.x CPython/x.x.x Linux/3.xx.xx-generic'
```
I think requests is sending too much info.
The bot name `"python-requests" would be okay.
But my **Python version which may be vulnerable** ?
My Linux **kernel version which may be vulnerable**?
I consider this a big problem. Many people using requests may not be aware of this and inadvertently sending this info.
Please just change the the `User-Agent` to something like
`"python-requests"` or `"Requests"`
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3182/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3182/timeline
| null |
completed
| null | null | false |
[
"@Logmytech We did. Note our [changelog](http://docs.python-requests.org/en/master/community/updates/#id6), which says that we updated in 2.8.0, released 7 months ago. =)\n"
] |
https://api.github.com/repos/psf/requests/issues/3181
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3181/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3181/comments
|
https://api.github.com/repos/psf/requests/issues/3181/events
|
https://github.com/psf/requests/pull/3181
| 153,544,539 |
MDExOlB1bGxSZXF1ZXN0NjkyMTUwMDA=
| 3,181 |
Potential fix for #3066 (Transfer-Encoding and Content-Length headers both being set)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3794108?v=4",
"events_url": "https://api.github.com/users/davidsoncasey/events{/privacy}",
"followers_url": "https://api.github.com/users/davidsoncasey/followers",
"following_url": "https://api.github.com/users/davidsoncasey/following{/other_user}",
"gists_url": "https://api.github.com/users/davidsoncasey/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/davidsoncasey",
"id": 3794108,
"login": "davidsoncasey",
"node_id": "MDQ6VXNlcjM3OTQxMDg=",
"organizations_url": "https://api.github.com/users/davidsoncasey/orgs",
"received_events_url": "https://api.github.com/users/davidsoncasey/received_events",
"repos_url": "https://api.github.com/users/davidsoncasey/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/davidsoncasey/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/davidsoncasey/subscriptions",
"type": "User",
"url": "https://api.github.com/users/davidsoncasey",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-05-06T21:46:07Z
|
2021-09-08T04:00:58Z
|
2016-05-24T01:42:27Z
|
NONE
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3794108?v=4",
"events_url": "https://api.github.com/users/davidsoncasey/events{/privacy}",
"followers_url": "https://api.github.com/users/davidsoncasey/followers",
"following_url": "https://api.github.com/users/davidsoncasey/following{/other_user}",
"gists_url": "https://api.github.com/users/davidsoncasey/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/davidsoncasey",
"id": 3794108,
"login": "davidsoncasey",
"node_id": "MDQ6VXNlcjM3OTQxMDg=",
"organizations_url": "https://api.github.com/users/davidsoncasey/orgs",
"received_events_url": "https://api.github.com/users/davidsoncasey/received_events",
"repos_url": "https://api.github.com/users/davidsoncasey/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/davidsoncasey/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/davidsoncasey/subscriptions",
"type": "User",
"url": "https://api.github.com/users/davidsoncasey",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3181/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3181/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3181.diff",
"html_url": "https://github.com/psf/requests/pull/3181",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3181.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3181"
}
| true |
[
"This looks good @davidsoncasey, thanks so much! Do you want to open a parallel pull request with your idea about refactoring `prepare_body`? I'd love to see both side-by-side.\n",
"@Lukasa sounds good. I'll work on getting something put together in the next couple of days.\n",
"I'm going close this PR out since these changes and discussion have been moved to #3184. \n"
] |
|
https://api.github.com/repos/psf/requests/issues/3180
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3180/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3180/comments
|
https://api.github.com/repos/psf/requests/issues/3180/events
|
https://github.com/psf/requests/issues/3180
| 153,225,463 |
MDU6SXNzdWUxNTMyMjU0NjM=
| 3,180 |
OSError: [Errno 22] in self.sock.sendall(data)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1974558?v=4",
"events_url": "https://api.github.com/users/AstinCHOI/events{/privacy}",
"followers_url": "https://api.github.com/users/AstinCHOI/followers",
"following_url": "https://api.github.com/users/AstinCHOI/following{/other_user}",
"gists_url": "https://api.github.com/users/AstinCHOI/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/AstinCHOI",
"id": 1974558,
"login": "AstinCHOI",
"node_id": "MDQ6VXNlcjE5NzQ1NTg=",
"organizations_url": "https://api.github.com/users/AstinCHOI/orgs",
"received_events_url": "https://api.github.com/users/AstinCHOI/received_events",
"repos_url": "https://api.github.com/users/AstinCHOI/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/AstinCHOI/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/AstinCHOI/subscriptions",
"type": "User",
"url": "https://api.github.com/users/AstinCHOI",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-05-05T12:50:56Z
|
2016-05-05T17:11:37Z
|
2016-05-05T13:09:58Z
|
NONE
| null |
env: Mac OS X El Capitan / python 3.5.1 / requests 2.10.0
I want to upload file which is about 3GB size.
``` py
def read_in_chunks(file_object, chunk_size=4096):
while True:
data = file_object.read(chunk_size)
if not data:
break
yield data
with open('3GB.mov', 'br') as f:
data = b''.join([chunk for chunk in read_in_chunks(f)])
requests.put(url, headers=headers, data=data)
```
```
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/urllib/request.py", line 1240, in do_open
h.request(req.get_method(), req.selector, req.data, headers)
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py", line 1083, in request
self._send_request(method, url, body, headers)
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py", line 1128, in _send_request
self.endheaders(body)
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py", line 1079, in endheaders
self._send_output(message_body)
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py", line 913, in _send_output
self.send(message_body)
File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py", line 885, in send
self.sock.sendall(data)
OSError: [Errno 22] Invalid argument
```
Could you give me some advice?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3180/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3180/timeline
| null |
completed
| null | null | false |
[
"Hi @AstinCHOI,\n\nThis is a defect tracker, not a question and answer portion of the requests project. If you're looking for help (which it appears you are) please pose questions on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests). When posting there, please be prepared to answer questions such as:\n- Why are you using a generator and then joining it into a single byte string in memory?\n- Why does your traceback not include any code from the requests library?\n",
"Hi, @sigmavirus24 \n1) http://stackoverflow.com/questions/11662960/ioerror-errno-22-invalid-argument-when-reading-writing-large-bytestring and it's my fault to use like that.\n2) Sorry, I couldn't include the traceback with requests due to some logic.. but there is problem when you use data=data which is above 2GB size binary in OS X. If you use OSX, you can test it.\n\nNaver mind. I will post in StackOverflow.\n\nThanks.\n",
"In OS X, alternative way (for some versions): http://stackoverflow.com/questions/2502596/python-http-post-a-large-file-with-streaming and it can post/put with big files.\n\nIt was not requests module fault.\n\nThanks.\n"
] |
https://api.github.com/repos/psf/requests/issues/3179
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3179/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3179/comments
|
https://api.github.com/repos/psf/requests/issues/3179/events
|
https://github.com/psf/requests/pull/3179
| 153,160,783 |
MDExOlB1bGxSZXF1ZXN0Njg5NjIwMDk=
| 3,179 |
Fix TypeError when get json-encoded content of a response
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1556054?v=4",
"events_url": "https://api.github.com/users/messense/events{/privacy}",
"followers_url": "https://api.github.com/users/messense/followers",
"following_url": "https://api.github.com/users/messense/following{/other_user}",
"gists_url": "https://api.github.com/users/messense/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/messense",
"id": 1556054,
"login": "messense",
"node_id": "MDQ6VXNlcjE1NTYwNTQ=",
"organizations_url": "https://api.github.com/users/messense/orgs",
"received_events_url": "https://api.github.com/users/messense/received_events",
"repos_url": "https://api.github.com/users/messense/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/messense/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/messense/subscriptions",
"type": "User",
"url": "https://api.github.com/users/messense",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2016-05-05T03:17:47Z
|
2021-09-08T04:01:01Z
|
2016-05-06T12:59:01Z
|
CONTRIBUTOR
|
resolved
|
[`self.content`](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L728) could be `None`, so `len(self.content)` may raise `TypeError: object of type 'NoneType' has no len()`
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3179/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3179/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3179.diff",
"html_url": "https://github.com/psf/requests/pull/3179",
"merged_at": "2016-05-06T12:59:01Z",
"patch_url": "https://github.com/psf/requests/pull/3179.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3179"
}
| true |
[
"Huh. I feel like maybe we should just change `Response.content` to never be `None`. It's certainly surprising to me that it can be. @kennethreitz @sigmavirus24?\n",
"@Lukasa so we set it to `None` in [the descriptor](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L728) in (2) [error](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L739) [cases](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L744). \n\nI don't think we can just change it to never be `None` before 3.0 but that's my hesitancy to break user-level code speaking.\n",
"@sigmavirus24 Yeah, that seems reasonable. In that case, clearly the `json()` code needs to handle it too.\n\nThat means I think this patch is fine. @messense are you interested in adding a test to validate that this works as expected?\n",
"This came up yesterday in a work context. The content of a response CAN BE None, and that would be correct, in the case of a 204 response code.\nhttps://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html\n\nI agree it would be better if response.json() handled that gracefully rather than dying of a JSON parsing error.\n",
"I just realized that fix this will make it raise `ValueError: No JSON object could be decoded` instead because of `json.loads('')`\n",
"Hurrah, this looks great to me! I'd argue the previous behaviour was a bug so we have no API compatibility concerns to worry about here, but I want @sigmavirus24 to ACK that before merging.\n",
"This looks fine to me :)\n"
] |
https://api.github.com/repos/psf/requests/issues/3178
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3178/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3178/comments
|
https://api.github.com/repos/psf/requests/issues/3178/events
|
https://github.com/psf/requests/pull/3178
| 153,145,372 |
MDExOlB1bGxSZXF1ZXN0Njg5NTI4MjM=
| 3,178 |
Encoding JSON requests to bytes for urllib3 to handle
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7773758?v=4",
"events_url": "https://api.github.com/users/haikuginger/events{/privacy}",
"followers_url": "https://api.github.com/users/haikuginger/followers",
"following_url": "https://api.github.com/users/haikuginger/following{/other_user}",
"gists_url": "https://api.github.com/users/haikuginger/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/haikuginger",
"id": 7773758,
"login": "haikuginger",
"node_id": "MDQ6VXNlcjc3NzM3NTg=",
"organizations_url": "https://api.github.com/users/haikuginger/orgs",
"received_events_url": "https://api.github.com/users/haikuginger/received_events",
"repos_url": "https://api.github.com/users/haikuginger/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/haikuginger/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/haikuginger/subscriptions",
"type": "User",
"url": "https://api.github.com/users/haikuginger",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2016-05-05T00:26:23Z
|
2021-09-08T03:01:04Z
|
2016-05-22T16:02:09Z
|
CONTRIBUTOR
|
resolved
|
Implements #3177.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3178/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3178/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3178.diff",
"html_url": "https://github.com/psf/requests/pull/3178",
"merged_at": "2016-05-22T16:02:09Z",
"patch_url": "https://github.com/psf/requests/pull/3178.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3178"
}
| true |
[
"@haikuginger Thanks for this! I've left some notes inline.\n",
"Bump! 🙂\n",
"Sorry @haikuginger! GitHub doesn't ping me when you push new changes. :(\n",
"I'm :+1: on this if @Lukasa is =D\n",
"Let's do it! Thanks @haikuginger, you're doing awesome work! :sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/3177
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3177/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3177/comments
|
https://api.github.com/repos/psf/requests/issues/3177/events
|
https://github.com/psf/requests/issues/3177
| 153,145,294 |
MDU6SXNzdWUxNTMxNDUyOTQ=
| 3,177 |
Encode JSON to bytes per urllib3 expectations
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7773758?v=4",
"events_url": "https://api.github.com/users/haikuginger/events{/privacy}",
"followers_url": "https://api.github.com/users/haikuginger/followers",
"following_url": "https://api.github.com/users/haikuginger/following{/other_user}",
"gists_url": "https://api.github.com/users/haikuginger/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/haikuginger",
"id": 7773758,
"login": "haikuginger",
"node_id": "MDQ6VXNlcjc3NzM3NTg=",
"organizations_url": "https://api.github.com/users/haikuginger/orgs",
"received_events_url": "https://api.github.com/users/haikuginger/received_events",
"repos_url": "https://api.github.com/users/haikuginger/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/haikuginger/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/haikuginger/subscriptions",
"type": "User",
"url": "https://api.github.com/users/haikuginger",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2016-05-05T00:25:43Z
|
2021-09-08T18:00:43Z
|
2016-05-22T16:45:20Z
|
CONTRIBUTOR
|
resolved
|
urllib3 expects to receive request bodies as bytes-like objects. While sending Unicode strings may work, it's suboptimal and can result in unexpected behavior (see shazow/urllib3#855). While we shouldn't encode user-supplied request bodies, we should encode requests-produced JSON strings.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7773758?v=4",
"events_url": "https://api.github.com/users/haikuginger/events{/privacy}",
"followers_url": "https://api.github.com/users/haikuginger/followers",
"following_url": "https://api.github.com/users/haikuginger/following{/other_user}",
"gists_url": "https://api.github.com/users/haikuginger/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/haikuginger",
"id": 7773758,
"login": "haikuginger",
"node_id": "MDQ6VXNlcjc3NzM3NTg=",
"organizations_url": "https://api.github.com/users/haikuginger/orgs",
"received_events_url": "https://api.github.com/users/haikuginger/received_events",
"repos_url": "https://api.github.com/users/haikuginger/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/haikuginger/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/haikuginger/subscriptions",
"type": "User",
"url": "https://api.github.com/users/haikuginger",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3177/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3177/timeline
| null |
completed
| null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/3176
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3176/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3176/comments
|
https://api.github.com/repos/psf/requests/issues/3176/events
|
https://github.com/psf/requests/pull/3176
| 153,064,975 |
MDExOlB1bGxSZXF1ZXN0Njg4OTY3Njc=
| 3,176 |
add ability to control the headers that will be deleted in case of redirect
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4",
"events_url": "https://api.github.com/users/ghost/events{/privacy}",
"followers_url": "https://api.github.com/users/ghost/followers",
"following_url": "https://api.github.com/users/ghost/following{/other_user}",
"gists_url": "https://api.github.com/users/ghost/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ghost",
"id": 10137,
"login": "ghost",
"node_id": "MDQ6VXNlcjEwMTM3",
"organizations_url": "https://api.github.com/users/ghost/orgs",
"received_events_url": "https://api.github.com/users/ghost/received_events",
"repos_url": "https://api.github.com/users/ghost/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ghost/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ghost",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2016-05-04T17:01:34Z
|
2021-09-08T01:21:44Z
|
2016-11-15T09:47:36Z
|
NONE
|
resolved
|
In some cases the redirect is done to a different host.
when the {"Host": "www.example.com"} is set in the headers and the request is redirected to another host, let's say www.another-example.com, the next request will be sent to a new URLwith a wrong Host header.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3176/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3176/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3176.diff",
"html_url": "https://github.com/psf/requests/pull/3176",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3176.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3176"
}
| true |
[
"This is up to @kennethreitz, but I'm disinclined to want to add this: I think I'd rather factor out the logic in `SessionRedirectMixin` as we've done with a few other bits to allow easier subclassing, rather than add further keyword arguments.\n",
"I agree with @Lukasa. I don't think we need this as an extra parameter to this method. I also don't think we need to keep track of which headers we're sanitizing for a user. I also agree that we might be able to better serve you by refactoring things so you can override these with subclassing.\n",
"👍 \n",
"@sigmavirus24, @Lukasa thanks for the comments.\nInheriting `SessionRedirectMixin` was my first idea but then I've seen so much logic in the `resolve_redirects` function so it became a sort of code duplication.\nBy saying this, I agree with you guys, factoring out the logic might come very handy.\nSo how should we proceed?\n",
"@adaronen If you'd like to supply a PR that factors out this logic to a method (`resolve_*`) then we'll happily review that. =)\n",
"Cool, I'll do my best.\n",
"Closing due to inactivity. Please feel free to reopen if you have the time to make more forward progress.\n"
] |
https://api.github.com/repos/psf/requests/issues/3175
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3175/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3175/comments
|
https://api.github.com/repos/psf/requests/issues/3175/events
|
https://github.com/psf/requests/issues/3175
| 152,846,729 |
MDU6SXNzdWUxNTI4NDY3Mjk=
| 3,175 |
Gzipping requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1063219?v=4",
"events_url": "https://api.github.com/users/kuraga/events{/privacy}",
"followers_url": "https://api.github.com/users/kuraga/followers",
"following_url": "https://api.github.com/users/kuraga/following{/other_user}",
"gists_url": "https://api.github.com/users/kuraga/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kuraga",
"id": 1063219,
"login": "kuraga",
"node_id": "MDQ6VXNlcjEwNjMyMTk=",
"organizations_url": "https://api.github.com/users/kuraga/orgs",
"received_events_url": "https://api.github.com/users/kuraga/received_events",
"repos_url": "https://api.github.com/users/kuraga/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kuraga/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kuraga/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kuraga",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-05-03T18:47:33Z
|
2021-09-08T18:00:48Z
|
2016-05-04T11:04:11Z
|
NONE
|
resolved
|
Good day!
Does `requests` functionality of encoding POST requests with `gzip`?
If no, will you accept a pull request? Some notes, maybe?
Thanks!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1063219?v=4",
"events_url": "https://api.github.com/users/kuraga/events{/privacy}",
"followers_url": "https://api.github.com/users/kuraga/followers",
"following_url": "https://api.github.com/users/kuraga/following{/other_user}",
"gists_url": "https://api.github.com/users/kuraga/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kuraga",
"id": 1063219,
"login": "kuraga",
"node_id": "MDQ6VXNlcjEwNjMyMTk=",
"organizations_url": "https://api.github.com/users/kuraga/orgs",
"received_events_url": "https://api.github.com/users/kuraga/received_events",
"repos_url": "https://api.github.com/users/kuraga/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kuraga/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kuraga/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kuraga",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3175/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3175/timeline
| null |
completed
| null | null | false |
[
"@kuraga Requests does not currently support doing this automatically. =) However, doing so is fairly simple. I don't think we'd accept a pull request adding this functionality (it's not likely to be generally useful), but we'd certainly be happy to add an example to our documentation.\n",
"Ok, thanks!\n\nP.S.\n\n> it's not likely to be generally useful\n\nDisagree =) But yes, one more dependency...\n"
] |
https://api.github.com/repos/psf/requests/issues/3174
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3174/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3174/comments
|
https://api.github.com/repos/psf/requests/issues/3174/events
|
https://github.com/psf/requests/issues/3174
| 152,765,686 |
MDU6SXNzdWUxNTI3NjU2ODY=
| 3,174 |
Recent update of urllib3 broke stream requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/104093?v=4",
"events_url": "https://api.github.com/users/ducu/events{/privacy}",
"followers_url": "https://api.github.com/users/ducu/followers",
"following_url": "https://api.github.com/users/ducu/following{/other_user}",
"gists_url": "https://api.github.com/users/ducu/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ducu",
"id": 104093,
"login": "ducu",
"node_id": "MDQ6VXNlcjEwNDA5Mw==",
"organizations_url": "https://api.github.com/users/ducu/orgs",
"received_events_url": "https://api.github.com/users/ducu/received_events",
"repos_url": "https://api.github.com/users/ducu/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ducu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ducu/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ducu",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 16 |
2016-05-03T12:51:49Z
|
2021-09-08T09:00:35Z
|
2017-06-09T19:14:02Z
|
NONE
|
resolved
|
I suppose you can reproduce this easily, take this url for example: http://bit.ly/1rQJ1CL
```
File "/home/ducu/summarizer/env/lib/python2.7/site-packages/summary/__init__.py", line 280, in extract
body = self._get_tag(response, tag_name="body", encoding=encoding)
File "/home/ducu/summarizer/env/lib/python2.7/site-packages/summary/__init__.py", line 203, in _get_tag
for chunk in response.iter_content(config.CHUNK_SIZE): # , decode_unicode=True
File "/home/ducu/summarizer/env/lib/python2.7/site-packages/requests/models.py", line 664, in generate
for chunk in self.raw.stream(chunk_size, decode_content=True):
File "/home/ducu/summarizer/env/lib/python2.7/site-packages/requests/packages/urllib3/response.py", line 349, in stream
for line in self.read_chunked(amt, decode_content=decode_content):
File "/home/ducu/summarizer/env/lib/python2.7/site-packages/requests/packages/urllib3/response.py", line 501, in read_chunked
chunk = self._handle_chunk(amt)
File "/home/ducu/summarizer/env/lib/python2.7/site-packages/requests/packages/urllib3/response.py", line 461, in _handle_chunk
value = self._fp._safe_read(amt)
File "/home/ducu/.pyenv/versions/2.7.8/lib/python2.7/httplib.py", line 658, in _safe_read
chunk = self.fp.read(min(amt, MAXAMOUNT))
AttributeError: 'NoneType' object has no attribute 'read'
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/3174/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3174/timeline
| null |
completed
| null | null | false |
[
"I'm sorry, I actually _can't_ reproduce this. Can you show me slightly more about how you invoked requests?\n",
"Just a sec, not that easy to reproduce indeed..\nI get it via https://github.com/svven/summary with `requests` 2.10.0. No problem with 2.9.2 but I couldn't reproduce it directly with `requests`, still investigating\n",
"So, to be clear, this is definitely a bug. My suspicion is that the bug is actually in urllib3, not requests, but we'll see if we can get a better repro before concluding that.\n",
"Sorry but can't reproduce it directly, looks like it's in urllib3 indeed.\nIf you feel like it you can `pip install summary-extraction`, then `pip install requests --upgrade` so you get ver 2.10.0. Then\n\n```\n>>> import summary\n>>> url = 'http://bit.ly/1rQJ1CL'\n>>> s = summary.Summary(url)\n>>> s.extract()\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/home/ducu/.pyenv/versions/2.7.8/lib/python2.7/site-packages/summary/__init__.py\", line 251, in extract\n body = self._get_tag(response, tag_name=\"body\")\n File \"/home/ducu/.pyenv/versions/2.7.8/lib/python2.7/site-packages/summary/__init__.py\", line 184, in _get_tag\n for chunk in response.iter_content(CHUNK_SIZE, decode_unicode=True):\n File \"/home/ducu/.pyenv/versions/2.7.8/lib/python2.7/site-packages/requests/utils.py\", line 368, in stream_decode_response_unicode\n for chunk in iterator:\n File \"/home/ducu/.pyenv/versions/2.7.8/lib/python2.7/site-packages/requests/models.py\", line 664, in generate\n for chunk in self.raw.stream(chunk_size, decode_content=True):\n File \"/home/ducu/.pyenv/versions/2.7.8/lib/python2.7/site-packages/requests/packages/urllib3/response.py\", line 349, in stream\n for line in self.read_chunked(amt, decode_content=decode_content):\n File \"/home/ducu/.pyenv/versions/2.7.8/lib/python2.7/site-packages/requests/packages/urllib3/response.py\", line 501, in read_chunked\n chunk = self._handle_chunk(amt)\n File \"/home/ducu/.pyenv/versions/2.7.8/lib/python2.7/site-packages/requests/packages/urllib3/response.py\", line 461, in _handle_chunk\n value = self._fp._safe_read(amt)\n File \"/home/ducu/.pyenv/versions/2.7.8/lib/python2.7/httplib.py\", line 658, in _safe_read\n chunk = self.fp.read(min(amt, MAXAMOUNT))\nAttributeError: 'NoneType' object has no attribute 'read'\n>>> \n\n```\n\nI'm using Python 2.7.8 (default, Sep 4 2015, 14:01:33) [GCC 4.8.4] on linux2\nI downgraded requests to 2.9.2 for now and it works fine, not sure when I'd get the time to look into this.\nThanks anyway\n",
"Ok, I've diagnosed this. Took a bit of digging and a few leaps, but I've got it. This is a urllib3 bug, so I'll open an appropriate report over there.\n",
"Actually, before we conclude this is a urllib3 bug, we need to have a discussion about what the API promises. So let me explain what the problem is.\n\nurllib3 attempts to ensure that connections are thrown away if a problem occurs. This is implemented using a context manager in urllib3 (`_error_catcher`) that checks certain errors and then determines whether the block terminated cleanly. If it did not, it forcibly closes the connection before returning the connection object to the pool to be reopened.\n\nThat object treats _any_ exception as an error case that leaves the connection in an indeterminate state. That is mostly a good thing, except for one particularly awkward exception: [`GeneratorExit`](https://docs.python.org/2/library/exceptions.html#exceptions.GeneratorExit). This exception is raised when the `close()` method on a generator is called, and is thrown to allow things like `finally` blocks to execute properly.\n\n_One_ case where this is called is when a generator is garbage collected. That happens in your code, because you call `iter_content` multiple times, once for each tag you search for. When that happens you spin up several generators (`iter_content` returns a generator, and `stream` returns a generator, and `read_chunked` also returns a generator, so there's a chain of at least three generators in this case). Because you don't save the return value from `iter_content`, that generator chain gets leaked. This causes the `read_chunked` generator to throw `GeneratorExit`, which causes the urllib3 `_error_catcher` to conclude that the connection was not left in a clean state and terminates it.\n\nThere are therefore a few questions:\n1. Should `_error_catcher` consider `GeneratorExit` an error case, or special case it? I'm not sure: the question is whether the connection will get cleaned up properly in situations where the generator really is leaked. Currently my assumption is that it won't, and so `GeneratorExit` really is an error case.\n2. Is it safe to open multiple versions of the generators `iter_content`, `stream`, and `read_chunked`? They don't make it clear. `iter_lines` in requests is clear that it is _not_ safe to do that, but the rest are left ambiguous. We need to make a call, where I suspect the answer boils down to whether `decode_content` is `True`: if it is, there is a state object that gets lost when that generator leaks. Given that requests essentially always sets `decode_content` to `True`, I think that means it's also not safe to repeatedly call `iter_content`.\n\n@ducu In the short term, you can fix this by saving off the result of `iter_content` somewhere on your `Summary` object and then re-using that, rather than repeatedly re-calling that method. That logic will _definitely_ work, and we can work out whether or not the alternative _should_ work.\n\nCan I get input from @kennethreitz, @shazow, and @sigmavirus24 on this please?\n",
"> Is it safe to open multiple versions of the generators iter_content, stream, and read_chunked? \n\nDo you mean multiple instances? If iterating/streaming/reading is supposed to be 1:1 with what the socket does underneath, then no it's not safe unless you preload the response and cache it.\n\nNot sure about the generator stuff.\n",
"This is about stream requests, so it means repeatedly calling `iter_content`.\nPreloading the whole response defies the purpose, that's not a stream request anymore, right? I do want to load it chunk by chunk and process it iteratively because I may not need all of it. \nOr am I getting something wrong? I definitely don't understand what's with the 3 generators and how `iter_content` is different from `stream`.. so sorry but I can't help much on this issue\n",
"@ducu `iter_content` builds on top of `stream`. When I say `stream` here I don't mean the `stream` keyword argument as exposed by requests, I mean the `HTTPResponse.stream()` method exposed by urllib3. Sorry for the confusion there!\n\nSo I totally agree you want to load it chunk by chunk and process it iteratively. What I'm asking is whether we think this _is_ safe, and then whether it _should be_ safe:\n\n``` python\nr = requests.get(some_url, stream=True)\niter_one = r.iter_content()\niter_two = r.iter_content()\n\nwhile True:\n print next(iter_one)\n print next(iter_two)\n```\n\nPut another way, is it safe to repeatedly call `iter_content` and throw away the generator it returns (which is what your code does)?\n",
"Ah gottit, I'll try using a single generator then and see if that works.\nThanks for the clarification, brb\n\nOn Wed, May 4, 2016 at 10:55 AM Cory Benfield [email protected]\nwrote:\n\n> @ducu https://github.com/ducu iter_content builds on top of stream.\n> When I say stream here I don't mean the stream keyword argument as\n> exposed by requests, I mean the HTTPResponse.stream() method exposed by\n> urllib3. Sorry for the confusion there!\n> \n> So I totally agree you want to load it chunk by chunk and process it\n> iteratively. What I'm asking is whether we think this _is_ safe, and then\n> whether it _should be_ safe:\n> \n> r = requests.get(some_url, stream=True)\n> iter_one = r.iter_content()\n> iter_two = r.iter_content()\n> while True:\n> print next(iter_one)\n> print next(iter_two)\n> \n> Put another way, is it safe to repeatedly call iter_content and throw\n> away the generator it returns (which is what your code does)?\n> \n> —\n> You are receiving this because you were mentioned.\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/3174#issuecomment-216770410\n",
"Yep it works with a single `iter_content` generator, I'll go with this workaround.\nThanks a lot for your input guys, it's your choice whether you want to fix this urllib3 issue, otherwise I can close this.\n",
"This issue has also affected other projects: [Velociraptor](https://bitbucket.org/yougov/vr.server/commits/59f4753c6270c587f9c546f4e4a4e9274601a96e) and [sseclient](https://bitbucket.org/btubbs/sseclient/commits/857724844866adb2068a9d2b5c8c1ba06775796a). It does seem that the user's expectation is that the `iter_content` should return the same or compatible iterator on each call. It's somewhat awkward to have to store both the response and the response iterator. If for nothing else than compatibility, I suggest requests should cache the iterator per response.\n",
"Would that apply to `iter_lines` too?\n",
"@Lukasa if we decide to cache the iterators, it would. I'm not sure we should be caching the iterators though.\n",
"I can't think of an example of a project or class that provides you with an iterator that is cached. I mean, if we just look at built-ins:\n\n``` py\na = {'1': '2', '2': '3'}\nnext(a.items())\nnext(a.items())\n```\n\nWho would expect that to work? I don't think we should be enabling this behaviour, frankly.\n",
"@sigmavirus24 You're right. I wouldn't expect your example to work, and I can see why the surprise with `iter_content()`, because it's a function, which one expects to have side effects. But it still seems valuable to have a way to hook into the stream at whatever point it's currently at. This was a feature, however undesirable, that Requests 2.9 had which Requests 2.10 does not have... and the workaround is clumsy, requiring code in two different points in the code to be coordinated.\n\nIt really would be nice if the response object exposed some sort of interface such that one doesn't have to carry both the iterator and the response around anytime streaming is in play.\n"
] |
https://api.github.com/repos/psf/requests/issues/3173
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3173/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3173/comments
|
https://api.github.com/repos/psf/requests/issues/3173/events
|
https://github.com/psf/requests/pull/3173
| 152,713,061 |
MDExOlB1bGxSZXF1ZXN0Njg2NjU1NzU=
| 3,173 |
Add section on SOCKS proxies.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-05-03T07:08:00Z
|
2021-09-08T04:01:02Z
|
2016-05-05T17:18:28Z
|
MEMBER
|
resolved
|
Resolves #3172.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3173/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3173/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3173.diff",
"html_url": "https://github.com/psf/requests/pull/3173",
"merged_at": "2016-05-05T17:18:28Z",
"patch_url": "https://github.com/psf/requests/pull/3173.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3173"
}
| true |
[
":tada: \n",
":sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/3172
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3172/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3172/comments
|
https://api.github.com/repos/psf/requests/issues/3172/events
|
https://github.com/psf/requests/issues/3172
| 152,709,636 |
MDU6SXNzdWUxNTI3MDk2MzY=
| 3,172 |
how to use socks proxy ?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/13533742?v=4",
"events_url": "https://api.github.com/users/iliul/events{/privacy}",
"followers_url": "https://api.github.com/users/iliul/followers",
"following_url": "https://api.github.com/users/iliul/following{/other_user}",
"gists_url": "https://api.github.com/users/iliul/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/iliul",
"id": 13533742,
"login": "iliul",
"node_id": "MDQ6VXNlcjEzNTMzNzQy",
"organizations_url": "https://api.github.com/users/iliul/orgs",
"received_events_url": "https://api.github.com/users/iliul/received_events",
"repos_url": "https://api.github.com/users/iliul/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/iliul/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iliul/subscriptions",
"type": "User",
"url": "https://api.github.com/users/iliul",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-05-03T06:37:48Z
|
2021-09-08T15:00:48Z
|
2016-05-03T07:08:18Z
|
NONE
|
resolved
|
The new feature socks proxy is supported,but how to use it ?
no example In [http://www.python-requests.org/en/master/user/advanced/#proxies](http://www.python-requests.org/en/master/user/advanced/#proxies) .
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3172/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3172/timeline
| null |
completed
| null | null | false |
[
"Yup, that's a good spot.\n\nTo do it, you need to `pip install requests[socks]`, and then you can set the proxy as normal: `proxies={'http': 'socks5://host:port', 'https': 'socks5://host:port'}`.\n",
"Thanks and I got it @Lukasa \n",
"The pull request #3173 should add the needed section. Thanks for the report @iliul! :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/3171
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3171/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3171/comments
|
https://api.github.com/repos/psf/requests/issues/3171/events
|
https://github.com/psf/requests/pull/3171
| 152,645,964 |
MDExOlB1bGxSZXF1ZXN0Njg2MjE3OTY=
| 3,171 |
docs: Add a note about SSL c_rehash
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4",
"events_url": "https://api.github.com/users/ghost/events{/privacy}",
"followers_url": "https://api.github.com/users/ghost/followers",
"following_url": "https://api.github.com/users/ghost/following{/other_user}",
"gists_url": "https://api.github.com/users/ghost/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ghost",
"id": 10137,
"login": "ghost",
"node_id": "MDQ6VXNlcjEwMTM3",
"organizations_url": "https://api.github.com/users/ghost/orgs",
"received_events_url": "https://api.github.com/users/ghost/received_events",
"repos_url": "https://api.github.com/users/ghost/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ghost/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ghost",
"user_view_type": "public"
}
|
[] |
closed
| false | null |
[] | null | 1 |
2016-05-02T21:08:34Z
|
2016-05-03T07:44:43Z
|
2016-05-03T07:44:42Z
|
NONE
| null |
Just adding a quick note to inform users that it's not enough to point to a directory with the certificates but the directory needs to be "indexed". It took me some time to figure this out ... this simple note might save time to others in the future :).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3171/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3171/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3171.diff",
"html_url": "https://github.com/psf/requests/pull/3171",
"merged_at": "2016-05-03T07:44:42Z",
"patch_url": "https://github.com/psf/requests/pull/3171.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3171"
}
| true |
[
"Thanks for this @luv! :sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/3170
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3170/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3170/comments
|
https://api.github.com/repos/psf/requests/issues/3170/events
|
https://github.com/psf/requests/issues/3170
| 152,544,034 |
MDU6SXNzdWUxNTI1NDQwMzQ=
| 3,170 |
Unexpected redirect behavior
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7309679?v=4",
"events_url": "https://api.github.com/users/coursera-ds/events{/privacy}",
"followers_url": "https://api.github.com/users/coursera-ds/followers",
"following_url": "https://api.github.com/users/coursera-ds/following{/other_user}",
"gists_url": "https://api.github.com/users/coursera-ds/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/coursera-ds",
"id": 7309679,
"login": "coursera-ds",
"node_id": "MDQ6VXNlcjczMDk2Nzk=",
"organizations_url": "https://api.github.com/users/coursera-ds/orgs",
"received_events_url": "https://api.github.com/users/coursera-ds/received_events",
"repos_url": "https://api.github.com/users/coursera-ds/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/coursera-ds/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/coursera-ds/subscriptions",
"type": "User",
"url": "https://api.github.com/users/coursera-ds",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2016-05-02T12:31:47Z
|
2021-09-08T18:00:48Z
|
2016-05-02T12:39:07Z
|
NONE
|
resolved
|
requests is getting (incorrectly) stuck in a redirect loop for an image.
If I turn off redirects and manually step through the requests, each time pointing the new url to the value in the Location header, it works as expected (below is the response.headers + the response.status_code):
```
{'Content-Length': '178', 'Server': 'nginx', 'Connection': 'keep-alive', 'Location': 'https://wiki.wildberries.ru/img/2016/03/dffgfgdf-150x150.jpg', 'Date': 'Sun, 01 May 2016 01:27:25 GMT', 'Content-Type': 'text/html'} 301
{'Content-Length': '178', 'Set-Cookie': '__utmp=1', 'Server': 'nginx', 'Connection': 'close', 'Location': 'http://abenz.ru/wt', 'Date': 'Sun, 01 May 2016 01:27:26 GMT', 'Content-Type': 'text/html'} 301
{'Content-Length': '0', 'Expires': 'Thu, 21 Jul 1977 07:30:00 GMT', 'Keep-Alive': 'timeout=5, max=100', 'Server': 'Apache/2.4.10 (Debian)', 'Last-Modified': 'Sun, 01 May 2016 01:27:26 GMT', 'Connection': 'Keep-Alive', 'LOCATION': 'https://wiki.wildberries.ru/img/2016/03/dffgfgdf-150x150.jpg', 'Pragma': 'no-cache', 'Cache-Control': 'max-age=0', 'Date': 'Sun, 01 May 2016 01:27:26 GMT', 'Content-Type': 'text/html; charset=utf-8'} 302
{'Test-Head': '1', 'Accept-Ranges': 'bytes', 'Content-Length': '9753', 'Server': 'nginx', 'Last-Modified': 'Thu, 17 Mar 2016 15:14:49 GMT', 'Connection': 'close', 'ETag': '"56eac9e9-2619"', 'Date': 'Sun, 01 May 2016 01:27:27 GMT', 'Content-Type': 'image/jpeg'} 200
```
If I turn on logging while redirects is enabled, I see this. Notice that once it hits the abenz[.]ru domain, the location is never updated.
```
INFO:requests.packages.urllib3.connectionpool:Starting new HTTP connection (1): wiki.wildberries.ru
send: 'GET /img/2016/03/dffgfgdf-150x150.jpg HTTP/1.1\r\nHost: wiki.wildberries.ru\r\nConnection: keep-alive\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nuser-agent: Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/5.0)\r\n\r\n'
reply: 'HTTP/1.1 301 Moved Permanently\r\n'
header: Server: nginx
header: Date: Sun, 01 May 2016 01:50:21 GMT
header: Content-Type: text/html
header: Content-Length: 178
header: Connection: keep-alive
header: Location: https://wiki.wildberries.ru/img/2016/03/dffgfgdf-150x150.jpg
DEBUG:requests.packages.urllib3.connectionpool:"GET /img/2016/03/dffgfgdf-150x150.jpg HTTP/1.1" 301 178
INFO:requests.packages.urllib3.connectionpool:Starting new HTTPS connection (1): wiki.wildberries.ru
send: 'GET /img/2016/03/dffgfgdf-150x150.jpg HTTP/1.1\r\nHost: wiki.wildberries.ru\r\nConnection: keep-alive\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nuser-agent: Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/5.0)\r\n\r\n'
reply: 'HTTP/1.1 301 Moved Permanently\r\n'
header: Server: nginx
header: Date: Sun, 01 May 2016 01:50:22 GMT
header: Content-Type: text/html
header: Content-Length: 178
header: Connection: close
header: Location: http://abenz.ru/wt
header: Set-Cookie: __utmp=1
DEBUG:requests.packages.urllib3.connectionpool:"GET /img/2016/03/dffgfgdf-150x150.jpg HTTP/1.1" 301 178
INFO:requests.packages.urllib3.connectionpool:Starting new HTTP connection (1): abenz.ru
send: 'GET /wt HTTP/1.1\r\nHost: abenz.ru\r\nConnection: keep-alive\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nuser-agent: Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/5.0)\r\n\r\n'
reply: 'HTTP/1.1 302 Found\r\n'
header: Date: Sun, 01 May 2016 01:50:23 GMT
header: Server: Apache/2.4.10 (Debian)
header: Expires: Thu, 21 Jul 1977 07:30:00 GMT
header: Last-Modified: Sun, 01 May 2016 01:50:23 GMT
header: Cache-Control: max-age=0
header: Pragma: no-cache
header: LOCATION: https://wiki.wildberries.ru/img/2016/03/dffgfgdf-150x150.jpg
header: Content-Length: 0
header: Keep-Alive: timeout=5, max=100
header: Connection: Keep-Alive
header: Content-Type: text/html; charset=utf-8
DEBUG:requests.packages.urllib3.connectionpool:"GET /wt HTTP/1.1" 302 0
send: 'GET /wt HTTP/1.1\r\nHost: abenz.ru\r\nConnection: keep-alive\r\nCookie: __utmp=1\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nuser-agent: Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/5.0)\r\n\r\n'
reply: 'HTTP/1.1 302 Found\r\n'
header: Date: Sun, 01 May 2016 01:50:24 GMT
header: Server: Apache/2.4.10 (Debian)
header: Expires: Thu, 21 Jul 1977 07:30:00 GMT
header: Last-Modified: Sun, 01 May 2016 01:50:24 GMT
header: Cache-Control: max-age=0
header: Pragma: no-cache
header: LOCATION: https://wiki.wildberries.ru/img/2016/03/dffgfgdf-150x150.jpg
header: Content-Length: 0
header: Keep-Alive: timeout=5, max=99
header: Connection: Keep-Alive
header: Content-Type: text/html; charset=utf-8
DEBUG:requests.packages.urllib3.connectionpool:"GET /wt HTTP/1.1" 302 0
send: 'GET /wt HTTP/1.1\r\nHost: abenz.ru\r\nConnection: keep-alive\r\nCookie: __utmp=1\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nuser-agent: Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/5.0)\r\n\r\n'
...
```
The calls above are done via:
```
h = {'user-agent': 'Mozilla/5.0 (compatible; MSIE 8.0; Windows NT 6.1; Trident/5.0)'}
with requests.Session() as s:
r = s.get(url, headers = h, timeout = HTTP_TIMEOUT_SECONDS) #, allow_redirects = False)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7309679?v=4",
"events_url": "https://api.github.com/users/coursera-ds/events{/privacy}",
"followers_url": "https://api.github.com/users/coursera-ds/followers",
"following_url": "https://api.github.com/users/coursera-ds/following{/other_user}",
"gists_url": "https://api.github.com/users/coursera-ds/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/coursera-ds",
"id": 7309679,
"login": "coursera-ds",
"node_id": "MDQ6VXNlcjczMDk2Nzk=",
"organizations_url": "https://api.github.com/users/coursera-ds/orgs",
"received_events_url": "https://api.github.com/users/coursera-ds/received_events",
"repos_url": "https://api.github.com/users/coursera-ds/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/coursera-ds/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/coursera-ds/subscriptions",
"type": "User",
"url": "https://api.github.com/users/coursera-ds",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3170/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3170/timeline
| null |
completed
| null | null | false |
[
"Was running older versions of urllib3 and requests. Upgrading fixed it.\n",
"I am still getting the following error: _ssl.c:2580 or _ssl.c:2825\n\nbut when I am trying to post through SOAP UI on the same URL with basic Auth and SSL certificate , connection is going through and getting response.\n\nNeed your expert help , I am so stuck ...\n\nattaching the basic auth 2 options which I am trying so far and the error message:\n",
"these are errors which I am encountering \nOption # 1\n\n> > > import requests\n> > > from requests.auth import HTTPBasicAuth\n> > > auth = HTTPBasicAuth('auth_user', 'auth_pass')\n> > > requests.post('https://my-site.com/rest_service', cert=('/Users/my_user/Desktop/auth_user.pfx', '/path/key'), auth=auth)\n> > > Traceback (most recent call last):\n> > > File \"<stdin>\", line 1, in <module>\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/api.py\", line 88, in post\n> > > return request('post', url, data=data, *_kwargs)\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/api.py\", line 44, in request\n> > > return session.request(method=method, url=url, *_kwargs)\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/sessions.py\", line 456, in request\n> > > resp = self.send(prep, *_send_kwargs)\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/sessions.py\", line 559, in send\n> > > r = adapter.send(request, *_kwargs)\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/adapters.py\", line 382, in send\n> > > raise SSLError(e, request=request)\n> > > requests.exceptions.SSLError: [SSL] PEM lib (_ssl.c:2580)\n\nOption # 2\n\n> > > r = requests.post(\"https://my-site.com/rest_service\", verify='/Users/my_user/Desktop/auth_user.pfx', data={}, auth=('auth_user', 'auth_pass'))\n> > > Traceback (most recent call last):\n> > > File \"<stdin>\", line 1, in <module>\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/api.py\", line 88, in post\n> > > return request('post', url, data=data, *_kwargs)\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/api.py\", line 44, in request\n> > > return session.request(method=method, url=url, *_kwargs)\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/sessions.py\", line 456, in request\n> > > resp = self.send(prep, *_send_kwargs)\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/sessions.py\", line 559, in send\n> > > r = adapter.send(request, *_kwargs)\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/adapters.py\", line 382, in send\n> > > raise SSLError(e, request=request)\n> > > requests.exceptions.SSLError: unknown error (_ssl.c:2825)\n",
"@nahonnyate Please don't piggyback on unrelated issues. This is particularly problematic as you _already_ have an open issue (#3137) that has us asking questions of you that you have not yet answered.\n",
"@Lukasa Sorry to miss my original thread, Let me respond the questions in my thread. Thanks for your help\n"
] |
https://api.github.com/repos/psf/requests/issues/3139
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3139/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3139/comments
|
https://api.github.com/repos/psf/requests/issues/3139/events
|
https://github.com/psf/requests/pull/3139
| 151,901,856 |
MDExOlB1bGxSZXF1ZXN0NjgzOTI4MTk=
| 3,139 |
Initialize hash_utf8 to None, preventing NameError. Fixes #3138.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9448417?v=4",
"events_url": "https://api.github.com/users/markshannon/events{/privacy}",
"followers_url": "https://api.github.com/users/markshannon/followers",
"following_url": "https://api.github.com/users/markshannon/following{/other_user}",
"gists_url": "https://api.github.com/users/markshannon/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/markshannon",
"id": 9448417,
"login": "markshannon",
"node_id": "MDQ6VXNlcjk0NDg0MTc=",
"organizations_url": "https://api.github.com/users/markshannon/orgs",
"received_events_url": "https://api.github.com/users/markshannon/received_events",
"repos_url": "https://api.github.com/users/markshannon/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/markshannon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/markshannon/subscriptions",
"type": "User",
"url": "https://api.github.com/users/markshannon",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-04-29T15:50:39Z
|
2021-09-08T04:01:03Z
|
2016-04-29T16:05:20Z
|
NONE
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3139/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3139/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3139.diff",
"html_url": "https://github.com/psf/requests/pull/3139",
"merged_at": "2016-04-29T16:05:20Z",
"patch_url": "https://github.com/psf/requests/pull/3139.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3139"
}
| true |
[
"Thanks @markshannon! :tada: \n"
] |
|
https://api.github.com/repos/psf/requests/issues/3138
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3138/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3138/comments
|
https://api.github.com/repos/psf/requests/issues/3138/events
|
https://github.com/psf/requests/issues/3138
| 151,835,971 |
MDU6SXNzdWUxNTE4MzU5NzE=
| 3,138 |
Possible NameError if unexpected encryption algorithm encountered.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9448417?v=4",
"events_url": "https://api.github.com/users/markshannon/events{/privacy}",
"followers_url": "https://api.github.com/users/markshannon/followers",
"following_url": "https://api.github.com/users/markshannon/following{/other_user}",
"gists_url": "https://api.github.com/users/markshannon/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/markshannon",
"id": 9448417,
"login": "markshannon",
"node_id": "MDQ6VXNlcjk0NDg0MTc=",
"organizations_url": "https://api.github.com/users/markshannon/orgs",
"received_events_url": "https://api.github.com/users/markshannon/received_events",
"repos_url": "https://api.github.com/users/markshannon/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/markshannon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/markshannon/subscriptions",
"type": "User",
"url": "https://api.github.com/users/markshannon",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-04-29T10:22:12Z
|
2021-09-08T18:00:49Z
|
2016-04-29T16:05:22Z
|
NONE
|
resolved
|
`HTTPDigestAuth.build_digest_header` may fail with a NameError if encryption algorithm is not one of
'MD5', 'MD5-SESS' or 'SHA.
https://github.com/kennethreitz/requests/blob/master/requests/auth.py#L117
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3138/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3138/timeline
| null |
completed
| null | null | false |
[
"Yup, that looks wrong. Looks like that name needs to be initialised to `None`. Want to provide a patch?\n",
"I thought I'd leave that to those of you who know the code base, in case there was something more subtle going on. I'm happy to make a PR if you want.\n",
"PRs always appreciated!\n"
] |
https://api.github.com/repos/psf/requests/issues/3137
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3137/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3137/comments
|
https://api.github.com/repos/psf/requests/issues/3137/events
|
https://github.com/psf/requests/issues/3137
| 151,807,285 |
MDU6SXNzdWUxNTE4MDcyODU=
| 3,137 |
Python ssl certificate connectivity issue from mac, ssl.c:2825
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/12782102?v=4",
"events_url": "https://api.github.com/users/nahonnyate/events{/privacy}",
"followers_url": "https://api.github.com/users/nahonnyate/followers",
"following_url": "https://api.github.com/users/nahonnyate/following{/other_user}",
"gists_url": "https://api.github.com/users/nahonnyate/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nahonnyate",
"id": 12782102,
"login": "nahonnyate",
"node_id": "MDQ6VXNlcjEyNzgyMTAy",
"organizations_url": "https://api.github.com/users/nahonnyate/orgs",
"received_events_url": "https://api.github.com/users/nahonnyate/received_events",
"repos_url": "https://api.github.com/users/nahonnyate/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nahonnyate/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nahonnyate/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nahonnyate",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 16 |
2016-04-29T07:19:30Z
|
2018-06-25T11:53:47Z
|
2016-05-04T19:26:39Z
|
NONE
|
resolved
|
Hi,
I am trying to post the following from my mac device, also I installed the .pfx file in my keychain and trying to post the same.
``` py
r = requests.post("https://url", verify='/Users/my_user/Desktop/cert.pfx', data={}, auth=('user', 'password'))
```
it is throwing the following error 👍
```
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/my_user/anaconda/lib/python2.7/site-packages/requests/api.py", line 88, in post
return request('post', url, data=data, **kwargs)
File "/Users/my_user/anaconda/lib/python2.7/site-packages/requests/api.py", line 44, in request
return session.request(method=method, url=url, **kwargs)
File "/Users/my_user/anaconda/lib/python2.7/site-packages/requests/sessions.py", line 456, in request
resp = self.send(prep, **send_kwargs)
File "/Users/my_user/anaconda/lib/python2.7/site-packages/requests/sessions.py", line 559, in send
r = adapter.send(request, **kwargs)
File "/Users/my_user/anaconda/lib/python2.7/site-packages/requests/adapters.py", line 382, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: unknown error (_ssl.c:2825
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3137/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3137/timeline
| null |
completed
| null | null | false |
[
"What's the format of `cert.pfx`?\n",
"I am not sure but maybe a look at this may help (from source code)\n\n :param verify: (optional) whether the SSL cert will be verified. A CA_BUNDLE path can also be provided. Defaults to `True`.\n :param stream: (optional) if `False`, the response content will be immediately downloaded.\n :param cert: (optional) if String, path to ssl client cert file (.pem). If Tuple, ('cert', 'key') pair.\n\nDid you try cert= instead of verify?\nI think cert param is used for authentication purposes?\n",
"@Lukasa cert.pfx is the SSL certificate which I need to mention in my POST/GET along with basic auth.\n@baptistapedro I tried both the option and it is throwing me the following error.\n\nI am attaching the error I encountered in both the option.\n\nOption # 1\n\n> > > import requests\n> > > from requests.auth import HTTPBasicAuth\n> > > auth = HTTPBasicAuth('auth_user', 'auth_pass')\n> > > requests.post('https://my-site.com/rest_service', cert=('/Users/my_user/Desktop/auth_user.pfx', '/path/key'), auth=auth)\n> > > Traceback (most recent call last):\n> > > File \"<stdin>\", line 1, in <module>\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/api.py\", line 88, in post\n> > > return request('post', url, data=data, *_kwargs)\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/api.py\", line 44, in request\n> > > return session.request(method=method, url=url, *_kwargs)\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/sessions.py\", line 456, in request\n> > > resp = self.send(prep, *_send_kwargs)\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/sessions.py\", line 559, in send\n> > > r = adapter.send(request, *_kwargs)\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/adapters.py\", line 382, in send\n> > > raise SSLError(e, request=request)\n> > > requests.exceptions.SSLError: [SSL] PEM lib (_ssl.c:2580)\n\nOption # 2\n\n> > > r = requests.post(\"https://my-site.com/rest_service\", verify='/Users/my_user/Desktop/auth_user.pfx', data={}, auth=('auth_user', 'auth_pass'))\n> > > Traceback (most recent call last):\n> > > File \"<stdin>\", line 1, in <module>\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/api.py\", line 88, in post\n> > > return request('post', url, data=data, *_kwargs)\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/api.py\", line 44, in request\n> > > return session.request(method=method, url=url, *_kwargs)\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/sessions.py\", line 456, in request\n> > > resp = self.send(prep, *_send_kwargs)\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/sessions.py\", line 559, in send\n> > > r = adapter.send(request, *_kwargs)\n> > > File \"/Users/my_user/anaconda/lib/python2.7/site-packages/requests/adapters.py\", line 382, in send\n> > > raise SSLError(e, request=request)\n> > > requests.exceptions.SSLError: unknown error (_ssl.c:2825)\n",
"@nahonnyate My question was what is the _format_ of that certificate? How is it stored?\n\nThe reason I ask is that generally speaking a .pfx file is a Windows representation of a PKCS#12 format certificate. OpenSSL won't handle that format well, at least not as it's used here, so you'll need to convert it to PEM in order to use it. Find some instructions on [this page](https://www.sslshopper.com/article-most-common-openssl-commands.html).\n",
"@Lukasa thanks for the referred page, I just did info on that SSL .pfx file and here is the output: if it helps.\n\npkcs12 -info -in '/Users/my_user/Desktop/auth_user.pfx'\nEnter Import Password:\nMAC Iteration 2000\nMAC verified OK\nPKCS7 Data\nShrouded Keybag: pb***_SHA1**_-Key*****_-**_**, Iteration 2000\nBag Attributes\n localKeyID: 0\\* *\\* 00 00 \n friendlyName: le-*****_-**_-***_-**__-_********\n Microsoft CSP Name: Microsoft Enhanced Cryptographic Provider v1.0\nKey Attributes\n X509v3 Key Usage: 80 \n",
"@Lukasa tried the following as you have suggested , not sure why it is complaining about the certificate now, I am using the same certificate (.pfx version) through SOAP UI in POST request and getting proper response back\n\n## Converted .pfx file to .pem through this\n\nopenssl pkcs12 -in '/Users/auth_user/Desktop/cert.pfx' -out cert.pem -nodes\n\n## use the same .pem in my POST\n\nr = requests.post('https://my-site.com/rest_service', verify='/Users/auth_user/Desktop/cert.pem', data={}, auth=('auth_user', 'auth_pass'))\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/Users/auth_user/anaconda/lib/python2.7/site-packages/requests/api.py\", line 88, in post\n return request('post', url, data=data, *_kwargs)\n File \"/Users/auth_user/anaconda/lib/python2.7/site-packages/requests/api.py\", line 44, in request\n return session.request(method=method, url=url, *_kwargs)\n File \"/Users/auth_user/anaconda/lib/python2.7/site-packages/requests/sessions.py\", line 456, in request\n resp = self.send(prep, *_send_kwargs)\n File \"/Users/auth_user/anaconda/lib/python2.7/site-packages/requests/sessions.py\", line 559, in send\n r = adapter.send(request, *_kwargs)\n File \"/Users/auth_user/anaconda/lib/python2.7/site-packages/requests/adapters.py\", line 382, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)\n",
"Don't pass the cert to the `verify` parameter, pass it to `cert`.\n",
"thanks for your help on the same , it is going through right now but throwing me 401\n\nr = requests.post('https://my-site.com/rest_service', verify=False, cert='/Users/auth_user/Desktop/cert.pem', data={}, auth=('auth_user', 'auth_pass'))\nr.status_code = 401\n",
"401 suggests you're failing to authenticate. How does your site expect you to log in?\n",
"my site expects basic header authentication and I was thinking the argument auth=('auth_user', 'auth_pass') what I am paying in my POST will take care of that. Do I need to mention something else to do confirm its BasicAuth\n",
"Nope, that should do basic auth. Do you have logs from your site to indicate what it's 401-ing for?\n",
"@Lukasa I also tried with urllib2 with the following option and getting the same error as before:\n\n---\n\nimport requests\nimport urllib2\nimport base64\n\nchimpConfig = {\n \"headers\" : {\n \"Authorization\": \"Basic \" + base64.encodestring(\"auth_user:auth_pass\").replace('\\n', '')\n },\n \"cert\": \"/Users/user/Desktop/ssl_suth_cert.pem\",\n \"url\": 'https://url.com/'}\n\n#perform authentication\ndatas = None\ntimeout = 2\ncert = \"/Users/user/Desktop/ssl_suth_cert.pem\"\nrequest = urllib2.Request(chimpConfig[\"url\"], datas, chimpConfig[\"headers\"])\nresult = urllib2.urlopen(request)\n\nprint \"Response:\",result\nprint result.code\n\n---\n\nit is giving me the following error\nurllib2.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)>\n\nnow if I add \"cert\" parameter in my call I am getting the following error:\nonly change in the above code is: added cert param:\nrequest = urllib2.Request(chimpConfig[\"url\"], datas, **cert,** chimpConfig[\"headers\"])\n\nerror:\n\nTraceback (most recent call last):\n File \"urlpost.py\", line 16, in <module>\n request = urllib2.Request(chimpConfig[\"url\"], datas, cert, chimpConfig[\"headers\"])\n File \"/Users/summukhe/anaconda/lib/python2.7/urllib2.py\", line 238, in **init**\n for key, value in headers.items():\nAttributeError: 'str' object has no attribute 'items'\n",
"I'm not really interested in what urllib2 is getting wrong (though the second traceback indicates a problem with your code, probably that the cert is being passed in the wrong place).\n\nRegardless, I'm interested in seeing either logs from the server or some kind of reference for what the server is expecting.\n",
"@Lukasa , thanks for your direction, after analyzing the log it seems the url was having some '\\n' character which was causing the later issue.\nsimple rstrip() worked and Verify= False was the key as you have mentioned before. \n\nOne more question, how do I post message through this ? if I am adding another param as \"data\", this request is throwing error \"TypeError: post() got multiple values for keyword argument 'data'\"\n\nI really appreciate your help and the community to provide the support.\n\n## POST started working\n\nheaders = {'content-type': 'application/json'}\n\n> > > r = requests.post(url, headers, verify=False, cert='/Users/user/Desktop/auth_user.pem', auth=('user', 'pass'))\n> > > r.content\n> > > 'POST REQUEST PING'\n> > > r.status_code\n> > > 200\n> > > ///\n",
"@Lukasa \n\n## I think we are all clean now, able to post with data-file as well.\n\nThanks everyone for the help, we can close the issue\n\npayload='/Users/user/Desktop/a.xml'\nr = requests.post(url,auth=('auth_user', 'pass'), data=payload, verify=False, cert='/Users/user/Desktop/sinri2683c.pem',headers=headers)\n",
"@nahonnyate I'm facing the same issue on my Mac. I use conda as my package manager. Please tell me how to resolve"
] |
https://api.github.com/repos/psf/requests/issues/3136
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3136/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3136/comments
|
https://api.github.com/repos/psf/requests/issues/3136/events
|
https://github.com/psf/requests/pull/3136
| 151,605,950 |
MDExOlB1bGxSZXF1ZXN0NjgxOTkwNzY=
| 3,136 |
Update readthedocs links.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/229453?v=4",
"events_url": "https://api.github.com/users/Natim/events{/privacy}",
"followers_url": "https://api.github.com/users/Natim/followers",
"following_url": "https://api.github.com/users/Natim/following{/other_user}",
"gists_url": "https://api.github.com/users/Natim/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Natim",
"id": 229453,
"login": "Natim",
"node_id": "MDQ6VXNlcjIyOTQ1Mw==",
"organizations_url": "https://api.github.com/users/Natim/orgs",
"received_events_url": "https://api.github.com/users/Natim/received_events",
"repos_url": "https://api.github.com/users/Natim/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Natim/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Natim/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Natim",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-04-28T10:39:52Z
|
2021-09-08T04:01:03Z
|
2016-04-28T10:43:14Z
|
NONE
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3136/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3136/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3136.diff",
"html_url": "https://github.com/psf/requests/pull/3136",
"merged_at": "2016-04-28T10:43:14Z",
"patch_url": "https://github.com/psf/requests/pull/3136.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3136"
}
| true |
[
"Thanks so much @Natim!\n"
] |
|
https://api.github.com/repos/psf/requests/issues/3135
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3135/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3135/comments
|
https://api.github.com/repos/psf/requests/issues/3135/events
|
https://github.com/psf/requests/issues/3135
| 151,215,181 |
MDU6SXNzdWUxNTEyMTUxODE=
| 3,135 |
requests.cookies hardcodes a root path attribute
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/110020?v=4",
"events_url": "https://api.github.com/users/evnm/events{/privacy}",
"followers_url": "https://api.github.com/users/evnm/followers",
"following_url": "https://api.github.com/users/evnm/following{/other_user}",
"gists_url": "https://api.github.com/users/evnm/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/evnm",
"id": 110020,
"login": "evnm",
"node_id": "MDQ6VXNlcjExMDAyMA==",
"organizations_url": "https://api.github.com/users/evnm/orgs",
"received_events_url": "https://api.github.com/users/evnm/received_events",
"repos_url": "https://api.github.com/users/evnm/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/evnm/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/evnm/subscriptions",
"type": "User",
"url": "https://api.github.com/users/evnm",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-04-26T19:09:17Z
|
2021-09-08T18:00:49Z
|
2016-04-26T19:24:33Z
|
NONE
|
resolved
|
While trying to figure out how to set specific [path attributes](https://en.wikipedia.org/wiki/HTTP_cookie#Domain_and_Path) in cookies when using requests, I noticed that `RequestsCookieJar` [hardcodes the root path](https://github.com/kennethreitz/requests/blob/bbeb0001cdc657ac8c7fef98e154229bc392db0e/requests/cookies.py#L400) when cookies are created and doesn't seem to allow this default to be overridden.
Is there a reason for this? If not, would the maintainers be open to a PR modifying `create_cookie` to heed a `path` specified in `kwargs`?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/110020?v=4",
"events_url": "https://api.github.com/users/evnm/events{/privacy}",
"followers_url": "https://api.github.com/users/evnm/followers",
"following_url": "https://api.github.com/users/evnm/following{/other_user}",
"gists_url": "https://api.github.com/users/evnm/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/evnm",
"id": 110020,
"login": "evnm",
"node_id": "MDQ6VXNlcjExMDAyMA==",
"organizations_url": "https://api.github.com/users/evnm/orgs",
"received_events_url": "https://api.github.com/users/evnm/received_events",
"repos_url": "https://api.github.com/users/evnm/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/evnm/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/evnm/subscriptions",
"type": "User",
"url": "https://api.github.com/users/evnm",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3135/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3135/timeline
| null |
completed
| null | null | false |
[
"@evnm That reading of the code is not right. =)\n\nYou'll notice [here](https://github.com/kennethreitz/requests/blob/bbeb0001cdc657ac8c7fef98e154229bc392db0e/requests/cookies.py#L414) that we call `update` with the kwargs. That will override the path if you provide it. Try it: `create_cookie('key', 'value', path='/some/path')` should work just fine.\n",
"Ah, cool. An object lesson on careful code-reading. Thanks for taking the time to point that out!\n",
"No problem, thanks for checking with us! :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/3134
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3134/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3134/comments
|
https://api.github.com/repos/psf/requests/issues/3134/events
|
https://github.com/psf/requests/issues/3134
| 151,214,794 |
MDU6SXNzdWUxNTEyMTQ3OTQ=
| 3,134 |
Extensions Guide
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
] | null | 5 |
2016-04-26T19:07:15Z
|
2021-09-08T07:00:41Z
|
2017-07-30T00:16:45Z
|
CONTRIBUTOR
|
resolved
|
We currently have a list of _recommended_ (e.g. softly endorsed) extensions in the documentation (this needs some love).
In addition, we should have a document that lists all community extensions, grouped by category.
---
Prompted by https://github.com/kennethreitz/requests/issues/3131
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/3134/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3134/timeline
| null |
completed
| null | null | false |
[
"Feel free to paste extensions you're aware of here.\n",
"PyPI catalogs these for us see:\n- [searching for requests on PyPI](https://pypi.python.org/pypi?:action=search&term=requests&submit=search)\n- [searching for requests on Warehouse](https://pypi.io/search/?q=requests)\n\nJudging by the number of these packages, I'm not sure we want to keep a list of _every_ community extension. That seems... extreme.\n",
"Yeah, no API clients (maybe one or two exceptions; doubtful). They have to look at least somewhat generally useful.\n",
"I forsee this being a fairly large list, intended to be for discovering new toys, or looking for something specific via ctrl-f.\n",
"you can start with some well used n promising ones first\n"
] |
https://api.github.com/repos/psf/requests/issues/3133
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3133/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3133/comments
|
https://api.github.com/repos/psf/requests/issues/3133/events
|
https://github.com/psf/requests/issues/3133
| 151,192,406 |
MDU6SXNzdWUxNTExOTI0MDY=
| 3,133 |
requests cannot fetch this website (only in Python 3)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/350846?v=4",
"events_url": "https://api.github.com/users/jribbens/events{/privacy}",
"followers_url": "https://api.github.com/users/jribbens/followers",
"following_url": "https://api.github.com/users/jribbens/following{/other_user}",
"gists_url": "https://api.github.com/users/jribbens/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jribbens",
"id": 350846,
"login": "jribbens",
"node_id": "MDQ6VXNlcjM1MDg0Ng==",
"organizations_url": "https://api.github.com/users/jribbens/orgs",
"received_events_url": "https://api.github.com/users/jribbens/received_events",
"repos_url": "https://api.github.com/users/jribbens/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jribbens/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jribbens/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jribbens",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2016-04-26T17:27:15Z
|
2021-09-08T18:00:49Z
|
2016-04-26T17:52:49Z
|
NONE
|
resolved
|
`requests.get("https://www.northwarks.gov.uk/")` hangs.
`urllib.request.urlopen("https://www.northwarks.gov.uk/").read()` works fine.
I am using Python 3.4 and requests 2.9.1.
If I use Python 2.7.10 and requests 2.9.1 then it works fine.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/350846?v=4",
"events_url": "https://api.github.com/users/jribbens/events{/privacy}",
"followers_url": "https://api.github.com/users/jribbens/followers",
"following_url": "https://api.github.com/users/jribbens/following{/other_user}",
"gists_url": "https://api.github.com/users/jribbens/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jribbens",
"id": 350846,
"login": "jribbens",
"node_id": "MDQ6VXNlcjM1MDg0Ng==",
"organizations_url": "https://api.github.com/users/jribbens/orgs",
"received_events_url": "https://api.github.com/users/jribbens/received_events",
"repos_url": "https://api.github.com/users/jribbens/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jribbens/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jribbens/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jribbens",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3133/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3133/timeline
| null |
completed
| null | null | false |
[
"So, first things first. If we set `allow_redirects=False` and `stream=True`, then the first request works. With anything else set, it hangs. The headers the server sends sends seem to be the problem here. Here's the header block curl is seeing:\n\n```\nHTTP/1.1 302 Moved Temporarily\nContent-Type: text/html; charset=UTF-8\nLocation: https://www.northwarks.gov.uk/site/\nServer: Microsoft-IIS/8.5\nX-Powered-By: ASP.NET\nX-FRAME-OPTIONS : SAMEORIGIN\nDate: Tue, 26 Apr 2016 17:38:24 GMT\nContent-Length: 158\n```\n\nGiven that header block, Python 3's `http.client` module (which is responsible for header parsing), only parses up to `X-Powered-By`. The reason is that the header after (`X-FRAME_OPTIONS`, weirdly all-capped) is not formatted correctly. RFC 7230 forbids a space between the header name and the colon. This breaks the header parser, which therefore doesn't read the Content-Length header field. That means that requests expects that the connection will be framed by connection close, so it barfs.\n\nThe overly-strict header parsing you're seeing here is the fault of the Python standard library. You can raise the issue with them: however, they handled a similar one in [bug 25539](https://bugs.python.org/issue25539) and didn't end up moving forward with a fix in Python.\n\nMore realistically, you should contact the server operator and tell them that their server is emitting invalid HTTP. This appears to be a local government server, so it should be possible to get in touch with the operators.\n\nUnfortunately, requests can't do much about this: the standard library is just getting this wrong.\n",
"Thanks for the excellent analysis. I will look at the `http.client` issue and I agree that it looks like this is not something requests can fix itself.\n",
"Thanks for the report @jribbens! :sparkles:\n",
"`X-FRAME-OPTIONS%20` :trollface: \n"
] |
https://api.github.com/repos/psf/requests/issues/3132
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3132/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3132/comments
|
https://api.github.com/repos/psf/requests/issues/3132/events
|
https://github.com/psf/requests/pull/3132
| 151,097,486 |
MDExOlB1bGxSZXF1ZXN0Njc4NjEyMTc=
| 3,132 |
utils: let select_proxy not raise an exception when url has no hostname
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1543430?v=4",
"events_url": "https://api.github.com/users/chipaca/events{/privacy}",
"followers_url": "https://api.github.com/users/chipaca/followers",
"following_url": "https://api.github.com/users/chipaca/following{/other_user}",
"gists_url": "https://api.github.com/users/chipaca/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/chipaca",
"id": 1543430,
"login": "chipaca",
"node_id": "MDQ6VXNlcjE1NDM0MzA=",
"organizations_url": "https://api.github.com/users/chipaca/orgs",
"received_events_url": "https://api.github.com/users/chipaca/received_events",
"repos_url": "https://api.github.com/users/chipaca/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/chipaca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chipaca/subscriptions",
"type": "User",
"url": "https://api.github.com/users/chipaca",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2016-04-26T10:24:54Z
|
2021-09-08T04:01:04Z
|
2016-04-26T10:35:23Z
|
CONTRIBUTOR
|
resolved
|
Fixes #3131.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3132/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3132/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3132.diff",
"html_url": "https://github.com/psf/requests/pull/3132",
"merged_at": "2016-04-26T10:35:23Z",
"patch_url": "https://github.com/psf/requests/pull/3132.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3132"
}
| true |
[
"While we're here, we should probably also handle the possibility that there's no scheme in the URL as well.\n",
"scheme defaults to `''`, though. Well, more or less:\n\n> The scheme argument gives the default addressing scheme, to be used only if the URL does not specify one. It should be the same type (text or bytes) as urlstring, except that the default value '' is always allowed, and is automatically converted to b'' if appropriate.\n",
"Urgh, `urlparse` is such an inconsistent function. Good spot.\n",
"Thanks! :sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/3131
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3131/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3131/comments
|
https://api.github.com/repos/psf/requests/issues/3131/events
|
https://github.com/psf/requests/issues/3131
| 151,097,407 |
MDU6SXNzdWUxNTEwOTc0MDc=
| 3,131 |
utils.select_proxy raises an exception when URL has no hostname
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1543430?v=4",
"events_url": "https://api.github.com/users/chipaca/events{/privacy}",
"followers_url": "https://api.github.com/users/chipaca/followers",
"following_url": "https://api.github.com/users/chipaca/following{/other_user}",
"gists_url": "https://api.github.com/users/chipaca/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/chipaca",
"id": 1543430,
"login": "chipaca",
"node_id": "MDQ6VXNlcjE1NDM0MzA=",
"organizations_url": "https://api.github.com/users/chipaca/orgs",
"received_events_url": "https://api.github.com/users/chipaca/received_events",
"repos_url": "https://api.github.com/users/chipaca/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/chipaca/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chipaca/subscriptions",
"type": "User",
"url": "https://api.github.com/users/chipaca",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2016-04-26T10:24:27Z
|
2021-09-08T16:00:40Z
|
2016-04-26T10:35:23Z
|
CONTRIBUTOR
|
resolved
|
URLs sometimes have no hostname (e.g. file URLs, or unix socket URLs). The parsed URL `hostname` attribute returns `None` in this case, making `utils.select_proxy` complain about concat'ing a `str` with `None`.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3131/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3131/timeline
| null |
completed
| null | null | false |
[
"@chipaca keep in mind that whatever adapter you're using will have to deal with older versions of requests and if it's not meant to touch the network should not be inheriting from the HTTPAdapter.\n",
"@sigmavirus24 I'm using https://github.com/msabramo/requests-unixsocket/ which does inherit from `HTTPAdapter`. I'm not sure what you mean, though; where does \"touching the network\" become a factor?\n",
"If you're using a `file://` URI, you're never going to talk to something over the network using HTTP (for example). I'll talk to @msabramo about fixing unixsocket as this problem will persist otherwise for several people who will not get this fix.\n",
"I'm curious how you saw this exception while using that library. Looking at unixsocket's code, it overrides the method which is the one place where [`select_proxy` is used in requests](https://github.com/kennethreitz/requests/blob/60aee145b96a4cf13a7daefd8578d538fe8459a8/requests/adapters.py#L250).\n",
"it's used in one more place: `HTTPAdapter.request_url`.\n\nOn Tue, Apr 26, 2016 at 3:56 PM, Ian Cordasco [email protected]\nwrote:\n\n> I'm curious how you saw this exception there. Looking at unixsocket's\n> code, it overrides the method which is the one place where select_proxy\n> is used in requests\n> https://github.com/kennethreitz/requests/blob/60aee145b96a4cf13a7daefd8578d538fe8459a8/requests/adapters.py#L250\n> .\n> \n> —\n> You are receiving this because you were mentioned.\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/3131#issuecomment-214772376\n\n## \n\nJohn Lenton ([email protected]) ::: http://chipaca.com\n",
"As to where I'm using it, I'm using it to connect to a unix socket :-)\nI packaged httpie as a snap (originally for snappy ubuntu core systems, now on ubuntu classic itself), and added support for unix sockets so that it could talk to the snappy daemon. That's on https://github.com/chipaca/httpie-snap fwiw.\n",
"Ah so @msabramo should be overriding that too if he continues to subclass the adapter. (I missed the usage in the `request_url` method because GitHub search hasn't indexed that yet apparently.)\n",
"I didn't know about requests-unixsocket! We must catalogue these extensions somewhere. \n",
"Hmmm, but don't unixsocket URLs always have a \"hostname\" Because the \"hostname\" is a URL-encoded path to the socket on the filesystem. E.g.:\n\n```\n[38] > /Users/marca/dev/git-repos/requests/requests/utils.py(592)select_proxy()\n-> proxy = proxies.get(urlparts.scheme+'://'+urlparts.hostname)\n(Pdb++) urlparts\nParseResult(scheme='http+unix', netloc='%2Ftmp%2Ftest_requests.132820227_21127_b4f4da9f', path='/path/to/page', params='', query='', fragment='')\n(Pdb++) urlparts.scheme\n'http+unix'\n(Pdb++) urlparts.hostname\n'%2ftmp%2ftest_requests.132820227_21127_b4f4da9f'\n```\n"
] |
https://api.github.com/repos/psf/requests/issues/3109
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3109/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3109/comments
|
https://api.github.com/repos/psf/requests/issues/3109/events
|
https://github.com/psf/requests/pull/3109
| 150,186,966 |
MDExOlB1bGxSZXF1ZXN0Njc0MTk3NTY=
| 3,109 |
HTTPAdapter now updates its PoolManager connection_pool_kw
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1977525?v=4",
"events_url": "https://api.github.com/users/jeremycline/events{/privacy}",
"followers_url": "https://api.github.com/users/jeremycline/followers",
"following_url": "https://api.github.com/users/jeremycline/following{/other_user}",
"gists_url": "https://api.github.com/users/jeremycline/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jeremycline",
"id": 1977525,
"login": "jeremycline",
"node_id": "MDQ6VXNlcjE5Nzc1MjU=",
"organizations_url": "https://api.github.com/users/jeremycline/orgs",
"received_events_url": "https://api.github.com/users/jeremycline/received_events",
"repos_url": "https://api.github.com/users/jeremycline/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jeremycline/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jeremycline/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jeremycline",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2013-11-17T11:29:34Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": null,
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/20",
"id": 487518,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels",
"node_id": "MDk6TWlsZXN0b25lNDg3NTE4",
"number": 20,
"open_issues": 12,
"state": "open",
"title": "3.0.0",
"updated_at": "2024-05-19T18:43:00Z",
"url": "https://api.github.com/repos/psf/requests/milestones/20"
}
| 13 |
2016-04-21T20:47:49Z
|
2021-09-08T02:10:28Z
|
2016-09-29T20:30:31Z
|
CONTRIBUTOR
|
resolved
|
This is not particularly polished yet, and depends on the as-yet unreleased
urllib3 version 1.16. Still, I'd greatly appreciate some early feedback on how
(in)sane this approach is.
With the addition of https://github.com/shazow/urllib3/pull/830 requests
should update the connection_pool_kw on the PoolManager so that new
ConnectionPools get created when TLS/SSL settings change. This ensures
that users can update the CA certificates used to verify servers as well
as the client certificate and key it uses to authenticate with servers.
This fixes issue #2863
---
Closes #2863
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3109/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3109/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3109.diff",
"html_url": "https://github.com/psf/requests/pull/3109",
"merged_at": "2016-09-29T20:30:31Z",
"patch_url": "https://github.com/psf/requests/pull/3109.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3109"
}
| true |
[
"Thanks for this @jeremycline!\n\nIn principle this is fine. However, it changes the interface of the HTTPAdapter which is IMO part of our public interface, so I'd want to sit on this until 3.0.0. Happily, 3.0.0 shouldn't be too far away now so we can progress with this work. =)\n",
"I'm confused why we would remove `cert_verify` as part of this.\n",
"@sigmavirus24 Because it sets things on the connection pool directly, which is Not Right. We want to get a different pool if those are different.\n",
"I thought about leaving the method name as `cert_verify`, but did it this way because the behaviour changed a great deal. I am, of course, not apposed to alternate approaches. I just wanted to make sure it was clear the way certificates are paired up with connections changed.\n",
"Since urllib-1.16 is out now, I suppose I should get this ready to go :smile:\n\nOther than rebasing once 1.16 is pulled in, is there anything that jumps out as lacking?\n",
"Nothing leaps out, no. =)\n",
"I've rebased onto 2.11 and all the tests are passing now.\n",
"@jeremycline can you rebase once more? I promise we won't lose track of this again. =(\n",
"@sigmavirus24 no problem, rebased!\n",
"wait a second, @Lukasa can this not go into a pre-3.0 release?\n",
"@sigmavirus24 Nope, we removed cert_verify which is public on the HTTPAdapter.\n",
"@sigmavirus24 Should I add a 3.0 changelog entry or is that something you'd prefer to handle?\n",
"Looks good to me then!\n"
] |
https://api.github.com/repos/psf/requests/issues/3108
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3108/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3108/comments
|
https://api.github.com/repos/psf/requests/issues/3108/events
|
https://github.com/psf/requests/pull/3108
| 150,106,470 |
MDExOlB1bGxSZXF1ZXN0NjczNzI4OTI=
| 3,108 |
Flip conditional in session.send()
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4",
"events_url": "https://api.github.com/users/kevinburke/events{/privacy}",
"followers_url": "https://api.github.com/users/kevinburke/followers",
"following_url": "https://api.github.com/users/kevinburke/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kevinburke",
"id": 234019,
"login": "kevinburke",
"node_id": "MDQ6VXNlcjIzNDAxOQ==",
"organizations_url": "https://api.github.com/users/kevinburke/orgs",
"received_events_url": "https://api.github.com/users/kevinburke/received_events",
"repos_url": "https://api.github.com/users/kevinburke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kevinburke",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-04-21T15:21:26Z
|
2021-09-08T04:01:04Z
|
2016-04-21T15:30:22Z
|
CONTRIBUTOR
|
resolved
|
Previously we checked that the `request` being sent was an instance of a
PreparedRequest. If a user somehow created a PreparedRequest using a different
Requests library instance, this check makes the request un-sendable.
(This happened recently - unbeknownst to me, my server was running an outdated
version of pip, vulnerable to this issue - pypa/pip#1489, which creates
multiple subdirectories (src/requests, src/requests/requests) when you rerun
pip install --target. So the PreparedRequest was being created in one version
of the library and compared against the other version of the library, and
throwing this exception, even though they were both PreparedRequest instances!)
It would probably be preferable to check the object's behavior (instead of
its type), but a PreparedRequest has a lot of behavior, and it wouldn't be
really feasible or allow us to provide a helpful error message to check all
of it here. Instead flip the conditional to guard against the user sending an
unprepared Request, which should still give us most of the benefits of the
better error message.
Fixes #3102
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3108/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3108/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3108.diff",
"html_url": "https://github.com/psf/requests/pull/3108",
"merged_at": "2016-04-21T15:30:22Z",
"patch_url": "https://github.com/psf/requests/pull/3108.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3108"
}
| true |
[
"I'm happy with this! Go for it @kennethreitz, merge if you'd like to. =D\n",
"Not if I merge it first ;)\n"
] |
https://api.github.com/repos/psf/requests/issues/3107
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3107/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3107/comments
|
https://api.github.com/repos/psf/requests/issues/3107/events
|
https://github.com/psf/requests/issues/3107
| 149,621,611 |
MDU6SXNzdWUxNDk2MjE2MTE=
| 3,107 |
Redirected request.post turns into a GET
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2030556?v=4",
"events_url": "https://api.github.com/users/wontonst/events{/privacy}",
"followers_url": "https://api.github.com/users/wontonst/followers",
"following_url": "https://api.github.com/users/wontonst/following{/other_user}",
"gists_url": "https://api.github.com/users/wontonst/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/wontonst",
"id": 2030556,
"login": "wontonst",
"node_id": "MDQ6VXNlcjIwMzA1NTY=",
"organizations_url": "https://api.github.com/users/wontonst/orgs",
"received_events_url": "https://api.github.com/users/wontonst/received_events",
"repos_url": "https://api.github.com/users/wontonst/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/wontonst/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wontonst/subscriptions",
"type": "User",
"url": "https://api.github.com/users/wontonst",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-04-20T00:48:15Z
|
2021-09-08T18:00:50Z
|
2016-04-20T07:22:32Z
|
NONE
|
resolved
|
```
SITE = 'HTTP@{}'.format(API_ENDPOINT)
w_, krb_context = kerberos.authGSSClientInit(SITE)
kerberos.authGSSClientStep(krb_context, '')
AUTH = HTTPKerberosAuth(mutual_authentication=requests_kerberos.DISABLED)
r = requests.post(url, json = {'key':'value'}, auth=AUTH)
print 'MY REQUEST'
pprint.pprint(vars(r.request))
r = requests.post('http://httpbin.org/post', json = {'key':'value'})
print 'THEIR REQUEST'
pprint.pprint(vars(r.request))
```
```
MY REQUEST
{'_cookies': <RequestsCookieJar[Cookie(version=0, name='xxx', value='xx', port=None, port_specified=False, domain='xxx', d\
omain_specified=False, domain_initial_dot=False, path='/', path_specified=True, secure=True, expires=1461145561, discard=False, comment=None, comment_url=None, rest={'HttpOnly\
': None}, rfc2109=False), Cookie(version=0, name='xxx', value='xxxx', port=None,\
port_specified=False, domain='xxx', domain_specified=False, domain_initial_dot=False, path='/', path_specified=True, secure=True, expires=146114\
5561, discard=False, comment=None, comment_url=None, rest={'HttpOnly': None}, rfc2109=False)]>,
'body': None,
'headers': {'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'User-Agent': 'python-requests/2.9.1', 'Connection': 'keep-alive', 'Cookie': 'xxx', 'Content-Type': 'application/json'},
'hooks': {'response': [<bound method HTTPKerberosAuth.handle_response of <requests_kerberos.kerberos_.HTTPKerberosAuth object at 0x10c0a6310>>]},
'method': 'GET',
'url': 'xxx'}
THEIR REQUEST
{'_cookies': <RequestsCookieJar[]>,
'body': '{"key": "value"}',
'headers': {'Content-Length': '16', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'User-Agent': 'python-requests/2.9.1', 'Connection': 'keep-alive', 'Content-Type': 'a\
pplication/json'},
'hooks': {'response': []},
'method': 'POST',
'url': 'http://httpbin.org/post'}
```
As you can see, body becomes empty and the method is POST.
Looking at the response history shows that it gets redirected from the authenticator, then the request becomes a GET.
Shouldn't the POST request propagate all the way to the end?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3107/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3107/timeline
| null |
completed
| null | null | false |
[
"Nope. =) Not all redirects preserve the verb: if the website in question wants to preserve the verb, they should use 307 or 308 instead.\n\nRequests behaves like a browser when redirected. That means that it does the following:\n- on 303, changes everything except HEAD/GET to GET, as specified in RFC 7231 Section 6.4.4\n- on 302, changes everything except HEAD/GET to GET, as allowed by the note in RFC 7231 Section 6.4.3 and as done by browsers\n- on 301, changes POST to GET, as allowed by the note in RFC 7231 Section 6.4.2 and as done by browsers\n"
] |
https://api.github.com/repos/psf/requests/issues/3106
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3106/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3106/comments
|
https://api.github.com/repos/psf/requests/issues/3106/events
|
https://github.com/psf/requests/issues/3106
| 149,542,630 |
MDU6SXNzdWUxNDk1NDI2MzA=
| 3,106 |
Congrats on worst naming standards for Python modules 2015+
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2856049?v=4",
"events_url": "https://api.github.com/users/mossarelli/events{/privacy}",
"followers_url": "https://api.github.com/users/mossarelli/followers",
"following_url": "https://api.github.com/users/mossarelli/following{/other_user}",
"gists_url": "https://api.github.com/users/mossarelli/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mossarelli",
"id": 2856049,
"login": "mossarelli",
"node_id": "MDQ6VXNlcjI4NTYwNDk=",
"organizations_url": "https://api.github.com/users/mossarelli/orgs",
"received_events_url": "https://api.github.com/users/mossarelli/received_events",
"repos_url": "https://api.github.com/users/mossarelli/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mossarelli/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mossarelli/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mossarelli",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2016-04-19T18:24:26Z
|
2016-04-19T18:44:34Z
|
2016-04-19T18:24:42Z
|
NONE
| null |
`
pkg_resources.VersionConflict: (requests 2.2.1 (/usr/lib/python2.7/dist-packages), Requirement.parse('requests>=2.3.0'))
`
> Reasons for using a naming convention (as opposed to allowing programmers to choose any character sequence) include the following:
> - to reduce the effort needed to read and understand source code;
> - to enable code reviews to focus on more important issues than arguing over syntax and naming standards.
> - to enable code quality review tools to focus their reporting mainly on significant issues other than syntax and style preferences.
> - to enhance source code appearance (for example, by disallowing overlong names or unclear abbreviations).
> [https://en.wikipedia.org/wiki/Naming_convention_%28programming%29#cite_ref-1](https://en.wikipedia.org/wiki/Naming_convention_%28programming%29)
Without prior knowledge into Python syntax, Bash syntax and the name of the module, error fixing took several extra hours to solve. Another nail in the coffin.
Thank you for a great module regardless of name. It provides one of the strongest features I need in a bot that checks URLs. Too bad I have to be a wizard to swish and flick my wand to fix the errors.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2856049?v=4",
"events_url": "https://api.github.com/users/mossarelli/events{/privacy}",
"followers_url": "https://api.github.com/users/mossarelli/followers",
"following_url": "https://api.github.com/users/mossarelli/following{/other_user}",
"gists_url": "https://api.github.com/users/mossarelli/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mossarelli",
"id": 2856049,
"login": "mossarelli",
"node_id": "MDQ6VXNlcjI4NTYwNDk=",
"organizations_url": "https://api.github.com/users/mossarelli/orgs",
"received_events_url": "https://api.github.com/users/mossarelli/received_events",
"repos_url": "https://api.github.com/users/mossarelli/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mossarelli/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mossarelli/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mossarelli",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3106/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3106/timeline
| null |
completed
| null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/3105
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3105/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3105/comments
|
https://api.github.com/repos/psf/requests/issues/3105/events
|
https://github.com/psf/requests/issues/3105
| 149,336,538 |
MDU6SXNzdWUxNDkzMzY1Mzg=
| 3,105 |
Sticker Design [Kenneth]
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
] | null | 7 |
2016-04-19T03:17:38Z
|
2021-09-08T17:05:34Z
|
2016-07-01T18:39:31Z
|
CONTRIBUTOR
|
resolved
|
Using this as a place to throw things i find that I like.
Going to use StickerMule, not StickerRobot. Undecided on size or design, yet.
I like the idea of using 99designs to come up with the sticker. Would be far more expensive than doing it myself, but I'd rather be handing out _amazing_ stickers at a conference than mediocre ones (for an amazing project).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3105/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3105/timeline
| null |
completed
| null | null | false |
[
"http://99designs.com\n",
"Turtles (evolutions of Rezzy the sea turtle, possibly):\n- http://graphicriver.net/item/turtle/10194964?s_rank=6 (cute, but not a sea turtle)\n- http://graphicriver.net/item/cartoon-turtle/12865148?s_rank=38 (running, meh)\n- http://graphicriver.net/item/cartoon-turtle/9991241?s_rank=49 (i really like this one, very similar to current Rezzy)\n- http://graphicriver.net/item/vector-turtle-tattoo/8725608?s_rank=157 (super dope, very different)\n\nUndecided if \"sea turtle\" is important. I feel like it is. \n",
"Experimented with using the current Rezzy, of course. This could work, but I think the stickers are an opportunity to play with Rezzy as a concept, not as that specific image. \n",
"> Would be far more expensive than doing it myself, but I'd rather be handing out amazing stickers at a conference than mediocre ones (for an amazing project).\n\nI'll be handing out mediocre ones. =D\n",
"99design's prices have gone up, won't be using them. \n",
"$ pip install requests\n",
"Shipped!\n"
] |
https://api.github.com/repos/psf/requests/issues/3104
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3104/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3104/comments
|
https://api.github.com/repos/psf/requests/issues/3104/events
|
https://github.com/psf/requests/issues/3104
| 149,209,398 |
MDU6SXNzdWUxNDkyMDkzOTg=
| 3,104 |
SSLError exception leak while reading content stream
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/880064?v=4",
"events_url": "https://api.github.com/users/fredthomsen/events{/privacy}",
"followers_url": "https://api.github.com/users/fredthomsen/followers",
"following_url": "https://api.github.com/users/fredthomsen/following{/other_user}",
"gists_url": "https://api.github.com/users/fredthomsen/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/fredthomsen",
"id": 880064,
"login": "fredthomsen",
"node_id": "MDQ6VXNlcjg4MDA2NA==",
"organizations_url": "https://api.github.com/users/fredthomsen/orgs",
"received_events_url": "https://api.github.com/users/fredthomsen/received_events",
"repos_url": "https://api.github.com/users/fredthomsen/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/fredthomsen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fredthomsen/subscriptions",
"type": "User",
"url": "https://api.github.com/users/fredthomsen",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 12 |
2016-04-18T16:55:07Z
|
2021-09-08T15:00:48Z
|
2016-04-19T06:19:23Z
|
NONE
|
resolved
|
This system is on requests 2.7.0. Traceback below:
```
.....File "/usr/local/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/sessions.py", line 508, in post return self.request(\'POST\', url, data=data, json=json, **kwargs)
File "/usr/local/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/sessions.py", line 465, in request resp = self.send(prep, **send_kwargs),
File "/usr/local/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/sessions.py", line 605, in send r.content,
File "/usr/local/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/models.py", line 750, in content self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes(),
File "/usr/local/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/models.py", line 673, in generate for chunk in self.raw.stream(chunk_size, decode_content=True):,
File "/usr/local/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/packages/urllib3/response.py", line 303, in stream for line in self.read_chunked(amt, decode_content=decode_content):,
File "/usr/local/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/packages/urllib3/response.py", line 450, in read_chunked chunk = self._handle_chunk(amt),
File "/usr/local/lib/python2.7/site-packages/requests-2.7.0-py2.7.egg/requests/packages/urllib3/response.py", line 420, in _handle_chunk returned_chunk = self._fp._safe_read(self.chunk_left),
File "/usr/local/lib/python2.7/httplib.py", line 647, in _safe_read chunk = self.fp.read(min(amt, MAXAMOUNT)),
File "/usr/local/lib/python2.7/socket.py", line 380, in read data = self._sock.recv(left),
File "/usr/local/lib/python2.7/ssl.py", line 241, in recv return self.read(buflen),
File "/usr/local/lib/python2.7/ssl.py", line 160, in read return self._sslobj.read(len)
SSLError: The read operation timed out
```
Seems similar to #1826
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/880064?v=4",
"events_url": "https://api.github.com/users/fredthomsen/events{/privacy}",
"followers_url": "https://api.github.com/users/fredthomsen/followers",
"following_url": "https://api.github.com/users/fredthomsen/following{/other_user}",
"gists_url": "https://api.github.com/users/fredthomsen/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/fredthomsen",
"id": 880064,
"login": "fredthomsen",
"node_id": "MDQ6VXNlcjg4MDA2NA==",
"organizations_url": "https://api.github.com/users/fredthomsen/orgs",
"received_events_url": "https://api.github.com/users/fredthomsen/received_events",
"repos_url": "https://api.github.com/users/fredthomsen/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/fredthomsen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fredthomsen/subscriptions",
"type": "User",
"url": "https://api.github.com/users/fredthomsen",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3104/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3104/timeline
| null |
completed
| null | null | false |
[
"@fredthomsen Can you please check with 2.9.0? I believe we fixed this already.\n",
"On the system I am able to reproduce this on, it's only occurring occasionally, but has almost always been happening daily. I put 2.9.0 on there, and I'll let it run overnight.\n",
"Determined a better way of reproducing the issue consistently. @Lukasa looks fine on 2.9.0. Move along nothing to see here.\n",
"Thanks for the report and for clearing it up @fredthomsen! :sparkles:\n",
"I'm having this problem on 2.9.1 (with eventlet monkey-patched urllib, if it matters). Backtrace:\n\n```\nTraceback (most recent call last):\n <snip...>\n File \"/app/.heroku/python/lib/python3.3/site-packages/requests/api.py\", line 107, in post \n return request('post', url, data=data, json=json, **kwargs) \n File \"/app/.heroku/python/lib/python3.3/site-packages/requests/api.py\", line 53, in request \n return session.request(method=method, url=url, **kwargs) \n File \"/app/.heroku/python/lib/python3.3/site-packages/requests/sessions.py\", line 468, in request \n resp = self.send(prep, **send_kwargs) \n File \"/app/.heroku/python/lib/python3.3/site-packages/requests/sessions.py\", line 608, in send \n r.content \n File \"/app/.heroku/python/lib/python3.3/site-packages/requests/models.py\", line 737, in content \n self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes() \n File \"/app/.heroku/python/lib/python3.3/site-packages/requests/models.py\", line 660, in generate \n for chunk in self.raw.stream(chunk_size, decode_content=True): \n File \"/app/.heroku/python/lib/python3.3/site-packages/requests/packages/urllib3/response.py\", line 344, in stream \n data = self.read(amt=amt, decode_content=decode_content) \n File \"/app/.heroku/python/lib/python3.3/site-packages/requests/packages/urllib3/response.py\", line 301, in read \n data = self._fp.read(amt) \n File \"/app/.heroku/python/lib/python3.3/http/client.py\", line 507, in read \n return super(HTTPResponse, self).read(amt) \n File \"/app/.heroku/python/lib/python3.3/http/client.py\", line 546, in readinto \n n = self.fp.readinto(b) \n File \"/app/.heroku/python/lib/python3.3/socket.py\", line 297, in readinto \n return self._sock.recv_into(b) \n File \"/app/.heroku/python/lib/python3.3/site-packages/eventlet/green/ssl.py\", line 221, in recv_into \n timeout_exc=timeout_exc('timed out')) \n File \"/app/.heroku/python/lib/python3.3/site-packages/eventlet/hubs/__init__.py\", line 162, in trampoline \n return hub.switch() \n File \"/app/.heroku/python/lib/python3.3/site-packages/eventlet/hubs/hub.py\", line 294, in switch \n return self.greenlet.switch() \nssl.SSLError: ('timed out',) \n```\n",
"The eventlet monkeypatching does appear to be a problem, mostly because it seems to be causing a `from` import to be binding the wrong object. If you move the monkeypatch to ensure that it occurs _before_ `requests` is imported your problem should go away.\n",
"Edit: thanks for the awesomely fast response!\n\nUnfortunately, this is running under celery and gunicorn which both automatically monkey patch, so we don't have control over when that happens :(\n",
"Do you have control over when the import occurs? If so, you should be able to delay it until a point well after the monkeypatch.\n",
"Oh, duh, sorry.\n\nWhen I run this locally (having edited eventlet to print when monkey patched and requests to print when imported), I see that eventlet _is_ monkey patching before requests is imported... It's possible that the import order is different on prod, but I don't have a good way to check the import order on prod since I can't edit the libraries and they don't log.\n\nDo you have a suggestion for a requests call that would reliably repro the `ssl.SSLError` if the monkey patching was happening in the wrong order? That way I can test locally if the problem was persisting despite the fact that locally my imports are in the correct order. (The request that's failing on prod unfortunately works fine from my localhost, so I can't use it to test.)\n",
"Any idea of how I could get a consistent repro on this so that I can test the import-order stuff? I'm not sure what's causing the SSLError, so I'm not sure how to set up a request that consistently raises it.\n",
"@benkuhn The SSL error seems to be caused by a failure to deliver the response, so a server that sends a content-length but then stops sending halfway through the body (without closing the socket) should be enough to trigger this.\n",
"OK, sorry for the extremely long delay, but I have a minimal repro that shows the problem occurs even when `eventlet.monkey_patch()` is called before `requests` is imported. Sorry for the copy-paste; Github apparently doesn't want to upload my .py files?\n\n``` python\n# server.py\nfrom flask import Flask, Response\nimport time\n\napp = Flask(__name__)\n\[email protected]('/')\ndef root():\n return 'hello world'\n\[email protected]('/fail')\ndef fail():\n def generate():\n yield 'hello\\n'\n time.sleep(5)\n yield 'world'\n return Response(generate())\n\napp.run('0.0.0.0', debug=True, port=8100, ssl_context='adhoc')\n```\n\n``` python\n# client.py\nimport eventlet\neventlet.monkey_patch()\n\nimport requests\n\nrequests.get('https://localhost:8100/fail', timeout=4, verify=False)\n```\n\nInstall `flask`, `requests==2.11.1` and `pyopenssl`, then run `python server.py` and `python client.py`. You should see the following output:\n\n``````\n/Users/ben/.virtualenvs/remit/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py:838: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/security.html\n InsecureRequestWarning)\nTraceback (most recent call last):\n File \"client.py\", line 6, in <module>\n requests.get('https://localhost:8100/fail', timeout=4, verify=False)\n File \"/Users/ben/.virtualenvs/remit/lib/python3.5/site-packages/requests/api.py\", line 70, in get\n return request('get', url, params=params, **kwargs)\n File \"/Users/ben/.virtualenvs/remit/lib/python3.5/site-packages/requests/api.py\", line 56, in request\n return session.request(method=method, url=url, **kwargs)\n File \"/Users/ben/.virtualenvs/remit/lib/python3.5/site-packages/requests/sessions.py\", line 475, in request\n resp = self.send(prep, **send_kwargs)\n File \"/Users/ben/.virtualenvs/remit/lib/python3.5/site-packages/requests/sessions.py\", line 628, in send\n r.content\n File \"/Users/ben/.virtualenvs/remit/lib/python3.5/site-packages/requests/models.py\", line 755, in content\n self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()\n File \"/Users/ben/.virtualenvs/remit/lib/python3.5/site-packages/requests/models.py\", line 676, in generate\n for chunk in self.raw.stream(chunk_size, decode_content=True):\n File \"/Users/ben/.virtualenvs/remit/lib/python3.5/site-packages/requests/packages/urllib3/response.py\", line 357, in stream\n data = self.read(amt=amt, decode_content=decode_content)\n File \"/Users/ben/.virtualenvs/remit/lib/python3.5/site-packages/requests/packages/urllib3/response.py\", line 314, in read\n data = self._fp.read(amt)\n File \"/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py\", line 433, in read\n n = self.readinto(b)\n File \"/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/http/client.py\", line 473, in readinto\n n = self.fp.readinto(b)\n File \"/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/socket.py\", line 575, in readinto\n return self._sock.recv_into(b)\n File \"/Users/ben/.virtualenvs/remit/lib/python3.5/site-packages/eventlet/green/ssl.py\", line 224, in recv_into\n timeout_exc=timeout_exc('timed out'))\n File \"/Users/ben/.virtualenvs/remit/lib/python3.5/site-packages/eventlet/hubs/__init__.py\", line 162, in trampoline\n return hub.switch()\n File \"/Users/ben/.virtualenvs/remit/lib/python3.5/site-packages/eventlet/hubs/hub.py\", line 294, in switch\n return self.greenlet.switch()\nssl.SSLError: ('timed out',)```\n``````\n"
] |
https://api.github.com/repos/psf/requests/issues/3103
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3103/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3103/comments
|
https://api.github.com/repos/psf/requests/issues/3103/events
|
https://github.com/psf/requests/pull/3103
| 149,092,070 |
MDExOlB1bGxSZXF1ZXN0NjY4MjAyOTk=
| 3,103 |
Implement #2149
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/31764?v=4",
"events_url": "https://api.github.com/users/weiwei/events{/privacy}",
"followers_url": "https://api.github.com/users/weiwei/followers",
"following_url": "https://api.github.com/users/weiwei/following{/other_user}",
"gists_url": "https://api.github.com/users/weiwei/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/weiwei",
"id": 31764,
"login": "weiwei",
"node_id": "MDQ6VXNlcjMxNzY0",
"organizations_url": "https://api.github.com/users/weiwei/orgs",
"received_events_url": "https://api.github.com/users/weiwei/received_events",
"repos_url": "https://api.github.com/users/weiwei/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/weiwei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/weiwei/subscriptions",
"type": "User",
"url": "https://api.github.com/users/weiwei",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-04-18T09:10:17Z
|
2021-09-08T04:01:05Z
|
2016-04-18T13:42:50Z
|
NONE
|
resolved
|
Though #2149 is marked as closed, I had to do this because 1) I can't just give up requests and 2) I can't change the non-standard URLs I'm using. So I did a little hack. Now it goes like this:
```
>>> from urllib.parse import quote
>>> requests.models.DEFAULT_QUOTE_VIA = quote
>>> requests.models.DEFAULT_SAFE = '/?$=%'
>>> r = requests.get('http://reddit.com/', params={'$foo': 'bar baz'})
>>> r.url
'https://www.reddit.com/?$foo=bar%20baz'
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/31764?v=4",
"events_url": "https://api.github.com/users/weiwei/events{/privacy}",
"followers_url": "https://api.github.com/users/weiwei/followers",
"following_url": "https://api.github.com/users/weiwei/following{/other_user}",
"gists_url": "https://api.github.com/users/weiwei/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/weiwei",
"id": 31764,
"login": "weiwei",
"node_id": "MDQ6VXNlcjMxNzY0",
"organizations_url": "https://api.github.com/users/weiwei/orgs",
"received_events_url": "https://api.github.com/users/weiwei/received_events",
"repos_url": "https://api.github.com/users/weiwei/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/weiwei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/weiwei/subscriptions",
"type": "User",
"url": "https://api.github.com/users/weiwei",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3103/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3103/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3103.diff",
"html_url": "https://github.com/psf/requests/pull/3103",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3103.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3103"
}
| true |
[
"I'm -1 on this: we have an appropriate hook to do this already (the PreparedRequest), and I'm disinclined to want to add configurability here, _particularly_ in the form of global singleton state: this is a recipe for really tricky to debug issues down the road. \n",
"@Lukasa Thanks for pointing to the right direction.\n"
] |
https://api.github.com/repos/psf/requests/issues/3102
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3102/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3102/comments
|
https://api.github.com/repos/psf/requests/issues/3102/events
|
https://github.com/psf/requests/issues/3102
| 149,045,770 |
MDU6SXNzdWUxNDkwNDU3NzA=
| 3,102 |
PreparedRequest check: use duck typing?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4",
"events_url": "https://api.github.com/users/kevinburke/events{/privacy}",
"followers_url": "https://api.github.com/users/kevinburke/followers",
"following_url": "https://api.github.com/users/kevinburke/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kevinburke",
"id": 234019,
"login": "kevinburke",
"node_id": "MDQ6VXNlcjIzNDAxOQ==",
"organizations_url": "https://api.github.com/users/kevinburke/orgs",
"received_events_url": "https://api.github.com/users/kevinburke/received_events",
"repos_url": "https://api.github.com/users/kevinburke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kevinburke",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-04-18T05:25:57Z
|
2021-09-08T18:00:50Z
|
2016-04-21T15:30:22Z
|
CONTRIBUTOR
|
resolved
|
Okay, this is admittedly an edge case.
I was attempting to deploy to AWS from a CI server, and `boto3` was returning this stack trace:
```
Traceback (most recent call last):
File "deploy.py", line 43, in <module>
main(args.realm)
File "deploy.py", line 29, in main
results = [client.update_function_code(FunctionName=arn, ZipFile=data) for arn in arns]
File "/home/ubuntu/webhooks/src/botocore/botocore/client.py", line 236, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/home/ubuntu/webhooks/src/botocore/botocore/client.py", line 489, in _make_api_call
operation_model, request_dict)
File "/home/ubuntu/webhooks/src/botocore/endpoint.py", line 117, in make_request
return self._send_request(request_dict, operation_model)
File "/home/ubuntu/webhooks/src/botocore/endpoint.py", line 146, in _send_request
success_response, exception):
File "/home/ubuntu/webhooks/src/botocore/endpoint.py", line 219, in _needs_retry
caught_exception=caught_exception)
File "/home/ubuntu/webhooks/src/botocore/botocore/hooks.py", line 226, in emit
return self._emit(event_name, kwargs)
File "/home/ubuntu/webhooks/src/botocore/botocore/hooks.py", line 209, in _emit
response = handler(**kwargs)
File "/home/ubuntu/webhooks/src/botocore/botocore/retryhandler.py", line 183, in __call__
if self._checker(attempts, response, caught_exception):
File "/home/ubuntu/webhooks/src/botocore/botocore/retryhandler.py", line 250, in __call__
caught_exception)
File "/home/ubuntu/webhooks/src/botocore/botocore/retryhandler.py", line 265, in _should_retry
return self._checker(attempt_number, response, caught_exception)
File "/home/ubuntu/webhooks/src/botocore/botocore/retryhandler.py", line 313, in __call__
caught_exception)
File "/home/ubuntu/webhooks/src/botocore/botocore/retryhandler.py", line 222, in __call__
return self._check_caught_exception(attempt_number, caught_exception)
File "/home/ubuntu/webhooks/src/botocore/botocore/retryhandler.py", line 355, in _check_caught_exception
raise caught_exception
ValueError: You can only send PreparedRequests.
```
The related code in Requests is here:
``` python
# It's possible that users might accidentally send a Request object.
# Guard against that specific failure case.
if not isinstance(request, PreparedRequest):
raise ValueError('You can only send PreparedRequests.')
```
This was failing because unbeknownst to me, the server was running an outdated version of `pip`, vulnerable to this issue, which creates multiple subdirectories (`src/requests`, `src/requests/requests`) when you rerun `pip install --target`: https://github.com/pypa/pip/issues/1489. So the PreparedRequest was being created in one version of the library and compared against the other version of the library, and throwing this exception, even though they were both PreparedRequest instances!
I'm not sure of the best reference here, but as I understand it, it's preferable to check an object's behavior rather than its type, see for example [Alex Martelli's message here](https://groups.google.com/forum/?hl=en#!msg/comp.lang.python/CCs2oJdyuzc/NYjla5HKMOIJ):
> In other words, don't check whether it IS-a duck: check
> whether it QUACKS-like-a duck, WALKS-like-a duck,
> etc, etc, depending on exactly what subset of duck-like
> behaviour you need to play your language-games with.
I may be the only human in the history of existence to hit this exact edge case, but it seems like Requests could still perform the request. Is there a better way to check for the necessary behaviour?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3102/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3102/timeline
| null |
completed
| null | null | false |
[
"That's a good question.\n\nGenerally speaking it seems like a bad idea to pass objects from one form of a library to another form (it's not even really possible to do in Python unless the namespacing has gone all to hell. One way we could resolve this problem is to flip the check to catch the specific error case we're explicitly worried about (`isinstance(request, Request)`). That might be the least destructive way to avoid this problem.\n",
"Sounds good!\n"
] |
https://api.github.com/repos/psf/requests/issues/3101
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3101/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3101/comments
|
https://api.github.com/repos/psf/requests/issues/3101/events
|
https://github.com/psf/requests/issues/3101
| 149,024,007 |
MDU6SXNzdWUxNDkwMjQwMDc=
| 3,101 |
How to find what proxies Requests is using?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/8064242?v=4",
"events_url": "https://api.github.com/users/thuzhf/events{/privacy}",
"followers_url": "https://api.github.com/users/thuzhf/followers",
"following_url": "https://api.github.com/users/thuzhf/following{/other_user}",
"gists_url": "https://api.github.com/users/thuzhf/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/thuzhf",
"id": 8064242,
"login": "thuzhf",
"node_id": "MDQ6VXNlcjgwNjQyNDI=",
"organizations_url": "https://api.github.com/users/thuzhf/orgs",
"received_events_url": "https://api.github.com/users/thuzhf/received_events",
"repos_url": "https://api.github.com/users/thuzhf/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/thuzhf/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/thuzhf/subscriptions",
"type": "User",
"url": "https://api.github.com/users/thuzhf",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-04-18T02:36:12Z
|
2021-09-08T18:00:52Z
|
2016-04-18T05:38:28Z
|
NONE
|
resolved
|
I set no proxies in environment variables, but I set proxies in system configuration on my MacOS. And I find that Requests use it automatically (by checking what responses I get), but it seems this feature is not pointed out in your documentation. So is there any way to see what proxies are Requests is using? Will it automatically use what urllib.request.getproxies() return?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3101/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3101/timeline
| null |
completed
| null | null | false |
[
"You can call `requests.utils.get_environ_proxies(url)`, where the URL is the URL you're accessing. That will show you what proxies we found. \n"
] |
https://api.github.com/repos/psf/requests/issues/3100
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3100/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3100/comments
|
https://api.github.com/repos/psf/requests/issues/3100/events
|
https://github.com/psf/requests/pull/3100
| 148,838,782 |
MDExOlB1bGxSZXF1ZXN0NjY3MzM3NjY=
| 3,100 |
Remove stale sentence in philosophy.rst
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/15012549?v=4",
"events_url": "https://api.github.com/users/hitstergtd/events{/privacy}",
"followers_url": "https://api.github.com/users/hitstergtd/followers",
"following_url": "https://api.github.com/users/hitstergtd/following{/other_user}",
"gists_url": "https://api.github.com/users/hitstergtd/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/hitstergtd",
"id": 15012549,
"login": "hitstergtd",
"node_id": "MDQ6VXNlcjE1MDEyNTQ5",
"organizations_url": "https://api.github.com/users/hitstergtd/orgs",
"received_events_url": "https://api.github.com/users/hitstergtd/received_events",
"repos_url": "https://api.github.com/users/hitstergtd/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/hitstergtd/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hitstergtd/subscriptions",
"type": "User",
"url": "https://api.github.com/users/hitstergtd",
"user_view_type": "public"
}
|
[
{
"color": "fbca04",
"default": false,
"description": null,
"id": 44501249,
"name": "Needs BDFL Input",
"node_id": "MDU6TGFiZWw0NDUwMTI0OQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
] | null | 2 |
2016-04-16T09:59:53Z
|
2021-09-08T04:01:05Z
|
2016-04-17T17:26:25Z
|
CONTRIBUTOR
|
resolved
|
Sentence contains reference to version 1.0.0.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3100/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3100/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3100.diff",
"html_url": "https://github.com/psf/requests/pull/3100",
"merged_at": "2016-04-17T17:26:25Z",
"patch_url": "https://github.com/psf/requests/pull/3100.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3100"
}
| true |
[
"Agreed. I saw this a few weeks ago and was thinking the same thing. \n",
"@kennethreitz, @sigmavirus24 Thanks! 👍 \n"
] |
https://api.github.com/repos/psf/requests/issues/3099
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3099/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3099/comments
|
https://api.github.com/repos/psf/requests/issues/3099/events
|
https://github.com/psf/requests/issues/3099
| 148,779,712 |
MDU6SXNzdWUxNDg3Nzk3MTI=
| 3,099 |
overall timeout
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5225146?v=4",
"events_url": "https://api.github.com/users/emgerner-msft/events{/privacy}",
"followers_url": "https://api.github.com/users/emgerner-msft/followers",
"following_url": "https://api.github.com/users/emgerner-msft/following{/other_user}",
"gists_url": "https://api.github.com/users/emgerner-msft/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/emgerner-msft",
"id": 5225146,
"login": "emgerner-msft",
"node_id": "MDQ6VXNlcjUyMjUxNDY=",
"organizations_url": "https://api.github.com/users/emgerner-msft/orgs",
"received_events_url": "https://api.github.com/users/emgerner-msft/received_events",
"repos_url": "https://api.github.com/users/emgerner-msft/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/emgerner-msft/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/emgerner-msft/subscriptions",
"type": "User",
"url": "https://api.github.com/users/emgerner-msft",
"user_view_type": "public"
}
|
[
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
}
] |
closed
| true | null |
[] | null | 38 |
2016-04-15T22:00:39Z
|
2021-09-08T18:00:47Z
|
2016-04-26T17:09:54Z
|
NONE
|
resolved
|
We already make great use of the timeout parameter which allows setting per TCP transaction timeouts. This is very helpful! However, we also need to support an overall timeout across the connection. Reading the [docs on timeouts](http://docs.python-requests.org/en/master/user/quickstart/#timeouts) I see this isn't currently supported, and searching through the issues at least a bit back I didn't see another request for this feature -- excuse me if there is.
I realize we can set timers in our library to accomplish this, but I'm concerned about the additional overhead (one per thread, and we may have many) as well as any adverse effects to connection pooling if we end up needing to abort a request. Is there a good way to abort a request in the first place? I didn't see anything obvious in the docs.
So: Long term, it would be great if we could add overall timeout to the requests library. Short term, is there a recommended way of implementing this on my end?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3099/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3099/timeline
| null |
completed
| null | null | false |
[
"Hi @emgerner-msft,\n\nFor reference, the following are all variations on this theme if not this exact feature request:\n- https://github.com/kennethreitz/requests/issues/2327\n- https://github.com/kennethreitz/requests/issues/2685\n- https://github.com/kennethreitz/requests/issues/1928\n- (and I'm sure there are more)\n\nWe've also discussed this over on https://github.com/sigmavirus24/requests-toolbelt/issues/51\n\nYou'll notice the last link discusses [this package](https://github.com/pnpnpn/timeout-decorator) which should handle this for you without adding it to requests. The reality, is that there's no need for requests to do this when another package already does it very well.\n",
"The package you reference does it by forking a separate process to run the web request. That is a very heavyweight way of achieving the simple goal of a timeout, and in my view is not in any way a substitute for requests itself having a native timeout feature.\n",
"@jribbens If you can come up with a way that uses neither threads nor processes, that would be amazing. Until then, if you want a wall clock timeout your best bet is that package as it's the most reliable way of achieving that at the moment.\n",
"I don't think @jribbens is saying no threads nor processes. Just that a process _per_ web request is excessive. Many languages have a way of multiple timers sharing a single additional thread or process. I'm just not aware of how to do that best in Python. \n\nIt seems like #1928 has the most discussion of alternatives, but most come with a lot of caveats (this won't work for your use case, etc). I'm fine with having some custom code in my library and writing my own custom solution if this really doesn't belong in requests but I think I need a little more information on what that would look like. The whole reason we use requests is to get away from the low level TCP connection pooling logic but it seems like reading that thread that in order to write this custom code I need to know that logic, and that's what I'm having some trouble with.\n",
"@emgerner-msft is correct. I am bit confused by @sigmavirus24's comment, having a \"total timeout\" without using threads or processes seems quite pedestrian and not at all \"amazing\". Just calculate the deadline at the start of the whole process (e.g. `deadline = time.time() + total_timeout`) and then on any individual operation set the timeout to be `deadline - time.time()`.\n",
"> having a \"total timeout\" without using threads or processes seems quite pedestrian and not at all \"amazing\". \n\nAnd your solution is rather primitive. The reason _most_ people want a total (or wall clock) timeout is to prevent a read from \"hanging\", in other words a case like the following:\n\n``` py\nr = requests.get(url, stream=True)\nfor chunk in r.iter_content(chunksize):\n process_data(chunk)\n```\n\nWhere each read takes a long time in the middle of `iter_content` but it's less than the read timeout (I'm assuming we apply that when streaming, but it still may be the case that we don't) they specified. Certainly it would seem like this should be simply handled by your solution @jribbens until you remember how clocks drift and daylight savings time works and them `time.time()` is woefully insufficient.\n\nFinally, it's important to keep in mind that Requests' API is frozen. There is no good or consistent API for specifying a total timeout. And if we implemented a timeout like you suggest, we would have countless bugs that they specified a one minute long total timeout but it took longer because the last time we checked we were under a minute but their configured read timeout was long enough that their timeout error was raised around a minute and a half. That's a _very_ rough wall timeout that would be slightly better for people looking for this, but no different from the person implementing this themselves.\n",
"Apologies if I was unclear @sigmavirus24 , you seem to have critiqued my pseudocode illustration of principle as if you thought it was a literal patch. I should point out though that `time.time()` does not work the way you apparently think - daylight savings time is not relevant, and neither is clock skew on the timescales we're talking about here. Also you have misunderstood the suggestion if you think the bug you describe would occur. Finally I am not sure what you mean by the Requests API being \"frozen\" as the API was changed as recently as version 2.9.0 so clearly whatever you mean it's not what I would normally understand by the word.\n",
"Just to separate my discussion: I'm actually not arguing this is easy. If it were totally simple, I would just write it and stop bugging you. :) \n\nMy problems are:\n1) Everything on the threads you listed was monkey patches. That's fine, but I'm using this in a production quality library and can't take the caveat of internal changes breaking everything.\n2) The timeout decorator in the link you gave is great, but I'm not clear on how that affects the connection. Even if we accept that the only good way of doing timeouts is with a bunch of threads, how does this library enforce that the socket gets shut down, the connection dropped, etc. We're doing a lot of connections and this seems potentially quite leak prone. requests doesn't have an 'abort' method that I can find (correct me if I'm wrong) so how is the shutdown of the connection happening?\n\nAll I'm looking for is a clear 'blessed' version of how to solve this problem on my own, or if there's not a perfect solution, a couple solutions with the caveats discussed. Does that make sense?\n",
"@emgerner-msft Assuming you're using CPython, connection shutdown will happen when the request is no longer continuing. At that point all references to the underlying connection will be lost and the socket will be closed and disposed of.\n",
"@Lukasa Okay, thanks! How does the library determine the request is no longer continuing? For example, if I used the timeout decorator route and cut off in the middle of the download, when would the download actually stop? Do I need to do anything special with the streaming options?\n",
"If you use the timeout decorator, the download will stop when the timeout fires. This is because signals interrupt syscalls, which means that there will be no further calls into the socket. Once the request is no longer in scope (e.g. the stack has unwound to outside of your `requests.*` function), that's in: CPython will clean up the connection object and tear the connection down. No special streaming options are required there.\n",
"Perfect. I'm good to close the thread then, unless others have more to say.\n",
"Actually, sorry, one more concern. Was looking at the timeout decorator code more closely since you said that it uses signals was relevant, as opposed to something like Python Timers (presumably). It looks like it [calls signal with SIGALRM](https://github.com/pnpnpn/timeout-decorator/blob/master/timeout_decorator/timeout_decorator.py#L73) which is documented in [Python Signal](https://docs.python.org/3.5/library/signal.html#signal.signal) not to work on Windows. I need this to work in both Unix and Windows environments, as well as in Python 2.7 and 3.3+ (much like requests itself). I'll poke around a bit more and see if this will actually work given that.\n",
"@emgerner-msft That's frustrating. =( \n",
"@Lukasa Yup, tried the basic [usage snippet](https://github.com/pnpnpn/timeout-decorator#usage) and it doesn't work on Windows. I read some more of the code/examples and fiddled and it looks like if we don't use signals the package might work, but everything has to be pickable which is not the case for my application. So as far as I can tell, timeout decorator won't solve my problem. Any other ideas?\n",
"@emgerner-msft Are you confident that none of the Windows-specific signals are suitable?\n",
"@Lukasa To be blunt, I simply don't know. I haven't used signals before, and much like I didn't realize until you told me that they'd interrupt the request I'm not sure what's appropriate. I'm also not trying to get this just to work on Windows. I need full crossplat support (Windows and Unix) and both Python 2 and Python 3 support. So much of signals looks platform specific it's throwing me. [Timer](https://docs.python.org/2/library/threading.html#timer-objects) was one of the solutions I was looking at that looked less low level and thus might take care of my constraints, but I'm not sure then how I could close the connection. I can do more reading, but this is why I was hoping to get additional guidance from you guys. :)\n",
"So this is a really tricky place to be.\n\nThe reality is that there is more-or-less no cross-platform way to kill a thread except by interrupting it, which is basically what a signal is. That means, I think, that signals are the only route you really have to making this work across platforms. I'm inclined to try to ping in a Windowsy Pythony expert: @brettcannon, do you have a good suggestion here?\n",
"Out of interest, is there a reason to not implement \"total timeout\" in Requests other than that implementing and testing it requires work? I mean, if a patch to implement it magically appeared today would it in theory be rejected or accepted? I appreciate and agree with the \"eliminate unnecessary complexity\" point of view, but \"you can do it by forking a separate process\" does not make this feature unnecessary in my opinion.\n",
"@jribbens There are a few problems with this.\n\nPart 1 is that the complexity of such a patch is very high. To get it to behave correctly you need to repeatedly change timeouts at the socket level. This means that the patch needs to be passed pervasively though httplib, which we've already patched more than we'd like to. Essentially, we'd need to be reaching into httplib and reimplementing about 50% of its more complex methods in order to achieve this functional change.\n\nPart 2 is that the maintenance of such a patch is relatively burdensome. We'd likely need to start maintaining what amounts to a parallel fork of httplib (more properly http.client at this time) in order to successfully do it. Alternatively, we'd need to take on the maintenance burden of a different HTTP stack that is more amenable to this kind of change. This part is, I suspect, commonly missed by those who wish to have such a feature: the cost of implementing it is high, but that is _nothing_ compared to the ongoing maintenance costs of supporting such a feature on all platforms.\n\nPart 3 is that the advantage of such a patch is unclear. It has been my experience that most people who want a total timeout patch are not thinking entirely clearly about what they want. In most cases, total timeout parameters end up having the effect of killing perfectly good requests for no reason.\n\nFor example, suppose you've designed a bit of code that downloads files, and you'd like to handle hangs. While it's initially tempting to want to set a flat total timeout (\"no request may take more than 30 seconds!\"), such a timeout misses the point. For example, if a file changes from being 30MB to being 30GB in size, such a file can _never_ download in that kind of time interval, even though the download may be entirely healthy.\n\nPut another way, total timeouts are an attractive nuisance: they appear to solve a problem, but they don't do it effectively. A more useful approach, in my opinion, is to take advantage of the per-socket-action timeout, combined with `stream=True` and `iter_content`, and assign yourself timeouts for chunks of data. The way `iter_content` works, flow of control will be returned to your code in a somewhat regular interval. That means that you can set yourself socket-level timeouts (e.g. 5s) and then `iter_content` over fairly small chunks (e.g. 1KB of data) and be relatively confident that unless you're being actively attacked, no denial of service is possible here. If you're really worried about denial of service, set your socket-level timeout much lower and your chunk size smaller (0.5s and 512 bytes) to ensure that you're regularly having control flow handed back to you.\n\nThe upshot of all this is that I believe that total timeouts are a misfeature in a library like this one. The best kind of timeout is one that is tuned to allow large responses enough time to download in peace, and such a timeout is best served by socket-level timeouts and `iter_content`.\n",
"Maybe @zooba has an idea as he actually knows how Windows works. :)\n",
"(Unrelatedly, one of my favourite things to do is to set up a daisy-chain of experts in a GitHub issue.)\n",
"Haha, I already know @zooba and @brettcannon. I can discuss with them here or internally as a solution to this would probably help them too.\n",
"@emgerner-msft I figured you might, but didn't want to presume: MSFT is a big organisation!\n",
"@Lukasa Just reading through the wall of text you just wrote above -- interesting! On the discussion of stream=True and iter_content to time downloads, what is the equivalent way of handling larger uploads?\n\n_PS_: The paragraph above starting with 'Put another way,..' is the kind of guidance I looked for in the docs. Given the number of requests you get for maximum timeout (and your valid reasons for not doing it), maybe the best thing to do is add some of that information in the [timeout docs](http://docs.python-requests.org/en/master/user/advanced/#timeouts)?\n",
"lol @lukasa I take your point about maintenance, which was already in my mind, but on \"feature vs misfeature\" I'm afraid I'm completely opposite to you. I think anyone who _doesn't_ want a total timeout isn't thinking clearly about what they want, and I'm having difficulty imagining a situation where what you describe as a bug \"30MB download changes to 30GB and therefore fails\" isn't in fact a beneficial feature!\n\nYou can as you say do something a bit similar (but I suspect without most of the benefits of a total timeout) using `stream=True` but I thought the point of requests was that it handled things for you...\n",
"> I thought the point of requests was that it handled things for you\n\nIt handles HTTP for you. The facts that we already handle connection and read timeouts and that we have had a couple exemptions to our feature freeze of several years are tangential to the discussion of utility, desirability, consistency (across multiple platforms), and maintainability. We appreciate your feedback and your opinion. If you have new information to present, we'd appreciate that.\n\nIt may also be telling that requests doesn't handle everything, by the number of rejected feature requests on this project and the fact that there's a separate project implementing common usage patterns for users (the requests toolbelt). If a total timeout belongs anywhere, it would be there, but again, it would have to work on Windows, BSD, Linux, and OSX with excellent test coverage and without it being a nightmare to maintain.\n",
"> On the discussion of stream=True and iter_content to time downloads, what is the equivalent way of handling larger uploads?\n\nDefine a generator for your upload, and pass that to `data`. Or, if chunked encoding isn't a winner for you, define a file-like object with a magic `read` method and pass _that_ to `data`.\n\nLet me elaborate a bit. If you pass a generator to `data`, requests will iterate over it, and will send each chunk in turn. This means that to send data we'll necessarily have to hand flow of control to your code for each chunk. This lets you do whatever you want in that time, including throw exceptions to abort the request altogether.\n\nIf for some reason you can't use chunked transfer encoding for your uploads (unlikely, but possible if the server in question is real bad), you can do the same by creating a file-like object that has a length and then doing your magic in the `read` call, which will be repeatedly called for 8192-byte chunks. Again, this ensures the flow of control goes through your code intermittently, which lets you use your own logic.\n\n> PS: The paragraph above starting with 'Put another way,..' is the kind of guidance I looked for in the docs. Given the number of requests you get for maximum timeout (and your valid reasons for not doing it), maybe the best thing to do is add some of that information in the timeout docs?\n\nI _suppose_. Generally speaking, though, I'm always nervous about putting somewhat defensive text into documentation. It could go into an FAQ I guess, but text that explains why we _don't_ have something is rarely useful in documentation. The space in the docs would be better served, I suspect, by a recipe for doing something.\n\n> I think anyone who doesn't want a total timeout isn't thinking clearly about what they want, and I'm having difficulty imagining a situation where what you describe as a bug \"30MB download changes to 30GB and therefore fails\" isn't in fact a beneficial feature!\n\nHeh, I'm not:\n- package manager (e.g. pip, which uses requests), where packages can vary wildly in data size\n- web scraper, which may run against multiple sites that vary wildly in size\n- a log aggregator that downloads log files from hosts which have wildly varying levels of us (and therefore log file sizes)\n- video downloader (videos can vary wildly in size)\n\nIn actuality, I think the case that the developer knows to within one order of magnitude what file sizes they'll be dealing with is the uncommon case. In most cases developers have no idea. And generally I'd say that making assumptions about those sizes is unwise. If you have constraints on download size then your code should deliberately encode those assumptions (e.g. in the form of checks on content-length), rather than implicitly encode them and mix them in with the bandwidth of the user's network so that other people reading the code can see them clearly.\n\n> but I thought the point of requests was that it handled things for you...\n\nRequests very deliberately does not handle everything for users. Trying to do everything is an impossible task, and it's impossible to build a good library that does that. We regularly tell users to drop down to urllib3 in order to achieve something.\n\nWe only put code into requests if we can do it better or cleaner than most users will be able to do. If not, there's no value. I'm really not yet sold on total timeout being one of those things, especially given what I perceive to be the relatively marginal utility when aggregated across our user-base.\n\nThat said, I'm open to being convinced I'm wrong: I just haven't seen a convincing argument for it yet (and, to head you off at the pass, \"I need it!\" is not a convincing argument: gotta give some reasons!).\n",
"@sigmavirus24 \n\n> If a total timeout belongs anywhere, it would be there, but again, it would have to work on Windows, BSD, Linux, and OSX with excellent test coverage and without it being a nightmare to maintain.\n\nAgreed!\n",
"@lukasa I suppose my thinking is that not only do I want it, in fact nearly all users would want it if they thought about it (or they don't realise it's not already there). Half of your usage scenarios above where you say it should be avoided I would say it's vital (web scraper and log aggregator) - the other two it's less necessary as there's likely to be a user waiting for the result who can cancel the download manually if they want. Anything that runs in the background without a UI and doesn't use an overall timeout is buggy in my view!\n"
] |
https://api.github.com/repos/psf/requests/issues/3098
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3098/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3098/comments
|
https://api.github.com/repos/psf/requests/issues/3098/events
|
https://github.com/psf/requests/issues/3098
| 148,578,221 |
MDU6SXNzdWUxNDg1NzgyMjE=
| 3,098 |
HeaderParsingError: Failed to parse headers
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6367792?v=4",
"events_url": "https://api.github.com/users/guyskk/events{/privacy}",
"followers_url": "https://api.github.com/users/guyskk/followers",
"following_url": "https://api.github.com/users/guyskk/following{/other_user}",
"gists_url": "https://api.github.com/users/guyskk/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/guyskk",
"id": 6367792,
"login": "guyskk",
"node_id": "MDQ6VXNlcjYzNjc3OTI=",
"organizations_url": "https://api.github.com/users/guyskk/orgs",
"received_events_url": "https://api.github.com/users/guyskk/received_events",
"repos_url": "https://api.github.com/users/guyskk/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/guyskk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/guyskk/subscriptions",
"type": "User",
"url": "https://api.github.com/users/guyskk",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 23 |
2016-04-15T07:22:15Z
|
2021-08-27T00:08:26Z
|
2016-04-15T08:12:58Z
|
NONE
|
resolved
|
```
INFO:requests.packages.urllib3.connectionpool:Starting new HTTP connection (1): 127.0.0.1
DEBUG:requests.packages.urllib3.connectionpool:"POST /kkblog/ HTTP/1.1" 201 None
WARNING:requests.packages.urllib3.connectionpool:Failed to parse headers (url=http://127.0.0.1:5984/kkblog/): [MissingHeaderBodySeparatorDefect()], unparsed data: '³é\x97\xad\r\nETag: "1-967a00dff5e02add41819138abb3284d"\r\nDate: Fri, 15 Apr 2016 14:45:18 GMT\r\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 69\r\nCache-Control: must-revalidate\r\n\r\n'
Traceback (most recent call last):
File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 390, in _make_request
assert_header_parsing(httplib_response.msg)
File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/util/response.py", line 59, in assert_header_parsing
raise HeaderParsingError(defects=defects, unparsed_data=unparsed_data)
requests.packages.urllib3.exceptions.HeaderParsingError: [MissingHeaderBodySeparatorDefect()], unparsed data: '³é\x97\xad\r\nETag: "1-967a00dff5e02add41819138abb3284d"\r\nDate: Fri, 15 Apr 2016 14:45:18 GMT\r\nContent-Type: text/plain; charset=utf-8\r\nContent-Length: 69\r\nCache-Control: must-revalidate\r\n\r\n'
```
here is the same request with curl:
```
curl -v -X POST 127.0.0.1:5984/kkblog/ -H "Content-Type: application/json" -d '{"_id": "关闭"}'
Note: Unnecessary use of -X or --request, POST is already inferred.
* Trying 127.0.0.1...
* Connected to 127.0.0.1 (127.0.0.1) port 5984 (#0)
> POST /kkblog/ HTTP/1.1
> Host: 127.0.0.1:5984
> User-Agent: curl/7.47.1
> Accept: */*
> Content-Type: application/json
> Content-Length: 17
>
* upload completely sent off: 17 out of 17 bytes
< HTTP/1.1 201 Created
< Server: CouchDB/1.6.1 (Erlang OTP/18)
< Location: http://127.0.0.1:5984/kkblog/关闭
< ETag: "3-bc27b6930ca514527d8954c7c43e6a09"
< Date: Fri, 15 Apr 2016 15:13:14 GMT
< Content-Type: text/plain; charset=utf-8
< Content-Length: 69
< Cache-Control: must-revalidate
<
{"ok":true,"id":"关闭","rev":"3-bc27b6930ca514527d8954c7c43e6a09"}
* Connection #0 to host 127.0.0.1 left intact
```
the problem is `Location: http://127.0.0.1:5984/kkblog/关闭` in the response header, I tried other Chinese chars but they didn't cause Exception.
```
>>> '关闭'.encode('utf-8')
b'\xe5\x85\xb3\xe9\x97\xad'
>>>
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3098/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3098/timeline
| null |
completed
| null | null | false |
[
"The header parsing is done by httplib, in the Python standard library; that is the part that failed to parse. The failure to parse is understandable though: servers should not be shoving arbitrary bytes into headers. \n\nUsing UTF-8 for your headers is extremely unwise, as discussed by RFC 7230:\n\n> Historically, HTTP has allowed field content with text in the ISO-8859-1 charset [ISO-8859-1], supporting other charsets only through use of [RFC2047] encoding. In practice, most HTTP header field values use only a subset of the US-ASCII charset [USASCII].\n\nIn this instance it's not really possible for us to resolve the problem. The server should instead be sending urlencoded URLs, or RFC 2047-encoded header fields. Either way, httplib is getting confused here, and we can't really step in and stop it. \n",
"thanks\n",
"> In this instance it's not really possible for us to resolve the problem. The server should instead be sending urlencoded URLs, or RFC 2047-encoded header fields. Either way, httplib is getting confused here, and we can't really step in and stop it.\n\nSo..... what about the thousands and thousands of potential servers that I _can't_ just SSH into and fix?\n\nBasically, \"just follow the RFC\" is a complete non-answer, because I don't control the world (I'm taking minion applications, though!). \nThe fact is, servers out there serve UTF-8 headers. This is not something fixable, because they're not my server. My web browser handles this situation just fine, so it's clearly possible to make this situation work.\n\nAs it is, requests fails on these servers. This _is_ fixable, because I control the code on my local machine.\n",
"Additionally, the RFC only even ever states `In practice, most HTTP header\n field values use only a subset of the US-ASCII charset [USASCII].\n Newly defined header fields SHOULD limit their field values to\n US-ASCII octets. A recipient SHOULD treat other octets in field\n content (obs-text) as opaque data.`, so UTF-8 header values are even RFC compliant!\n",
"US-ASCII is a sub-set of UTF-8, not the other way around. Characters like `关闭` are UTF-8, but are NOT US-ASCII. So no, UTF-8 is NOT RFC compliant.\n",
"[Deleted comments about RFC stuff, since it's irrelevant, due to broad use of UTF-8 headers]\n\n---\n\nAnyways, the issue of RFC compliance is **irrelevant**. There are servers out there that **are** serving UTF-8 headers, which is then broadly supported by all browser vendors. \n",
"Content-Disposition is defined in RFC2183. It states that \"Short\" parameters should be US-ASCII. Long Parameters should be encoded as per RFC-2184. I can't see where in RFC-2184 it says that UTF-8 encoding is valid. (But I may be missing that particular line)\n\n@Lukasa knows much more about the relevant RFCs than I do though.\n",
"And again, whether it's compliant is irrelevant. There are servers out that that act like this. And my browser, and cURL are _completely_ happy talking to them, yet requests explodes.\n\n---\n\nHell, I'm interacting with **cloudflare**, and it's serving UTF-8 headers. So basically unicode header support is massively, MASSIVELY deployed and available, standards be damned.\n\nIf you're dead set on being puritanical about RFC support, the only people who are harmed are people who want to use the requests library.\n",
"@fake-name I don't recall being puritanical about _anything_. Here is, word for word, what I said (literally quoting myself from this thread):\n\n> The header parsing is done by httplib, in the Python standard library; that is the part that failed to parse. The failure to parse is understandable though: servers should not be shoving arbitrary bytes into headers.\n> \n> Using UTF-8 for your headers is extremely unwise, as discussed by RFC 7230:\n> \n> > Historically, HTTP has allowed field content with text in the ISO-8859-1 charset [ISO-8859-1], supporting other charsets only through use of [RFC2047] encoding. In practice, most HTTP header field values use only a subset of the US-ASCII charset [USASCII].\n> \n> In this instance it's not really possible for us to resolve the problem. The server should instead be sending urlencoded URLs, or RFC 2047-encoded header fields. Either way, httplib is getting confused here, and we can't really step in and stop it.\n\nNote the key part of this comment: \"Either way, httplib is getting confused here, and we can't really step in and stop it.\".\n\nThis is what I mean when I say \"it's not really possible for us to resolve the problem\". The issue here is in a helper library that sits in the Python standard library. Changing the header parsing logic of that standard library module, while _possible_, is something that needs to be done as part of the standard CPython development process. Requests already carries more subclasses and monkeypatches to httplib than we're happy with, and we're strongly disinclined to carry more.\n\nSo here are the options for resolving this issue:\n1. File a bug with the CPython development team, get the bug fixed and into Python 3.6 or 3.7, upgrade to that Python version.\n2. Work out what the minimal possible invasive change is to httplib in order to allow parsing UTF-8 headers, propose that patch to us to see what we think of it.\n3. Write a patch that removes Requests' requirement to use httplib at all so that the next time we have a problem that boils down to \"httplib is stupid\", we don't have to have this argument again.\n\nNow, _you_ are welcome to pursue any of those options, but I've been to this rodeo a few times so I'm pursuing (3), which is the only one that actually makes this problem go away for good. Unfortunately, it turns out that replacing our low-level HTTP stack that we have spent 7 years integrating with takes quite a lot of work, and I can't just vomit out the code to fix this on demand.\n\nTo sum up: I didn't say I didn't think this was a problem or a bug, I said it was a problem that the Requests team couldn't fix, at least not on a timescale that was going to be helpful to this user. If you disagree, by all means, provide a patch to prove me wrong.\n\n---\n\nAnd let me make something clear. For the last 9 months or so I have been the most active Requests maintainer by a long margin. Requests is not all I do with my time. I maintain 15 other libraries and actively contribute to more. I have quite a lot of stuff I am supposed to be doing. So I have to _prioritise_ my bug fixing.\n\nTrust me when I say that a bug where the effort required to fix it is extremely high and the flaw comes from a server that is emitting non-RFC-compliant output, that's not a bug that screams out \"must be fixed this second\". Any time there is a bug predicated on the notion that our peer isn't spec compliant that bug drops several spaces down my priority list. [Postel was wrong](https://tools.ietf.org/html/draft-thomson-postel-was-wrong-00).\n\nBrowsers are incentivised to support misbehaving servers because they are in a competitive environment, and users only blame them when things go wrong. If Chrome doesn't support ${CRAPPY_WEBSITE_X} then Chrome users will just go to a browser that does when they need access.\n\nThat's all fine and good, but the reason that Requests doesn't do this is because we have _two_ regular developers. That's it. There are only so many things two developers can do in a day. Neither of us work on just Requests. Compare this to Chrome, which has tens of full-time developers and hundreds of part-time ones. If you want Requests to work on every site where Chrome does, then I have bad news for you my friend because it's just _never_ going to happen.\n\nI say all of this to say: please don't _berate_ the Requests team because we didn't think your particular pet bug was important. We prioritise bugs and close ones we don't think we'll fix any time soon. If you would like to see this bug fixed, a much better option is to _write the patch yourself_. Shouting at me does not make me look fondly on your request for assistance.\n",
"I apologize. I assumed you were holding an opinion that you were not, and proceeded to be a complete ass.\n\nIn any event, I don't disagree that this is a issue with the core library, but, well, in my experience complaining about encoding issues in the core library is non-productive (I have an issue with the built-in `ftplib` where it decodes some messages as `iso-8859-1` _even when in utf-8 mode_, which I was only able to solve by [monkey-patching the stdlib](https://github.com/fake-name/MangaCMS/blob/6cf8b59db0326aa0f67ef76127884577c16c25cd/UploadPlugins/Madokami/uploader.py#L33-L104)).\n\nAnyways, Assuming you're OK with monkey patching, here's a simple snippet that monkey-patches `http.client` to make it much, MUCH more robust to arbitrary header encodings:\n\n```\n\n\ntry:\n import cchardet as chardet\nexcept ImportError:\n import chardet as chardet\n\nimport http.client\nimport email.parser\n\ndef parse_headers(fp, _class=http.client.HTTPMessage):\n \"\"\"Parses only RFC2822 headers from a file pointer.\n\n email Parser wants to see strings rather than bytes.\n But a TextIOWrapper around self.rfile would buffer too many bytes\n from the stream, bytes which we later need to read as bytes.\n So we read the correct bytes here, as bytes, for email Parser\n to parse.\n\n Note: Monkey-patched version to try to more intelligently determine\n header encoding\n\n \"\"\"\n headers = []\n while True:\n line = fp.readline(http.client._MAXLINE + 1)\n if len(line) > http.client._MAXLINE:\n raise http.client.LineTooLong(\"header line\")\n headers.append(line)\n if len(headers) > http.client._MAXHEADERS:\n raise HTTPException(\"got more than %d headers\" % http.client._MAXHEADERS)\n if line in (b'\\r\\n', b'\\n', b''):\n break\n\n\n hstring = b''.join(headers)\n inferred = chardet.detect(hstring)\n if inferred and inferred['confidence'] > 0.8:\n print(\"Parsing headers!\", hstring)\n hstring = hstring.decode(inferred['encoding'])\n else:\n hstring = hstring.decode('iso-8859-1')\n\n return email.parser.Parser(_class=_class).parsestr(hstring)\n\nhttp.client.parse_headers = parse_headers\n\n```\n\nNote: This _does_ require `cchardet`, or `chardet`. I'm open to better ways to determine the encoding. It simply overrides the `http.client.parse_headers()` member of the stdlib, which _is_ kind of squicky.\n\nSplatting the above into a file, and then just importing it at the beginning of the `requests/__init__.py` file seems to solve the problem. \n",
"So, that patch looks somewhat plausible. I should note for others skimming this thread that it only works on Python 3, but that's ok because this problem only exists on Python 3 (Python 2 doesn't insist on decoding the headers).\n\nThere are a few things we'd need to address if we wanted to move forward on this patch:\n1. It's slow. Requests includes chardet, and already cops quite a lot of flak for using it to try to guess the encoding of body content because of how slow it is. Doing a chardet check for every response we get is likely to be pretty unpleasant.\n2. It makes the assertion that every header is in the response is encoded the same way. Unfortunately I can confidently tell you that that's not true. It's extremely common to have a situation involving reverse or caching proxies where multiple entities add headers to a response, often using different character encodings. Of course, this patch isn't any _worse_ in that case than the current behaviour, but it doesn't necessarily gracefully handle it well either.\n3. Monkeypatching the stdlib is asking for trouble. It hasn't entirely stopped us in the past, but that doesn't make me happy about it.\n\nPart of me wonders whether we can address this issue in a more sledgehammery way. For example, can we avoid the chardet requirement by asserting that we only want to try UTF-8 and ISO-8859-1? That will fail in the rare instance that someone sends a header block encoded in UTF-16, but frankly I'd argue that in that instance the server is beyond help and needs to be put out of its misery.\n\nThe other advantage of doing it that way around is that it would let us drop down and put the patch into urllib3. In addition to ensuring that more people get the opportunity to benefit from the patch, it is the more natural place to put it: urllib3 is where we do most of our interfacing with httplib. The other advantage of putting it there is that I'm currently involved in trying to _replace_ httplib in urllib3, and having the patch in urllib3 makes me much more likely to _remember_ to remove the patch once we've resolved the issue. ;)\n\nWhat do you think @fake-name? Would you be open to a more reduced version of this patch getting added to urllib3 as a (hopefully) temporary measure until we can remove httplib entirely?\n",
"It's definitely a extremely clumsy fix. \n\n> It's slow. Requests includes chardet, and already cops quite a lot of flak for using it to try to guess the encoding of body content because of how slow it is. Doing a chardet check for every response we get is likely to be pretty unpleasant.\n\nIf requests is using chardet, have you looked at cChardet? It's a Cython-based api-compatible replacement for chardet, which according to it's [readme page](https://github.com/PyYoshi/cChardet) is about 1000-4000x faster then plain chardet. \n\n> It makes the assertion that every header is in the response is encoded the same way. Unfortunately I can confidently tell you that that's not true. It's extremely common to have a situation involving reverse or caching proxies where multiple entities add headers to a response, often using different character encodings. Of course, this patch isn't any worse in that case than the current behaviour, but it doesn't necessarily gracefully handle it well either.\n\nUgh. Why can't everything just be UTF-8?\n\nI wonder if it's possible to only bother doing a chardet call if the string contains characters outside of the normal ASCII printable chars? Something like `if any([char > 127 for char in rawheader])`. That'd let the system avoid the call entirely when the string is definitively entirely plain ascii.\n\nI can add per-header decoding after I'm home from work.\n\n> Monkeypatching the stdlib is asking for trouble. It hasn't entirely stopped us in the past, but that doesn't make me happy about it.\n\nIt's certainly pretty ugly, though this is one of the cleaner monkey-patches I've seen (they were nice enough to break out the entire header decode thing into a single function, rather then having to replace parts of a class or something). Since it looks like `parse_headers` _is_ part of the httplib public API (albeit an undocumented part), the function signature _should_ be pretty fixed. \n\nI'm not too enthused about it either, but I think trying to convince python core to include cchardet in their dependencies for it is probably not exactly viable.\n\n> Part of me wonders whether we can address this issue in a more sledgehammery way. For example, can we avoid the chardet requirement by asserting that we only want to try UTF-8 and ISO-8859-1? That will fail in the rare instance that someone sends a header block encoded in UTF-16, but frankly I'd argue that in that instance the server is beyond help and needs to be put out of its misery.\n\nThe problem with putting servers out of their misery is the people who own datacenters tend to get kind of irritable when you break in and hit someone's server with a baseball-bat. And _then_ you have to deal with the person who actually owns the server. \n\n> The other advantage of doing it that way around is that it would let us drop down and put the patch into urllib3. In addition to ensuring that more people get the opportunity to benefit from the patch, it is the more natural place to put it: urllib3 is where we do most of our interfacing with httplib. The other advantage of putting it there is that I'm currently involved in trying to replace httplib in urllib3, and having the patch in urllib3 makes me much more likely to remember to remove the patch once we've resolved the issue. ;)\n> \n> What do you think @fake-name? Would you be open to a more reduced version of this patch getting added to urllib3 as a (hopefully) temporary measure until we can remove httplib entirely?\n\nI don't know the requests architecture well enough to comment coherently, though you should treat my little hack above as basically public domain (or WTFPL if you really want a actual license). If it's useful, do what you want with it. \n",
"> If requests is using chardet, have you looked at cChardet?\n\nUnfortunately, we can't depend on third-party modules, especially not if they require compilation. That rules out cython. =(\n\nI'm pretty strongly inclined to want to ship this as a much more minor patch that tries UTF-8 and then ISO-8859-1 in that order. I suspect that will cover 98% of the problems, and will let us get a better feel for how problematic the remaining cases are.\n",
"> Unfortunately, we can't depend on third-party modules, especially not if they require compilation. That rules out cython. =(\n\nIsn't `chardet` a third-party module? Also, cChardet has available windows builds in PyPi for 2.7, 3.4 and 3.5. \n\nAnyways, cchardet and chardet are API-compatible. Try to install cchardet, fall back to chardet on failure.\n",
"chardet is vendored into the requests repository.\n\nI'm open to having cchardet as a fast-path for regular chardet though, via optional import.\n",
"So, chardet has been vastly improved (but not yet released) to significantly improve its speed. Once they get around to it, the new release should improve things for us.\n",
"Kinda hackier chardet-less version. Also does per-header decoding:\n\n```\nimport sys\nimport codecs\n\nimport http.client\nimport email.parser\n\ncchardet = False\n\ntry:\n import cchardet\nexcept ImportError:\n pass\n\ndef isUTF8Strict(data):\n '''\n Check if all characters in a bytearray are decodable\n using UTF-8.\n '''\n try:\n decoded = data.decode('UTF-8')\n except UnicodeDecodeError:\n return False\n else:\n for ch in decoded:\n if 0xD800 <= ord(ch) <= 0xDFFF:\n return False\n return True\n\ndef decode_headers(header_list):\n '''\n Decode a list of headers.\n\n Takes a list of bytestrings, returns a list of unicode strings.\n The character set for each bytestring is individually decoded.\n '''\n\n decoded_headers = []\n for header in header_list:\n if cchardet:\n inferred = cchardet.detect(header)\n if inferred and inferred['confidence'] > 0.8:\n print(\"Parsing headers!\", header)\n decoded_headers.append(header.decode(inferred['encoding']))\n else:\n decoded_headers.append(header.decode('iso-8859-1'))\n else:\n # All bytes are < 127 (e.g. ASCII)\n if all([char & 0x80 == 0 for char in header]):\n decoded_headers.append(header.decode(\"us-ascii\"))\n elif isUTF8Strict(header):\n decoded_headers.append(header.decode(\"utf-8\"))\n else:\n decoded_headers.append(header.decode('iso-8859-1'))\n\n return decoded_headers\n\n\ndef parse_headers(fp, _class=http.client.HTTPMessage):\n \"\"\"Parses only RFC2822 headers from a file pointer.\n\n email Parser wants to see strings rather than bytes.\n But a TextIOWrapper around self.rfile would buffer too many bytes\n from the stream, bytes which we later need to read as bytes.\n So we read the correct bytes here, as bytes, for email Parser\n to parse.\n\n Note: Monkey-patched version to try to more intelligently determine\n header encoding\n\n \"\"\"\n headers = []\n while True:\n line = fp.readline(http.client._MAXLINE + 1)\n if len(line) > http.client._MAXLINE:\n raise http.client.LineTooLong(\"header line\")\n headers.append(line)\n if len(headers) > http.client._MAXHEADERS:\n raise HTTPException(\"got more than %d headers\" % http.client._MAXHEADERS)\n if line in (b'\\r\\n', b'\\n', b''):\n break\n\n decoded_headers = decode_headers(headers)\n\n hstring = ''.join(decoded_headers)\n\n return email.parser.Parser(_class=_class).parsestr(hstring)\n\nhttp.client.parse_headers = parse_headers\n\n```\n\nI'm curious how much of a performance improvement this'd result in. It might be a good idea to work up a testing scaffold (or just tell people doing high-throughput stuff to install cchardet).\n",
"I suspect not much of a perf improvement: there's a lot of time spent looking at each byte in Python code, which is always going to be slow. The way to get it to be faster is to just say: \"try to decode the header block with UTF-8; if that fails, try with latin-1\". That's the barest minimum of better, but it might be good enough.\n",
"I have the same problem. Code works on Python 2.7 but was failed with Python 3.6.4\r\n\r\n```\r\nimport requests\r\nimport logging\r\n\r\nfrom requests.auth import HTTPBasicAuth\r\n\r\nlogging.basicConfig(level=logging.DEBUG)\r\n\r\nr = requests.get('http://test.local',\r\n auth=HTTPBasicAuth('USER', 'PASSWORD'))\r\n\r\nprint(r)\r\n```\r\n_______________\r\n\r\n```\r\nDEBUG:urllib3.connectionpool:Starting new HTTP connection (1): test-silhouette.3shape.local\r\nDEBUG:urllib3.connectionpool:http://test.local:80 \"GET / HTTP/1.1\" 200 None\r\nWARNING:urllib3.connectionpool:Failed to parse headers (url=http://test.local:80/): [MissingHeaderBodySeparatorDefect()], \r\nunparsed data: 'Access-Control-Expose-Headers : WWW-Authenticate\\r\\nAccess-Control-Allow-Origin: *\\r\\nAccess-Control-Allow-Methods: \r\nGET, POST, OPTIONS, PUT, PATCH, DELETE\\r\\nAccess-Control-Allow-Credentials: true\\r\\nAccess-Control-Allow-Headers: accept, authorization, \r\nContent-Type\\r\\nDate: Thu, 12 Apr 2018 14:06:25 GMT\\r\\nContent-Length: 2692\\r\\n\\r\\n'\r\nTraceback (most recent call last):\r\n File \"D:\\Python3.6.4\\lib\\site-packages\\urllib3\\connectionpool.py\", line 399, in _make_request\r\n assert_header_parsing(httplib_response.msg)\r\n File \"D:\\Python3.6.4\\lib\\site-packages\\urllib3\\util\\response.py\", line 66, in assert_header_parsing\r\n raise HeaderParsingError(defects=defects, unparsed_data=unparsed_data)\r\nurllib3.exceptions.HeaderParsingError: [MissingHeaderBodySeparatorDefect()], unparsed data: 'Access-Control-Expose-Headers : \r\nWWW-Authenticate\\r\\nAccess-Control-Allow-Origin: *\\r\\nAccess-Control-Allow-Methods: GET, POST, OPTIONS, PUT, PATCH, \r\nDELETE\\r\\nAccess-Control-Allow-Credentials: true\\r\\nAccess-Control-Allow-Headers: accept, authorization, Content-Type\\r\\nDate: Thu, \r\n12 Apr 2018 14:06:25 GMT\\r\\nContent-Length: 2692\\r\\n\\r\\n'\r\nTraceback (most recent call last):\r\n File \"D:\\Python3.6.4\\lib\\site-packages\\urllib3\\response.py\", line 302, in _error_catcher\r\n yield\r\n File \"D:\\Python3.6.4\\lib\\site-packages\\urllib3\\response.py\", line 384, in read\r\n data = self._fp.read(amt)\r\n File \"D:\\Python3.6.4\\lib\\http\\client.py\", line 449, in read\r\n n = self.readinto(b)\r\n File \"D:\\Python3.6.4\\lib\\http\\client.py\", line 493, in readinto\r\n n = self.fp.readinto(b)\r\n File \"D:\\Python3.6.4\\lib\\socket.py\", line 586, in readinto\r\n return self._sock.recv_into(b)\r\nConnectionResetError: [WinError 10054] An existing connection was forcibly closed by the remote host\r\n```",
"Hi @safo-bora, the issues you’re encountering is because the server is returning an illegal header.\r\n\r\nThe first header: \r\n`'Access-Control-Expose-Headers : WWW-Authenticate\\r\\n`\r\n\r\nThere is a space between the header name and the colon which violates RFC 7230 section 3.2.4:\r\n\r\n> No whitespace is allowed between the header field-name and colon. In\r\n the past, differences in the handling of such whitespace have led to\r\n security vulnerabilities in request routing and response handling. A\r\n server MUST reject any received request message that contains\r\n whitespace between a header field-name and colon with a response code\r\n of 400 (Bad Request).",
"Thank you! \r\n\r\nWe have found the problem on our side.\r\n\r\nNow works!\r\n\r\n\r\n\r\n",
"For folks who need to alter how requests ( and urllib3 as a whole ) parse headers, if changing the response isn't an option — here's a hint; hope it might help:\r\n\r\nhttps://gist.github.com/bahoo/bdfedfc47cb971840cae489a844a2408",
"How can you solve this problem"
] |
https://api.github.com/repos/psf/requests/issues/3097
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3097/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3097/comments
|
https://api.github.com/repos/psf/requests/issues/3097/events
|
https://github.com/psf/requests/issues/3097
| 148,503,885 |
MDU6SXNzdWUxNDg1MDM4ODU=
| 3,097 |
Be able to set _MAXHEADERS in httplib.py
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1246420?v=4",
"events_url": "https://api.github.com/users/sloh/events{/privacy}",
"followers_url": "https://api.github.com/users/sloh/followers",
"following_url": "https://api.github.com/users/sloh/following{/other_user}",
"gists_url": "https://api.github.com/users/sloh/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sloh",
"id": 1246420,
"login": "sloh",
"node_id": "MDQ6VXNlcjEyNDY0MjA=",
"organizations_url": "https://api.github.com/users/sloh/orgs",
"received_events_url": "https://api.github.com/users/sloh/received_events",
"repos_url": "https://api.github.com/users/sloh/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sloh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sloh/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sloh",
"user_view_type": "public"
}
|
[
{
"color": "e11d21",
"default": false,
"description": null,
"id": 136589914,
"name": "Needs Info",
"node_id": "MDU6TGFiZWwxMzY1ODk5MTQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20Info"
},
{
"color": "777777",
"default": false,
"description": null,
"id": 162780722,
"name": "Question/Not a bug",
"node_id": "MDU6TGFiZWwxNjI3ODA3MjI=",
"url": "https://api.github.com/repos/psf/requests/labels/Question/Not%20a%20bug"
},
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
}
] |
closed
| true | null |
[] | null | 1 |
2016-04-14T22:41:44Z
|
2021-09-08T18:00:51Z
|
2016-04-19T17:14:10Z
|
NONE
|
resolved
|
I am making a GET request which returns a response with 100+ headers because I am printing debug statements in it, so when I use requests to get the URL, I am a similar error like below:
Traceback (most recent call last):
File "......", line ....., in get_response
response = requests.get(url)
File "/Users/loh/myvirtualenvs/auto_ssai2.7/lib/python2.7/site-packages/requests/sessions.py", line 480, in get
return self.request('GET', url, *_kwargs)
File "/Users/loh/myvirtualenvs/auto_ssai2.7/lib/python2.7/site-packages/requests/sessions.py", line 468, in request
resp = self.send(prep, *_send_kwargs)
File "/Users/loh/myvirtualenvs/auto_ssai2.7/lib/python2.7/site-packages/requests/sessions.py", line 576, in send
r = adapter.send(request, **kwargs)
File "/Users/loh/myvirtualenvs/auto_ssai2.7/lib/python2.7/site-packages/requests/adapters.py", line 426, in send
raise ConnectionError(err, request=request)
ConnectionError: ('Connection aborted.', HTTPException('got more than 100 headers',))
Need a way to change _MAXHEADERS in httplib.py as suggested here:
http://stackoverflow.com/questions/23055378/http-client-httpexception-got-more-than-100-headers
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3097/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3097/timeline
| null |
completed
| null | null | false |
[
"I'm sorry. I don't understand. Does something like\n\n``` py\nimport httplib # or http.client if you're on Python 3\n\nhttplib._MAXHEADERS = 1000\n```\n\nNot work for you? Given that this seems to be exceptionally rare, I'm not sure we need a way to do this (or override this).\n"
] |
https://api.github.com/repos/psf/requests/issues/3096
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3096/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3096/comments
|
https://api.github.com/repos/psf/requests/issues/3096/events
|
https://github.com/psf/requests/pull/3096
| 148,412,460 |
MDExOlB1bGxSZXF1ZXN0NjY1MTIzNTM=
| 3,096 |
Change _store of CaseInsensitiveDict to OrderedDict
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2237894?v=4",
"events_url": "https://api.github.com/users/piotrjurkiewicz/events{/privacy}",
"followers_url": "https://api.github.com/users/piotrjurkiewicz/followers",
"following_url": "https://api.github.com/users/piotrjurkiewicz/following{/other_user}",
"gists_url": "https://api.github.com/users/piotrjurkiewicz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/piotrjurkiewicz",
"id": 2237894,
"login": "piotrjurkiewicz",
"node_id": "MDQ6VXNlcjIyMzc4OTQ=",
"organizations_url": "https://api.github.com/users/piotrjurkiewicz/orgs",
"received_events_url": "https://api.github.com/users/piotrjurkiewicz/received_events",
"repos_url": "https://api.github.com/users/piotrjurkiewicz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/piotrjurkiewicz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/piotrjurkiewicz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/piotrjurkiewicz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 13 |
2016-04-14T16:15:41Z
|
2021-09-08T03:01:06Z
|
2016-04-29T21:47:35Z
|
CONTRIBUTOR
|
resolved
|
This will preserve order of request headers when passed to request method as `OrderedDict`.
Closes #3038.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3096/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3096/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3096.diff",
"html_url": "https://github.com/psf/requests/pull/3096",
"merged_at": "2016-04-29T21:47:35Z",
"patch_url": "https://github.com/psf/requests/pull/3096.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3096"
}
| true |
[
"Thanks for this! I'd love a test that proves that this works: do you feel like you're up to writing one?\n",
"OK, now tests should pass on Python 2.6 as well.\n",
"@piotrjurkiewicz the tests fail on 3.3 and 3.4 now http://ci.kennethreitz.org/job/requests-pr/974/\n\nThis reminds me, I need to add Python 3.5 to the CI server. I didn't realize it was missing. \n",
"@kennethreitz the tests here seem to have hung in Jenkins. Is there anyway to add a timeout to test runs?\n",
"@sigmavirus24 just added a build timeout plugin, configured for 5 minutes for PR builds. \n\nDon't worry, I'm getting increasingly frustrated with maintaining Jenkins at the moment. \n",
"> Don't worry, I'm getting increasingly frustrated with maintaining Jenkins at the moment. \n\nI've been maintaining my own for a while. I feel your pain and I'm happy to help out more.\n",
"I mean, I've been managing Jenkins servers since the Hudson days, but this is the only one I've had to deal with in the past three years, and my annoyances with how it requires at least some level of upkeep are growing :)\n\nI do think it's an overall less frustrating experience for this project than using Travis would be, so worth the effort. Travis has seen much improvements over the past two years, though, so at some point, this may no longer be true. \n",
":sparkles: :cake: :sparkles:\n",
"Is this fixed in version 2.10.0 ?\n\n```\nimport requests\nfrom collections import OrderedDict\n\n\n\nheaders = OrderedDict( [\n('User-Agent', 'Mozilla/5.0 (Windows NT 6.3; WOW64; rv:46.0) Gecko/20100101 Firefox/46.0'),\n('Accept', 'application/json, text/javascript, */*; q=0.01'),\n('Accept-Language', 'cs,en-US;q=0.7,en;q=0.3'),\n('Accept-Encoding', 'gzip, deflate'),\n('Connection', 'keep-alive'),\n] )\n\nrsp = requests.get('http://api.wipmania.com',headers=headers)\n\n```\n\norder of headers not preserved on Windows 8.1 python 2.7.9 tested with Wireshark\n\nedit: it seems to work fine only if I use session like this\n\n```\nimport requests\nfrom collections import OrderedDict\n\nheaders = OrderedDict( [\n('User-Agent', 'Mozilla/5.0 (Windows NT 6.3; WOW64; rv:46.0) Gecko/20100101 Firefox/46.0'),\n('Accept', 'application/json, text/javascript, */*; q=0.01'),\n('Accept-Language', 'cs,en-US;q=0.7,en;q=0.3'),\n('Accept-Encoding', 'gzip, deflate'),\n('Connection', 'keep-alive'),\n] )\n\nses = requests.Session()\nses.headers = OrderedDict()\nses.get('http://website.com',headers=headers)\n```\n",
"The reason the header order is being overridden in your case is because of the way requests merges the two different dictionaries in the `Session` and the request kwargs.\n\nBy default, a requests `Session` already contains several keys:\n\n```\n>>> s = requests.Session()\n>>> s.headers\n{'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'User-Agent': 'python-requests/2.10.0'}\n```\n\nYou'll note, then, that when you send headers using the `headers` kwarg, the order of the keys in the Session is preserved in priority to the order of the keys in the `headers` kwarg. This is the expected result of using the dict on the `Session` as the base into which the request dict is merged to update.\n\nTrying to get the entirely intuitive behaviour (where the request header defines the order in preference to the `Session` order) is somewhat frustrating. Right now the code looks like this:\n\n``` python\nactual_headers = CaseInsensitiveDict(to_key_val_list(session.headers))\nactual_headers.update(to_key_val_list(request.headers))\n```\n\nWe'd need to change the code to\n\n``` python\nactual_headers = CaseInsensitiveDict(to_key_val_list(request.headers))\nactual_headers.update(to_key_val_list(session.headers))\nactual_headers.update(to_key_val_list(request.headers))\n```\n\nThis would lead to the exact same result as we currently have but would prioritise the _order_ of the request dict rather than the `Session` dict. The cost is that we do substantially extra computation in order to achieve this relatively minor effect.\n\nI am open to making this change, but it does rather feel like using a sledgehammer to crack a nut. @kennethreitz @sigmavirus24?\n",
"@Lukasa I think this is something we need to document instead of work around. I'm not sure \"fixing\" that particular behaviour wouldn't introduce some other subtle bug.\n",
"I don't disagree with @sigmavirus24. While I do feel like this pattern should \"just work\" as requested, I feel like it's extremely uncommon for someone to want/need this, and we should not bend over backwards to accomplish this. \n",
"Ok, so I think we're in agreement: we'd rather document this than try to change the logic.\n"
] |
https://api.github.com/repos/psf/requests/issues/3095
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3095/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3095/comments
|
https://api.github.com/repos/psf/requests/issues/3095/events
|
https://github.com/psf/requests/issues/3095
| 148,285,989 |
MDU6SXNzdWUxNDgyODU5ODk=
| 3,095 |
SSLError: bad handshake (ssl3_get_server_certificate, certificate verify failed)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/158471?v=4",
"events_url": "https://api.github.com/users/djc/events{/privacy}",
"followers_url": "https://api.github.com/users/djc/followers",
"following_url": "https://api.github.com/users/djc/following{/other_user}",
"gists_url": "https://api.github.com/users/djc/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/djc",
"id": 158471,
"login": "djc",
"node_id": "MDQ6VXNlcjE1ODQ3MQ==",
"organizations_url": "https://api.github.com/users/djc/orgs",
"received_events_url": "https://api.github.com/users/djc/received_events",
"repos_url": "https://api.github.com/users/djc/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/djc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/djc/subscriptions",
"type": "User",
"url": "https://api.github.com/users/djc",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 12 |
2016-04-14T08:13:04Z
|
2021-09-08T07:00:39Z
|
2016-04-14T08:23:51Z
|
NONE
|
resolved
|
After updating the Let's Encrypt certificate I use on my host (from one that uses one host to one that has multiple hostnames), a requests-using script I have started failing:
```
File "test.py", line 5, in get
rsp = requests.get(url)
File "/usr/lib/python2.7/site-packages/requests/api.py", line 69, in get
return request('get', url, params=params, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/api.py", line 50, in request
response = session.request(method=method, url=url, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 468, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 597, in send
history = [resp for resp in gen] if allow_redirects else []
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 195, in resolve_redirects
**adapter_kwargs
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 576, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python2.7/site-packages/requests/adapters.py", line 433, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: ("bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)",)
```
I have requests-2.8.1, certifi-2015.11.20, urllib3-1.14 and openssl-1.0.2g. The affected host is https://xavamedia.nl/.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3095/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3095/timeline
| null |
completed
| null | null | false |
[
"What OS and Python version do you have?\n",
"Oh, nevermind, I've spotted it. Your intermediate certificate is wrong.\n\nA quick check using OpenSSL shows that you've served the following certificate chain:\n\n```\n 0 s:/CN=xavamedia.nl\n i:/C=US/O=Let's Encrypt/CN=Let's Encrypt Authority X3\n 1 s:/C=US/O=Let's Encrypt/CN=Let's Encrypt Authority X1\n i:/O=Digital Signature Trust Co./CN=DST Root CA X3\n```\n\nYou'll notice that your leaf cert (cert 0) is issued by \"Let's Encrypt Authority X3\", but the intermediate cert you sent belongs to \"Let's Encrypt Authority X1\". That's not right at all. I don't know how you're setting up those certs, but you'll need to ensure that you're serving the appropriate intermediate certificate, because neither requests nor certifi ships intermediate certs in their bundles.\n",
"Doh, that's dumb. I guess I would have expected my browser (Firefox) to flag that sort of issue...\n",
"Sadly, most browsers _do_ ship a list of intermediate certificates for this very reason: misconfigured TLS servers are extremely common. We don't do it only because such a list would be really quite brutal to maintain.\n",
"@Lukasa How did you do the quick check with openssl? I've having the same problem, and suspect I may have the same cause.\n",
"@singletoned `openssl s_client -connect <host>:<port> -showcerts` should do the trick.\n",
"That's brilliant, thanks. And thanks for the blisteringly fast response!\n",
"No problem!\n",
" File \"download.py\", line 31, in <module>\r\n req = requests.get(zipURL, stream=True)#, verify=False)\r\n File \"C:\\Program Files\\Anaconda3\\lib\\site-packages\\requests\\api.py\", line 72, in get\r\n return request('get', url, params=params, **kwargs)\r\n File \"C:\\Program Files\\Anaconda3\\lib\\site-packages\\requests\\api.py\", line 58, in request\r\n return session.request(method=method, url=url, **kwargs)\r\n File \"C:\\Program Files\\Anaconda3\\lib\\site-packages\\requests\\sessions.py\", line 502, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"C:\\Program Files\\Anaconda3\\lib\\site-packages\\requests\\sessions.py\", line 612, in send\r\n r = adapter.send(request, **kwargs)\r\n File \"C:\\Program Files\\Anaconda3\\lib\\site-packages\\requests\\adapters.py\", line 514, in send\r\n raise SSLError(e, request=request)\r\nrequests.exceptions.SSLError: (\"bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)\",)\r\n\r\nHi Lukasa, I believe I have the same problem. I've searched here and there and have not solved the problem. Do you know what's going wrong at my end? Thank you.",
"Seems we're failing to validate the cert. is the server you're contacting on the public internet?",
"@Lukasa : I'm having the same issue while interacting with vision.goolgeapis.com\r\n\r\nNo, my network is not public.\r\n\r\nDo you think think might be because of the firewall ?",
"If you have an intercepting proxy in between you and the endpoint then yes, frequently you'll fail to validate that certificate."
] |
https://api.github.com/repos/psf/requests/issues/3094
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3094/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3094/comments
|
https://api.github.com/repos/psf/requests/issues/3094/events
|
https://github.com/psf/requests/issues/3094
| 148,220,295 |
MDU6SXNzdWUxNDgyMjAyOTU=
| 3,094 |
session proxy not working with user/password
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5225146?v=4",
"events_url": "https://api.github.com/users/emgerner-msft/events{/privacy}",
"followers_url": "https://api.github.com/users/emgerner-msft/followers",
"following_url": "https://api.github.com/users/emgerner-msft/following{/other_user}",
"gists_url": "https://api.github.com/users/emgerner-msft/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/emgerner-msft",
"id": 5225146,
"login": "emgerner-msft",
"node_id": "MDQ6VXNlcjUyMjUxNDY=",
"organizations_url": "https://api.github.com/users/emgerner-msft/orgs",
"received_events_url": "https://api.github.com/users/emgerner-msft/received_events",
"repos_url": "https://api.github.com/users/emgerner-msft/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/emgerner-msft/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/emgerner-msft/subscriptions",
"type": "User",
"url": "https://api.github.com/users/emgerner-msft",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2016-04-14T00:50:11Z
|
2021-09-08T18:00:57Z
|
2016-04-14T18:09:30Z
|
NONE
|
resolved
|
All of the below testing was done with Fiddler. User and password proxy auth in Fiddler was enabled the default way by doing Rules->Require Proxy Authentication which requires user be '1' and password be '1'. The code below does not contain the https section of the proxy as it's largely identical.
If a proxy is set on the session without user and password and Fiddler auth is off, Fiddler is used correctly:
`
session.proxies['http'] = 'http://127.0.0.1:8888'.format(proxy_string)
session.request(....)
`
If a proxy is set on the session with user and password, Fiddler returns proxy auth failures:
`
session.proxies['http'] = 'http://1:[email protected]:8888'.format(proxy_string)
session.request(....)
`
If I do exactly the same thing but send the proxy with the request directly, both cases work.
`
proxies = {}
proxies['http'] = 'http://1:[email protected]:8888'.format(proxy_string)
session.request(...., proxies=proxies)
`
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3094/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3094/timeline
| null |
completed
| null | null | false |
[
"Thanks for this report @emgerner-msft!\n\nFirstly, can I check: the proxy string in your code doesn't actually separate the port from the host with an `@` symbol. Was that was a typo? If not, that could very really break our parsing of the URL.\n\nAssuming it's a typo, I'd like you to try something for me. Can you reconfigure Fiddler and then print the output of both `requests.utils.should_bypass_proxies(url)` and `requests.utils.get_environ_proxies(url)`, where `url` is the URL you requested when you had this problem.\n",
"That was a typo. Sorry about that! Updated the original issue.\n\nI turned on Fiddler with proxy auth and ran the commands as asked with the URL I'm using to make requests. `requests.utils.should_bypass_proxies(uri)` returned `False` and `requests.utils.get_environ_proxies(uri)` returned `{'http': 'http://127.0.0.1:8888', 'https': 'https://127.0.0.1:8888'}`.\n",
"Ok, so that's really interesting.\n\nThe reason I asked those questions is that we have a known issue at this time around proxies in sessions. Specifically, currently settings set on the Session are considered less authoritative than those in the environment (see #2018). That means that, if you haven't set `session.trust_env = False`, your setting of the proxies on the Session shouldn't have done anything at all. Clearly that's not happening, which is a bit of a concern.\n\nCan I get you to run all four cases (session + no auth, session + auth, request + no auth, request + auth) again, with Fiddler configured to require auth, but each time run it in a brand new Python shell? I want to exclude the effect of connection pooling here, which can play havoc with these kinds of results. \n",
"Read through the bug. I totally agree with the proposed ordering of precedence -- it's actually how I naively assumed it would work. :)\n\nSounds like to me that bug actually explains things pretty well though. If requests says it should not bypass proxies and the environment correctly gives the Fiddler proxy, that means it's simply using the environment setting instead of the Session proxy. The environment setting doesn't have auth, so it fails. If I set the proxy on the request, it overrides correctly and the auth works. Is this not the correct interpretation of that output and the bug?\n",
"I don't think so, not without that final set of tests. The problem is that the environment proxy strings don't have an auth part, which means we should have failed to auth _ever_ with the session only version. That's why I wanted you to re-test: to understand what the actual behaviour is here. \n",
"We did always fail to proxy with auth on with the proxy set on the session itself. The first case where I thought proxy was working with session was just hitting the environment it looks like -- note for that case Fiddler auth was off. Before you clarified the order of precedence I had misinterpreted this to mean the session proxy just didn't work with auth -- not that it didn't work at all. With auth ON, the only time I got the proxy to work was by setting it directly when making the request `session.request(...., proxies=proxies)`. Your bug explains this perfectly. :) Feel free to close as a duplicate.\n",
"Aha, ok, cool. We are scheduling to fix this for 3.0.0. Thanks for the report!\n"
] |
https://api.github.com/repos/psf/requests/issues/3093
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3093/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3093/comments
|
https://api.github.com/repos/psf/requests/issues/3093/events
|
https://github.com/psf/requests/issues/3093
| 148,200,495 |
MDU6SXNzdWUxNDgyMDA0OTU=
| 3,093 |
Token string add error
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9083431?v=4",
"events_url": "https://api.github.com/users/Nuruddinjr/events{/privacy}",
"followers_url": "https://api.github.com/users/Nuruddinjr/followers",
"following_url": "https://api.github.com/users/Nuruddinjr/following{/other_user}",
"gists_url": "https://api.github.com/users/Nuruddinjr/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Nuruddinjr",
"id": 9083431,
"login": "Nuruddinjr",
"node_id": "MDQ6VXNlcjkwODM0MzE=",
"organizations_url": "https://api.github.com/users/Nuruddinjr/orgs",
"received_events_url": "https://api.github.com/users/Nuruddinjr/received_events",
"repos_url": "https://api.github.com/users/Nuruddinjr/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Nuruddinjr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Nuruddinjr/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Nuruddinjr",
"user_view_type": "public"
}
|
[
{
"color": "e11d21",
"default": false,
"description": null,
"id": 136589914,
"name": "Needs Info",
"node_id": "MDU6TGFiZWwxMzY1ODk5MTQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20Info"
},
{
"color": "777777",
"default": false,
"description": null,
"id": 162780722,
"name": "Question/Not a bug",
"node_id": "MDU6TGFiZWwxNjI3ODA3MjI=",
"url": "https://api.github.com/repos/psf/requests/labels/Question/Not%20a%20bug"
},
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
}
] |
closed
| true | null |
[] | null | 3 |
2016-04-13T22:24:56Z
|
2021-09-08T16:00:35Z
|
2016-08-05T07:53:54Z
|
NONE
|
resolved
|
``` python
url = "https://graph.facebook.com/v2.6/me/subscribed_apps?access_token="+self.token
data = {}
if url:
data['url'] = url
result = requests.post(url, data)
```
Above code generating following error, even though schema is defined as in url. how to fix?
```
requests.exceptions.MissingSchema: Invalid URL '%sreally long string , facebook token':
No schema supplied. Perhaps you meant http://%sreally long string , facebook token?
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3093/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3093/timeline
| null |
completed
| null | null | false |
[
"EAAHEisMO9u8BABL41ZCoUFeXRWzhhjxDb0jDNzyH7YdJeFVpUoYpRrqhp96YKkDrfzqFJfclbqhOZB4MMvihAOS2XBJLYpLDGLIyZAuXBuo8gpddNICEgc7Xo8mRGBH6F9q38ZALItZCaGfytrs59dPLRhi9ibEYeC2Pi0AbXXXXXXXX my token is as following\n",
"@Nuruddinjr Can you print the actual error message? It seems like the URL you've built is invalid, which suggests that the code you're using right now isn't correct in some way.\n",
"@Nuruddinjr please answer @Lukasa soon. Otherwise, I'm going to close this issue as it does not appear to be a bug.\n"
] |
https://api.github.com/repos/psf/requests/issues/3092
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3092/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3092/comments
|
https://api.github.com/repos/psf/requests/issues/3092/events
|
https://github.com/psf/requests/issues/3092
| 148,186,611 |
MDU6SXNzdWUxNDgxODY2MTE=
| 3,092 |
Problem in json.dump with headers dictionary
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/812863?v=4",
"events_url": "https://api.github.com/users/rodrigozanatta/events{/privacy}",
"followers_url": "https://api.github.com/users/rodrigozanatta/followers",
"following_url": "https://api.github.com/users/rodrigozanatta/following{/other_user}",
"gists_url": "https://api.github.com/users/rodrigozanatta/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rodrigozanatta",
"id": 812863,
"login": "rodrigozanatta",
"node_id": "MDQ6VXNlcjgxMjg2Mw==",
"organizations_url": "https://api.github.com/users/rodrigozanatta/orgs",
"received_events_url": "https://api.github.com/users/rodrigozanatta/received_events",
"repos_url": "https://api.github.com/users/rodrigozanatta/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rodrigozanatta/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rodrigozanatta/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rodrigozanatta",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-04-13T21:22:27Z
|
2021-09-08T18:00:58Z
|
2016-04-14T07:50:42Z
|
NONE
|
resolved
|
I really don't know what is happing. I make a `requests.get` and save the returned headers in a dictionary:
```
r = requests.get("http://myurl.com")
my_dic = {'my_header' : r.headers}
```
When I try to parse my_dic to JSON, I get an error.
```
with open('arq.txt', 'w') as fp:
json.dump(my_dic, fp)
```
For no special reason, it fail! I thought there was an illegal character or duplicate key, but no. It only work when I created this function:
```
def fix(old):
new = {}
for key, value in old.items():
new[key] = value
return new
```
So, I need to do this:
```
r = requests.get("http://myurl.com")
my_dic = {'my_header' : fix(r.headers)}
```
Why? Is this a bug? I am using the python 3.5 and believe all most update libraries.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3092/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3092/timeline
| null |
completed
| null | null | false |
[
"So you didn't actually print your error, but the reason it's a problem is that the json module really only accepts dictionaries. `r.headers` is not actually a dictionary:\n\n```\n>>> type(r.headers)\n<class 'requests.structures.CaseInsensitiveDict'>\n```\n\nYou can resolve the problem by simply calling `dict()` on it:\n\n``` python\nr = requests.get(\"http://myurl.com\")\nmy_dic = {'my_header' : dict(r.headers)}\n```\n",
"Yeah, now I understand it. But... Why not make it a dictionary? I am new in python and it take hours to understand WHY the JSON was having problem. My first try was some illegal character. Because the error print it like a dictionary. I have the same problem with cookies. \n\nSo I suggest you to make every \"like a dictionary\" to be a \"real dictionary\" I can imagine so many problems with it in others frameworks. \n",
"The reason it's not a dictionary is because generally speaking, types defined in Python should not subclass the built-in types. This is because the built-in Python types have various shortcuts in their C implementation that means that they don't call back into Python code, which means they don't see method overrides defined in Python.\n\nFor that reason, it is intended that modules use the [collections module's Abstract Base Classes](https://docs.python.org/2/library/collections.html#collections-abstract-base-classes): in our case, [Mutable Mapping](https://docs.python.org/2/library/collections.html#collections.MutableMapping). It is unfortunate that the JSON module doesn't like them, but I believe that's a flaw in the JSON module rather than the data type we've constructed.\n"
] |
https://api.github.com/repos/psf/requests/issues/3091
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3091/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3091/comments
|
https://api.github.com/repos/psf/requests/issues/3091/events
|
https://github.com/psf/requests/pull/3091
| 148,113,322 |
MDExOlB1bGxSZXF1ZXN0NjYzNTEwMTI=
| 3,091 |
Clear any pooled proxy connections
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/559268?v=4",
"events_url": "https://api.github.com/users/bodgit/events{/privacy}",
"followers_url": "https://api.github.com/users/bodgit/followers",
"following_url": "https://api.github.com/users/bodgit/following{/other_user}",
"gists_url": "https://api.github.com/users/bodgit/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bodgit",
"id": 559268,
"login": "bodgit",
"node_id": "MDQ6VXNlcjU1OTI2OA==",
"organizations_url": "https://api.github.com/users/bodgit/orgs",
"received_events_url": "https://api.github.com/users/bodgit/received_events",
"repos_url": "https://api.github.com/users/bodgit/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bodgit/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bodgit/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bodgit",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2016-04-13T16:25:31Z
|
2021-09-08T04:01:06Z
|
2016-04-15T12:50:06Z
|
CONTRIBUTOR
|
resolved
|
Based on the explanation of the bug in #3090 this PR closes any pooled connections for any active ProxyManager object associated with the session.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3091/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3091/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3091.diff",
"html_url": "https://github.com/psf/requests/pull/3091",
"merged_at": "2016-04-15T12:50:06Z",
"patch_url": "https://github.com/psf/requests/pull/3091.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3091"
}
| true |
[
"This looks great to me. @bodgit could you add some tests to ensure this doesn't regress?\n",
"I'm wondering what/how I can test this. I had a look through `tests/test_requests.py` but I could only see one proxy-related test. Do you have any suggestions?\n",
"Create a dictionary with values which are mocks and then assert that `clear()` was called once on each one?\n",
"To be clear, my code review is +1 on this too. @sigmavirus24 has the best testing approach here, I think we just need to mock it out.\n",
"The suggestion makes sense, however I don't know how to do this with pytest. I'm trying to find an existing example I can use as a starting point...\n",
"I've managed to come up with the following:\n\n``` python\n def test_session_close_proxy_clear(self, mocker):\n proxies = {\n 'one': mocker.Mock(autospec=True),\n 'two': mocker.Mock(autospec=True),\n }\n session = requests.Session()\n mocker.patch.dict(session.adapters['http://'].proxy_manager, proxies)\n session.close()\n proxies['one'].clear.assert_called_once_with()\n proxies['two'].clear.assert_called_once_with()\n```\n\nI had to pull in `pytest-mock`. It passes with my change and fails when I revert it so it looks like it does the job.\n",
"I think that looks fine to me. If you want to go ahead and update this pull request (you can just push your new commits to the branch), we'll review more formally. =)\n",
"Fixed.\n",
"Thanks @bodgit! :tada: \n"
] |
https://api.github.com/repos/psf/requests/issues/3090
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3090/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3090/comments
|
https://api.github.com/repos/psf/requests/issues/3090/events
|
https://github.com/psf/requests/issues/3090
| 148,079,336 |
MDU6SXNzdWUxNDgwNzkzMzY=
| 3,090 |
Session.close() doesn't work with proxies
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/559268?v=4",
"events_url": "https://api.github.com/users/bodgit/events{/privacy}",
"followers_url": "https://api.github.com/users/bodgit/followers",
"following_url": "https://api.github.com/users/bodgit/following{/other_user}",
"gists_url": "https://api.github.com/users/bodgit/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bodgit",
"id": 559268,
"login": "bodgit",
"node_id": "MDQ6VXNlcjU1OTI2OA==",
"organizations_url": "https://api.github.com/users/bodgit/orgs",
"received_events_url": "https://api.github.com/users/bodgit/received_events",
"repos_url": "https://api.github.com/users/bodgit/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bodgit/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bodgit/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bodgit",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
},
{
"color": "0b02e1",
"default": false,
"description": null,
"id": 191274,
"name": "Contributor Friendly",
"node_id": "MDU6TGFiZWwxOTEyNzQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly"
}
] |
closed
| true | null |
[] | null | 4 |
2016-04-13T14:29:33Z
|
2021-09-08T18:00:57Z
|
2016-04-15T13:36:43Z
|
CONTRIBUTOR
|
resolved
|
I have an upstream server that supports both basic authentication and SSL client certificates for authenticating a client.
I have the following contrived requests client (with a session object) that hits the upstream server using basic auth and then switches to using SSL client certificates and repeats the request:
``` python
#!/usr/bin/env python
import requests
session = requests.Session()
session.verify = './cacert.pem'
session.auth = ('bob', 'password')
response = session.get('https://1.2.3.4/')
session.auth = None
session.cert = ('./cert.pem', './key.pem')
session.close()
response = session.get('https://1.2.3.4/')
```
Because the client certificates need to be presented at the beginning of the connection I use `session.close()` to tear down any active connections before attempting the second request. If I don't do this, the session connection pooling kicks in and reuses the existing connection from the first request which won't have presented any certificates and so the second request is unauthorised.
This is all understood however I then had a need for the client to access the server through a proxy so the code is updated to just add the proxy configuration:
``` python
#!/usr/bin/env python
import requests
session = requests.Session()
session.verify = False
session.auth = ('bob', 'password')
session.proxies = {
'https': 'http://alice:[email protected]:3128/',
}
response = session.get('https://1.2.3.4/')
session.auth = None
session.cert = ('./cert.pem', './key.pem')
session.close()
response = session.get('https://1.2.3.4/')
```
Now the second request is unauthorised; `session.close()` seems to have no effect and doesn't tear down the connections through the proxy; there are no TCP FIN packets initiated from the client which I do see when no proxy is used.
The upstream server has keepalives enabled so if I add a sleep for longer than the idle timeout then the server terminates the connection between itself and the proxy, then the proxy terminates the connection to the client and then the second request is successful.
I appreciate this is probably a corner case but any ideas here? I would expect `session.close()` to correctly tear down the connections to the proxy. I've reproduced this against version 2.9.1.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/559268?v=4",
"events_url": "https://api.github.com/users/bodgit/events{/privacy}",
"followers_url": "https://api.github.com/users/bodgit/followers",
"following_url": "https://api.github.com/users/bodgit/following{/other_user}",
"gists_url": "https://api.github.com/users/bodgit/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bodgit",
"id": 559268,
"login": "bodgit",
"node_id": "MDQ6VXNlcjU1OTI2OA==",
"organizations_url": "https://api.github.com/users/bodgit/orgs",
"received_events_url": "https://api.github.com/users/bodgit/received_events",
"repos_url": "https://api.github.com/users/bodgit/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bodgit/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bodgit/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bodgit",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3090/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3090/timeline
| null |
completed
| null | null | false |
[
"Great spot, this is definitely a bug!\n\nThe bug is in `HTTPAdapter.close`: this currently clears the basic `PoolManager`, but doesn't clear any instantiated `ProxyManager` objects. That means that they inadvertently get preserved, which makes using them for this use-case untenable.\n\nThis bug is easily fixed, though: clearing out `HTTPAdapter.proxy_manager` is going to be the way to go.\n\nIn the meantime @bodgit, to work around this problem you can not just close the Session but actively mount new `HTTPAdapters`:\n\n``` python\nsession.close()\n\nsession.mount('http://', requests.adapters.HTTPAdapter())\nsession.mount('https://', requests.adapters.HTTPAdapter())\n```\n\nThis will hopefully all become needless in a future version of requests which will include TLS information in the connection pooling, but we're not there yet.\n",
"Brilliant, I've just confirmed adding those two `session.mount()` calls gets things working again. I noticed it doesn't cause any previous connections to be immediately torn down but that shouldn't be a massive problem.\n",
"I've submitted the following diff as a PR which seems to fix the issue for me:\n\n``` diff\ndiff --git a/requests/adapters.py b/requests/adapters.py\nindex fe9f533..db62c09 100644\n--- a/requests/adapters.py\n+++ b/requests/adapters.py\n@@ -264,10 +264,12 @@ class HTTPAdapter(BaseAdapter):\n def close(self):\n \"\"\"Disposes of any internal state.\n\n- Currently, this just closes the PoolManager, which closes pooled\n- connections.\n+ Currently, this closes the PoolManager and any active ProxyManager,\n+ which closes any pooled connections.\n \"\"\"\n self.poolmanager.clear()\n+ for proxy in self.proxy_manager.values():\n+ proxy.clear()\n\n def request_url(self, request, proxies):\n \"\"\"Obtain the url to use when making the final request.\n\n```\n",
"Closed this as #3091 has been merged. Thanks @Lukasa for the workaround and the explanation of the bug.\n"
] |
https://api.github.com/repos/psf/requests/issues/3089
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3089/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3089/comments
|
https://api.github.com/repos/psf/requests/issues/3089/events
|
https://github.com/psf/requests/pull/3089
| 147,938,482 |
MDExOlB1bGxSZXF1ZXN0NjYyNjE4OTg=
| 3,089 |
add requests.util.parse funtions
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9625413?v=4",
"events_url": "https://api.github.com/users/sleshep/events{/privacy}",
"followers_url": "https://api.github.com/users/sleshep/followers",
"following_url": "https://api.github.com/users/sleshep/following{/other_user}",
"gists_url": "https://api.github.com/users/sleshep/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sleshep",
"id": 9625413,
"login": "sleshep",
"node_id": "MDQ6VXNlcjk2MjU0MTM=",
"organizations_url": "https://api.github.com/users/sleshep/orgs",
"received_events_url": "https://api.github.com/users/sleshep/received_events",
"repos_url": "https://api.github.com/users/sleshep/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sleshep/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sleshep/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sleshep",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2016-04-13T03:44:27Z
|
2021-09-08T04:00:58Z
|
2016-04-13T07:23:19Z
|
NONE
|
resolved
|
help things easy
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3089/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3089/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3089.diff",
"html_url": "https://github.com/psf/requests/pull/3089",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3089.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3089"
}
| true |
[
"Thanks for this!\n\nHowever, I don't believe we need either of these.\n\n`parse_rawpostdata_to_dict` is a less general version of `urlparse.parse_qs` from the standard library, so users should simply use that. `parse_rawcookies_to_dict` is a less general buggy version of `cookielib.CookieJar.add_cookie_header`. Both of these have better versions available in the standard library, so I'm afraid that we're not going to merge this: they simply add no useful function that is not available elsewhere.\n\nThanks for the work though!\n",
"@Lukasa \nBut i can't use `cookielib.CookieJar.add_cookie_header` to add a raw cookie to requests\ncan you give me a example?\nthanks!\n",
"@sleshep What specifically are you trying to achieve? You appear to have a cookie header on hand and want to attach it to a request. Is that right?\n",
"@Lukasa \nYes, I copy a cookie from browser want to attach it to a request.\nHow to do this with `cookielib.CookieJar.add_cookie_header`?\n",
"Oh I'm sorry, I didn't fully understand your use case. You want to do this:\n\n``` python\nimport Cookie\nimport requests\n\nc = Cookie.SimpleCookie()\nc.load('user=a;a=b;c=1')\n\ns = requests.Session()\nfor v in c.values():\n s.cookies.set_cookie(requests.cookies.morsel_to_cookie(v))\n```\n\nThen, running requests through that session will use the cookies appropriately.\n\nNote that this won't tightly scope them to domains unless the domain is set in the cookie directly, but you can use the standard functionality in the Cookie module to manipulate the morsels appropriately.\n",
"@Lukasa \nThanks.\n but I think use `Cookie.SimpleCookie` is too complex,use this pull request to very easy to achive.\n",
"@sleshep As a project, we don't agree. This pull request does not implement the same functionality that the standard library does. It also implements only a portion of what can be found without this pull request regardless of one's perception of ease of use.\n\nIf these functions are working for you, you should keep using them, but we are not going to accept them into requests or maintain them since there are significantly more mature and robust solutions available to us that we do not have to maintain.\n",
"@sigmavirus24 \nOk.thank you\n",
"if found a simple way to achieve my target.\n\n``` python\nfrom urlparse import parse_qsl\nfrom requests import Session\ncookies='user=a;a=b;c=1'\nparams='user=a&a=b&c=1'\ns=requests.Session()\ns.cookies.update(dict(parse_qsl(cookies)))\ns.post('http://someurl',data=dict(parse_qsl(params)))\n```\n\nit work well.\nThanks all of you.\n@kennethreitz \n@Lukasa \n@sigmavirus24 \n"
] |
https://api.github.com/repos/psf/requests/issues/3088
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3088/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3088/comments
|
https://api.github.com/repos/psf/requests/issues/3088/events
|
https://github.com/psf/requests/issues/3088
| 147,487,047 |
MDU6SXNzdWUxNDc0ODcwNDc=
| 3,088 |
badTimezone: cannot clone with git 2.8.1
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/225102?v=4",
"events_url": "https://api.github.com/users/rillian/events{/privacy}",
"followers_url": "https://api.github.com/users/rillian/followers",
"following_url": "https://api.github.com/users/rillian/following{/other_user}",
"gists_url": "https://api.github.com/users/rillian/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rillian",
"id": 225102,
"login": "rillian",
"node_id": "MDQ6VXNlcjIyNTEwMg==",
"organizations_url": "https://api.github.com/users/rillian/orgs",
"received_events_url": "https://api.github.com/users/rillian/received_events",
"repos_url": "https://api.github.com/users/rillian/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rillian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rillian/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rillian",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2016-04-11T16:46:00Z
|
2021-08-31T00:07:02Z
|
2016-04-11T16:50:12Z
|
NONE
|
resolved
|
I can't clone this repo. It complains:
```
$ git clone https://github.com/kennethreitz/requests
Cloning into 'requests'...
remote: Counting objects: 17048, done.
remote: Compressing objects: 100% (39/39), done.
error: object 5e6ecdad9f69b1ff789a17733b8edc6fd7091bd8: badTimezone: invalid author/committer line - bad time zone
fatal: Error in object
fatal: index-pack failed
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3088/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3088/timeline
| null |
completed
| null | null | false |
[
"This is a known problem: see #3008 and #2690 for previous discussions. The problem can be easily resolved on your end either by performing a shallow clone (if you don't need the history) or turning off the fsck timestamp check.\n",
"Thanks for the quick response.\n",
"@rillian did you enable verification yourself or did another tool do that?\n",
"You're right, I have those options set in my ~/.gitconfig; it's nothing to do with the git version. I don't know if I did it manually or if one of our audit tools added it, because how is that not the default? :/\n\n``` ini\n[transfer]\n fsckobjects = true\n[fetch]\n fsckobjects = true\n[receive]\n fsckObjects = true \n```\n\nI've been trying to figure out how to get YCM to work around this, but it's tricky with submodules because an initial shallow clone generally fails. There doesn't seem to be a `fetch.fsckBadTimezone`. I couldn't get your `$ git config fsck.badTimezone ignore` suggestion to work, just disabling `fetch.fsckObjects` entirely. \n",
"```git config --global fetch.fsck.badTimezone ignore```\r\n\r\nWorks for me when cloning submodules, this just disables it for fetch."
] |
https://api.github.com/repos/psf/requests/issues/3087
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3087/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3087/comments
|
https://api.github.com/repos/psf/requests/issues/3087/events
|
https://github.com/psf/requests/issues/3087
| 147,332,696 |
MDU6SXNzdWUxNDczMzI2OTY=
| 3,087 |
Is there a mistake on requests/docs/user/quickstart.rst
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/16648345?v=4",
"events_url": "https://api.github.com/users/gaoxinge/events{/privacy}",
"followers_url": "https://api.github.com/users/gaoxinge/followers",
"following_url": "https://api.github.com/users/gaoxinge/following{/other_user}",
"gists_url": "https://api.github.com/users/gaoxinge/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gaoxinge",
"id": 16648345,
"login": "gaoxinge",
"node_id": "MDQ6VXNlcjE2NjQ4MzQ1",
"organizations_url": "https://api.github.com/users/gaoxinge/orgs",
"received_events_url": "https://api.github.com/users/gaoxinge/received_events",
"repos_url": "https://api.github.com/users/gaoxinge/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gaoxinge/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gaoxinge/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gaoxinge",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-04-11T06:58:01Z
|
2021-09-08T18:01:00Z
|
2016-04-11T07:09:38Z
|
NONE
|
resolved
|
I am reading the official document of requests. There is one code in quickstart.rst
``` python
>>> payload = {'key1': 'value1', 'key2': ['value2', 'value3']}
>>> r = requests.get('http://httpbin.org/get', params=payload)
>>> print(r.url)
```
However, the output is
``` python
http://httpbin.org/get?key2=value2&key2=value3&key1=value1
```
instead of what is in rst?
Is this a small mistake?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3087/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3087/timeline
| null |
completed
| null | null | false |
[
"No. =)\n\nPython dictionaries are _unordered_: that is, when iterated over the order in which elements are returned is not well defined. That means that iterating over the dictionary above can give you two possible options: first `'key1'` then `'key2'`, or first `'key2'` then `'key1'`. The first case gives you the code from the documentation, the second one gives you the code you saw.\n\nIf you repeatedly re-run the same code, you should see both URLs. This is not a bug in requests or in Python, it's just an expected side effect of the way Python works.\n"
] |
https://api.github.com/repos/psf/requests/issues/3086
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3086/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3086/comments
|
https://api.github.com/repos/psf/requests/issues/3086/events
|
https://github.com/psf/requests/pull/3086
| 147,328,998 |
MDExOlB1bGxSZXF1ZXN0NjU5NDU5MTg=
| 3,086 |
add rtype in session.request,help pycharm easy.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9625413?v=4",
"events_url": "https://api.github.com/users/sleshep/events{/privacy}",
"followers_url": "https://api.github.com/users/sleshep/followers",
"following_url": "https://api.github.com/users/sleshep/following{/other_user}",
"gists_url": "https://api.github.com/users/sleshep/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sleshep",
"id": 9625413,
"login": "sleshep",
"node_id": "MDQ6VXNlcjk2MjU0MTM=",
"organizations_url": "https://api.github.com/users/sleshep/orgs",
"received_events_url": "https://api.github.com/users/sleshep/received_events",
"repos_url": "https://api.github.com/users/sleshep/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sleshep/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sleshep/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sleshep",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-04-11T06:35:18Z
|
2021-09-08T04:01:08Z
|
2016-04-11T07:14:39Z
|
NONE
|
resolved
|
add rtype in session.request,help pycharm easy.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3086/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3086/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3086.diff",
"html_url": "https://github.com/psf/requests/pull/3086",
"merged_at": "2016-04-11T07:14:39Z",
"patch_url": "https://github.com/psf/requests/pull/3086.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3086"
}
| true |
[
"Thanks! :sparkles: :cake: :sparkles:\n",
"Interesting! I didn't know about this feature. \n",
"We should add these all over. \n"
] |
https://api.github.com/repos/psf/requests/issues/3085
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3085/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3085/comments
|
https://api.github.com/repos/psf/requests/issues/3085/events
|
https://github.com/psf/requests/issues/3085
| 147,316,725 |
MDU6SXNzdWUxNDczMTY3MjU=
| 3,085 |
Is there a way to specify filename in the post data?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/12237169?v=4",
"events_url": "https://api.github.com/users/imnisen/events{/privacy}",
"followers_url": "https://api.github.com/users/imnisen/followers",
"following_url": "https://api.github.com/users/imnisen/following{/other_user}",
"gists_url": "https://api.github.com/users/imnisen/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/imnisen",
"id": 12237169,
"login": "imnisen",
"node_id": "MDQ6VXNlcjEyMjM3MTY5",
"organizations_url": "https://api.github.com/users/imnisen/orgs",
"received_events_url": "https://api.github.com/users/imnisen/received_events",
"repos_url": "https://api.github.com/users/imnisen/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/imnisen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/imnisen/subscriptions",
"type": "User",
"url": "https://api.github.com/users/imnisen",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2016-04-11T05:04:01Z
|
2021-09-08T18:00:59Z
|
2016-04-11T07:13:31Z
|
NONE
|
resolved
|
Hi,
For example, I want to upload a video called "测试中文视频.mp4", which will be converted to`filename*=utf-8''%E6%B5%8B%E......`, the problem is that my working server does not recognize this format, so, is there a method to tell requests to convert the request message to like this: `filename=测试中文视频.mp4` or like this: `filename=u'\u6d4b\u8bd5\u4e2d\u6587\u89c6\u9891.mp4'`.
Thanks very much!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3085/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3085/timeline
| null |
completed
| null | null | false |
[
"I haven't worked on this, but a PreparedRequest I believe is what you want.\n\nOn Mon, 11 Apr 2016 15:04 imnisen [email protected] wrote:\n\n> Hi,\n> For example, I want to upload a image called \"测试中文视频.jpg\", which will be\n> converted tofilename*=utf-8''%E6%B5%8B%E......, the problem is that my\n> working server does not recognize this format, so, is there a method to\n> tell requests to convert the request message to like this:\n> filename=测试中文视频.jpg or like this:\n> filename=u'\\u6d4b\\u8bd5\\u4e2d\\u6587\\u89c6\\u9891.mp4'.\n> \n> Thanks very much!\n> \n> —\n> You are receiving this because you are subscribed to this thread.\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/3085\n",
"There is no builtin method to make requests do this, nor will there be: servers that don't understand the format requests uses are old an non-standards-compliant. However, as @TetraEtc points out, you can use the [PreparedRequest flow](http://docs.python-requests.org/en/master/user/advanced/#prepared-requests) to mutate the body as you wish to, which would allow you to change the multipart-encoded body to whatever form you like.\n\nYou can also manually set the filename yourself by using longer tuples in the [files parameter](http://docs.python-requests.org/en/master/api/#requests.request): in particular, if you use a _bytestring_ in the filename portion then requests will leave it alone.\n",
"@TetraEtc thanks, I have succeed using urllib2 to send the data which encoded manually, although it is a boring work. I will also try the PreparedRequest method.\n\n@Lukasa do you mean use something like `\"测试中文视频.mp4\".encode('utf-8')` in the filename portion?\n when I try this, because of `result.encode('ascii')` (at requests/packages/urllib3/fields line38) fail, the post content is also changed to something like this `filename*='%E6%B5%8B%E8%AF%95%E4%B8%AD%E6%96%87%E...`, which is not I want\n\nThank you.\n",
"Ah, yup, that'll do it. So yes, in that case I'm afraid you will have to use the prepared request flow to tweak the body.\n"
] |
https://api.github.com/repos/psf/requests/issues/3084
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3084/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3084/comments
|
https://api.github.com/repos/psf/requests/issues/3084/events
|
https://github.com/psf/requests/issues/3084
| 147,254,881 |
MDU6SXNzdWUxNDcyNTQ4ODE=
| 3,084 |
requests attempts to use IPv6 even when IPv6 is disabled
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1066494?v=4",
"events_url": "https://api.github.com/users/kz26/events{/privacy}",
"followers_url": "https://api.github.com/users/kz26/followers",
"following_url": "https://api.github.com/users/kz26/following{/other_user}",
"gists_url": "https://api.github.com/users/kz26/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kz26",
"id": 1066494,
"login": "kz26",
"node_id": "MDQ6VXNlcjEwNjY0OTQ=",
"organizations_url": "https://api.github.com/users/kz26/orgs",
"received_events_url": "https://api.github.com/users/kz26/received_events",
"repos_url": "https://api.github.com/users/kz26/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kz26/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kz26/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kz26",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-04-10T19:19:19Z
|
2021-09-08T18:01:01Z
|
2016-04-10T19:31:39Z
|
NONE
|
resolved
|
This is an issue when running on a server without IPv6 (must be disabled because the network does not support it). Example when connecting to https://graph.facebook.com and IPv4 happens to fail:
Python 3.4.3, requests 2.9.1, `ipv6.disabled=1` in kernel, `gai.conf` set to prefer IPv4
``` python
HTTPSConnectionPool(host='graph.facebook.com', port=443): Max retries exceeded with url: /v2.5/me/feed (Caused by NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7f4dbd158518>: Failed to establish a new connection: [Errno 97] Address family not supported by protocol',))
Traceback (most recent call last):
File "/home/lib/python3.4/site-packages/requests/packages/urllib3/connection.py", line 137, in _new_conn
(self.host, self.port), self.timeout, **extra_kw)
File "/home/lib/python3.4/site-packages/requests/packages/urllib3/util/connection.py", line 91, in create_connection
raise err
File "/home/lib/python3.4/site-packages/requests/packages/urllib3/util/connection.py", line 71, in create_connection
sock = socket.socket(af, socktype, proto)
File "/usr/lib/python3.4/socket.py", line 126, in __init__
_socket.socket.__init__(self, family, type, proto, fileno)
OSError: [Errno 97] Address family not supported by protocol
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3084/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3084/timeline
| null |
completed
| null | null | false |
[
"Requests doesn't do much special here. Requests calls `getaddrinfo` and then iterates through the entries in the order returned from `getaddrinfo` attempting to connect to each one in turn.\n\nThat exception only gets raised if we've attempted all our connections and they all fail. We then raise the final exception we saw. The bigger problem here seems to be that we're swallowing perfectly reasonable exceptions earlier in the chain that reveal the underlying problem in situations when IPv6 or IPv4 are enabled.\n\nThat code all lives in [urllib3](https://github.com/shazow/urllib3), would you like to open this issue over there so we can track it appropriately?\n",
"That's fair. I'll open a ticket on urllib3.\n"
] |
https://api.github.com/repos/psf/requests/issues/3083
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3083/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3083/comments
|
https://api.github.com/repos/psf/requests/issues/3083/events
|
https://github.com/psf/requests/issues/3083
| 147,238,144 |
MDU6SXNzdWUxNDcyMzgxNDQ=
| 3,083 |
Improvement to hook response.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1459670?v=4",
"events_url": "https://api.github.com/users/JackDandy/events{/privacy}",
"followers_url": "https://api.github.com/users/JackDandy/followers",
"following_url": "https://api.github.com/users/JackDandy/following{/other_user}",
"gists_url": "https://api.github.com/users/JackDandy/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/JackDandy",
"id": 1459670,
"login": "JackDandy",
"node_id": "MDQ6VXNlcjE0NTk2NzA=",
"organizations_url": "https://api.github.com/users/JackDandy/orgs",
"received_events_url": "https://api.github.com/users/JackDandy/received_events",
"repos_url": "https://api.github.com/users/JackDandy/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/JackDandy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JackDandy/subscriptions",
"type": "User",
"url": "https://api.github.com/users/JackDandy",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2016-04-10T16:27:38Z
|
2021-09-08T18:00:59Z
|
2016-04-11T13:00:16Z
|
NONE
|
resolved
|
Hello,
It would be nice if the _response_ hook was called even if there is an error during the send() process flow.
Example, if a timeout occurs, the cb is not used and a requests constructed url and query data that can normally be saved via the cb is not available - that is, URL that caused a timeout.
Thanks for your consideration.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3083/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3083/timeline
| null |
completed
| null | null | false |
[
"@JackDandy If you pass `stream=True`, then the callback should fire once the response has been received, so that resolves the problem where timeouts are encountered downloading the response data.\n\nIf the timeout is encountered while actually downloading the response, we can't call the response hook because we don't have a response to call it with. If you need to intercept that data the best place to do it is likely to be in the HTTPAdapter itself (by overriding the `send` method, which gets a `PreparedRequest` object), or by using the [prepared request flow](http://docs.python-requests.org/en/master/user/advanced/#prepared-requests) which will give you access to those things, also on a `PreparedRequest` object.\n\nDo either of those work for you?\n",
"If there is no response to provide to the hook, I'm uncertain what we would call the hook with. I do not think we could safely construct a fake response object just so we can call the response hook (which wouldn't confuse existing response hooks).\n\nYou can introspect what requests constructs for the url, query data, etc. by using the prepared request flow.\n",
"The hook is a nice and cheap way to get the constructed URL that is used in the request for logs. To add the prepared request flow seems overkill for URL reporting. A server response has content or none so there is value to add by calling the hook in any outcome.\n\n[edit] If too much hassle to create response objects, then save the built URL/post data in the requests object like _headers_ in order to give a full picture of what requests sent after its pre-processing.\n\nSo, yeah, just suggestions.\n",
"Sounds like `PreparedRequest` is what you want — it was built for exactly this purpose, if I understand you correctly. \n"
] |
https://api.github.com/repos/psf/requests/issues/3082
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3082/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3082/comments
|
https://api.github.com/repos/psf/requests/issues/3082/events
|
https://github.com/psf/requests/pull/3082
| 146,989,195 |
MDExOlB1bGxSZXF1ZXN0NjU4MTk3ODg=
| 3,082 |
Added a test to show a faulty behaviour when posting binary data for an object with no __iter__
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7489847?v=4",
"events_url": "https://api.github.com/users/tzickel/events{/privacy}",
"followers_url": "https://api.github.com/users/tzickel/followers",
"following_url": "https://api.github.com/users/tzickel/following{/other_user}",
"gists_url": "https://api.github.com/users/tzickel/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tzickel",
"id": 7489847,
"login": "tzickel",
"node_id": "MDQ6VXNlcjc0ODk4NDc=",
"organizations_url": "https://api.github.com/users/tzickel/orgs",
"received_events_url": "https://api.github.com/users/tzickel/received_events",
"repos_url": "https://api.github.com/users/tzickel/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tzickel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tzickel/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tzickel",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2016-04-08T17:28:11Z
|
2021-09-08T04:01:07Z
|
2016-04-11T12:38:50Z
|
NONE
|
resolved
|
An object with read / seek / tell but without __iter__ triggers a code path in requests which causes the stream to seek to the start instead of the position it was when it was passed into the data of the request. The faulty code is in models.py in the function prepare_content_length where it seeks to the start of the stream.
The tests fail currently because this is a test to expose the bug.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3082/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3082/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3082.diff",
"html_url": "https://github.com/psf/requests/pull/3082",
"merged_at": "2016-04-11T12:38:50Z",
"patch_url": "https://github.com/psf/requests/pull/3082.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3082"
}
| true |
[
"Ok awesome, this is a great first step! Are you interested in providing a fix as well?\n",
"I have made a dummy test so it works both on python 2 and 3. I hope the added fix is ok with all possible interactions (first time looking at requests code), but it might still not work with the redirect bug.\n",
"I think this works well, at least in the basic case. As you correctly pointed out, there's a question about how this will work with redirects, but that's harder for us to test at this moment (though we'll get there shortly I hope).\n\n@kennethreitz, want a final review here?\n",
"This looks like a sane approach to me. \n",
"Ok @kennethreitz, you can hit the big green merge button whenever you're ready.\n",
"Gah, @sigmavirus24 beat me to it\n"
] |
https://api.github.com/repos/psf/requests/issues/3081
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3081/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3081/comments
|
https://api.github.com/repos/psf/requests/issues/3081/events
|
https://github.com/psf/requests/issues/3081
| 146,893,172 |
MDU6SXNzdWUxNDY4OTMxNzI=
| 3,081 |
Missing double quotation marks at "requests/packages/urllib3/fields.py" Line 46
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/12237169?v=4",
"events_url": "https://api.github.com/users/imnisen/events{/privacy}",
"followers_url": "https://api.github.com/users/imnisen/followers",
"following_url": "https://api.github.com/users/imnisen/following{/other_user}",
"gists_url": "https://api.github.com/users/imnisen/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/imnisen",
"id": 12237169,
"login": "imnisen",
"node_id": "MDQ6VXNlcjEyMjM3MTY5",
"organizations_url": "https://api.github.com/users/imnisen/orgs",
"received_events_url": "https://api.github.com/users/imnisen/received_events",
"repos_url": "https://api.github.com/users/imnisen/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/imnisen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/imnisen/subscriptions",
"type": "User",
"url": "https://api.github.com/users/imnisen",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-04-08T10:57:31Z
|
2021-09-08T18:01:01Z
|
2016-04-08T11:03:41Z
|
NONE
|
resolved
|
value = '%s_=%s' % (name, value)
should not be
value = '%s_="%s"' % (name, value)
:)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3081/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3081/timeline
| null |
completed
| null | null | false |
[
"Thanks for this report!\n\nNo double quotation mark is required here. All the characters that would require a quotation mark (e.g. spaces) are percent-encoded by `encode_rfc2231`. This means that, per that RFC, no quotation mark is required. =)\n"
] |
https://api.github.com/repos/psf/requests/issues/3080
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3080/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3080/comments
|
https://api.github.com/repos/psf/requests/issues/3080/events
|
https://github.com/psf/requests/pull/3080
| 146,840,541 |
MDExOlB1bGxSZXF1ZXN0NjU3NDY0Mjg=
| 3,080 |
Add 421 Misdirected Request.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-04-08T07:25:52Z
|
2021-09-08T04:01:08Z
|
2016-04-10T20:49:27Z
|
MEMBER
|
resolved
|
The 421 Misdirect Request status code was originally added in [RFC 7540](https://tools.ietf.org/html/rfc7540#section-11.7), and is going to be actively used in [RFC 7838](https://www.rfc-editor.org/rfc/rfc7838.txt). This adds it to our status code registry.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3080/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3080/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3080.diff",
"html_url": "https://github.com/psf/requests/pull/3080",
"merged_at": "2016-04-10T20:49:27Z",
"patch_url": "https://github.com/psf/requests/pull/3080.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3080"
}
| true |
[
":sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/3079
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3079/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3079/comments
|
https://api.github.com/repos/psf/requests/issues/3079/events
|
https://github.com/psf/requests/issues/3079
| 146,747,996 |
MDU6SXNzdWUxNDY3NDc5OTY=
| 3,079 |
requests gets stuck on post redirect where data is readable
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7489847?v=4",
"events_url": "https://api.github.com/users/tzickel/events{/privacy}",
"followers_url": "https://api.github.com/users/tzickel/followers",
"following_url": "https://api.github.com/users/tzickel/following{/other_user}",
"gists_url": "https://api.github.com/users/tzickel/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tzickel",
"id": 7489847,
"login": "tzickel",
"node_id": "MDQ6VXNlcjc0ODk4NDc=",
"organizations_url": "https://api.github.com/users/tzickel/orgs",
"received_events_url": "https://api.github.com/users/tzickel/received_events",
"repos_url": "https://api.github.com/users/tzickel/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tzickel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tzickel/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tzickel",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 18 |
2016-04-07T21:33:22Z
|
2021-09-02T00:07:16Z
|
2016-11-28T20:15:20Z
|
NONE
|
resolved
|
Checked on python 2.7. Unfortunately httpbin does not support 307 post redirects and thus it is not easy to test this issue in requests test suite currently.
Anyhow the issue seems to be with the stream being already over when the redirects tries to read it again. The response should be an exception if a stream is inputed and it's redirected (although this is known only in httplib).
example code:
``` python
from flask import Flask, request, redirect, url_for
import requests
import io
app = Flask('test')
@app.route('/redirect/', methods=['POST'])
def redirect():
response = app.make_response('')
response.status_code = 307
response.headers['Location'] = 'http://localhost:5000/'
return response
@app.route('/', methods=['POST'])
def index():
return request.get_data()
if __name__ == "__main__":
import threading
def loop():
app.run()
t = threading.Thread(target=loop)
t.daemon = True
t.start()
if requests.post('http://localhost:5000/redirect/', data='hey').text != 'hey':
raise Exception()
if requests.post('http://localhost:5000/redirect/', data=io.BytesIO('hey')).text != 'hey': # <-- This code gets stuck
raise Exception()
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3079/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3079/timeline
| null |
completed
| null | null | false |
[
"So this is a very real problem: when passing requests any file-like object it does not rewind the file pointer on redirect. That means that if the original POST consumed the data, the follow-on POST will not.\n\nWith normal files, that causes a small behavioural error, because we'll usually conclude that the file is zero-length and POST nothing. That's wrong, but program execution continues. With BytesIO we have a separate problem, which is that we use `len(BytesIO.getvalue())` to work out how much data we're going to send. `getvalue()` doesn't change with `read()`, which means that we tell the remote server that we're going to send 3 bytes of data but provide an empty body. This obviously causes the server to wait for the body that is never going to come.\n\nFixing this requires a degree of care: the PreparedRequest needs to make a note, for any object with a `tell()` method and a `read()` method, of what the current location of the file pointer is. It then needs to check, in `resolve_redirects`, whether it's keeping the request body, and if it is and has a saved location of the file pointer then it needs to restore that pointer.\n\nThis should be entirely do-able though. Are you interested in submitting a patch to fix this?\n",
"I don't like that BytesIO.getvalue() is called at super_len, because if i'm passing large data (let's say 100MB of data), the getvalue will allocate a new copy of it in memory, than super_len will get the length, than deallocate it...\n\nUnfortunately on python 2 I'm not sure if there is a better way for this (in python 3, I think you can get the object's buffer, and check it's length, to skip this allocation).\n\nMaybe we should use chunked transfer for objects like that ?\n\n---\n\nAlso If I understand correctly, if you pass files= instead of data=, all files will be read into memory in the message preparation phase, and they will not be streamed from disk like passing a single file in data, yes ?\n",
"@tzickel As far as I know, BytesIO implements `tell()` on all Python versions, so the simplest thing to do is to add logic that does `tell()`, `seek()`, `tell()`, `seek()`. Doing that, incidentally, would allow us to also record the current location in the file-like-object so that we can safely rewind.\n\n@sigmavirus24, you wrote the original `getvalue()` call: what are the odds you remember the rationale at the time?\n",
"Nice, something like prepare_content_length is doing. Maybe there is another bug (just by reading code, haven't tried to trigger it yet), that if you pass data that has 'read' / 'seek' / 'tell' but no '**iter**', it will go in prepare_body to the non-stream code path, and call prepare_content_length, which will than move the file pointer to the start of the file instead of to it's position when passed into requests (I assume this will be need to be a custom object, since most stuff with read/seek/tell have **iter** as well).\n\nI've also added another question in my previous comment, regarding passing real files to files= param, and it will copy them into memory instead of streaming them from disk ?\n",
"I suspect that there are some latent bugs in this general area, sadly. For each one, if you can provide a test case that repros the problem, we can look at fixing it. =)\n\n> Also If I understand correctly, if you pass files= instead of data=, all files will be read into memory in the message preparation phase, and they will not be streamed from disk like passing a single file in data, yes ?\n\nCorrect: files sends a multipart file upload. If you need to stream it, the requests toolbelt has an object for that.\n",
"1. I think that httpbin's /redirect and /redirect-to should both support all headers, not just get and should have an added optional parameter for the HTTP code (which 30X) since the current 302 is too limiting (requests changes them all to GET).\n2. While I don't use the tool belt, it too should be checked for this issue (when it streams).\n",
"For limitations with httpbin I'm afraid you'll have to take that up on their repository.\n",
"Here is a demo for the other bug (without redirect this time), is it too nit-picking ? should it have another issue report in your opinion ?\nhttps://gist.github.com/tzickel/acac6bf2faaaadac1245d8a2d8683516\n",
"I think keeping it all in the same issue is a good idea. =) Once we get httpbin working for this, we can write some tests and some fixes.\n",
"This other bug can actually work with httpbin, since it's using a regular post method.\n",
"In that case, feel free to open a pull request with a test case that reproduces the bug and we can go from there. =)\n\n(Make sure it fails, rather than just timing out, by setting a timeout parameter on the request.)\n",
"1. Shouldn't requests have an option to make 302 and 307 act the same (instead of as 303 as it does by default) ? The HTTP 1.1 spec actually meant that 302 should be like 307 by default (yet most browsers treat it like a 303).\n2. I'm still waiting for the input from @sigmavirus24 because that is important memory-wise.\n",
"> Shouldn't requests have an option to make 302 and 307 act the same (instead of as 303 as it does by default) ?\n\nI don't believe so, no. In general it's substantially more consistent to behave like a browser in this situation, given the fact that POST -> 302 -> GET is a _extremely_ common pattern. The advantage of having a switch isn't really there. If needed, you can turn redirects off and handle them yourself to get this behaviour.\n",
"Ok. https://github.com/kennethreitz/requests/pull/3082 is for the second issue. The first / main / bigger issue cannot be auto-tested with httpbin for now (I think).\n",
"I don't remember the reasoning for the `getvalue` call. I'm tempted to say that existed before I extracted it out into a function and I was using it as a basis for the new function to handle all things. It's also positive I just was very naive in how I implemented support for String/Bytes IO objects. I don't honestly remember.\n",
"So I believe that #3082, #3535, #3655, and Runscope/httpbin#284 in combination should have addressed all the pieces brought up in this issue. @tzickel, is there anything else you feel is missing? If not, I'm proposing we close this as resolved.\n",
"Alright, with 2.12 released and no further comments from @tzickel, I think we can consider this closed.",
"Hi guys,\r\n\r\nI'm having the same issue, request.post gets stuck with the following error.\r\nSharing the Traceback below.\r\n\r\nNOTE: This works fine on localhost,\r\nIt's not working on an instance.\r\n\r\nDEBUG:river.apps: RiverApp is loaded. (2020-02-19 15:52:08; apps.py:27)\r\nDEBUG:urllib3.connectionpool: Starting new HTTP connection (1): instance_ip_address:8000 (2020-02-19 15:52:16; connectionpool.py:225)\r\nERROR:django.request: Internal Server Error: /api/v1/user/login/ (2020-02-19 15:53:16; exception.py:124)\r\nTraceback (most recent call last):\r\n File \"/home/Projects/gk/env_cn/lib/python2.7/site-packages/django/core/handlers/exception.py\", line 39, in inner\r\n response = get_response(request)\r\n File \"/home/Projects/gk/env_cn/lib/python2.7/site-packages/django/core/handlers/base.py\", line 249, in _legacy_get_response\r\n response = self._get_response(request)\r\n File \"/home/Projects/gk/env_cn/lib/python2.7/site-packages/django/core/handlers/base.py\", line 187, in _get_response\r\n response = self.process_exception_by_middleware(e, request)\r\n File \"/home/Projects/gk/env_cn/lib/python2.7/site-packages/django/core/handlers/base.py\", line 185, in _get_response\r\n response = wrapped_callback(request, *callback_args, **callback_kwargs)\r\n File \"/home/Projects/gk/env_cn/lib/python2.7/site-packages/django/views/decorators/csrf.py\", line 58, in wrapped_view\r\n return view_func(*args, **kwargs)\r\n File \"/home/Projects/gk/env_cn/lib/python2.7/site-packages/django/views/generic/base.py\", line 68, in view\r\n return self.dispatch(request, *args, **kwargs)\r\n File \"/home/Projects/gk/env_cn/lib/python2.7/site-packages/rest_framework/views.py\", line 483, in dispatch\r\n response = self.handle_exception(exc)\r\n File \"/home/Projects/gk/env_cn/lib/python2.7/site-packages/rest_framework/views.py\", line 443, in handle_exception\r\n self.raise_uncaught_exception(exc)\r\n File \"/home/Projects/gk/env_cn/lib/python2.7/site-packages/rest_framework/views.py\", line 480, in dispatch\r\n response = handler(request, *args, **kwargs)\r\n File \"/home/Projects/gk/env_cn/lib/python2.7/site-packages/rest_framework/decorators.py\", line 52, in handler\r\n return func(*args, **kwargs)\r\n File \"./gkAuth/views.py\", line 97, in authenticate_user\r\n resp = requests.post(url=url, data=\"username=%s&password=%s\" % (email, raw_password), headers={\"Content-Type\": \"application/x-www-\r\nform-urlencoded\"})\r\n File \"/home/Projects/gk/env_cn/lib/python2.7/site-packages/requests/api.py\", line 116, in post\r\n return request('post', url, data=data, json=json, **kwargs)\r\n File \"/home/Projects/gk/env_cn/lib/python2.7/site-packages/requests/api.py\", line 60, in request\r\n return session.request(method=method, url=url, **kwargs)\r\n File \"/home/Projects/gk/env_cn/lib/python2.7/site-packages/requests/sessions.py\", line 533, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"/home/Projects/gk/env_cn/lib/python2.7/site-packages/requests/sessions.py\", line 646, in send\r\n r = adapter.send(request, **kwargs)\r\n File \"/home/Projects/gk/env_cn/lib/python2.7/site-packages/requests/adapters.py\", line 498, in send\r\n raise ConnectionError(err, request=request)\r\nConnectionError: ('Connection aborted.', BadStatusLine('No status line received - the server has closed the connection',))"
] |
https://api.github.com/repos/psf/requests/issues/3078
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3078/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3078/comments
|
https://api.github.com/repos/psf/requests/issues/3078/events
|
https://github.com/psf/requests/issues/3078
| 146,573,413 |
MDU6SXNzdWUxNDY1NzM0MTM=
| 3,078 |
The request module redirects to a wrong URL if the 'Location' field of the response header contains utf-8 characters.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/441964?v=4",
"events_url": "https://api.github.com/users/odlbo/events{/privacy}",
"followers_url": "https://api.github.com/users/odlbo/followers",
"following_url": "https://api.github.com/users/odlbo/following{/other_user}",
"gists_url": "https://api.github.com/users/odlbo/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/odlbo",
"id": 441964,
"login": "odlbo",
"node_id": "MDQ6VXNlcjQ0MTk2NA==",
"organizations_url": "https://api.github.com/users/odlbo/orgs",
"received_events_url": "https://api.github.com/users/odlbo/received_events",
"repos_url": "https://api.github.com/users/odlbo/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/odlbo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/odlbo/subscriptions",
"type": "User",
"url": "https://api.github.com/users/odlbo",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2016-04-07T10:24:11Z
|
2021-09-08T02:09:53Z
|
2016-04-07T10:26:25Z
|
NONE
|
resolved
|
Hello!
The requests module redirects to a wrong URL if the 'Location' field of the response header contains UTF-8 characters (for example, https://picsrch.me/wCtd3uQF). The server typically does not return a response containing UTF-8 characters in the 'Location' field, but many HTTP clients (Firefox, Chrome, Wget, etc.) react correctly. As I can see, this module tries to solve this problem as well.
The main problem is that http.client module (which is used by requests module) decodes all fields in latin1, but requests module uses utf-8 encoding to prepare URL from 'Location' field (we can see this behavior in urllib.parse.quote function).
P.S.:
Python - 3.4.0
requests - 2.8.1
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3078/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3078/timeline
| null |
completed
| null | null | false |
[
"Hi there!\n\nThis is a known issue that has a fix prepared for the 3.0.0 release: see #2754. Unfortunately, this change will subtly change the logic of the code for a lot of people, which is why it's scheduled for 3.0.0. However, we're aiming to have the 3.0.0 release out in the first half of this year.\n",
"Ok. Thanks! :)\n",
"when will the 3.0 release be out? from the situation I have this issue persists and looks like its not just for utf-8 Location header.",
"@1600 we don't have a hard date for 3.0 at this time as we're waiting for some work on urllib3 to be completed before we can look at creating a release."
] |
https://api.github.com/repos/psf/requests/issues/3077
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3077/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3077/comments
|
https://api.github.com/repos/psf/requests/issues/3077/events
|
https://github.com/psf/requests/issues/3077
| 146,155,870 |
MDU6SXNzdWUxNDYxNTU4NzA=
| 3,077 |
%2e Being Decoded in URL
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5316939?v=4",
"events_url": "https://api.github.com/users/correcthorsebatterystaple-/events{/privacy}",
"followers_url": "https://api.github.com/users/correcthorsebatterystaple-/followers",
"following_url": "https://api.github.com/users/correcthorsebatterystaple-/following{/other_user}",
"gists_url": "https://api.github.com/users/correcthorsebatterystaple-/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/correcthorsebatterystaple-",
"id": 5316939,
"login": "correcthorsebatterystaple-",
"node_id": "MDQ6VXNlcjUzMTY5Mzk=",
"organizations_url": "https://api.github.com/users/correcthorsebatterystaple-/orgs",
"received_events_url": "https://api.github.com/users/correcthorsebatterystaple-/received_events",
"repos_url": "https://api.github.com/users/correcthorsebatterystaple-/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/correcthorsebatterystaple-/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/correcthorsebatterystaple-/subscriptions",
"type": "User",
"url": "https://api.github.com/users/correcthorsebatterystaple-",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2016-04-06T01:12:48Z
|
2021-09-08T19:00:23Z
|
2016-04-06T07:43:44Z
|
NONE
|
resolved
|
Hi there,
When a url-encoded period is included in a url (in path directory, path file, param name, or param value), it gets permanently converted to a period before the request is made (sometime before get_connection() is called in a subclassed HTTPAdapter). Other url-encoded things like single ticks (%27), double quotes (%22), semicolons (%3f), and percent symbols (%25) don't get decoded.
This means you can't successfully request a url with %2e anywhere in it because it gets changed to a '.', and if you try to double-url-encode it with %252e, it stays as %252e.
I've also tried passing %2e in via params=dict(...) and params='...bytes...', and nothing works.
Examples:
`requests.get('http://www.example.com/test.x%252ey/test.%252ey.html?test.x%252ey=test.x%252ey').url`
Output: u'http://www.example.com/test.x%252ey/test.%252ey.html?test.x%252ey=test.x%252ey'
First Line of Request When Viewing Through Proxy: GET /test.x%252ey/test.%252ey.html?test.x%252ey=test.x%252ey HTTP/1.1
`requests.get('http://www.example.com/', params=dict(a='x%2ey')).url`
Output: u'http://www.example.com/?a=x%252ey'
First Line of Request When Viewing Through Proxy: GET /?a=x%252ey HTTP/1.1
`requests.get('http://www.example.com/', params='a=x%2ey').url`
Output: u'http://www.example.com/?a=x.y'
First Line of Request When Viewing Through Proxy: GET /?a=x.y HTTP/1.1
`requests.get('http://www.example.com/', params='a=x.y').url`
Output: u'http://www.example.com/?a=x.y'
First Line of Request When Viewing Through Proxy: GET /?a=x.y HTTP/1.1
I just happen to be facing a use case where I need to be able to make a request containing a %2e in one of the parameters, so I'm a bit stuck on how to workaround this within the context of the Requests lib.
Best Regards,
-Justin
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3077/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3077/timeline
| null |
completed
| null | null | false |
[
"Thanks for this @correcthorsebatterystaple-!\n\nRequests has strong opinions about the correct form of a URL: specifically, it does not percent-encode things that don't normally need percent encoding. I should note that your server is in this instance misbehaving: periods do not need to be percent encoded in query string and there's no reason to do it, and even if there were, for characters that do not need percent encoding both the encoded and non-encoded character are to be treated _exactly identically_.\n\nRegardless, the pragmatic solution to this problem is the [prepared request flow](http://docs.python-requests.org/en/master/user/advanced/#prepared-requests), where you construct a `Request` object, `prepare()` it, and then edit the `url` field on the `PreparedRequest` object you get. Any changes to the URL field will then be kept, so you can (for example) double-encode the period in the query string and then do a `prepped.url.replace()` call to swap it out with the single-encoded period.\n",
"Thanks for the response, @Lukasa.\n\nFair point regarding equivalency and the server misbehaving. However, one doesn't always have control over the misbehaving server, so sometimes you have to have a way to send what needs to be sent, even if it's technically wrong. :-)\n\nThanks for the info on how to get around this with a PreparedRequest!\n\nOut of curiosity, I understand a stance of not percent-encoding things that don't typically need it, but why percent-**de**code things (specifically the period in this case)? Does it benefit other parts of the codebase by normalizing encoded and decoded periods when performing operations?\n\nAlso, regarding Requests' stance on URL forms and encodings, is there documentation regarding what gets encoded/decoded where? For example, I noticed passing query string params as a dict results in encoding, but passing as bytes doesn't.\n\nI'm mainly asking because I'm writing a tool that walks through a series of requests (to and from servers I don't have any control over ;-) ), and part of the challenge is to ensure certain pieces of data pass through the process in their intended form. If I pull a URL out of a Location header, for example, do I need to do any manually encoding on certain characters before I pass it to Requests (like plus signs)?\n\nThank you for your time on this! I'm a bit of a Python noob (maybe I've graduated to noob+ at this point ;-) ), and I'm even more of a Requests noob, so I appreciate you taking the time to field my feedback.\n",
"> Why percent-decode things (specifically the period in this case)? Does it benefit other parts of the codebase by normalizing encoded and decoded periods when performing operations?\n\nThe short answer is because we try to normalise URLs. This is because users are often inconsistent with their percent encoding, leading to URLs that are partially encoded and partially not. That causes problems when we want to come to percent-encode a URL ourselves, because blindly percent-encoding a URL that may be either unencoded, encoded, or partially encoded can lead to all kinds of nasty results. The easiest way to resolve the problem is to do what we currently do, which is _decode_ the URL (ignoring unencoded values rather than treating them as errors) and then _re-encoding_ it according to the standard encoding rules.\n\nGenerally speaking, Requests reserves the right to transform your request into a form that is semantically identical to the original in order to provide a better user experience for the majority of users: essentially, we try to save users from themselves at the cost of making it harder to be really specific about what you want.\n\n> If I pull a URL out of a Location header, for example, do I need to do any manually encoding on certain characters before I pass it to Requests (like plus signs)?\n\nYou don't _need_ to, but you may want to manually handle it yourself if you need it to remain _identical_ to the Location header provided.\n",
"And this is exactly why PreparedRequests were created. \n",
"Right on. Thanks!\n"
] |
https://api.github.com/repos/psf/requests/issues/3076
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3076/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3076/comments
|
https://api.github.com/repos/psf/requests/issues/3076/events
|
https://github.com/psf/requests/issues/3076
| 145,991,760 |
MDU6SXNzdWUxNDU5OTE3NjA=
| 3,076 |
requests.get returns a response with the text doubled
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/596935?v=4",
"events_url": "https://api.github.com/users/hachterberg/events{/privacy}",
"followers_url": "https://api.github.com/users/hachterberg/followers",
"following_url": "https://api.github.com/users/hachterberg/following{/other_user}",
"gists_url": "https://api.github.com/users/hachterberg/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/hachterberg",
"id": 596935,
"login": "hachterberg",
"node_id": "MDQ6VXNlcjU5NjkzNQ==",
"organizations_url": "https://api.github.com/users/hachterberg/orgs",
"received_events_url": "https://api.github.com/users/hachterberg/received_events",
"repos_url": "https://api.github.com/users/hachterberg/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/hachterberg/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hachterberg/subscriptions",
"type": "User",
"url": "https://api.github.com/users/hachterberg",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2016-04-05T13:37:32Z
|
2021-09-08T19:00:24Z
|
2016-04-05T14:56:20Z
|
NONE
|
resolved
|
In a rare instance, I managed to do a GET request to a server that should retrieve an xml document. However the response.text field contains the document twice (just concatenated, i.e. the first half of the data and second half of the data are exact duplicates):
``` python
>>> resp = requests.get('http://bigr-rad-xnat.erasmusmc.nl/schemas/xnat/xnat.xsd', auth=('username', 'password'))
>>> resp.text[:len(resp.text)/2] == resp.text[len(resp.text)/2:]
True
```
Now of course, it could happen that the document really is that, but it shouldn't and I tried the request using curl, postman (chromium extension), and chromium and it all returns just half the response of requests. As a final test I tried it using urllib2:
``` python
>>> import urllib2 base64
>>> req = urllib2.Request('http://bigr-rad-xnat.erasmusmc.nl/schemas/xnat/xnat.xsd')
>>> base64string = base64.encodestring(
'%s:%s' % ('password', 'password'))[:-1]
authheader = "Basic %s" % base64string
req.add_header("Authorization", authheader)
>>> handler = urllib2.urlopen(req)
>>> content = handler.read()
```
And that also gave me a single copy of the document. Hence I conclude something is probably going wrong in requests.
I am using requests version 2.9.1 on python 2.7.11 (and I also tried on 3.5.1) on debian stretch. The server is a tomcat server running XNAT (www.xnat.org). The problem is that the server serving this is located behind a hospital firewall and therefor I cannot give anyone from outside access to reproduce/test, but I can gather additional information.
I tried to look at the open bugs and did not find a similar report, but I could have missed it, if so I am sorry.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3076/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3076/timeline
| null |
completed
| null | null | false |
[
"@hachterberg Can you check whether the same problem exists for `resp.content`?\n",
"@Lukasa The result is the same for `resp.text` and `resp.content`\n",
"Ok, that's very interesting. Are you familiar with the tool Wireshark? If you are, I'd like to see the differences between the request and response headers for the transaction using requests and the transaction using curl (you'll obviously want to scrub the Authorization header).\n",
"I never used wireshark, but I gave it a go. When i copy the asci text it prepends some binary mess it seems (the TCP part?), should I clean it up? I left it in for now.\n\nThe CURL request header:\n\n```\n\fP)E@@\n\n2PVrv&P.\"GET /schemas/xnat/xnat.xsd HTTP/1.1\nHost: bigr-rad-xnat.erasmusmc.nl\nAuthorization: Basic SCRUB=\nUser-Agent: curl/7.47.0\nAccept: */*\n```\n\nthe CURL response header:\n\n```\nP)dBAE!~@=pb\n\nP2v&VrPNHTTP/1.1 200 OK\nDate: Tue, 05 Apr 2016 14:28:10 GMT\nSet-Cookie: JSESSIONID=SCRUB; Path=/\nSet-Cookie: SESSION_EXPIRATION_TIME=\"1459866490593,900000\"; Version=1; Path=/\nAccept-Ranges: bytes\nETag: W/\"154153-1452780004000\"\nLast-Modified: Thu, 14 Jan 2016 14:00:04 GMT\nContent-Type: text/xml\nContent-Length: 154153\nVary: Accept-Encoding\nConnection: close\n\n<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n[here the rest of the xml document]\n```\n\nThe requests request header:\n\n```\n\fP)E,d@@\"\n\nvPgslP.dGET /schemas/xnat/xnat.xsd HTTP/1.1\nHost: bigr-rad-xnat.erasmusmc.nl\nConnection: keep-alive\nAccept: */*\nAccept-Encoding: gzip, deflate\nAuthorization: Basic SCRUB=\nUser-Agent: python-requests/2.9.1\n```\n\nThe requests respone header:\n\n```\n)dBAE90DS@=/\n\nPvlgtPfhHTTP/1.1 200 OK\nDate: Tue, 05 Apr 2016 14:36:17 GMT\nSet-Cookie: JSESSIONID=SCRUB; Path=/\nSet-Cookie: SESSION_EXPIRATION_TIME=\"1459866977197,900000\"; Version=1; Path=/\nAccept-Ranges: bytes\nETag: W/\"154153-1452780004000\"\nLast-Modified: Thu, 14 Jan 2016 14:00:04 GMT\nContent-Type: text/xml\nVary: Accept-Encoding\nContent-Encoding: gzip\nConnection: close\nTransfer-Encoding: chunked\n\n3893\n[big mess of binary from here on]\n```\n",
"The problem here seems to be the server. The big difference is that requests sends `Accept-Encoding: gzip, deflate`, where curl does not. That causes the PHP server in this case to dramatically change the response it sends: rather than using a `Content-Length` framed response it uses a `Transfer-Encoding: chunked` header and a gzip-encoded body.\n\nThis means that almost certainly the server is getting this wrong.\n\nYou should be able to avoid it by changing your requests call to add `headers={'Accept-Encoding': None}`. Of course, I recommend you contact the web server operator to get them to fix their behaviour.\n"
] |
https://api.github.com/repos/psf/requests/issues/3075
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3075/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3075/comments
|
https://api.github.com/repos/psf/requests/issues/3075/events
|
https://github.com/psf/requests/issues/3075
| 145,858,738 |
MDU6SXNzdWUxNDU4NTg3Mzg=
| 3,075 |
Incorrect Request When URL Contains Uppercase Letters in "HTTPS" Protocol and When Using Proxies
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5316939?v=4",
"events_url": "https://api.github.com/users/correcthorsebatterystaple-/events{/privacy}",
"followers_url": "https://api.github.com/users/correcthorsebatterystaple-/followers",
"following_url": "https://api.github.com/users/correcthorsebatterystaple-/following{/other_user}",
"gists_url": "https://api.github.com/users/correcthorsebatterystaple-/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/correcthorsebatterystaple-",
"id": 5316939,
"login": "correcthorsebatterystaple-",
"node_id": "MDQ6VXNlcjUzMTY5Mzk=",
"organizations_url": "https://api.github.com/users/correcthorsebatterystaple-/orgs",
"received_events_url": "https://api.github.com/users/correcthorsebatterystaple-/received_events",
"repos_url": "https://api.github.com/users/correcthorsebatterystaple-/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/correcthorsebatterystaple-/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/correcthorsebatterystaple-/subscriptions",
"type": "User",
"url": "https://api.github.com/users/correcthorsebatterystaple-",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2016-04-05T01:51:09Z
|
2021-09-08T19:00:24Z
|
2016-04-05T09:06:04Z
|
NONE
|
resolved
|
Hi there,
First off, Requests is awesome. Thank you for the great library!
I'm writing a script that places a series of requests and manually follows some redirects, and one of the redirects is to a URL that starts with "HTTPS" (that is, the protocol is uppercase, like HTTPS://www.example.com instead of https://www.example.com). It would seem that Requests generates the wrong request when the following three conditions are met:
1. The protocol is HTTPS.
2. The protocol contains one or more uppercase letters. (e.g. "HTTPS", "httpS", "hTTps", "Https", etc.)
3. Proxies are being used.
I wrote a test script that uses a Session to GET www.example.com with variations in case for both http and https protocols, and here are the results:
| proto | proxies | result |
| --- | --- | --- |
| http | no | [OK] |
| htTp | no | [OK] |
| HTTP | no | [OK] |
| https | no | [OK] |
| HTTPS | no | [OK] |
| htTps | no | [OK] |
| httpS | no | [OK] |
| Https | no | [OK] |
| http | YES | [OK] |
| htTp | YES | [OK] |
| HTTP | YES | [OK] |
| https | YES | [OK] |
| HTTPS | YES | [ERROR] |
| htTps | YES | [ERROR] |
| httpS | YES | [ERROR] |
| Https | YES | [ERROR] |
In the ERROR case, my proxy (Burp) was generating the following error:
**Error**
Invalid client request received: First line of request did not contain an absolute URL - try enabling invisible proxy support.
GET / HTTP/1.1
Host: localhost:8080
Connection: keep-alive
Accept-Encoding: gzip, deflate
Accept: _/_
User-Agent: python-requests/2.9.1
The issue can be replicated with the following couple lines of code:
`import requests`
`requests.request('GET', 'HTTPS://www.example.com/test', proxies=dict(http='localhost:8080', https='localhost:8080'))`
What happens will depend on your proxy. With Burp, when errors are given back as valid HTTP responses, the above will result in a 200 OK response object, and when you inspect the content of the response, you'll see Burp's error. However, if you suppress Burp errors, then Burp returns an empty response, causing requests to raise a ConnectionError exception:
requests.exceptions.ConnectionError: ('Connection aborted.', BadStatusLine("''",))
Thank you for your time!
Best Regards,
-Justin
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3075/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3075/timeline
| null |
completed
| null | null | false |
[
"Hey Justin! Good catch, this is definitely a bug.\n\nThe problem is that urllib3 expects only lowercase schemes in its proxy manager code. Requests could lowercase the scheme to urllib3 itself, but it seems like it'd be better to just make urllib3 tolerant of this kind of error.\n",
"Given that this problem is pretty pervasive in urllib3's code, I'm going to close this issue in favour of shazow/urllib3#833. Let's improve both projects at once.\n\nThanks for the report @correcthorsebatterystaple-, please feel free to subscribe to the issue linked above.\n",
"Hi @Lukasa, thanks for the quick response!\n\nIs the urllib3 issue specific to https? Because it seems to work with mixed case protocol when it's http, and only has issues with https.\n\nRegards,\n-Justin\n",
"Yup, this specific issue is HTTPS specific (urllib3 does a case-sensitive comparison to `https`, which obviously fails).\n",
"@Lukasa Gotcha. Luckily, this is relatively easy to workaround in my case. Though, I wonder what would happen if Requests were configured to auto-follow redirects (while using proxies) and it encountered an HTTPS URL in the Location header (which is what would happen in my case, except I'm manually following each redirect, so I can check for the case and adjust it).\n\nAnyway, thanks again for your help. :-)\n",
"@Lukasa Is there a way to workaround the issue in a custom HTTPAdapter subclass? I tried fixing the scheme in the add_headers method, but it looks like that doesn't catch the issue early enough in the flow (I haven't dug through the code enough yet to know exactly what's happening when).\n",
"@Lukasa Nvm, I overrode the get_connection method in my own HTTPAdapter child class to make any https schemes lower case before calling the parent class's get_connection method. :-)\n",
"@correcthorsebatterystaple- Yup, that'd be the best place to work around this for now. =)\n"
] |
https://api.github.com/repos/psf/requests/issues/3074
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3074/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3074/comments
|
https://api.github.com/repos/psf/requests/issues/3074/events
|
https://github.com/psf/requests/issues/3074
| 145,857,516 |
MDU6SXNzdWUxNDU4NTc1MTY=
| 3,074 |
proxy-scheme is case sensitivity or not?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18046327?v=4",
"events_url": "https://api.github.com/users/henry51/events{/privacy}",
"followers_url": "https://api.github.com/users/henry51/followers",
"following_url": "https://api.github.com/users/henry51/following{/other_user}",
"gists_url": "https://api.github.com/users/henry51/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/henry51",
"id": 18046327,
"login": "henry51",
"node_id": "MDQ6VXNlcjE4MDQ2MzI3",
"organizations_url": "https://api.github.com/users/henry51/orgs",
"received_events_url": "https://api.github.com/users/henry51/received_events",
"repos_url": "https://api.github.com/users/henry51/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/henry51/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/henry51/subscriptions",
"type": "User",
"url": "https://api.github.com/users/henry51",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-04-05T01:45:29Z
|
2021-09-08T19:00:24Z
|
2016-04-05T09:10:27Z
|
NONE
|
resolved
|
This question came out when I tried to validate some proxies.
If proxy-scheme is uppercase as follows, it will return 200. (The host of proxy is not a true one.)
import requests
proxy = {
'HTTP': 'HTTP://111.1.11.1:8000'
}
r = requests.get('http://www.sina.com',proxies = proxy)
print (r.status_code)
However, if changing 'HTTP' to 'http', it will raise an error about httpconnectionerror.
So, it is confused me that proxy-scheme is case sensitivity or not? Or something went wrong about my code ?
Hoping to receive your instruction.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3074/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3074/timeline
| null |
completed
| null | null | false |
[
"Thanks for this!\n\nThis is a real bug, but it's a duplicate of #3075. Please see shazow/urllib3#833 to track work on this problem.\n"
] |
https://api.github.com/repos/psf/requests/issues/3073
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3073/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3073/comments
|
https://api.github.com/repos/psf/requests/issues/3073/events
|
https://github.com/psf/requests/issues/3073
| 145,170,191 |
MDU6SXNzdWUxNDUxNzAxOTE=
| 3,073 |
Redacted
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4",
"events_url": "https://api.github.com/users/ghost/events{/privacy}",
"followers_url": "https://api.github.com/users/ghost/followers",
"following_url": "https://api.github.com/users/ghost/following{/other_user}",
"gists_url": "https://api.github.com/users/ghost/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ghost",
"id": 10137,
"login": "ghost",
"node_id": "MDQ6VXNlcjEwMTM3",
"organizations_url": "https://api.github.com/users/ghost/orgs",
"received_events_url": "https://api.github.com/users/ghost/received_events",
"repos_url": "https://api.github.com/users/ghost/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ghost/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ghost",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2016-04-01T12:38:13Z
|
2016-04-02T15:52:49Z
|
2016-04-01T12:51:03Z
|
NONE
| null |
Redacted
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3073/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3073/timeline
| null |
completed
| null | null | false |
[
"did you try `OrderedDict`? Python dict has unordered keys by design\n",
"@arcan1s you don't need an OrderedDict, you only need to provide a list of tuples. And questions should be asked on [StackOverflow](https://stackoverflow.com).\n",
"@MrAureliusR The terseness comes from the fact that we get several new issues a day that require that the core maintainers interrupt their days to address. Each bug report costs at least one maintainer a few minutes and a context switch, and may cost more than that if multiple maintainers stop to address it. That's unfortunate when you consider that the majority of the maintainers are unpaid, and all of us are overstretched.\n\nThis is doubly problematic for questions. When you opened this issue you were shown this screen:\n\n\n\nNotice the yellow bar at the top. Had you clicked through and read that link you'd have seen [this document](https://github.com/kennethreitz/requests/blob/master/CONTRIBUTING.md), which contains in its very first section this text:\n\n> The GitHub issue tracker is for _bug reports_ and _feature requests_. Please do not use it to ask questions about how to use Requests. These questions should instead be directed to Stack Overflow. Make sure that your question is tagged with the `python-requests` tag when asking it on Stack Overflow, to ensure that it is answered promptly and accurately.\n\nThis is why the interaction was strained: you demonstrated that you had not read that note. Those notes are important to ensure that everyone interacts as efficiently as possible and to ensure that the limited resources of the project are allocated in the most useful way available.\n\nEach issue opened on this bug tracker costs the core team at least 5 minutes of time. We're at issue 3073 right now, meaning that if these had all been _just questions_ we'd have spent nearly _11 days_ just swapping to handle those questions.\n\nI say all of this not to chastise you, but to remind you that open source is a predominantly volunteer operation where the primary resource is time. Asking questions on bug trackers is a common source of wasted time on the part of maintainers, which is why maintainers can be terse when questions are asked in inappropriate forums.\n",
"It's triply problematic here because our documentation describes how to handle this exact scenario.\n",
"Read through the original thread, and I want it to be known that I think @MrAureliusR's submission was perfectly fine. Not every interaction here has to be handled solely by the maintainers (in this instance, a community member was already stepping in to help).\n\nThe \"be cordial or be on your way\" rule applies to maintainers too :) These interactions were fairly cordial, but far from friendly.\n",
"No further comment needed. Sorry for your bad experience, @MrAureliusR\n"
] |
https://api.github.com/repos/psf/requests/issues/3072
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3072/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3072/comments
|
https://api.github.com/repos/psf/requests/issues/3072/events
|
https://github.com/psf/requests/issues/3072
| 144,787,225 |
MDU6SXNzdWUxNDQ3ODcyMjU=
| 3,072 |
find_no_duplicates raise KeyError
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4315746?v=4",
"events_url": "https://api.github.com/users/jackyliang/events{/privacy}",
"followers_url": "https://api.github.com/users/jackyliang/followers",
"following_url": "https://api.github.com/users/jackyliang/following{/other_user}",
"gists_url": "https://api.github.com/users/jackyliang/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jackyliang",
"id": 4315746,
"login": "jackyliang",
"node_id": "MDQ6VXNlcjQzMTU3NDY=",
"organizations_url": "https://api.github.com/users/jackyliang/orgs",
"received_events_url": "https://api.github.com/users/jackyliang/received_events",
"repos_url": "https://api.github.com/users/jackyliang/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jackyliang/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jackyliang/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jackyliang",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-03-31T04:38:31Z
|
2021-09-08T19:00:25Z
|
2016-03-31T19:58:15Z
|
NONE
|
resolved
|
Hi there!
I am currently on `requests version: 2.9.1` I have getting the following error in my code:
```
Traceback (most recent call last):
File "./add.py", line 98, in <module>
'SESSID':term.cookies['SESSID'],
File "/usr/local/lib/python2.7/site-packages/requests/cookies.py", line 276, in __getitem__
return self._find_no_duplicates(name)
File "/usr/local/lib/python2.7/site-packages/requests/cookies.py", line 331, in _find_no_duplicates
raise KeyError('name=%r, domain=%r, path=%r' % (name, domain, path))
KeyError: "name='SESSID', domain=None, path=None"
```
Line 98 in `add.py` refers to the following:
```
# Grab the Add/Drop Class page cookies
ad_cookie = {
'SESSID':term.cookies['SESSID'],
'IDMSESSID':username
}
```
Just to make sure that the particular cookie in that page was not removed or missing, I checked the cURL for when I hit that page, and surely enough, both `SESSID` and `IDMSESSID` were in that cURL request.
Wondering what the issue is?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3072/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3072/timeline
| null |
completed
| null | null | false |
[
"Don't check curl, check requests. =) Can you print out:\n\n```\nfor resp in response.history:\n print resp.headers\nprint response.headers\n```\n\nThat should provide information about what cookies got set.\n",
"@Lukasa I feel dumb! Turns out my `username` and `password` were incorrect, and thus I was not authenticated, and thus no cookie. My bad. Thanks for the super quick reply though, Lukasa!\n",
"No problem, glad you got it sorted out!\n"
] |
https://api.github.com/repos/psf/requests/issues/3071
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3071/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3071/comments
|
https://api.github.com/repos/psf/requests/issues/3071/events
|
https://github.com/psf/requests/issues/3071
| 144,347,300 |
MDU6SXNzdWUxNDQzNDczMDA=
| 3,071 |
allow certs.where to return a file like object
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/24433?v=4",
"events_url": "https://api.github.com/users/cburroughs/events{/privacy}",
"followers_url": "https://api.github.com/users/cburroughs/followers",
"following_url": "https://api.github.com/users/cburroughs/following{/other_user}",
"gists_url": "https://api.github.com/users/cburroughs/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/cburroughs",
"id": 24433,
"login": "cburroughs",
"node_id": "MDQ6VXNlcjI0NDMz",
"organizations_url": "https://api.github.com/users/cburroughs/orgs",
"received_events_url": "https://api.github.com/users/cburroughs/received_events",
"repos_url": "https://api.github.com/users/cburroughs/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/cburroughs/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cburroughs/subscriptions",
"type": "User",
"url": "https://api.github.com/users/cburroughs",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-03-29T18:23:49Z
|
2021-09-08T19:00:25Z
|
2016-03-29T18:31:11Z
|
NONE
|
resolved
|
Currently `certs.where` expects a file name that is then passed along and eventually opened. This means it is up to the caller to deal with eggs, wheels, install locations, zip_safe vs not etc, to eventually figure out where something is going to end up on the file system.. None of these issues are particularly fun. Instead of certs.where could return a file like object and a custom implantation could be as simple as `StringIO.StringIO(pkgutil.get_data(__name__, 'cacert.pem'))`
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3071/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3071/timeline
| null |
completed
| null | null | false |
[
"Unfortunately, `certs.where` cannot return a file-like object: due to several limitations in the way OpenSSL's APIs behave across versions of Python the lowest common denominator is that we need to provide a _path on the filesystem_ to OpenSSL.\n\nThis makes it impossible for us to do that. Sorry, it's otherwise a good idea!\n"
] |
https://api.github.com/repos/psf/requests/issues/3070
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3070/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3070/comments
|
https://api.github.com/repos/psf/requests/issues/3070/events
|
https://github.com/psf/requests/issues/3070
| 144,324,837 |
MDU6SXNzdWUxNDQzMjQ4Mzc=
| 3,070 |
Consider making Timeout option required or have a default
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/236970?v=4",
"events_url": "https://api.github.com/users/mlissner/events{/privacy}",
"followers_url": "https://api.github.com/users/mlissner/followers",
"following_url": "https://api.github.com/users/mlissner/following{/other_user}",
"gists_url": "https://api.github.com/users/mlissner/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mlissner",
"id": 236970,
"login": "mlissner",
"node_id": "MDQ6VXNlcjIzNjk3MA==",
"organizations_url": "https://api.github.com/users/mlissner/orgs",
"received_events_url": "https://api.github.com/users/mlissner/received_events",
"repos_url": "https://api.github.com/users/mlissner/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mlissner/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mlissner/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mlissner",
"user_view_type": "public"
}
|
[
{
"color": "02e10c",
"default": false,
"description": null,
"id": 76800,
"name": "Feature Request",
"node_id": "MDU6TGFiZWw3NjgwMA==",
"url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request"
},
{
"color": "eb6420",
"default": false,
"description": null,
"id": 44501256,
"name": "Breaking API Change",
"node_id": "MDU6TGFiZWw0NDUwMTI1Ng==",
"url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change"
}
] |
closed
| true | null |
[] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2024-05-19T18:29:04Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/34",
"id": 11073254,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/34/labels",
"node_id": "MI_kwDOABTKOs4AqPbm",
"number": 34,
"open_issues": 0,
"state": "open",
"title": "Bankruptcy",
"updated_at": "2024-05-20T14:37:16Z",
"url": "https://api.github.com/repos/psf/requests/milestones/34"
}
| 62 |
2016-03-29T16:56:50Z
|
2024-05-21T10:16:30Z
|
2024-05-20T14:35:37Z
|
CONTRIBUTOR
| null |
I have a feeling I'm about to get a swift education on this topic, but I've been thinking about the pros/cons of changing `requests` so that somehow there is a timeout value configured for every request.
I think there are two ways to do this:
1. Provide a default value. I know browsers have a default, so that may be a simple place to begin.
2. Make every user configure this in every request -- bigger API breakage. Probably not the way to go.
The reason I'm thinking about this is because I've used requests for a few years now and until now I didn't realize the importance of providing a timeout. It took one of my programs hanging forever for me to realize that the default here isn't really very good for my purposes. (I'm in the process of updating all my code...)
I also see that a lot of people want `Session` objects to have a timeout parameter, and this might be a way to do that as well.
If a large default were provided to all requests and all sessions, what negative impact would that have? The only thing I can think of is that some programs will get timeout exceptions where they previously hung, which seems like an improvement to me.
## Caveat, added May 13, 2016:
Please don't use this issue to discuss adding a timeout attribute to requests. There are a number of discussions about this elsewhere (search closed issues), and we don't want that conversation to muddy this issue too. Thanks.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 66,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 11,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 77,
"url": "https://api.github.com/repos/psf/requests/issues/3070/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3070/timeline
| null |
completed
| null | null | false |
[
"This is an entirely reasonable suggestion.\n\nHonestly, I'm not averse to doing it: there are definitely worse things to do than this. I'd be open to providing a default timeout. However, I don't think we can do it until 3.0.0.\n\nOf course @kennethreitz as keeper of the spirit of requests has got the final say on this, and should definitely weigh in.\n",
"Changing the default would definitely need to wait until 3.0.0.\n\nI'm not against having a well-reasoned (and therefore, also well-documented) default timeout parameter.\n\nI'm still pretty strongly against sessions having a timeout attribute though and I think that shouldn't muddy the waters of this discussion because it will sidetrack things and prevent an otherwise productive discussion.\n",
"@sigmavirus24 I don't know the argument pro/con sessions having a timeout, but I'm happy to defer to other's judgement and experience on that, keeping these waters clean, like you say. \n\nI'm just happy that I haven't gotten a swift education about timeouts.\n",
"I'm definitely in favor of this in light of dealing with a package that doesn't provide a way to set a timeout, but does accept a fully configured session. My current fix is a specialized `HTTPAdapter` that sets a default timeout if none is provided:\n\n``` python\nclass TimeoutHTTPAdapter(HTTPAdapter):\n def __init__(self, timeout, *args, **kwargs):\n self._timeout = timeout\n super().__init__(*args, **kwargs)\n\n def send(self, request, timeout=False, ...):\n if timeout is None:\n timeout = self._timeout\n return super().send(request, timeout=timeout, ...)\n```\n\nI'd much prefer something like:\n\n``` python\nsession = Session(timeout=...)\n\n# or even\n\nsession.default_timeout = ...\n```\n\nHowever, this current fix isn't too cumbersome either. \n",
":+1: for `session = Session(timeout=...)`. Would you merge a patch?\n",
"@kuraga No. Per @sigmavirus24, and in many many previous discussions:\n\n> I think I could actually still implement it using hooks - create a method which uses the prepared request workflow, and in the hook, just call it again. What would be your suggested solution?\n",
"@Lukasa ok, but which discussion did you cite? Was it private? Which previous discussions?\n",
"Argh, sorry, copy-paste fail:\n\n> I'm still pretty strongly against sessions having a timeout attribute though and I think that shouldn't muddy the waters of this discussion because it will sidetrack things and prevent an otherwise productive discussion.\n",
"@kuraga please search _closed_ issues. There are several discussions of this. I really don't want this diversion to distract from the topic of a _default timeout_ though. So can we **please** stop discussing this now as I asked nicely before. You and @justanr are distracting from the important and _attainable_ portion of this issue.\n",
"@sigmavirus24 I added a caveat to [my initial comment, above](https://github.com/kennethreitz/requests/issues/3070#issue-144324837). Hopefully that should get this discussion focused.\n",
"I'm (as a user) slightly against this proposal.\nThere are following caveats:\n1. There should be several different time-outs: resolve, connect, request, read. What should be defaults for each of them?\n2. If there will be default read time-out, then (for streaming interfaces) there should also be keep-alive packets. Is this implemented?\n3. There must be way to override default and disable time-outs completely.\n\nWhat I've missed?\n",
"At the very least, how about we include the `timeout` arg in the first introductory examples in the documentation? It's bad when intro examples are subtly wrong for the sake of brevity.\n",
"It's extremely unclear to me what \"wrong\" means here. They aren't wrong: they work, as designed. The introductory examples are all clearly typed into the Python interactive interpreter, so they have a timeout: the user sitting there can hit ^C anytime they like.\n\nI'd welcome enhancements to the \"timeouts\" section of the documentation to be more emphatic, that certainly does seem reasonable. But I don't think we need to go through and add the `timeout` arg. \n",
"> It's extremely unclear to me what \"wrong\" means here. They aren't wrong: they work, as designed.\n\nI think the point that's generally acknowledged by this bug is that the design _was_ wrong. There's a general consensus here that adding a default timeout or requiring it as an argument is a good idea. The docs could address that, and that seems like a simple step to take until this issue is resolved in the next major version.\n\nWhat happens in practice is that people grab the examples from the docs, don't read the timeout section carefully (or don't understand its implications -- I didn't for years until I got bit), and then wind up with programs that can hang forever. I completely agree with @chris-martin that until this issue is fixed, all examples in the docs should provide the timeout argument. Otherwise, we're providing examples that can (and probably will) break your programs.\n",
"There is a substantial difference between \"is a good idea in production code\" and \"should be used in every example\". For example, all production code should use Sessions for everything, but the documentation doesn't do that because it doesn't help teach the specific lessons that the documentation is intended to teach.\n\nWhile I appreciate the concern that users will blindly copy code out of the documentation without further consideration, anyone doing that in their product is necessarily going to have sub-par results. This is true in all programming tools. \n",
"That's fair enough, @Lukasa. What about making the Timeout section more explicit then? Right now it says:\n\n> You can tell Requests to stop waiting for a response after a given number of seconds with the timeout parameter.\n\nAnd then goes on with a long, somewhat complicated warning. Could the first part say:\n\n> You can tell Requests to stop waiting for a response after a given number of seconds with the timeout parameter. Nearly all production code should use this parameter in nearly all requests. Failure to do so can cause your program to hang indefinitely:\n\nSomething like that? \n",
"Yup, I'd be happy to merge a PR with that change in it. :smile:\n",
"I think that we've addressed this through documentation. I don't think any of the core team is going to change the API to require a timeout and I think we've done our diligence here.",
"I doubt this will change anybody's mind, but I *vehemently* disagree with closing this bug without a fix. We improved the documentation via the PR that I filed, but in practice I regularly run into programs that hang because they do not have a timeout. It's gotten bad enough that it's one of the first things I think of when a program hangs. \r\n\r\nRequests is a wonderful library but you can't document your way out of a problem like this. Look at the list of issues referencing this one. Documentation is just not enough, or else issues would stop referencing this one.\r\n\r\nI respect the requests maintainers greatly, but I hope you'll reconsider closing this. If it causes API changes, that's what version bumps are for. But I'm not even sure API changes would be needed.\r\n\r\nPlease reconsider.",
"We could consider making a default in 3.0, but I don't know what it would be... 120s?\r\n\r\nIdk, I like our current design. You really shouldn't be hitting the internet without proper timeouts in place ever in production. Any sane engineer knows that — it's not Requests' job to do your job for you. ",
"Here are a few example idle timeout defaults from other libraries, in case that helps:\r\n\r\n* The Haskell [http-client](https://hackage.haskell.org/package/http-client-0.5.7.0/docs/Network-HTTP-Client.html#v:managerResponseTimeout) package, 30 seconds.\r\n* Ruby's [standard library](https://ruby-doc.org/stdlib-2.4.1/libdoc/net/http/rdoc/Net/HTTP.html), 60 seconds.\r\n* The Scala [http4s](https://github.com/http4s/http4s/blob/7468dc30f6cce9f304ed4d1b988a40f4e6d2988b/blaze-client/src/main/scala/org/http4s/client/blaze/bits.scala#L15) library, 60 seconds.",
"Those seem reasonable to me. I just just think this should be a connect-only timeout, not a download timeout. ",
"It would also be overridable via environment variable. ",
"Hooray for reopening this bug. I like the solution of providing a default with an env override. Seems simple enough to me.\r\n\r\n> You really shouldn't be hitting the internet without proper timeouts in place ever in production. Any sane engineer knows that — it's not Requests' job to do your job for you.\r\n\r\nI do disagree with this though. The thing I love about Requests (and Python in general) is that these kinds of details are usually taken care of for me. For me, I think that's what so jarring about this issue. A super simple GET request has a gotcha that could bite you way later.\r\n\r\nAnyway, thanks for reopening. I really appreciate it and I think the community will greatly benefit from a default timeout.",
"@mlissner I don't disagree with your statement — as long as we apply this only to connect timeouts, not download timeouts. Download timeouts should be unlimited by default (people downloading large files). ",
"Environment overrides are already causing us to pull our hair out. I'd like to not add anymore, for what that's worth.",
"And as a user of software, I'd like to not have yet more mysterious inputs to programs, for what that's worth. Nobody's going to bother documenting that their daemon uses `requests` and therefore has behavior that depends on this environment variable.",
"Don't disagree. Will think about it. ",
"> Download timeouts should be unlimited by default (people downloading large files).\r\n\r\nThat's not how Requests timeouts work. They apply per socket I/O syscall, not over the total execution time. That is, they govern how much each call to `read()` blocks. Large file downloads shouldn't be an issue.",
"Has there been any progress on this?\r\n\r\n@mlissner To add to your statement, I think that the current situation kind-of contradicts the stated project goal \"HTTP Requests for Humans\".\r\n\r\nHumans need sensible defaults. Not just in graphical user interfaces, but also in command-line user interfaces and software libraries. And \"no timeout\" is almost never a sensible default. I was very surprised to read about that trap in the manual. This note should be more prominent in the manual, perhaps being the first note in the Quickstart (rather than the last note), because this is surprising and not what you expect from a library that performs network calls.\r\n\r\n(Even better than a more prominent place in the manual, of course, would be to fix this once and for all.)\r\n\r\n@sigmavirus24 @chris-martin I fully agree, and I don't need timeout environment variables, either. All I need is either a sane default timeout, or making the timeout argument mandatory. No more, no less."
] |
https://api.github.com/repos/psf/requests/issues/3069
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3069/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3069/comments
|
https://api.github.com/repos/psf/requests/issues/3069/events
|
https://github.com/psf/requests/pull/3069
| 144,231,246 |
MDExOlB1bGxSZXF1ZXN0NjQ0NDg5NTQ=
| 3,069 |
Update docstring and API doc to document ability to add per-file headers in multipart POST
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/13746639?v=4",
"events_url": "https://api.github.com/users/achermes/events{/privacy}",
"followers_url": "https://api.github.com/users/achermes/followers",
"following_url": "https://api.github.com/users/achermes/following{/other_user}",
"gists_url": "https://api.github.com/users/achermes/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/achermes",
"id": 13746639,
"login": "achermes",
"node_id": "MDQ6VXNlcjEzNzQ2NjM5",
"organizations_url": "https://api.github.com/users/achermes/orgs",
"received_events_url": "https://api.github.com/users/achermes/received_events",
"repos_url": "https://api.github.com/users/achermes/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/achermes/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/achermes/subscriptions",
"type": "User",
"url": "https://api.github.com/users/achermes",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-03-29T11:13:06Z
|
2021-09-08T04:01:11Z
|
2016-03-29T11:31:57Z
|
CONTRIBUTOR
|
resolved
|
I recently needed this function and could not find it in the docs - luckily SO pointed me at PR #1640.
However, I feel like this really should be documented, so here is my attempt.
Note:
- I have not run any tests whatsoever over this as this is "just a docs change" - I don't know if that's ok.
- The API docs seem a bit inaccurate in that they do not mention that `files` can also be a list. Similarly the docsstring in `_encode_files` does not discuss the case where instead of a tuple a file-obj is provided.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3069/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3069/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3069.diff",
"html_url": "https://github.com/psf/requests/pull/3069",
"merged_at": "2016-03-29T11:31:57Z",
"patch_url": "https://github.com/psf/requests/pull/3069.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3069"
}
| true |
[
"\\o/ This looks great, thanks @achermes! :sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/3068
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3068/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3068/comments
|
https://api.github.com/repos/psf/requests/issues/3068/events
|
https://github.com/psf/requests/issues/3068
| 143,847,469 |
MDU6SXNzdWUxNDM4NDc0Njk=
| 3,068 |
Response read hangs
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/59870?v=4",
"events_url": "https://api.github.com/users/medecau/events{/privacy}",
"followers_url": "https://api.github.com/users/medecau/followers",
"following_url": "https://api.github.com/users/medecau/following{/other_user}",
"gists_url": "https://api.github.com/users/medecau/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/medecau",
"id": 59870,
"login": "medecau",
"node_id": "MDQ6VXNlcjU5ODcw",
"organizations_url": "https://api.github.com/users/medecau/orgs",
"received_events_url": "https://api.github.com/users/medecau/received_events",
"repos_url": "https://api.github.com/users/medecau/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/medecau/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/medecau/subscriptions",
"type": "User",
"url": "https://api.github.com/users/medecau",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
}
] |
closed
| true | null |
[] | null | 5 |
2016-03-27T20:57:08Z
|
2021-09-08T19:00:26Z
|
2016-03-28T15:35:07Z
|
NONE
|
resolved
|
python 2.7.11
requests installed from pip and up-to-date
URL -> 'http://archive.mid.ru/bdomp/zu_r.nsf/strawebeng!OpenView'
GET request hangs at read time.
I tried to use `stream=True` and then read from `response.raw` and it works. But doesn't finish on normal usage.
I looked at the headers and the `Content-Length` is duplicated.
When I try to read it from `response.headers['Content-Length']` I find a single string: `'133172, 133172'`.
The length is correct otherwise.
I am not entirely sure of what's causing this but my guess is that requests is getting confused by the duplicate header.
_thanks for everything_
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3068/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3068/timeline
| null |
completed
| null | null | false |
[
"> I tried to use `stream=True` and then read from `response.raw` and it works.\n\nWhen you did this read, did you just do `read()`, or did you provide a length to the `read()` call?\n\nSo, the server is getting this wrong. It is strictly forbidden to send multiple content-length headers, and generally speaking I oppose having code to defend against idiot servers doing things wrong. In this specific instance, however, we have another very similar problem with servers sending duplicate `Location` headers, so I wouldn't be opposed to us providing fixes for both at once, as the fix is basically the same.\n",
"Yes, I provided the value I get from the duplicate `Conten-Length` header.\n",
"@medecau Yeah, so you'll almost certainly find that if you call `read()` with no arguments (which is what requests normally does), that hangs the same way that requests does. This is because httplib controls the HTTP framing and is presumably confused about what the actual content-length is.\n\nStrictly speaking the best way to fix this would be to raise a bug against httplib, though the _actual_ best fix is to tell the server to sort itself out.\n",
"Since this is strictly not something we can fix in requests (or urllib3) I think we should close this. Thoughts @Lukasa ?\n",
"I think we technically _can_ fix it in urllib3/requests, but doing it is pretty invasive. I'd like to approach getting it fixed in httplib first so at least python-dev can say we're lunatics.\n"
] |
https://api.github.com/repos/psf/requests/issues/3067
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3067/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3067/comments
|
https://api.github.com/repos/psf/requests/issues/3067/events
|
https://github.com/psf/requests/issues/3067
| 143,795,601 |
MDU6SXNzdWUxNDM3OTU2MDE=
| 3,067 |
No way to make requests redirect POST/PUT/DELETE requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/28710?v=4",
"events_url": "https://api.github.com/users/vmalloc/events{/privacy}",
"followers_url": "https://api.github.com/users/vmalloc/followers",
"following_url": "https://api.github.com/users/vmalloc/following{/other_user}",
"gists_url": "https://api.github.com/users/vmalloc/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/vmalloc",
"id": 28710,
"login": "vmalloc",
"node_id": "MDQ6VXNlcjI4NzEw",
"organizations_url": "https://api.github.com/users/vmalloc/orgs",
"received_events_url": "https://api.github.com/users/vmalloc/received_events",
"repos_url": "https://api.github.com/users/vmalloc/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/vmalloc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vmalloc/subscriptions",
"type": "User",
"url": "https://api.github.com/users/vmalloc",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-03-27T10:25:46Z
|
2021-09-08T19:00:25Z
|
2016-03-27T10:31:31Z
|
NONE
|
resolved
|
In the documentation (see http://docs.python-requests.org/en/latest/api/#requests.request) it says that specifying _allow_redirects_ to `True` means that redirecting POST/PUT/DELETE is allowed.
Quick attempts show that the reality is that those requests _ALWAYS_ get redirected as `GET`s, even when this parameter is True (which it is by default). It seems like the tests [here](https://github.com/kennethreitz/requests/blame/3acf3a723892550719ca23bfe1be07549a8ad201/tests/test_requests.py#L165) support this claim.
The one commented out test [here](https://github.com/kennethreitz/requests/blame/3acf3a723892550719ca23bfe1be07549a8ad201/tests/test_requests.py#L208) supports the statement from the documentation, but is commented out (the reason seems to be a bug in httpbin, but no further explanation is supplied)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3067/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3067/timeline
| null |
completed
| null | null | false |
[
"Requests will change the verb used for a redirect in the following cases:\n- On a 303 See Other response, HEAD and GET are left alone: all other verbs get changed to GET, as per [RFC 7231 Section 6.4.4](https://tools.ietf.org/html/rfc7231#section-6.4.4).\n- On a 302 Found response, HEAD and GET are left alone: all other verbs get changed to GET. This is allowed by the note in [RFC 7231 Section 6.4.3](https://tools.ietf.org/html/rfc7231#section-6.4.3), and is extremely common behaviour with all browsers.\n- On a 301 Moved Permanently response, POST is changed to GET. All other verbs are left alone. Again, this is allowed by the note in [RFC 7231 Section 6.4.2](https://tools.ietf.org/html/rfc7231#section-6.4.2) and is extremely common behaviour with browsers.\n\nWhile requests allows redirects, it uses the semantic meaning in the status code to potentially adjust the redirect verb. If a server explicitly wants the redirected user agent to keep the verb the same in all circumstances, servers should use 307 Temporary Redirect and 308 Permanent Redirect.\n",
":+1: thanks. You learn something new every day...\n",
"@lukasa we should add your snippet of text there to the docs. Doubt it'll reduce incoming issues (questions) around this, but it'd be great for the docs. \n"
] |
https://api.github.com/repos/psf/requests/issues/3066
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3066/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3066/comments
|
https://api.github.com/repos/psf/requests/issues/3066/events
|
https://github.com/psf/requests/issues/3066
| 143,481,469 |
MDU6SXNzdWUxNDM0ODE0Njk=
| 3,066 |
auth and data set results in invalid http header
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/709223?v=4",
"events_url": "https://api.github.com/users/julian-r/events{/privacy}",
"followers_url": "https://api.github.com/users/julian-r/followers",
"following_url": "https://api.github.com/users/julian-r/following{/other_user}",
"gists_url": "https://api.github.com/users/julian-r/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/julian-r",
"id": 709223,
"login": "julian-r",
"node_id": "MDQ6VXNlcjcwOTIyMw==",
"organizations_url": "https://api.github.com/users/julian-r/orgs",
"received_events_url": "https://api.github.com/users/julian-r/received_events",
"repos_url": "https://api.github.com/users/julian-r/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/julian-r/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/julian-r/subscriptions",
"type": "User",
"url": "https://api.github.com/users/julian-r",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
}
] |
closed
| true | null |
[] | null | 12 |
2016-03-25T11:30:41Z
|
2021-09-08T13:05:42Z
|
2016-12-08T12:33:12Z
|
NONE
|
resolved
|
https://stackoverflow.com/questions/36216274/uploading-a-0-bytes-file-to-owncloud-with-python-requests-hangs/36217781#36217781
When auth is set and a file object is passed to data, the header contains a content-length AND a Transfer-Encoding: chunked is set in the header and additionally the last chunk is not send.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3066/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3066/timeline
| null |
completed
| null | null | false |
[
"Yup, this is a real bug. `prepare_auth` unconditionally calls `prepare_content_length`, which does the wrong thing for situations where it doesn't know the content-length.\n",
"Working out exactly how best to fix this is tricky: `prepare_content_length` does weirdly quite a lot, and it's not entirely apparent what parts of its logic are actually mandatory.\n",
"Ok, I think the best way to do this is to have `prepare_content_length` short-circuit out if there's a `Transfer-Encoding` header present in the headers already: if we already concluded we want to send via chunked transfer encoding, we probably shouldn't change our minds about that.\n",
"In fact, we maybe want to go higher level than that: perhaps `prepare_auth` should choose not to delegate to `prepare_content_length` when `Transfer-Encoding` is present in the headers.\n",
"@Lukasa has anyone started work on this? I have some free time today, I was thinking I'd take a look at it. If I come up with something I can put together a PR.\n",
"@davidsoncasey I don't believe anyone has taken this up yet, no.\n",
"@Lukasa great, I'm working now on writing a test to replicate the issue. Once I've got that, I may ask you to take a look to verify that I'm correctly capturing the issue, and then I'll proceed with a fix.\n",
"@Lukasa I just made PR #3181, with the solution to not call `prepare_content_length` if the `Transfer-Encoding` header is present. I went back and forth on whether to do the check there or to short-circuit out of `prepare_content_length` and I'm still not sure which I like better, so if you take a look and think it would be better to do within `prepare_content_length` then by all means.\n\nI was thinking it may be more clear to refactor `prepare_body` slightly to not set the `Content-Length` header at all in `prepare_body` (line 442 of models.py), and to call `prepare_content_length` regardless of if the data is a stream. I think then it could be a bit more explicit that the two headers should never coexist in the same request. Let me know if you have thoughts.\n",
"I had a brief chat about this issue with @Lukasa this morning, and wanted to update the status. Due to some recent improvements to `super_len` and `prepare_content_length`, it seems we've inadvertently solved @julian-r's original issue. You'll find the code below (a minor variant of the Stackoverflow question) now works.\n\n``` python\nf = io.BytesIO(b'')\nresp = requests.put('http://httpbin.org/put', auth=('asdf', 'fdsa'), data=f)\nassert 'Transfer-Encoding' in resp.request.headers\nassert 'Content-Length' not in resp.request.headers\n```\n\n~~Note I use the word _works_ loosely. Due to an oversight I made, Requests is currently setting all streamable bodies with a length of 0 to 'Transfer-Encoding: chunked'. This doesn't cause any severely negative behaviour but is likely better if we used 'Content-Length' instead. We can either integrate that into the work in #3338, or I'll open a separate issue/PR for it.~~\n\n---\n\n@davidsoncasey, you've done a lot of great work in #3184 and #3338 that would be beneficial for Requests. Would you be willing to reframe a few parts of it around these recent changes?\n1. I think it would be helpful to move your three non-exception tests from #3338 into the master branch. This would show that the functionality is currently working and will prevent us from regressing it moving forward.\n2. We can then integrate your simplification of `prepare_body` and `prepare_content_length`, along with your exceptions, with the current changes on master. That would be an excellent addition in 3.0.0.\n\nHow does that sound?\n",
"The move of streamable bodies of length 0 to `Transfer-Encoding: chunked` is actually somewhat deliberate: it clarifies the code a great deal, and also handles edge cases where we accidentally decide that a file object has length zero even though it does not.\n",
"Yep, I realized what a rats nest that was going to be when I started digging yesterday. I guess I should have paid more attention to jseabold's work before extending it. Disregard my comment on that above.\n",
"This should be resolved with #3754."
] |
https://api.github.com/repos/psf/requests/issues/3065
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3065/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3065/comments
|
https://api.github.com/repos/psf/requests/issues/3065/events
|
https://github.com/psf/requests/issues/3065
| 143,121,766 |
MDU6SXNzdWUxNDMxMjE3NjY=
| 3,065 |
ImportError No module ngd-httpsclient
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3597266?v=4",
"events_url": "https://api.github.com/users/jamesh38/events{/privacy}",
"followers_url": "https://api.github.com/users/jamesh38/followers",
"following_url": "https://api.github.com/users/jamesh38/following{/other_user}",
"gists_url": "https://api.github.com/users/jamesh38/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jamesh38",
"id": 3597266,
"login": "jamesh38",
"node_id": "MDQ6VXNlcjM1OTcyNjY=",
"organizations_url": "https://api.github.com/users/jamesh38/orgs",
"received_events_url": "https://api.github.com/users/jamesh38/received_events",
"repos_url": "https://api.github.com/users/jamesh38/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jamesh38/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jamesh38/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jamesh38",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 21 |
2016-03-24T01:37:53Z
|
2021-09-08T19:00:26Z
|
2016-03-25T14:06:59Z
|
NONE
|
resolved
|
Using AWS Lambda to send push notifications to Pusher app in Python. When I install Pusher and all its dependencies to a directory and zip up to Lambda I run a simple test and get this error.
No module named ndg.httpsclient.ssl_peer_verification
Here is the code I'm trying to run.
```
from pusher import Pusher
pusher = Pusher(app_id=u'id', key=u'key', secret=u'secret')
def createPitchZip(context, event):
pusher.trigger('testchannel', 'testevent', {u'some': u'data'})
```
I've seen several posts about this but installing the dependencies individually doesn't seem to be helping.
I've tried to post this on pusher github and have gotten to response. Banging my head on the wall
Here is the stack trace
```
Traceback (most recent call last):
File "pusherlambda.py", line 2, in <module>
pusher = Pusher(app_id=u'*****', key=*****', secret=u'****')
File "/home/vagrant/Code/Lamdba/venv/lib/python2.7/site-packages/pusher/pusher.py", line 42, in __init__
from pusher.requests import RequestsBackend
File "/home/vagrant/Code/Lamdba/venv/lib/python2.7/site-packages/pusher/requests.py", line 12, in <module>
import urllib3.contrib.pyopenssl
File "/home/vagrant/Code/Lamdba/venv/lib/python2.7/site-packages/urllib3/contrib/pyopenssl.py", line 49, in <module>
from ndg.httpsclient.ssl_peer_verification import SUBJ_ALT_NAME_SUPPORT
ImportError: No module named ndg.httpsclient.ssl_peer_verification
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3065/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3065/timeline
| null |
completed
| null | null | false |
[
"I could be wrong but I think this is a urllib3 issue not requests\n\nOn Thu, 24 Mar 2016 11:38 James Hoegerl [email protected] wrote:\n\n> Using AWS Lambda to send push notifications to Pusher app in Python. When\n> I install Pusher and all its dependencies to a directory and zip up to\n> Lambda I run a simple test and get this error.\n> \n> No module named ndg.httpsclient.ssl_peer_verification\n> Here is the code I'm trying to run.\n> \n> from pusher import Pusher\n> \n> pusher = Pusher(app_id=u'id', key=u'key', secret=u'secret')\n> \n> def createPitchZip(context, event):\n> \n> ```\n> pusher.trigger('testchannel', 'testevent', {u'some': u'data'})\n> ```\n> \n> I've seen several posts about this but installing the dependencies\n> individually doesn't seem to be helping.\n> \n> I've tried to post this on pusher github and have gotten to response.\n> Banging my head on the wall\n> \n> Here is the stack trace\n> \n> Traceback (most recent call last):\n> File \"pusherlambda.py\", line 2, in\n> pusher = Pusher(app_id=u'_**', key=__', secret=u'_*_')\n> File\n> \"/home/vagrant/Code/Lamdba/venv/lib/python2.7/site-packages/pusher/pusher.py\",\n> line 42, in *init_\n> from pusher.requests import RequestsBackend\n> File\n> \"/home/vagrant/Code/Lamdba/venv/lib/python2.7/site-packages/pusher/requests.py\",\n> line 12, in\n> import urllib3.contrib.pyopenssl\n> File\n> \"/home/vagrant/Code/Lamdba/venv/lib/python2.7/site-packages/urllib3/contrib/pyopenssl.py\",\n> line 49, in\n> from ndg.httpsclient.ssl_peer_verification import SUBJ_ALT_NAME_SUPPORT\n> ImportError: No module named ndg.httpsclient.ssl_peer_verification\n> \n> —\n> You are receiving this because you are subscribed to this thread.\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/3065\n",
"I'll head over and post there I guess. \n",
"So could you provide more steps to reproduce this? Is it something like:\n- create virtualenv\n- pip install some things\n- zip ... the virtualenv?\n- upload to Lambda\n\nMostly, this traceback looks like you're using some kind of unvendored version of requests & urllib3 and I'm not sure how that's happening.\n\nAre you installing ndg-httpsclient with Pusher and including that in your zip file?\n",
"Yah so basically I'll do just what you said.\n\n```\nvirtualenv env\n/env/bin/pip install pusher\n```\n\n<img width=\"1246\" alt=\"screen shot 2016-03-23 at 9 47 32 pm\" src=\"https://cloud.githubusercontent.com/assets/3597266/14007037/e224d230-f140-11e5-9c5d-cb335a295756.png\">\n\nThen i'll cd to site-packages and zip all of them together with my lambda script which is the code above and I'll test it and get that error.\n",
"Can you try not zipping everything together, but instead running in the virtual environment? I think your distribution of the environment is not working correctly: specifically, that the import path for `ndg-httpsclient` is getting munged.\n",
"Right so if I `cd env/lib/python2.7/site-packages/` and create and run a script called `lambda.py` with this code:\n\n```\nfrom pusher import Pusher\npusher = Pusher(app_id=u'***', key=u'***', secret=u'***')\npusher.trigger('testchannel', 'testevent', {u'some': u'data'})\n```\n\nI'll get the same stack trace.\n\n```\n> vagrant@homestead:~/Code/Lamdba/env/lib/python2.7/site-packages$ python lambda.py \n> Traceback (most recent call last):\n> File \"lambda.py\", line 3, in <module>\n> pusher = Pusher(app_id=u'***', key=u'***'', secret=u'***')\n> File \"/home/vagrant/Code/Lamdba/env/lib/python2.7/site-packages/pusher/pusher.py\", line 42, in __init__\n> from pusher.requests import RequestsBackend\n> File \"/home/vagrant/Code/Lamdba/env/lib/python2.7/site-packages/pusher/requests.py\", line 12, in <module>\n> import urllib3.contrib.pyopenssl\n> File \"/home/vagrant/Code/Lamdba/env/lib/python2.7/site-packages/urllib3/contrib/pyopenssl.py\", line 49, in <module>\n> from ndg.httpsclient.ssl_peer_verification import SUBJ_ALT_NAME_SUPPORT\n> ImportError: No module named ndg.httpsclient.ssl_peer_verification\n```\n",
"The code in pusher is quite confusing: they're trying to call into the urllib3 that is not bundled inside requests, which is for no adequately explainable reason exploding.\n\nFrom your location on the filesystem, can you try the following Python one-liners and tell me what they return?\n\n``` bash\npython -c \"import requests; print requests.__file__\"\npython -c \"import urllib3; print urllib3.__file__\"\npython -c \"import OpenSSL; print OpenSSL.__file__\"\npython -c \"import ndg.httpsclient; print ndg.httpsclient.__file__\"\n```\n",
"```\nvagrant@homestead:~/Code/Lamdba/env/lib/python2.7/site-packages$ python -c \"import requests; print requests.__file__\"\nrequests/__init__.pyc\nvagrant@homestead:~/Code/Lamdba/env/lib/python2.7/site-packages$ python -c \"import urllib3; print urllib3.__file__\"\nurllib3/__init__.pyc\nvagrant@homestead:~/Code/Lamdba/env/lib/python2.7/site-packages$ python -c \"import OpenSSL; print OpenSSL.__file__\"\nOpenSSL/__init__.pyc\nvagrant@homestead:~/Code/Lamdba/env/lib/python2.7/site-packages$ python -c \"import ndg.httpsclient; print ndg.httpsclient.__file__\"\nTraceback (most recent call last):\n File \"<string>\", line 1, in <module>\nImportError: No module named ndg.httpsclient\n```\n",
"Well that's bizarre. Does `pip freeze` show that ndg-httpsclient is installed?\n\n@dstufft, any theories about this weirdness?\n",
"> vagrant@homestead:~/Code/Lamdba/env/lib/python2.7/site-packages$ ../../../bin/pip freeze\n> cffi==1.5.2\n> cryptography==1.3.1\n> enum34==1.1.2\n> idna==2.1\n> ipaddress==1.0.16\n> ndg-httpsclient==0.4.0\n> pusher==1.2.3\n> pyasn1==0.1.9\n> pycparser==2.14\n> pyOpenSSL==16.0.0\n> requests==2.9.1\n> six==1.10.0\n> urllib3==1.14\n",
"@sigmavirus24 Any luck with this?\n",
"What is in your site-packages directory? Mine looks like this:\n\n```\nndg/\n httpsclient/\n __init__.py\n https.py\n ssl_context_util.py\n ssl_peer_verification.py\n # etc.\n```\n",
"Also I just did `pip install requests[security]` and my `pip freeze` output looks like this:\n\n```\ncffi==1.5.2\ncryptography==1.3.1\nenum34==1.1.2\nidna==2.1\nipaddress==1.0.16\nndg-httpsclient==0.4.0\npyasn1==0.1.9\npycparser==2.14\npyOpenSSL==16.0.0\nrequests==2.9.1\nsix==1.10.0\nwheel==0.24.0\n```\n\nI'm having no problem import `ndg.httpsclient`.\n",
"I don't even see urllib3 on your pip freeze. I dont have wheel.\n\n```\nJames-Hoegerls-MacBook-Pro:Music-Pub-Works jjh47$ ls -LR ../Lamdba/env/lib/python2.7/site-packages/ndg\nhttpsclient\n\n../Lamdba/env/lib/python2.7/site-packages/ndg/httpsclient:\n__init__.py https.pyc ssl_peer_verification.py ssl_socket.pyc test utils.py\n__init__.pyc ssl_context_util.py ssl_peer_verification.pyc subj_alt_name.py urllib2_build_opener.py utils.pyc\nhttps.py ssl_context_util.pyc ssl_socket.py subj_alt_name.pyc urllib2_build_opener.pyc\n\n../Lamdba/env/lib/python2.7/site-packages/ndg/httpsclient/test:\nREADME __init__.pyc scripts test_https.pyc test_urllib2.pyc test_utils.pyc\n__init__.py pki test_https.py test_urllib2.py test_utils.py\n\n../Lamdba/env/lib/python2.7/site-packages/ndg/httpsclient/test/pki:\nca localhost.crt localhost.key\n\n../Lamdba/env/lib/python2.7/site-packages/ndg/httpsclient/test/pki/ca:\n08bd99c7.0 ade0138a.0\n\n../Lamdba/env/lib/python2.7/site-packages/ndg/httpsclient/test/scripts:\nopenssl_https_server.sh\n\n```\n",
"> I don't even see urllib3 on your pip freeze. I dont have wheel.\n\nBecause the issue seems to be in the fact that you can't import `ndg.httpsclient`. It has nothing to do with urllib3 being installed or anything.\n\nI wonder if you could share the following:\n\n```\n$ pip --version\n$ python -V\n$ python -c 'import setuptools; setuptools.__version__'\n```\n\nI think this might be namespace package related (given that ndg-httpsclient is a namespace package) but that might be something of a red-herring. I'm not quite certain yet.\n",
"For what it's worth, my virtualenv pip version is 7.1.x and my python version is 2.7.11.\n",
"```\nvagrant@homestead:~/Code/Lamdba/env/lib/python2.7/site-packages$ ../../../bin/pip --version\npip 8.1.1 from /home/vagrant/Code/Lamdba/env/local/lib/python2.7/site-packages (python 2.7)\nvagrant@homestead:~/Code/Lamdba/env/lib/python2.7/site-packages$ python -V\nPython 2.7.8\nvagrant@homestead:~/Code/Lamdba/env/lib/python2.7/site-packages$ python -c 'import setuptools; setuptools.__version__'\n```\n\nLast line gave no output.\n\nThank ya'll so much for the help btw. My first dive into Python.\n",
"I'm so dumb. I didn't use the `activate` command for virtual env and was running pip things with `env/bin/pip install` but I wasn't running programs with `env/bin/python lambda.py`\n\nSo that's working now. But it still wouldn't explain why Lambda would throw that error. Do I need to do something special in my python code to activate the python from my virtualenv?\n",
"@james-hoegerl I don't really know anything about lambda I'm afraid.\n",
"This isn't a requests issue. So I'm closing this.\n",
"Yah was just about to do that. Thanks for your help @sigmavirus24 @Lukasa. Sorry this ended up being so dumb. Heading to lambda forums. \n"
] |
https://api.github.com/repos/psf/requests/issues/3064
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3064/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3064/comments
|
https://api.github.com/repos/psf/requests/issues/3064/events
|
https://github.com/psf/requests/issues/3064
| 142,948,566 |
MDU6SXNzdWUxNDI5NDg1NjY=
| 3,064 |
Honor HTTP strict transport security headers
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3287067?v=4",
"events_url": "https://api.github.com/users/asieira/events{/privacy}",
"followers_url": "https://api.github.com/users/asieira/followers",
"following_url": "https://api.github.com/users/asieira/following{/other_user}",
"gists_url": "https://api.github.com/users/asieira/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/asieira",
"id": 3287067,
"login": "asieira",
"node_id": "MDQ6VXNlcjMyODcwNjc=",
"organizations_url": "https://api.github.com/users/asieira/orgs",
"received_events_url": "https://api.github.com/users/asieira/received_events",
"repos_url": "https://api.github.com/users/asieira/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/asieira/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/asieira/subscriptions",
"type": "User",
"url": "https://api.github.com/users/asieira",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-03-23T12:35:49Z
|
2021-09-08T19:00:27Z
|
2016-03-23T12:36:57Z
|
CONTRIBUTOR
|
resolved
|
[HTTP strict transport security header](https://scotthelme.co.uk/hsts-the-missing-link-in-tls/) helps protect web applications against communications interception and MITM attacks. I would recommend that requests look for and honor the header by default.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3064/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3064/timeline
| null |
completed
| null | null | false |
[
"See shazow/urllib3#608, and the comment I just made on #3063: the same limitations that apply there apply here.\n",
"Perfect, thank you! :bow:\n"
] |
https://api.github.com/repos/psf/requests/issues/3063
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3063/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3063/comments
|
https://api.github.com/repos/psf/requests/issues/3063/events
|
https://github.com/psf/requests/issues/3063
| 142,947,860 |
MDU6SXNzdWUxNDI5NDc4NjA=
| 3,063 |
Honor HTTP public key pinning headers for server certificate validation
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3287067?v=4",
"events_url": "https://api.github.com/users/asieira/events{/privacy}",
"followers_url": "https://api.github.com/users/asieira/followers",
"following_url": "https://api.github.com/users/asieira/following{/other_user}",
"gists_url": "https://api.github.com/users/asieira/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/asieira",
"id": 3287067,
"login": "asieira",
"node_id": "MDQ6VXNlcjMyODcwNjc=",
"organizations_url": "https://api.github.com/users/asieira/orgs",
"received_events_url": "https://api.github.com/users/asieira/received_events",
"repos_url": "https://api.github.com/users/asieira/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/asieira/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/asieira/subscriptions",
"type": "User",
"url": "https://api.github.com/users/asieira",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-03-23T12:31:56Z
|
2021-09-08T19:00:26Z
|
2016-03-23T12:36:20Z
|
CONTRIBUTOR
|
resolved
|
An interesting protection against MITM attacks could be provided by default if requests looked for and honored [Public-Key-Pins](https://en.wikipedia.org/wiki/HTTP_Public_Key_Pinning) headers in the server response for additional validation of the certificate presented.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3063/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3063/timeline
| null |
completed
| null | null | false |
[
"@asieira This is an idea we've been interested in for a long time: see shazow/urllib3#607.\n\nUnfortunately, any support we're likely to have will be of minimal use in practice unless deliberate effort is taken to harden your application. We'd need a cache in non-ephemeral storage (or the exit of your app cleans the in-memory cache), and PKP headers are extremely uncommon.\n\nRegardless, the issue to track is the one linked above.\n",
"Perfect, thank you! :bow:\n"
] |
https://api.github.com/repos/psf/requests/issues/3062
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3062/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3062/comments
|
https://api.github.com/repos/psf/requests/issues/3062/events
|
https://github.com/psf/requests/issues/3062
| 142,444,095 |
MDU6SXNzdWUxNDI0NDQwOTU=
| 3,062 |
Requests causing NoneType error
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/17990713?v=4",
"events_url": "https://api.github.com/users/MNgeographer/events{/privacy}",
"followers_url": "https://api.github.com/users/MNgeographer/followers",
"following_url": "https://api.github.com/users/MNgeographer/following{/other_user}",
"gists_url": "https://api.github.com/users/MNgeographer/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/MNgeographer",
"id": 17990713,
"login": "MNgeographer",
"node_id": "MDQ6VXNlcjE3OTkwNzEz",
"organizations_url": "https://api.github.com/users/MNgeographer/orgs",
"received_events_url": "https://api.github.com/users/MNgeographer/received_events",
"repos_url": "https://api.github.com/users/MNgeographer/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/MNgeographer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MNgeographer/subscriptions",
"type": "User",
"url": "https://api.github.com/users/MNgeographer",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-03-21T19:16:53Z
|
2021-09-08T19:00:27Z
|
2016-03-21T19:20:21Z
|
NONE
|
resolved
|
I've pip installed requests (requests-2.9.1-py2.py3-none-any). I'm working on a script to download an excel file from my company's SharePoint onto my local drive using Python 2.7. Even if I comment out every last bit of code other than "import requests" at the beginning, I get this error message:
_"AttributeError: 'NoneType' object has no attribute 'a2b_hex' "_
So something about importing the requests module is causing this problem. Here's the full traceback:
_Traceback (most recent call last): File "H:\PythonSandbox\GetSharePoint.py", line 1, in import requests
File "C:\Python27\ArcGIS10.1\lib\site-packages\requests__init__.py", line 58, in from . import utils
File "C:\Python27\ArcGIS10.1\lib\site-packages\requests\utils.py", line 26, in
from .compat import parse_http_list as _parse_list_header
File "C:\Python27\ArcGIS10.1\lib\site-packages\requests\compat.py", line 29, in
import json
File "C:\Python27\ArcGIS10.1\Lib\json__init__.py", line 108, in from .decoder import JSONDecoder
File "C:\Python27\ArcGIS10.1\Lib\json\decoder.py", line 24, in NaN, PosInf, NegInf = _floatconstants()
File "C:\Python27\ArcGIS10.1\Lib\json\decoder.py", line 18, in _floatconstants _BYTES = '7FF80000000000007FF0000000000000'.decode('hex')
File "C:\Python27\ArcGIS10.1\Lib\encodings\hex_codec.py", line 42, in hex_decode
output = binascii.a2b_hex(input)
AttributeError: 'NoneType' object has no attribute 'a2b_hex'
Failed to execute (GetSharePoint)._
Apologies if this isn't worded well, I'm a relative novice and new to github. Thanks!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3062/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3062/timeline
| null |
completed
| null | null | false |
[
"This looks like an error in your system. For some reason, the hex_codec module's import of binascii is broken: something is patching it away or otherwise messing with it such that it doesn't not function correctly.\n\nTaking you through debugging that is not really something that we can do here (this is a requests bug tracker, and the issue here is not with requests), but nonetheless, you should be investigating why binascii is not functioning well here. Given that this is in the standard library and that your code apparently _only_ imports requests, either ArcGIS have busted something or your system is pretty remarkably borked.\n\nFWIW, you should be able to reproduce this error by using just the following script:\n\n``` python\nimport json\n```\n\nThat involves no third-party code, which should demonstrate that the problem is not in requests. =)\n"
] |
https://api.github.com/repos/psf/requests/issues/3061
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3061/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3061/comments
|
https://api.github.com/repos/psf/requests/issues/3061/events
|
https://github.com/psf/requests/issues/3061
| 142,176,952 |
MDU6SXNzdWUxNDIxNzY5NTI=
| 3,061 |
Getting tunnel CONNECT response headers
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1206342?v=4",
"events_url": "https://api.github.com/users/jkryanchou/events{/privacy}",
"followers_url": "https://api.github.com/users/jkryanchou/followers",
"following_url": "https://api.github.com/users/jkryanchou/following{/other_user}",
"gists_url": "https://api.github.com/users/jkryanchou/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jkryanchou",
"id": 1206342,
"login": "jkryanchou",
"node_id": "MDQ6VXNlcjEyMDYzNDI=",
"organizations_url": "https://api.github.com/users/jkryanchou/orgs",
"received_events_url": "https://api.github.com/users/jkryanchou/received_events",
"repos_url": "https://api.github.com/users/jkryanchou/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jkryanchou/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jkryanchou/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jkryanchou",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 21 |
2016-03-20T16:09:25Z
|
2021-09-08T16:00:25Z
|
2016-03-20T17:11:05Z
|
NONE
|
resolved
|
Hi all,
There was a issue with sending https requests via the rotating proxy service named ProxyMesh. Here is the code snippet.
```
>>> import requests
>>> auth = requests.auth.HTTPProxyAuth('foo', 'PASSWORD')
>>> proxies = {'https': 'http://us-ca.proxymesh.com:31280'}
>>> response = requests.get('https://example.com', proxies=proxies, auth=auth)
```
I couldn't get the tunnel `CONNECT` responses headers in the final response. While it contains the service-specified header X-ProxyMesh-IP which could be useful to requests-rate control.
`http://proxymesh.com/blog/pages/proxy-server-headers.html#https`
I have read source code related to https request handling, and found that requests mainly use the underlaying http library urllib3 to process http/https connections. It uses the ssl.wrap_socket() to wrap ssl suites on http connection and do some subsequent operations on the SSL Tunnel.
ref: https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/util/ssl_.py
Is there any methods could inject or attach the CONNECT responses headers into final response returned by https requests? or some suggestions on using requests with rotating proxy services like ProxyMesh crawlera?
I wonder is there anyone have ever thought how to get the CONNECT response headers?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3061/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3061/timeline
| null |
completed
| null | null | false |
[
"Unfortunately, getting the headers on CONNECT responses is not easy to do, because the underlying HTTP library hides that request away from us. Changing that fact either requires monkey patching the underlying library, totally rewriting the way we handle tunnelling to do it explicitly ourselves, or using a new library. Any of those is a substantial and dangerous chunk of work, and it's just not really of enough value to justify at this time. \n\nWhat do you need from the headers?\n",
"@Lukasa It contains the service-specified header X-ProxyMesh-IP which could be used to control request-rate. I could manage a proxies-count to do deeper requests-handling.\n\nI have searched for a long time, found nothing could help me to figure it out. Could you offer me some advices on getting tunnel CONNECT responses headers. It is so significant to my application.\n",
"Or tell me some helpful libraries could do the stuff. The urllib3 uses ssl to wrap socket which is hard to do monkey-patch. If I insists on using requests to get the specified header. It will require me to do some hard work on it. \n",
"@Lukasa https://bugs.python.org/issue24964 Here is a patch to python httplib which is same to mine. However it was python 3 patch not python 2.7. On the other hand, I think the rotating proxy service design was not easy handful. Oops... I have no idea how could I works with the proxy in a convenient way. \n",
"Wrapping the socket isn't important here: you want to look for the `HTTPConnection._tunnel` method, which you could monkey patch. \n",
"@Lukasa Awesome. Thanks for your suggestion. I'm so sorry for making mistake the purpose of ssl.wrap_socket() method. I will try to do some monkey-patch hacking based on your reply. \n",
"There's no need to apologise! =)\n",
"Yeah, I'm so appreciate for your answer. Thanks a lot. \n",
"@ryanchou1991 Do you have some result? I'm with exactly same problem. \n",
"Yeah. You could inherit the `httplib.HTTPSConnection` to re-implement the `connect` method to do the stuff to get the IP. And it could work well.\n",
"@ryanchou1991 actually i got the IP making a http request before all the flow. I think the propouse is taking an valid IP from proxy, am i right? Is not a problem anymore.\n\nThe problem that I have now is to make proxymesh works with a https URL. I can't make this works. It always response a 407 error. \n\nCan you help me?\n",
"@moacirmoda Yeah. I have tried to work it with the requests. While it couldn't work well. I emailed to the Proxymesh author. He offered me a solution. You could email him for the solution. : ) He would give U a short code snippet for U.\n",
"Nice man! Thanks for the help\n",
"@moacirmoda You're welcome. I originally mail the issue to the maintainer. While I re-post to him some code snippets. and we worked together to figure it out. I finally re-implement by inheriting the `httplib.HTTPSConnection` to do the stuff based on the code snippet from the ProxyMesh maintainer. \n\nWhile my code snippets was so complicated so that I couldn't give you a clear example for u to work with. You could refer the example by email to him. Hope it could help you\n",
"Yeah... I will try to work in `httplib.HTTPSConnection`, and I already send and email to proxymesh support. Thanks for the help.\n",
"@ryanchou1991 \nCould you show me some examples from the snippets? Me emails with the proxymesh are not so efficiently.\n",
"@moacirmoda Sorry, I haven't checked my github notifications yesterday. Have you email him? Or I could offer the code snippets I worked together with him. Here is the code snippet link \n\nhttps://gist.github.com/ryanchou1991/e95d6ef76fbc2ed8d395\n",
"@moacirmoda I remind of a discussion on stackoverflow. It may help you work it out with the issue.\n\nHere is the discussion `https://stackoverflow.com/questions/14665064/using-python-requests-with-existing-socket-connection`\n",
"Hi @ryanchou1991 ,\n\nTheir support send me a snippet like these that you send me.\n\nLook. I did implement your snippet and doesn't work:\n\nhttps://gist.github.com/moacirmoda/122ab1557a062943161e723f8a8c37ad\n\nIf you look in the output, on the headers, there's no 'X-ProxyMesh-IP', it means that proxy doesn't work, I'm right?\n",
"@ryanchou1991 \n\nLook, I made even better: I access `https://www.whatismyip.com` under the proxy:\n\nhttps://gist.github.com/moacirmoda/122ab1557a062943161e723f8a8c37ad\n\nLook the output:\n\n```\nHTTP/1.1 200 Connection established\nX-ProxyMesh-IP: 162.213.38.164\n\n\nYour IP Address Is: 177.148.142.34\n```\n\n`177.148.142.34` is my real ip address. Means not work, right?\n",
"I ran into this issue as well. I couldn't find any nice way to modify `HTTPResponse` to include the proxy headers separately, so instead I just merged them into the final headers dictionary.\n\nThe below code is copied from my [StackOverflow question](https://stackoverflow.com/questions/39068998/reading-connect-headers):\n\n```\nimport socket\nimport httplib\nimport requests\n\nfrom requests.packages.urllib3.connection import HTTPSConnection\nfrom requests.packages.urllib3.connectionpool import HTTPSConnectionPool\nfrom requests.packages.urllib3.poolmanager import ProxyManager\n\nfrom requests.adapters import HTTPAdapter\n\n\nclass ProxyHeaderHTTPSConnection(HTTPSConnection):\n def __init__(self, *args, **kwargs):\n super(ProxyHeaderHTTPSConnection, self).__init__(*args, **kwargs)\n self._proxy_headers = []\n\n def _tunnel(self):\n self.send(\"CONNECT %s:%d HTTP/1.0\\r\\n\" % (self._tunnel_host, self._tunnel_port))\n\n for header, value in self._tunnel_headers.iteritems():\n self.send(\"%s: %s\\r\\n\" % (header, value))\n\n self.send(\"\\r\\n\")\n\n response = self.response_class(self.sock, strict=self.strict, method=self._method)\n version, code, message = response._read_status()\n\n if version == \"HTTP/0.9\":\n # HTTP/0.9 doesn't support the CONNECT verb, so if httplib has\n # concluded HTTP/0.9 is being used something has gone wrong.\n self.close()\n raise socket.error(\"Invalid response from tunnel request\")\n\n if code != 200:\n self.close()\n raise socket.error(\"Tunnel connection failed: %d %s\" % (code, message.strip()))\n\n self._proxy_headers = []\n\n while True:\n line = response.fp.readline(httplib._MAXLINE + 1)\n\n if len(line) > httplib._MAXLINE:\n raise LineTooLong(\"header line\")\n\n if not line or line == '\\r\\n':\n break\n\n # The line is a header, save it\n if ':' in line:\n self._proxy_headers.append(line)\n\n def getresponse(self, buffering=False):\n response = super(ProxyHeaderHTTPSConnection, self).getresponse(buffering)\n response.msg.headers.extend(self._proxy_headers)\n\n return response\n\n\nclass ProxyHeaderHTTPSConnectionPool(HTTPSConnectionPool):\n ConnectionCls = ProxyHeaderHTTPSConnection\n\n\nclass ProxyHeaderProxyManager(ProxyManager):\n def _new_pool(self, scheme, host, port):\n assert scheme == 'https'\n\n return ProxyHeaderHTTPSConnectionPool(host, port, **self.connection_pool_kw)\n\n\nclass ProxyHeaderHTTPAdapter(HTTPAdapter):\n def proxy_manager_for(self, proxy, **proxy_kwargs):\n if proxy in self.proxy_manager:\n manager = self.proxy_manager[proxy]\n else:\n proxy_headers = self.proxy_headers(proxy)\n manager = self.proxy_manager[proxy] = ProxyHeaderProxyManager(\n proxy_url=proxy,\n proxy_headers=proxy_headers,\n num_pools=self._pool_connections,\n maxsize=self._pool_maxsize,\n block=self._pool_block,\n **proxy_kwargs)\n\n return manager\n```\n\nYou can then install the adapter onto a session and pretend the proxy injected the headers into the proxied response, like for HTTP:\n\n```\nsession = requests.Session()\nsession.mount('https://', ProxyHeaderHTTPAdapter())\n\nresponse = session.get('https://example.com', proxies={...})\n```\n\n@moacirmoda The gist you linked to didn't work for me either. Mine does, however: https://gist.github.com/Blender3D/b8763f7b12099198fe1d947613c2739a\n"
] |
https://api.github.com/repos/psf/requests/issues/3060
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3060/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3060/comments
|
https://api.github.com/repos/psf/requests/issues/3060/events
|
https://github.com/psf/requests/pull/3060
| 141,772,335 |
MDExOlB1bGxSZXF1ZXN0NjMzMjY1OTE=
| 3,060 |
Consolidate logic for changing method during redirects
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/272675?v=4",
"events_url": "https://api.github.com/users/benweatherman/events{/privacy}",
"followers_url": "https://api.github.com/users/benweatherman/followers",
"following_url": "https://api.github.com/users/benweatherman/following{/other_user}",
"gists_url": "https://api.github.com/users/benweatherman/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/benweatherman",
"id": 272675,
"login": "benweatherman",
"node_id": "MDQ6VXNlcjI3MjY3NQ==",
"organizations_url": "https://api.github.com/users/benweatherman/orgs",
"received_events_url": "https://api.github.com/users/benweatherman/received_events",
"repos_url": "https://api.github.com/users/benweatherman/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/benweatherman/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/benweatherman/subscriptions",
"type": "User",
"url": "https://api.github.com/users/benweatherman",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-03-18T04:23:45Z
|
2021-09-08T04:01:11Z
|
2016-03-20T18:46:00Z
|
CONTRIBUTOR
|
resolved
|
I only moved the code into a function, there was no actual change to the code. I added a few tests to ensure we're doing things correctly.
The real point of me doing this is to make it easier to bring back `strict_mode` functionality. For you requests youngsters in the crowd, `strict_mode` followed the spec for redirects meaning the method wouldn't change to a GET. The current code follows the browser convention of changing the method to a GET when doing a 302 redirect. However, lots of servers want you to follow the standards (the nerve!) so I'd like to override the logic. Now that the method changing logic is in `rebuild_method`, I can simply override that function instead of overriding the entire `resolve_redirects` function as suggested by kennethreitz/requests#1325
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 1,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/3060/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3060/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3060.diff",
"html_url": "https://github.com/psf/requests/pull/3060",
"merged_at": "2016-03-20T18:46:00Z",
"patch_url": "https://github.com/psf/requests/pull/3060.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3060"
}
| true |
[
"So, I'm totally happy with this code change: I think it's a reasonable refactoring that helps improve some clarity, _as well_ as allow @benweatherman to perform the override you want.\n\nI'm also delighted to see some extra tests added for this. I think this change is wonderful! Pinging @kennethreitz or @sigmavirus24 for an extra round of review before merging.\n",
":sparkles: :cake: :sparkles:\n",
"Yay! Thanks y'all for getting this merged quickly. Now for the release... :joy_cat: \n"
] |
https://api.github.com/repos/psf/requests/issues/3059
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3059/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3059/comments
|
https://api.github.com/repos/psf/requests/issues/3059/events
|
https://github.com/psf/requests/pull/3059
| 141,688,800 |
MDExOlB1bGxSZXF1ZXN0NjMyODE2OTc=
| 3,059 |
Raise a ProxyError for proxy related connection issues
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/888394?v=4",
"events_url": "https://api.github.com/users/alexanderad/events{/privacy}",
"followers_url": "https://api.github.com/users/alexanderad/followers",
"following_url": "https://api.github.com/users/alexanderad/following{/other_user}",
"gists_url": "https://api.github.com/users/alexanderad/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/alexanderad",
"id": 888394,
"login": "alexanderad",
"node_id": "MDQ6VXNlcjg4ODM5NA==",
"organizations_url": "https://api.github.com/users/alexanderad/orgs",
"received_events_url": "https://api.github.com/users/alexanderad/received_events",
"repos_url": "https://api.github.com/users/alexanderad/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/alexanderad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alexanderad/subscriptions",
"type": "User",
"url": "https://api.github.com/users/alexanderad",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-03-17T19:47:29Z
|
2021-09-08T04:01:09Z
|
2016-04-06T19:01:25Z
|
CONTRIBUTOR
|
resolved
|
This one aims to address https://github.com/kennethreitz/requests/issues/3050
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3059/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3059/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3059.diff",
"html_url": "https://github.com/psf/requests/pull/3059",
"merged_at": "2016-04-06T19:01:25Z",
"patch_url": "https://github.com/psf/requests/pull/3059.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3059"
}
| true |
[
"Cool, I'm happy with this. @sigmavirus24 `ProxyError` is a `ConnectionError` subclass, so I think this can go in the next minor release. Agreed?\n",
"I agree. Do we have things we've merged that are not suitable for a minor release though?\n",
"I don't think so.\n"
] |
https://api.github.com/repos/psf/requests/issues/3058
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3058/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3058/comments
|
https://api.github.com/repos/psf/requests/issues/3058/events
|
https://github.com/psf/requests/issues/3058
| 141,629,779 |
MDU6SXNzdWUxNDE2Mjk3Nzk=
| 3,058 |
'SSLError: [Errno 2] No such file or directory' when reconfiguring the CA bundle a Session uses
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1977525?v=4",
"events_url": "https://api.github.com/users/jeremycline/events{/privacy}",
"followers_url": "https://api.github.com/users/jeremycline/followers",
"following_url": "https://api.github.com/users/jeremycline/following{/other_user}",
"gists_url": "https://api.github.com/users/jeremycline/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jeremycline",
"id": 1977525,
"login": "jeremycline",
"node_id": "MDQ6VXNlcjE5Nzc1MjU=",
"organizations_url": "https://api.github.com/users/jeremycline/orgs",
"received_events_url": "https://api.github.com/users/jeremycline/received_events",
"repos_url": "https://api.github.com/users/jeremycline/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jeremycline/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jeremycline/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jeremycline",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-03-17T16:09:46Z
|
2021-09-08T19:00:27Z
|
2016-03-17T16:18:29Z
|
CONTRIBUTOR
|
resolved
|
I want to start off by saying I think this is a pretty unusual use case and it appears to me to boil down to a behaviour in `urllib3`. However, I encountered it while working with `requests` and I'm not familiar enough with `urllib3` to say whether the behaviour is expected or not. It may be that the fix is to document this behaviour in `requests`, or make a change in `urllib3`, but I don't know for sure.
#### Problem Summary
1. Configure a `requests.sessions.Session` with a CA file.
2. Make a request using HTTPS to a host that has a certificate signed by this CA.
3. Reconfigure the `requests.sessions.Session` object with a _new_ CA file and delete the old one.
4. Wait until the connection with the server is closed (either from inactivity or by forcing it with 'Connection: close' on the request made in step 2).
5. Make a second request using HTTPS to a host that has a certificate signed by the new CA.
6. Instead of the same response from step 2, an exception is raised.
The exception looks something like this and is from the reproducer I've included below:
```
Traceback (most recent call last):
File "test_requests.py", line 78, in <module>
headers={'Connection': 'close'})
File "/home/jcline/.virtualenvs/test_requests/lib/python2.7/site-packages/requests/sessions.py", line 480, in get
return self.request('GET', url, **kwargs)
File "/home/jcline/.virtualenvs/test_requests/lib/python2.7/site-packages/requests/sessions.py", line 468, in request
resp = self.send(prep, **send_kwargs)
File "/home/jcline/.virtualenvs/test_requests/lib/python2.7/site-packages/requests/sessions.py", line 576, in send
r = adapter.send(request, **kwargs)
File "/home/jcline/.virtualenvs/test_requests/lib/python2.7/site-packages/requests/adapters.py", line 447, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: [Errno 2] No such file or directory
```
I did some digging and what I concluded is that a when a connection is pulled out of the `urllib3` connection pool the CA certificates (maybe even the client certificates?) are not reconfigured on the connection object, so when it goes to validate the server's certificate when it restarts the TCP connection it gets that "No such file or directory" when it tries to read that old CA certificate file.
I thought about this a bit and it may be the case that if you reconfigure the session object with a new CA file and don't delete the old one, you might encounter odd cases where a certificate you expect to pass validation with the new CA bundle fails because the old bundle is actually being used on some of the connections in the pool (but possibly not all of them).
#### Environment
I encountered this on Fedora 23. I installed requests into a clean virtualenv with pip rather than using the distro-provided packages. I saw this with requests-2.9.1.
#### Reproducer
I've written up a little script that reproduces this problem. It uses the same CA for both requests, but they are in different files and the CA file used for the first request is deleted.
```
import os
import tempfile
import requests
# The CA that signed the certificate used at cdn.redhat.com
CA = """
-----BEGIN CERTIFICATE-----
MIIHZDCCBUygAwIBAgIJAOb+QiglyeZeMA0GCSqGSIb3DQEBBQUAMIGwMQswCQYD
VQQGEwJVUzEXMBUGA1UECAwOTm9ydGggQ2Fyb2xpbmExEDAOBgNVBAcMB1JhbGVp
Z2gxFjAUBgNVBAoMDVJlZCBIYXQsIEluYy4xGDAWBgNVBAsMD1JlZCBIYXQgTmV0
d29yazEeMBwGA1UEAwwVRW50aXRsZW1lbnQgTWFzdGVyIENBMSQwIgYJKoZIhvcN
AQkBFhVjYS1zdXBwb3J0QHJlZGhhdC5jb20wHhcNMTAwMzE3MTkwMDQ0WhcNMzAw
MzEyMTkwMDQ0WjCBsDELMAkGA1UEBhMCVVMxFzAVBgNVBAgMDk5vcnRoIENhcm9s
aW5hMRAwDgYDVQQHDAdSYWxlaWdoMRYwFAYDVQQKDA1SZWQgSGF0LCBJbmMuMRgw
FgYDVQQLDA9SZWQgSGF0IE5ldHdvcmsxHjAcBgNVBAMMFUVudGl0bGVtZW50IE1h
c3RlciBDQTEkMCIGCSqGSIb3DQEJARYVY2Etc3VwcG9ydEByZWRoYXQuY29tMIIC
IjANBgkqhkiG9w0BAQEFAAOCAg8AMIICCgKCAgEA2Z+mW7OYcBcGxWS+RSKG2GJ2
csMXiGGfEp36vKVsIvypmNS60SkicKENMYREalbdSjrgfXxPJygZWsVWJ5lHPfBV
o3WkFrFHTIXd/R6LxnaHD1m8Cx3GwEeuSlE/ASjc1ePtMnsHH7xqZ9wdl85b1C8O
scgO7fwuM192kvv/veI/BogIqUQugtG6szXpV8dp4ml029LXFoNIy2lfFoa2wKYw
MiUHwtYgAz7TDY63e8qGhd5PoqTv9XKQogo2ze9sF9y/npZjliNy5qf6bFE+24oW
E8pGsp3zqz8h5mvw4v+tfIx5uj7dwjDteFrrWD1tcT7UmNrBDWXjKMG81zchq3h4
etgF0iwMHEuYuixiJWNzKrLNVQbDmcLGNOvyJfq60tM8AUAd72OUQzivBegnWMit
CLcT5viCT1AIkYXt7l5zc/duQWLeAAR2FmpZFylSukknzzeiZpPclRziYTboDYHq
revM97eER1xsfoSYp4mJkBHfdlqMnf3CWPcNgru8NbEPeUGMI6+C0YvknPlqDDtU
ojfl4qNdf6nWL+YNXpR1YGKgWGWgTU6uaG8Sc6qGfAoLHh6oGwbuz102j84OgjAJ
DGv/S86svmZWSqZ5UoJOIEqFYrONcOSgztZ5tU+gP4fwRIkTRbTEWSgudVREOXhs
bfN1YGP7HYvS0OiBKZUCAwEAAaOCAX0wggF5MB0GA1UdDgQWBBSIS6ZFxEbsj9bP
pvYazyY8kMx/FzCB5QYDVR0jBIHdMIHagBSIS6ZFxEbsj9bPpvYazyY8kMx/F6GB
tqSBszCBsDELMAkGA1UEBhMCVVMxFzAVBgNVBAgMDk5vcnRoIENhcm9saW5hMRAw
DgYDVQQHDAdSYWxlaWdoMRYwFAYDVQQKDA1SZWQgSGF0LCBJbmMuMRgwFgYDVQQL
DA9SZWQgSGF0IE5ldHdvcmsxHjAcBgNVBAMMFUVudGl0bGVtZW50IE1hc3RlciBD
QTEkMCIGCSqGSIb3DQEJARYVY2Etc3VwcG9ydEByZWRoYXQuY29tggkA5v5CKCXJ
5l4wDAYDVR0TBAUwAwEB/zALBgNVHQ8EBAMCAQYwEQYJYIZIAYb4QgEBBAQDAgEG
MCAGA1UdEQQZMBeBFWNhLXN1cHBvcnRAcmVkaGF0LmNvbTAgBgNVHRIEGTAXgRVj
YS1zdXBwb3J0QHJlZGhhdC5jb20wDQYJKoZIhvcNAQEFBQADggIBAJ1hEdNBDTRr
6kI6W6stoogSUwjuiWPDY8DptwGhdpyIfbCoxvBR7F52DlwyXOpCunogfKMRklnE
gH1Wt66RYkgNuJcenKHAhR5xgSLoPCOVF9rDjMunyyBuxjIbctM21R7BswVpsEIE
OpV5nlJ6wkHsrn0/E+Zk5UJdCzM+Fp4hqHtEn/c97nvRspQcpWeDg6oUvaJSZTGM
8yFpzR90X8ZO4rOgpoERukvYutUfJUzZuDyS3LLc6ysamemH93rZXr52zc4B+C9G
Em8zemDgIPaH42ce3C3TdVysiq/yk+ir7pxW8toeavFv75l1UojFSjND+Q2AlNQn
pYkmRznbD5TZ3yDuPFQG2xYKnMPACepGgKZPyErtOIljQKCdgcvb9EqNdZaJFz1+
/iWKYBL077Y0CKwb+HGIDeYdzrYxbEd95YuVU0aStnf2Yii2tLcpQtK9cC2+DXjL
Yf3kQs4xzH4ZejhG9wzv8PGXOS8wHYnfVNA3+fclDEQ1mEBKWHHmenGI6QKZUP8f
g0SQ3PNRnSZu8R+rhABOEuVFIBRlaYijg2Pxe0NgL9FlHsNyRfo6EUrB2QFRKACW
3Mo6pZyDjQt7O8J7l9B9IIURoJ1niwygf7VSJTMl2w3fFleNJlZTGgdXw0V+5g+9
Kg6Ay0rrsi4nw1JHue2GvdjdfVOaWSWC
-----END CERTIFICATE-----
"""
# Write two versions of the CA to temporary files. ca1 is used for the first request
# and ca2 is used for the second (after ca1 has been deleted).
fd, ca1_abs_path = tempfile.mkstemp(dir='/tmp/', text=True)
os.write(fd, CA)
os.close(fd)
fd, ca2_abs_path = tempfile.mkstemp(dir='/tmp/', text=True)
os.write(fd, CA)
os.close(fd)
# Make an initial request with the first CA
session = requests.sessions.Session()
session.verify = ca1_abs_path
response = session.get('https://cdn.redhat.com/',
headers={'Connection': 'close'})
print('Got HTTP ' + str(response.status_code) + ' (403 expected)')
# Remove the first CA and configure the session to use the second one.
os.remove(ca1_abs_path)
session.verify = ca2_abs_path
# This is going to end in a "file not found" because the connection needs to be
# re-established by urllib3 and the certificate on the connection is not
# updated to match the certificate on the connection pool.
try:
response = session.get('https://cdn.redhat.com/',
headers={'Connection': 'close'})
print('Got HTTP ' + str(response.status_code) + ' (403 expected)')
except requests.exceptions.SSLError as e:
os.remove(ca2_abs_path)
raise
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1977525?v=4",
"events_url": "https://api.github.com/users/jeremycline/events{/privacy}",
"followers_url": "https://api.github.com/users/jeremycline/followers",
"following_url": "https://api.github.com/users/jeremycline/following{/other_user}",
"gists_url": "https://api.github.com/users/jeremycline/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jeremycline",
"id": 1977525,
"login": "jeremycline",
"node_id": "MDQ6VXNlcjE5Nzc1MjU=",
"organizations_url": "https://api.github.com/users/jeremycline/orgs",
"received_events_url": "https://api.github.com/users/jeremycline/received_events",
"repos_url": "https://api.github.com/users/jeremycline/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jeremycline/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jeremycline/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jeremycline",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3058/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3058/timeline
| null |
completed
| null | null | false |
[
"Thanks for this very detailed bug report!\n\nUnfortunately, this is a known limitation: see #2863. We have the beginnings of a urllib3 patch outstanding but it's currently stalled a bit due to lack of time. For the moment, I'm afraid that the best approach here is to swap out either sessions or transport adapters when you change any TLS-related setting: it effectively invalidates all the connections, but requests doesn't know that.\n",
"Ahhh, sorry I missed the duplicate. I'll go ahead and close this one. Thanks for the quick response!\n",
"No problem, thanks for putting so much effort into the bug report!\n"
] |
https://api.github.com/repos/psf/requests/issues/3057
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3057/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3057/comments
|
https://api.github.com/repos/psf/requests/issues/3057/events
|
https://github.com/psf/requests/pull/3057
| 141,613,703 |
MDExOlB1bGxSZXF1ZXN0NjMyMzk0MTg=
| 3,057 |
Clarify that SSL verification is on by default
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4",
"events_url": "https://api.github.com/users/kevinburke/events{/privacy}",
"followers_url": "https://api.github.com/users/kevinburke/followers",
"following_url": "https://api.github.com/users/kevinburke/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kevinburke",
"id": 234019,
"login": "kevinburke",
"node_id": "MDQ6VXNlcjIzNDAxOQ==",
"organizations_url": "https://api.github.com/users/kevinburke/orgs",
"received_events_url": "https://api.github.com/users/kevinburke/received_events",
"repos_url": "https://api.github.com/users/kevinburke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kevinburke",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-03-17T15:17:51Z
|
2021-09-08T04:01:12Z
|
2016-03-17T15:19:59Z
|
CONTRIBUTOR
|
resolved
|
Generally if a kwarg is present it indicates that an option other than the
default is being specified. Putting `verify=True` in the first code sample
for SSL confused me, because it seemed to indicate that you had to specify
`verify=True` to get SSL verification. The opposite is true; SSL verification
is turned on by default and you have to specify `verify=False` to opt out of
SSL verification.
Updates the docs to make this more clear. Furthermore, connections to
https://kennethreitz.com currently time out instead of presenting an invalid
certificate, so I replaced this domain with https://requestb.in, which presents
the same error message as is currently there.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3057/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3057/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3057.diff",
"html_url": "https://github.com/psf/requests/pull/3057",
"merged_at": "2016-03-17T15:19:59Z",
"patch_url": "https://github.com/psf/requests/pull/3057.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3057"
}
| true |
[
"Thanks @kevinburke! :sparkles: :cake: :sparkles:\n",
"What about https://self-signed.badssl.com/? (I am not affiliated, just an occasional user)\nThe problem is, that is validates without SNI...\n"
] |
https://api.github.com/repos/psf/requests/issues/3056
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3056/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3056/comments
|
https://api.github.com/repos/psf/requests/issues/3056/events
|
https://github.com/psf/requests/pull/3056
| 141,326,756 |
MDExOlB1bGxSZXF1ZXN0NjMwOTQxODQ=
| 3,056 |
Import native Python JSON only
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/243799?v=4",
"events_url": "https://api.github.com/users/digitaldavenyc/events{/privacy}",
"followers_url": "https://api.github.com/users/digitaldavenyc/followers",
"following_url": "https://api.github.com/users/digitaldavenyc/following{/other_user}",
"gists_url": "https://api.github.com/users/digitaldavenyc/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/digitaldavenyc",
"id": 243799,
"login": "digitaldavenyc",
"node_id": "MDQ6VXNlcjI0Mzc5OQ==",
"organizations_url": "https://api.github.com/users/digitaldavenyc/orgs",
"received_events_url": "https://api.github.com/users/digitaldavenyc/received_events",
"repos_url": "https://api.github.com/users/digitaldavenyc/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/digitaldavenyc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/digitaldavenyc/subscriptions",
"type": "User",
"url": "https://api.github.com/users/digitaldavenyc",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 19 |
2016-03-16T16:27:43Z
|
2021-08-28T00:06:25Z
|
2016-03-16T16:51:41Z
|
NONE
|
resolved
|
Pull Request derived from the discussion on issue #3052
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3056/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3056/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3056.diff",
"html_url": "https://github.com/psf/requests/pull/3056",
"merged_at": "2016-03-16T16:51:41Z",
"patch_url": "https://github.com/psf/requests/pull/3056.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3056"
}
| true |
[
"LGTM.\n",
":sparkles: :cake: :sparkles:\n",
"Really, no longer a need for `json` to be in `compat` at all anymore, let alone be called `complexjson`. \n\nBut, that would require touching a lot more code. \n",
"@kennethreitz I actually was going to rename complexjson to json but there are json variable references all over the models.py file and decided it was probably best not to refactor it. If you still think it's a good idea I'm game to change it.\n",
"Perhaps there a different name we could use for the import so it would require less touching of code. `jsonp`?\n",
"Ah, prob best to `import json as complexjson` in `models.py` then (instead of `compat.py`).\n\nIt's a clever name, and I enjoy cleverness. \n",
"Ha! Agreed, complexjson is a good alt name. Best to either change it to json or leave it as is.\n",
"PR looks perfect. \n",
"The reason we need `complexjson` in `models.py` is because we accept `json` as a method/function parameter and for those cases we can't also reference `json` the module without a name collision. Leaving it as such makes life a bit simpler.\n\nThat said, I'm open to changing it to one of the following\n- `loljson`\n- `whyjson`\n- `hailjson`\n- `zomgjson`\n- `whatevenisjson`\n\nAnd probably quite a few other names. ;)\n",
"- `mwson` (micro web services object notation)\n",
"- `perhapsjson`\n- `itcouldbejson`\n- `pleasedontsayjason`\n",
"No, it's fine as is. \n",
"I know. I'm just kidding about alternate names. Alternatively\n- `mycathatesjson`\n- `oneweirdtrickforserializinganddeserializingjson`\n",
"- `magicjson`\n- `sacredjson`\n- `chopjsoncarrywater`\n- `cosmicjson`\n",
"I got it ...\n- `trumpjson`\n\nYou guys are gonna delete the PR now aren't you :(\n",
"TrumpJSON™: It's unbelievable, really, the best JSON you'll ever experience. Trust me, I know JSON. \n",
"It overwrites all your code with this way better code, trust me it's better....\n",
"It's a huuuuuuuuuuuuuuuuuuge download. It makes serialization great again\n",
"anyway this can get in a 2.x release? It's been two years now and `simplejson` is even less relevant than when this was authored"
] |
https://api.github.com/repos/psf/requests/issues/3055
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3055/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3055/comments
|
https://api.github.com/repos/psf/requests/issues/3055/events
|
https://github.com/psf/requests/pull/3055
| 141,319,823 |
MDExOlB1bGxSZXF1ZXN0NjMwOTAxNjY=
| 3,055 |
Import native Python JSON only
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/243799?v=4",
"events_url": "https://api.github.com/users/digitaldavenyc/events{/privacy}",
"followers_url": "https://api.github.com/users/digitaldavenyc/followers",
"following_url": "https://api.github.com/users/digitaldavenyc/following{/other_user}",
"gists_url": "https://api.github.com/users/digitaldavenyc/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/digitaldavenyc",
"id": 243799,
"login": "digitaldavenyc",
"node_id": "MDQ6VXNlcjI0Mzc5OQ==",
"organizations_url": "https://api.github.com/users/digitaldavenyc/orgs",
"received_events_url": "https://api.github.com/users/digitaldavenyc/received_events",
"repos_url": "https://api.github.com/users/digitaldavenyc/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/digitaldavenyc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/digitaldavenyc/subscriptions",
"type": "User",
"url": "https://api.github.com/users/digitaldavenyc",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-03-16T16:04:00Z
|
2021-09-08T04:01:12Z
|
2016-03-16T16:17:12Z
|
NONE
|
resolved
|
Pull Request derived from the discussion on issue #3052
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3055/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3055/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3055.diff",
"html_url": "https://github.com/psf/requests/pull/3055",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3055.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3055"
}
| true |
[
"Please branch off of proposed/3.0.0\n"
] |
https://api.github.com/repos/psf/requests/issues/3054
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3054/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3054/comments
|
https://api.github.com/repos/psf/requests/issues/3054/events
|
https://github.com/psf/requests/issues/3054
| 141,319,449 |
MDU6SXNzdWUxNDEzMTk0NDk=
| 3,054 |
Set a default timeout on a Session instance
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/108767?v=4",
"events_url": "https://api.github.com/users/rgov/events{/privacy}",
"followers_url": "https://api.github.com/users/rgov/followers",
"following_url": "https://api.github.com/users/rgov/following{/other_user}",
"gists_url": "https://api.github.com/users/rgov/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rgov",
"id": 108767,
"login": "rgov",
"node_id": "MDQ6VXNlcjEwODc2Nw==",
"organizations_url": "https://api.github.com/users/rgov/orgs",
"received_events_url": "https://api.github.com/users/rgov/received_events",
"repos_url": "https://api.github.com/users/rgov/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rgov/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rgov/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rgov",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-03-16T16:02:52Z
|
2016-03-16T16:07:44Z
|
2016-03-16T16:07:15Z
|
NONE
| null |
It would be useful to be able to set a default timeout on a Session instance, which would get used for any requests initiated from that Session in lieu of an explicitly supplied timeout.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3054/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3054/timeline
| null |
completed
| null | null | false |
[
"Please search the issue repository before you open new issues. If you had done so you'd have seen #2856 and #2011 which both cover this issue.\n",
"In the future please search the closed issues for similar ideas. This is one we have rejected numerous times. Further it is trivial to subclass the session class to provide this.\n"
] |
https://api.github.com/repos/psf/requests/issues/3053
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3053/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3053/comments
|
https://api.github.com/repos/psf/requests/issues/3053/events
|
https://github.com/psf/requests/pull/3053
| 141,299,003 |
MDExOlB1bGxSZXF1ZXN0NjMwNzc4NDE=
| 3,053 |
remove simplejson import, native python json import only
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/243799?v=4",
"events_url": "https://api.github.com/users/digitaldavenyc/events{/privacy}",
"followers_url": "https://api.github.com/users/digitaldavenyc/followers",
"following_url": "https://api.github.com/users/digitaldavenyc/following{/other_user}",
"gists_url": "https://api.github.com/users/digitaldavenyc/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/digitaldavenyc",
"id": 243799,
"login": "digitaldavenyc",
"node_id": "MDQ6VXNlcjI0Mzc5OQ==",
"organizations_url": "https://api.github.com/users/digitaldavenyc/orgs",
"received_events_url": "https://api.github.com/users/digitaldavenyc/received_events",
"repos_url": "https://api.github.com/users/digitaldavenyc/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/digitaldavenyc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/digitaldavenyc/subscriptions",
"type": "User",
"url": "https://api.github.com/users/digitaldavenyc",
"user_view_type": "public"
}
|
[
{
"color": "eb6420",
"default": false,
"description": null,
"id": 44501256,
"name": "Breaking API Change",
"node_id": "MDU6TGFiZWw0NDUwMTI1Ng==",
"url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change"
},
{
"color": "e11d21",
"default": false,
"description": null,
"id": 44501305,
"name": "Not Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTMwNQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Not%20Ready%20To%20Merge"
},
{
"color": "e11d21",
"default": false,
"description": null,
"id": 78002701,
"name": "Do Not Merge",
"node_id": "MDU6TGFiZWw3ODAwMjcwMQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Do%20Not%20Merge"
}
] |
closed
| true | null |
[] | null | 3 |
2016-03-16T14:55:30Z
|
2021-09-08T04:01:13Z
|
2016-03-16T16:00:24Z
|
NONE
|
resolved
|
Pull Request derived from the discussion on issue https://github.com/kennethreitz/requests/issues/3052
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/243799?v=4",
"events_url": "https://api.github.com/users/digitaldavenyc/events{/privacy}",
"followers_url": "https://api.github.com/users/digitaldavenyc/followers",
"following_url": "https://api.github.com/users/digitaldavenyc/following{/other_user}",
"gists_url": "https://api.github.com/users/digitaldavenyc/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/digitaldavenyc",
"id": 243799,
"login": "digitaldavenyc",
"node_id": "MDQ6VXNlcjI0Mzc5OQ==",
"organizations_url": "https://api.github.com/users/digitaldavenyc/orgs",
"received_events_url": "https://api.github.com/users/digitaldavenyc/received_events",
"repos_url": "https://api.github.com/users/digitaldavenyc/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/digitaldavenyc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/digitaldavenyc/subscriptions",
"type": "User",
"url": "https://api.github.com/users/digitaldavenyc",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3053/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3053/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3053.diff",
"html_url": "https://github.com/psf/requests/pull/3053",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3053.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3053"
}
| true |
[
"This seems reasonable to me, and it's certainly the smallest diff that could achieve the job. I do wonder if we shouldn't just remove this from `compat` now though, and instead change the imports in the modules that use it.\n",
"@Lukasa That's a good point. There is actually only file that is importing json, [models.py](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L34) So the diff would be about the same. Updated the PR\n",
"This should target proposed/3.0.0 not master.\n"
] |
https://api.github.com/repos/psf/requests/issues/3052
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3052/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3052/comments
|
https://api.github.com/repos/psf/requests/issues/3052/events
|
https://github.com/psf/requests/issues/3052
| 141,095,235 |
MDU6SXNzdWUxNDEwOTUyMzU=
| 3,052 |
Why default to simplejson?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/243799?v=4",
"events_url": "https://api.github.com/users/digitaldavenyc/events{/privacy}",
"followers_url": "https://api.github.com/users/digitaldavenyc/followers",
"following_url": "https://api.github.com/users/digitaldavenyc/following{/other_user}",
"gists_url": "https://api.github.com/users/digitaldavenyc/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/digitaldavenyc",
"id": 243799,
"login": "digitaldavenyc",
"node_id": "MDQ6VXNlcjI0Mzc5OQ==",
"organizations_url": "https://api.github.com/users/digitaldavenyc/orgs",
"received_events_url": "https://api.github.com/users/digitaldavenyc/received_events",
"repos_url": "https://api.github.com/users/digitaldavenyc/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/digitaldavenyc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/digitaldavenyc/subscriptions",
"type": "User",
"url": "https://api.github.com/users/digitaldavenyc",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 39 |
2016-03-15T20:55:07Z
|
2021-09-03T00:10:50Z
|
2016-03-20T19:47:51Z
|
NONE
|
resolved
|
I've recently run into an issue where another library installed simple-json and because the requests library defaults to it if available, it caused all of our json requests to fail due to a decoding problem.
I'm unclear why requests even defaults to simple-json anymore. I'd be happy to contribute to a PR to make the json library for requests more controllable but wanted to submit an issue first. Or perhaps there is another way I'm unaware of that would allow more control over which json library requests will use.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/3052/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3052/timeline
| null |
completed
| null | null | false |
[
"Thanks for this report! Please see issue #2516 for the last time this was discussed.\n",
"So to be clear, I am now as I was then open to removing simplejson as an option entirely, but @kennethreitz wanted it there. =)\n",
"I'm not saying it should be removed ... but perhaps there is a better way to control the import. It's possible for libraries to conflict with each other. Could there be a way to specify the json library to requests?\n",
"Historically, I was under the impression that a freshly-compiled `simplejson` has better performance characteristics than the `json` module that is built into Python 2.6. \n",
"I believe the \"speed-ups\" weren't enabled in the `json` module until 2.7.\n\nRecalling from memory here, so I may not be correct. \n",
"Controlling imports is _extremely_ fraught because it relies on delaying the import until some later time in the execution. I am pretty averse to doing that because it causes bugs to appear at weird places and can generally make life weirdly tricky. If we're going to keep it we should keep it, if we're going to remove it we should remove it.\n",
"The last three comments here state our stance nicely https://github.com/kennethreitz/requests/pull/2516#issuecomment-89005463\n",
"So what would you suggest to do when two libraries conflict with each other? Monkey patch the requests library to control which json import is used?\n",
"Conflict with eachother? I don't understand. If simplejson doesn't function properly on your system, then you probably find out why, or uninstall it. \n",
"I'm not against removing if it's really a problem for some users, as I don't think it would impact anyone in any measurable way, but it's currently one of those nice final touches that makes using Requests a pleasure — an extreme level of thoughtfulness.\n\nFive years ago, people cared a lot more about the speed of their encoding/decoding serialization libraries than they seem to today. This was best practice. I don't think it's harmful either. \n",
"It should also be noted that 2.6 is being used far less today than it was five years ago. \n",
"@digitaldavenyc the only acceptable way to control that import flow would be to support an environment variable e.g. `REQUESTS_NO_SIMPLEJSON`. That is not going to happen.\n",
"Yes I'll give my real world scenario ... One library is using requests and after some digging found that it was overriding json decode for various reasons. It works great with requests when using the standard python json library but another python library decided to import simplejson and broke decoding in requests. You could say that to real issue was the override but I think the lack of control over which json library requests use is an issue.\n",
"It should also be noted that our json support is very simple, and in no way necessary for using the library — you can easily `json.loads(response.content)` (although we do a little bit more with encodings, I believe). \n",
"Does simplejson actually have any performance gains over native json in newer versions of Python 3.3+ ?\n",
"I have no idea, I doubt it. I specifically said Python 2.6.\n",
"Or `json.loads(response.text)`. Yes it's nice to use a convenience method, but requests has lots of convenience methods that people bypass in certain cases and this is a great example of one case (where you have another requirement on simplejson which should work exactly identically to the standard library's json module).\n\nQuestions about simplejson's performance are not relevant to requests though.\n",
"I was about to say maybe there is no reason to change anything in requests but if there are no performance related reasons to use simplejson, then why bother including it in requests anymore? \n",
"Have you read any of the comments I made earlier?\n",
"Performance related reasons == Python 2.6.x.\n",
"@kennethreitz Exactly. You just mentioned that not many people use Python 2.6 anymore. Is it still a nice to have feature that's worth including?\n",
"Maybe, maybe not. That's what my statements above were postulating. \n",
"Particularly https://github.com/kennethreitz/requests/issues/3052#issuecomment-197025994\n",
"I agree with you that including simple-json support was great 3-4 years ago, it did show that level of thoughtfulness that makes requests a great library. But I am failing to see the value of including simple-json moving forward in future releases. If only applications using Python 2.6 can get any benefits from using simple-json, it seems pointless to include any longer since that is the lowest version of Python currently supported by requests. \n\nI am just a user of requests but you guys maintain the library and is for you to decide.\n",
"I agree. A few options here:\n1. Keeping things as they are (always preferred)\n2. Removing simplejson logic completely (I think this would be fine)\n3. Limiting simplejson logic to 2.6 only (unideal, but would limit potential side-effects)\n\nI like **1** the best, with a \"if it ain't broke, don't fix it!\" mentality, very loosely held. I know **2** is what @sigmavirus24 and @Lukasa prefer, and if it's worth the (minimal) effort, I'm not against it if they're for it. \n",
"I would think **3** would probably be the best option since Python 2.6 users may still be using simple-json and it is still supported by requests as of now. Could there potentially be any issues with that?\n",
"I don't think anyone will either notice or care. I put it in _for_ them because _I_ cared more than they did :)\n",
"If the community really cared about speed, we'd all be using `ujson` all the time. It's about convenience nowadays, and sticking to the crowd, above most else. \n",
"Found on the internet:\n\n> With simplejson (when using speedups module), string's type depends on its value, i.e. it decodes strings as 'str' type if their characters are ascii-only, but as 'unicode' otherwise; strangely enough, it always decodes strings as unicode (like standard json module) when speedups are disabled.\n\nRookie mistake, although quite common practice pre-python3. Requests did this for years for `Response.content` before I started working on Python 3 support and was forced to make the distinct `Response.text` property for unicode (much better design). \n\nI suspect this is the issue you're running into. \n\nGiven this information, **2** it will be: removing simplejson logic completely.\n",
"2 is probably the best route to take in the long term.\n"
] |
https://api.github.com/repos/psf/requests/issues/3051
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3051/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3051/comments
|
https://api.github.com/repos/psf/requests/issues/3051/events
|
https://github.com/psf/requests/pull/3051
| 141,019,351 |
MDExOlB1bGxSZXF1ZXN0NjI5MzAzMTY=
| 3,051 |
Fix 'Transfer-Encoding: chunked' bug when upload an empty file
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7709535?v=4",
"events_url": "https://api.github.com/users/monhz/events{/privacy}",
"followers_url": "https://api.github.com/users/monhz/followers",
"following_url": "https://api.github.com/users/monhz/following{/other_user}",
"gists_url": "https://api.github.com/users/monhz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/monhz",
"id": 7709535,
"login": "monhz",
"node_id": "MDQ6VXNlcjc3MDk1MzU=",
"organizations_url": "https://api.github.com/users/monhz/orgs",
"received_events_url": "https://api.github.com/users/monhz/received_events",
"repos_url": "https://api.github.com/users/monhz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/monhz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/monhz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/monhz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-03-15T15:59:26Z
|
2021-09-08T04:01:13Z
|
2016-03-15T16:08:14Z
|
NONE
|
resolved
|
Though we have the `stdout` (which is also a `fileno` type with length=0) to care about, when we post or put a normal empty file, we still can not force to set the `Transfer-Encoding` to `chunked`.
I think it is clearly meaning that "I don't want to transfer the file in chunked type", when someone set the `Content-length=0` in headers. `requests` does too much in this case, and it will bring out some problem when requesting some servers which not support chunked type. The `Transfer-Encoding` parameter should be user's response to add into the headers.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3051/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3051/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3051.diff",
"html_url": "https://github.com/psf/requests/pull/3051",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3051.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3051"
}
| true |
[
"I am :-1: on this change.\n\nThe reality is that I don't care about servers that don't support chunked transfer encoding. [RFC 7230 Section 4.1](https://tools.ietf.org/html/rfc7230#section-4.1) says:\n\n> A recipient MUST be able to parse and decode the chunked transfer coding.\n\nAnd this is hardly a new requirement. RFC 2616, the original HTTP/1.1 specification that was published _nearly 15 years ago_ (in 1999), says in [Section 3.6.1](https://tools.ietf.org/html/rfc7230#section-4.1) that:\n\n> All HTTP/1.1 applications MUST be able to receive and decode the \"chunked\" transfer-coding\n\nThis means that any server that cannot handle chunked transfer encoding is either a) a HTTP/1.0-only server, which is incompatible with requests (a HTTP/1.1 library); or b) deliberately non-compliant with the now _fifteen year old_ HTTP/1.1 specification. Requests has _no duty_ to bow to the whims of server developers that chose only to support more than 15 year old protocols or to ignore specifications when they do so, or to operators of websites that are stuck in the 1990s.\n\nAnd even if that were not true, your reasoning about chunked transfer encoding:\n\n> The Transfer-Encoding parameter should be user's response to add into the headers.\n\nis not now and has never been how requests operates. The requests team has quite consistently told users not to add their own content-length or transfer-encoding headers to the headers dictionary, because requests will ignore them.\n\nThe upshot is that sending an empty chunked body is far and away the nicest way to deal with this situation.\n"
] |
https://api.github.com/repos/psf/requests/issues/3050
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3050/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3050/comments
|
https://api.github.com/repos/psf/requests/issues/3050/events
|
https://github.com/psf/requests/issues/3050
| 140,481,019 |
MDU6SXNzdWUxNDA0ODEwMTk=
| 3,050 |
Getting ProxyError exception out of ConnectionError
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/888394?v=4",
"events_url": "https://api.github.com/users/alexanderad/events{/privacy}",
"followers_url": "https://api.github.com/users/alexanderad/followers",
"following_url": "https://api.github.com/users/alexanderad/following{/other_user}",
"gists_url": "https://api.github.com/users/alexanderad/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/alexanderad",
"id": 888394,
"login": "alexanderad",
"node_id": "MDQ6VXNlcjg4ODM5NA==",
"organizations_url": "https://api.github.com/users/alexanderad/orgs",
"received_events_url": "https://api.github.com/users/alexanderad/received_events",
"repos_url": "https://api.github.com/users/alexanderad/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/alexanderad/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alexanderad/subscriptions",
"type": "User",
"url": "https://api.github.com/users/alexanderad",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2016-03-13T12:25:25Z
|
2021-09-08T18:00:56Z
|
2016-04-16T04:03:25Z
|
CONTRIBUTOR
|
resolved
|
When a request is made via proxy and a timeout happens you'll get a `ConnectTimeout` error, which is expected:
``` python
import requests
proxies = {
'http': 'http://10.10.1.10:3128',
'https': 'http://10.10.1.10:1080',
}
try:
requests.get(
'http://httpbin.org/ip', proxies=proxies, timeout=1
).json()
except requests.ConnectionError as e:
print(isinstance(e, requests.exceptions.ConnectTimeout)) # true
```
I believe this is possible because of "special handling" here on [these lines](https://github.com/kennethreitz/requests/blob/master/requests/adapters.py#L429-L432).
However, if the proxy cause a `Network unreachable` / `No route to host` error, [you'll get](https://github.com/kennethreitz/requests/blob/master/requests/adapters.py#L437) a "generic" `ConnectionError`, not a specific `ProxyError` and there is actually no way to get to the `ProxyError`.
> ConnectionError(MaxRetryError("HTTPConnectionPool(host='10.10.1.10', port=3128): Max retries exceeded with url: http://httpbin.org/ip (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x10fe268d0>: Failed to establish a new connection: [Errno 51] Network is unreachable',)))",),)
The question is: is it possible to get somehow down to, at least, `ProxyError` in this case?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3050/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3050/timeline
| null |
completed
| null | null | false |
[
"And yeah, in first place, thanks for great library, @kennethreitz and all the contributors :+1: \n",
"@alexanderad The `ProxyError` you see in that traceback is actually a urllib3 `ProxyError` class. If we wanted to try to abstract the `ProxyError` out we could in principle do that, and it looks like the same change that we had to work around for the timeout error we also have to workaround for the `ProxyError`. \n\nA pull request to fix this should be fairly simple: do you want to tackle it @alexanderad?\n",
"@Lukasa I think I can take a look and provide a patch, thanks for the confirmation. I see some `ProxyError` handling on lines https://github.com/kennethreitz/requests/blob/master/requests/adapters.py#L442-L443, need to take closer look what _that_ `ProxyError` refers to in it's original purpose and why we don't see it in this case (perhaps it comes from system-wide configured proxy?)\n",
"@alexanderad So you can work out why we don't see it in this case by looking at the exception itself, which is a nested collection of ever-lower-level exceptions, which you can see more clearly here\n\n```\nConnectionError(\n MaxRetryError(\n \"HTTPConnectionPool(host='10.10.1.10', port=3128): Max retries exceeded with url: http://httpbin.org/ip (Caused by \n ProxyError('Cannot connect to proxy.', \n NewConnectionError(': Failed to establish a new connection: [Errno 51] Network is unreachable',)\n )\n )\",\n ),\n)\n```\n\nSo the specific problem is that this _kind_ of `ProxyError` now causes a `MaxRetryError`, where previously it did not. So we just need some error handling code in the `MaxRetryError` branch to look for a `ProxyError` in the `reason`, and reraise appropriately.\n\nGiven that `ProxyError` is a subclass of `ConnectionError`, this change should be non-breaking.\n",
"Thanks, I'll take care of this.\n",
"Fixed in #3059\n"
] |
https://api.github.com/repos/psf/requests/issues/3049
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3049/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3049/comments
|
https://api.github.com/repos/psf/requests/issues/3049/events
|
https://github.com/psf/requests/pull/3049
| 140,475,628 |
MDExOlB1bGxSZXF1ZXN0NjI2NjgxNTI=
| 3,049 |
Added unit tests for hooks module
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1236561?v=4",
"events_url": "https://api.github.com/users/Stranger6667/events{/privacy}",
"followers_url": "https://api.github.com/users/Stranger6667/followers",
"following_url": "https://api.github.com/users/Stranger6667/following{/other_user}",
"gists_url": "https://api.github.com/users/Stranger6667/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Stranger6667",
"id": 1236561,
"login": "Stranger6667",
"node_id": "MDQ6VXNlcjEyMzY1NjE=",
"organizations_url": "https://api.github.com/users/Stranger6667/orgs",
"received_events_url": "https://api.github.com/users/Stranger6667/received_events",
"repos_url": "https://api.github.com/users/Stranger6667/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Stranger6667/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Stranger6667/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Stranger6667",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2016-03-13T11:03:49Z
|
2021-09-08T04:01:10Z
|
2016-04-06T19:04:16Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3049/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3049/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3049.diff",
"html_url": "https://github.com/psf/requests/pull/3049",
"merged_at": "2016-04-06T19:04:16Z",
"patch_url": "https://github.com/psf/requests/pull/3049.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3049"
}
| true |
[
"This also seems reasonable to me, but @Stranger6667 is sat right next to me. ;) @sigmavirus24/@kennethreitz, mind doing an extra review for me? This is +1 from me.\n",
"Hello folks seems somehow I got placed on this email alias By mistake.\nAnyway I can be removed.\n\nThank you\n-Ryan\nOn Sun, Mar 13, 2016 at 7:24 AM Cory Benfield [email protected]\nwrote:\n\n> This also seems reasonable to me, but @Stranger6667\n> https://github.com/Stranger6667 is sat right next to me. ;)\n> @sigmavirus24/@kennethreitz https://github.com/kennethreitz, mind doing\n> an extra review for me? This is +1 from me.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/pull/3049#issuecomment-195938314\n> .\n",
"@ryandebruyn This is managed by GitHub, it's not an email alias. You'll need to take some action to remove yourself via GitHub.\n",
"@ryandebruyn you may be \"Watching\" this repository in which case only you can fix that by visiting the project page while logged in and \"un-watching\" it.\n",
"Hello, @sigmavirus24 @Lukasa !\nI've updated this PR :)\n",
"I have one small note. =)\n",
"@Stranger6667 I don't like having tons of pull requests open, so I'm merging this. If you wouldn't mind, it would be appreciated if you opened another PR that addresses the notes @Lukasa left on your test. \n"
] |
|
https://api.github.com/repos/psf/requests/issues/3048
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3048/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3048/comments
|
https://api.github.com/repos/psf/requests/issues/3048/events
|
https://github.com/psf/requests/pull/3048
| 140,471,356 |
MDExOlB1bGxSZXF1ZXN0NjI2NjcxODU=
| 3,048 |
Added unit tests for structures module
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1236561?v=4",
"events_url": "https://api.github.com/users/Stranger6667/events{/privacy}",
"followers_url": "https://api.github.com/users/Stranger6667/followers",
"following_url": "https://api.github.com/users/Stranger6667/following{/other_user}",
"gists_url": "https://api.github.com/users/Stranger6667/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Stranger6667",
"id": 1236561,
"login": "Stranger6667",
"node_id": "MDQ6VXNlcjEyMzY1NjE=",
"organizations_url": "https://api.github.com/users/Stranger6667/orgs",
"received_events_url": "https://api.github.com/users/Stranger6667/received_events",
"repos_url": "https://api.github.com/users/Stranger6667/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Stranger6667/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Stranger6667/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Stranger6667",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2016-03-13T10:09:05Z
|
2021-09-08T04:01:10Z
|
2016-04-06T19:06:36Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3048/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3048/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3048.diff",
"html_url": "https://github.com/psf/requests/pull/3048",
"merged_at": "2016-04-06T19:06:36Z",
"patch_url": "https://github.com/psf/requests/pull/3048.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3048"
}
| true |
[
"@sigmavirus24 @kennethreitz Can one of you two do a separate code review of this? @Stranger6667 is here with me at PyCon SK writing some of these tests. I'm happy with these, but it'd be good if one of you two gave the ok/not-ok.\n",
"Hello @Lukasa @sigmavirus24 !\nI've updated this PR :)\n",
"Awesome, we're very close now @Stranger6667. One small note. =)\n",
"Looks pretty solid!\n"
] |
|
https://api.github.com/repos/psf/requests/issues/3047
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3047/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3047/comments
|
https://api.github.com/repos/psf/requests/issues/3047/events
|
https://github.com/psf/requests/issues/3047
| 140,362,209 |
MDU6SXNzdWUxNDAzNjIyMDk=
| 3,047 |
requests:Failed to establish a new c onnection
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10515655?v=4",
"events_url": "https://api.github.com/users/chengshuyi/events{/privacy}",
"followers_url": "https://api.github.com/users/chengshuyi/followers",
"following_url": "https://api.github.com/users/chengshuyi/following{/other_user}",
"gists_url": "https://api.github.com/users/chengshuyi/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/chengshuyi",
"id": 10515655,
"login": "chengshuyi",
"node_id": "MDQ6VXNlcjEwNTE1NjU1",
"organizations_url": "https://api.github.com/users/chengshuyi/orgs",
"received_events_url": "https://api.github.com/users/chengshuyi/received_events",
"repos_url": "https://api.github.com/users/chengshuyi/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/chengshuyi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chengshuyi/subscriptions",
"type": "User",
"url": "https://api.github.com/users/chengshuyi",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2016-03-12T08:03:56Z
|
2021-09-08T19:00:28Z
|
2016-03-12T09:21:01Z
|
NONE
|
resolved
|
### I can run correctly in my pc,but unusual in Ali cloud server above.This problem has been bothering me three days.I expect someone can help me with this problem.
> On my Windows Server and Ubuntu Server,I conducted the following test
``` python
import requests,sys
url='http://up.photo.qzone.qq.com/cgi-bin/upload/cgi_upload_image?g_tk=1541189572'
files ={'files':('s.jpg',open(sys.path[0]+'\\user\\1005052183437642\\image\\3951851877517768-0.jpg','rb'))}
r = requests.post(url, files=files)
print(r.text)
#I got a response, it shows my server can upload pictures to that server.
```
> But I run the entire program, to execute the above code where exceptionally.Here is my code snippet
``` python
#I think you do not need to consider the correctness of his, because the entire program can be run on my pc
files=self.i_f()
for i in p_list:
files['filename']=(i[-20:],open(i, 'rb'))
r=requests.post(url=u_url,cookies=self.cj,files=files,headers=self.headers)
if self.p_r(r.text) is False:
print('fail:',i)
else:
print('success:',i)
```
> Here is the exception code appears.
>
> ``` python
> Microsoft Windows [版本 6.0.6002]
> 版权所有 (C) 2006 Microsoft Corporation。保留所有权利。
> upload_img_url: http://tjup.photo.qzone.qq.com/cgi-bin/upload/cgi_upload_image?g
> _tk=146481784
> Traceback (most recent call last):
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\sit
> e-packages\requests\packages\urllib3\connection.py", line 137, in _new_conn
> (self.host, self.port), self.timeout, **extra_kw)
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\sit
> e-packages\requests\packages\urllib3\util\connection.py", line 91, in create_con
> nection
> raise err
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\sit
> e-packages\requests\packages\urllib3\util\connection.py", line 81, in create_con
> nection
> sock.connect(sa)
> TimeoutError: [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没
> 有反应,连接尝试失败。
> #Note
> During handling of the above exception, another exception occurred:
> Traceback (most recent call last):
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\sit
> e-packages\requests\packages\urllib3\connectionpool.py", line 559, in urlopen
> body=body, headers=headers)
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\sit
> e-packages\requests\packages\urllib3\connectionpool.py", line 353, in _make_requ
> est
> conn.request(method, url, **httplib_request_kw)
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\htt
> p\client.py", line 1083, in request
> self._send_request(method, url, body, headers)
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\htt
> p\client.py", line 1128, in _send_request
> self.endheaders(body)
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\htt
> p\client.py", line 1079, in endheaders
> self._send_output(message_body)
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\htt
> p\client.py", line 911, in _send_output
> self.send(msg)
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\htt
> p\client.py", line 854, in send
> self.connect()
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\sit
> e-packages\requests\packages\urllib3\connection.py", line 162, in connect
> conn = self._new_conn()
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\sit
> e-packages\requests\packages\urllib3\connection.py", line 146, in _new_conn
> self, "Failed to establish a new connection: %s" % e)
> requests.packages.urllib3.exceptions.NewConnectionError: <requests.packages.urll
> ib3.connection.HTTPConnection object at 0x02168A10>: Failed to establish a new c
> onnection: [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反
> 应,连接尝试失败。
> #Note
> During handling of the above exception, another exception occurred:
> Traceback (most recent call last):
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\sit
> e-packages\requests\adapters.py", line 376, in send
> timeout=timeout
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\sit
> e-packages\requests\packages\urllib3\connectionpool.py", line 609, in urlopen
> _stacktrace=sys.exc_info()[2])
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\sit
> e-packages\requests\packages\urllib3\util\retry.py", line 273, in increment
> raise MaxRetryError(_pool, url, error or ResponseError(cause))
> requests.packages.urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='tju
> p.photo.qzone.qq.com', port=80): Max retries exceeded with url: /cgi-bin/upload/
> cgi_upload_image?g_tk=146481784 (Caused by NewConnectionError('<requests.package
> s.urllib3.connection.HTTPConnection object at 0x02168A10>: Failed to establish a
> new connection: [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机
> 没有反应,连接尝试失败。',))
> #Note
> During handling of the above exception, another exception occurred:
> Traceback (most recent call last):
> File "main.py", line 10, in <module>
> util.work(u,p)
> File "C:\Users\Administrator\Documents\python\SnToTx\util.py", line 140, in wo
> rk
> p(u,pwd,pic_list,text,video,tcn,r['mid'])
> File "C:\Users\Administrator\Documents\python\SnToTx\util.py", line 94, in p
> rt=rm.r(p_list,t,v,n)
> File "C:\Users\Administrator\Documents\python\SnToTx\txModule\eModule\emotion.
> py", line 88, in r
> r=requests.post(url=u_url,cookies=self.cj,files=files,headers=self.headers)
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\sit
> e-packages\requests\api.py", line 107, in post
> return request('post', url, data=data, json=json, **kwargs)
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\sit
> e-packages\requests\api.py", line 53, in request
> return session.request(method=method, url=url, **kwargs)
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\sit
> e-packages\requests\sessions.py", line 468, in request
> resp = self.send(prep, **send_kwargs)
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\sit
> e-packages\requests\sessions.py", line 576, in send
> r = adapter.send(request, **kwargs)
> File "C:\Users\Administrator\AppData\Local\Programs\Python\Python35-32\lib\sit
> e-packages\requests\adapters.py", line 437, in send
> raise ConnectionError(e, request=request)
> requests.exceptions.ConnectionError: HTTPConnectionPool(host='tjup.photo.qzone.q
> q.com', port=80): Max retries exceeded with url: /cgi-bin/upload/cgi_upload_imag
> e?g_tk=146481784 (Caused by NewConnectionError('<requests.packages.urllib3.conne
> ction.HTTPConnection object at 0x02168A10>: Failed to establish a new connection
> : [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接
> 尝试失败。',))
> C:\Users\Administrator\Documents\python\SnToTx>
> ```
The above is the cmd gives me the error message,If you need additional information, please leave a message.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10515655?v=4",
"events_url": "https://api.github.com/users/chengshuyi/events{/privacy}",
"followers_url": "https://api.github.com/users/chengshuyi/followers",
"following_url": "https://api.github.com/users/chengshuyi/following{/other_user}",
"gists_url": "https://api.github.com/users/chengshuyi/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/chengshuyi",
"id": 10515655,
"login": "chengshuyi",
"node_id": "MDQ6VXNlcjEwNTE1NjU1",
"organizations_url": "https://api.github.com/users/chengshuyi/orgs",
"received_events_url": "https://api.github.com/users/chengshuyi/received_events",
"repos_url": "https://api.github.com/users/chengshuyi/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/chengshuyi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chengshuyi/subscriptions",
"type": "User",
"url": "https://api.github.com/users/chengshuyi",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3047/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3047/timeline
| null |
completed
| null | null | false |
[
"Th error being received is \"failed to establish a new connection\" to host `tjup.photo.qzone.qq.com`. This is probably specific to your machine, because it's working fine here, so I suspect this is specific to your network.\n\nDo you want to try seeing if you can connect directly to that host from your browser?\n",
"I've tried it, but now seems to have found where the problem is.\n\nOn my Windows server above,when i type `telnet tjup.photo.qzone.qq.com 80`,it shows connection fail. Then on my Windows 7,connection succeeded. Perhaps this is where the problem is, do you think\n",
"@chengshuyi Almost certainly, yes. =)\n",
"Well, then i must solve this problem.Thank you for your attention\n"
] |
https://api.github.com/repos/psf/requests/issues/3046
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3046/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3046/comments
|
https://api.github.com/repos/psf/requests/issues/3046/events
|
https://github.com/psf/requests/issues/3046
| 140,137,021 |
MDU6SXNzdWUxNDAxMzcwMjE=
| 3,046 |
python 3.5 requests 2.9.1 can not upload Chinese filename
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/8475089?v=4",
"events_url": "https://api.github.com/users/pc10201/events{/privacy}",
"followers_url": "https://api.github.com/users/pc10201/followers",
"following_url": "https://api.github.com/users/pc10201/following{/other_user}",
"gists_url": "https://api.github.com/users/pc10201/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/pc10201",
"id": 8475089,
"login": "pc10201",
"node_id": "MDQ6VXNlcjg0NzUwODk=",
"organizations_url": "https://api.github.com/users/pc10201/orgs",
"received_events_url": "https://api.github.com/users/pc10201/received_events",
"repos_url": "https://api.github.com/users/pc10201/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/pc10201/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pc10201/subscriptions",
"type": "User",
"url": "https://api.github.com/users/pc10201",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2016-03-11T09:38:41Z
|
2021-09-08T19:00:29Z
|
2016-03-11T15:17:37Z
|
NONE
|
resolved
|
source code
`import requests
headers = {
'Origin': 'http://home.ctfile.com',
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/49.0.2623.75 Safari/537.36',
'Accept': '_/_',
'Referer': 'http://home.ctfile.com/',
'Accept-Encoding': 'gzip, deflate',
'Accept-Language': 'zh-CN,zh;q=0.8',
}
files = {'file': open('中文.txt', 'rb')}
values = {'name': r'中文.txt', 'filesize': '101'}
url = 'xxxx'
r = requests.post(url, files=files, data=values, headers=headers)
print(r.content)`
I use fiddler to capture browser data.
The right data is:
Content-Disposition: form-data; name="file"; filename="中文.txt"
The python script data is
Content-Disposition: form-data; name="file"; filename*=中文.txt
so I change
Python35\Lib\site-packages\requests\packages\urllib3\fields.py
line 46-47 to
`# value = email.utils.encode_rfc2231(value, 'utf-8')
value = '%s="%s"' % (name, value)`
It is OK now.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3046/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3046/timeline
| null |
completed
| null | null | false |
[
"> The right data is:\n> Content-Disposition: form-data; name=\"file\"; filename=\"中文.txt\"\n\nThat is not the right data. This is:\n\n```\nContent-Disposition: form-data; name=\"file\"; filename*=utf-8''%E4%B8%AD%E6%96%87.txt\n```\n\nThis uses RFC 2231 already. To achieve it, change your code to this:\n\n``` python\nfiles = {'file': open(u'中文.txt', 'rb')}\nvalues = {'name': r'中文.txt', 'filesize': '101'}\n\nurl = 'xxxx'\nr = requests.post(url, files=files, data=values, headers=headers)\n```\n",
"The fiddler right data screenshot\n\n\n\n\n\nnot like\n filename*=utf-8''%E4%B8%AD%E6%96%87.txt\n",
"@pc10201 Did you change your code to match mine, _including_ the unicode string for opening the file?\n",
"I will try tomorrow.I will tell you result.\n",
"I tried your code.\nIt do not work\n\n\n\nIf you use windows,you can use fiddler to capture http data.\nif you use linux or mac os,you can use wireshark.\n",
"@pc10201 In what sense is that not working? I see the filename field formatted correctly.\n",
"The server response is not right.\nThe right data\n\n\n\nThe bad data\n\n\nPlease check the third line with each image.\n",
"Assuming for a moment that the different filenames are unrelated to this problem, the issue here is that the \"right\" case is not using RFC 2231 encoding. This is common, but _wrong_: the server needs to be able to handle the RFC 2231-encoded filename, which is what we're providing.\n",
"You are right.The server may cause this problem.Please close this issue.\n"
] |
https://api.github.com/repos/psf/requests/issues/3045
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3045/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3045/comments
|
https://api.github.com/repos/psf/requests/issues/3045/events
|
https://github.com/psf/requests/issues/3045
| 139,997,937 |
MDU6SXNzdWUxMzk5OTc5Mzc=
| 3,045 |
Version/language widget causes massive URLs/redirect loops
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5297556?v=4",
"events_url": "https://api.github.com/users/nicktimko/events{/privacy}",
"followers_url": "https://api.github.com/users/nicktimko/followers",
"following_url": "https://api.github.com/users/nicktimko/following{/other_user}",
"gists_url": "https://api.github.com/users/nicktimko/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nicktimko",
"id": 5297556,
"login": "nicktimko",
"node_id": "MDQ6VXNlcjUyOTc1NTY=",
"organizations_url": "https://api.github.com/users/nicktimko/orgs",
"received_events_url": "https://api.github.com/users/nicktimko/received_events",
"repos_url": "https://api.github.com/users/nicktimko/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nicktimko/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nicktimko/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nicktimko",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2016-03-10T20:18:38Z
|
2021-09-08T18:00:56Z
|
2016-04-16T04:03:47Z
|
CONTRIBUTOR
|
resolved
|
The version/language selector widget causes problems when switching languages. As it's redundant with the Translations sidebar item, maybe removing the links would be OK?
## Reproducing
1. Go to http://docs.python-requests.org/en/master/user/install/ (i.e. _not_ the home page)
2. Click "v:master"
3. Click a language (e.g. "de")
4. End up at:
```
http://docs.python-requests.org/en/master/user_builds/requests/rtd-builds/master
/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/u
ser_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/use
r_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_
builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_bu
ilds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_buil
ds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds
/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/r
equests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/req
uests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/reque
sts/rtd-builds/master/user_builds/requests/translations/de/latest/user/install/i
ndex/index/index/index/index/index/index/index/index/index/index/index/index/ind
ex/index/index/index/index/index/index/index/
```
## Conditions
- Chrome 48 / Safari 9.0.3
- OS X 10.11
## Expected/Desired
1. Go to http://docs.python-requests.org/en/master/
2. Click the "v:master" version button at the bottom right
3. Click your favorite language.
or:
1. Go to http://docs.python-requests.org/en/master/user/install/
2. Click your favorite language under "Translations" on the sidebar
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3045/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3045/timeline
| null |
completed
| null | null | false |
[
"Uh, so I don't reproduce that with Safari. What browser are you using?\n",
"@Lukasa both Chrome and Safari according to their \"Conditions\" section.\n",
"How about no browser:\n\n```\nimport requests\n\nurl = 'http://docs.python-requests.org/zh_CN/latest/user/install/'\nfor x in range(10):\n print(x, url)\n response = requests.get(url, allow_redirects=False)\n if response.status_code in (301, 302):\n url = response.headers['Location']\n```\n\nYields:\n\n```\n0 http://docs.python-requests.org/zh_CN/latest/user/install/\n1 http://docs.python-requests.org/en/master/user_builds/requests/translations/zh_CN/latest/user/install/index/\n2 http://docs.python-requests.org/en/master/user_builds/requests/rtd-builds/master/user_builds/requests/translations/zh_CN/latest/user/install/index/index/\n3 http://docs.python-requests.org/en/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/translations/zh_CN/latest/user/install/index/index/index/\n4 http://docs.python-requests.org/en/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/translations/zh_CN/latest/user/install/index/index/index/index/\n5 http://docs.python-requests.org/en/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/translations/zh_CN/latest/user/install/index/index/index/index/index/\n6 http://docs.python-requests.org/en/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/translations/zh_CN/latest/user/install/index/index/index/index/index/index/\n7 http://docs.python-requests.org/en/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/translations/zh_CN/latest/user/install/index/index/index/index/index/index/index/\n8 http://docs.python-requests.org/en/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/translations/zh_CN/latest/user/install/index/index/index/index/index/index/index/index/\n9 http://docs.python-requests.org/en/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/rtd-builds/master/user_builds/requests/translations/zh_CN/latest/user/install/index/index/index/index/index/index/index/index/index/\n```\n",
"So my experience of this seems to be that it occurs only when you swap languages from one of the sub pages. If you do the swap at the top level it seems to work fine.\n\nThat _seems_ to me like it's probably an RTD issue? @ericholscher am I right about that, or am I on totally the wrong track here?\n",
"I did mention that in the initial issue ;)\n\nOn RTD's own documentation I can't seem to get it to trigger, but is that because their pages are completely flat (no subfolders it seems)...can't find any other projects using RTD + localization right now.\n",
"From an initial look, the subprojects are using sphinx's HTML builder, and the main project is using HTMLDir, so that URL's don't line up when switching languages. The redirect loop however is likely a bug :) We just deployed some new nginx configs so it's likely an issue with that.\n\neg. http://docs.python-requests.org/zh_CN/latest/user/install.html is a URL that works.\n\nI was just investigating a weird Nginx bug that caused this to happen when given a directory that exists but doesn't have an `index.html` file, as is the case here. I'll look at it more, but the main issue is that the URL is 404'ing in the first place. \n"
] |
https://api.github.com/repos/psf/requests/issues/3044
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3044/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3044/comments
|
https://api.github.com/repos/psf/requests/issues/3044/events
|
https://github.com/psf/requests/issues/3044
| 139,776,178 |
MDU6SXNzdWUxMzk3NzYxNzg=
| 3,044 |
Suggest to user to install 'certifi' Python module if CA bundle is not found and 'certifi' is not loaded
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/14837209?v=4",
"events_url": "https://api.github.com/users/aahancoc/events{/privacy}",
"followers_url": "https://api.github.com/users/aahancoc/followers",
"following_url": "https://api.github.com/users/aahancoc/following{/other_user}",
"gists_url": "https://api.github.com/users/aahancoc/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/aahancoc",
"id": 14837209,
"login": "aahancoc",
"node_id": "MDQ6VXNlcjE0ODM3MjA5",
"organizations_url": "https://api.github.com/users/aahancoc/orgs",
"received_events_url": "https://api.github.com/users/aahancoc/received_events",
"repos_url": "https://api.github.com/users/aahancoc/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/aahancoc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aahancoc/subscriptions",
"type": "User",
"url": "https://api.github.com/users/aahancoc",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-03-10T03:46:53Z
|
2021-09-08T19:00:29Z
|
2016-03-10T09:01:47Z
|
NONE
|
resolved
|
Requests grabs its CA bundle by either using the 'certifi' library, or if it's not installed, looking at the hard-coded filename 'cacert.pem'. The problem is that if you're on a Linux distro (ex: OpenSUSE) that doesn't have 'certifi' installed by default and doesn't have the CA bundle named 'cacert.pem', all you get is an unhelpful error telling you it could not find a suitable SSL CA certificate bundle.
What I'm suggesting is that, if the filename look fails and 'certifi' is not loaded, Requests adds a message in the "could not find a suitable SSL CA certificate bundle" error message that suggests that the user install 'certifi' to resolve the problem. It would save a lot of headache for people who aren't savvy enough to search the source code and figure out what they need to install in the first place.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3044/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3044/timeline
| null |
completed
| null | null | false |
[
"@aahancoc Requests ships the `cacert.pem` bundle by default, which means that if you're on a Linux distribution and it cannot be found your distribution removed that file themselves. This means your distribution broke your install.\n\nI recommend you take it up with the people breaking our software downstream. =)\n",
"We may not want to encourage the distros to modify thecodebase even further.\n\nUgh. \n"
] |
https://api.github.com/repos/psf/requests/issues/3043
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3043/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3043/comments
|
https://api.github.com/repos/psf/requests/issues/3043/events
|
https://github.com/psf/requests/pull/3043
| 139,608,622 |
MDExOlB1bGxSZXF1ZXN0NjIyNDUyNTU=
| 3,043 |
Refactoring and Improving Readability
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2357025?v=4",
"events_url": "https://api.github.com/users/tusharmakkar08/events{/privacy}",
"followers_url": "https://api.github.com/users/tusharmakkar08/followers",
"following_url": "https://api.github.com/users/tusharmakkar08/following{/other_user}",
"gists_url": "https://api.github.com/users/tusharmakkar08/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tusharmakkar08",
"id": 2357025,
"login": "tusharmakkar08",
"node_id": "MDQ6VXNlcjIzNTcwMjU=",
"organizations_url": "https://api.github.com/users/tusharmakkar08/orgs",
"received_events_url": "https://api.github.com/users/tusharmakkar08/received_events",
"repos_url": "https://api.github.com/users/tusharmakkar08/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tusharmakkar08/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tusharmakkar08/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tusharmakkar08",
"user_view_type": "public"
}
|
[
{
"color": "e11d21",
"default": false,
"description": null,
"id": 78002701,
"name": "Do Not Merge",
"node_id": "MDU6TGFiZWw3ODAwMjcwMQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Do%20Not%20Merge"
},
{
"color": "e11d21",
"default": false,
"description": null,
"id": 136589914,
"name": "Needs Info",
"node_id": "MDU6TGFiZWwxMzY1ODk5MTQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20Info"
}
] |
closed
| true | null |
[] | null | 11 |
2016-03-09T15:28:46Z
|
2016-03-16T20:28:22Z
|
2016-03-09T18:11:26Z
|
NONE
| null |
Since `and` condition is there, `verify is not True` will be always false.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3043/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3043/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3043.diff",
"html_url": "https://github.com/psf/requests/pull/3043",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3043.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3043"
}
| true |
[
"Hey @tusharmakkar08,\n\nThanks for sending this pull request. Could you explain what bug you were seeing that made you change this? What instigated this change? It's not clear that this exhibits the same behaviour it did before, especially since tests have failed pretty consistently across all versions of Python.\n",
"@sigmavirus24 : In line 174 in `adapters.py` `if url.lower().startswith('https') and verify` is there and then `if not verify` is there. Since `and` condition is there, `if not verify` won't be `True`. The other thing is more pythonic way of if/else i.e in line 200 \n",
"@tusharmakkar08 your commit no longer affects that region and simply turns an easy to read if/else statement into a ternary which is not \"pythonic\" nor is it easier to read.\n\nI'll ask again: What bug are you seeing that prompted this (if there was any bug)?\n",
"There was no bug per se. I was trying to remove lines for increase in readability.\n",
"Fewer lines does not lead to an increase in readability. In this instance, the ternary conditional is particularly unclear when combined with tuple unpacking assignment as well as creating an intermediate tuple in many cases. \n\nIn this instance, I suggest that the only clear readability improvement would come from using tuple unpacking in the first branch of the conditional. I don't think any change beyond that is warranted. \n",
"We're not interested in random refactorings that change logic for no apparent reason. Sorry @tusharmakkar08 \n",
"@Lukasa : I have done couple of more changes and few of them are [anti-patterns](https://www.quantifiedcode.com/app/issue_class/62b0f5b7b69e4a2498568b32bfa30991) as suggested by http://docs.quantifiedcode.com/python-code-patterns/ . This isn't random refactoring, we are removing anti-patterns from code @sigmavirus24. \n\nThanks. \n",
"@tusharmakkar08 while your contributions are appreciated, I want to give you feedback — all of the changes present in this pull request actually _reduce_ readability, significantly.\n",
"@sigmavirus24 @Lukasa @kennethreitz: Since requests is being mentioned over [here](http://docs.python-guide.org/en/latest/writing/reading/) among the best python codes, I believe these kind of antipatterns shouldn't exist over here. \n",
"@tusharmakkar08 please keep in mind that your continuous arguments are emailing roughly 841 people. Since it doesn't seem as if you will respectfully let this conversation end and it serves to benefit none of those 841 people (because it has little value to the project as a whole and is likely _not_ why they are subscribed) I am ending the conversation forcefully by locking it.\n",
"Antipatterns are an antipattern. :)\n"
] |
https://api.github.com/repos/psf/requests/issues/3042
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3042/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3042/comments
|
https://api.github.com/repos/psf/requests/issues/3042/events
|
https://github.com/psf/requests/pull/3042
| 139,476,640 |
MDExOlB1bGxSZXF1ZXN0NjIxODAzMDY=
| 3,042 |
Fix api.rst References
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/72231?v=4",
"events_url": "https://api.github.com/users/bsandrow/events{/privacy}",
"followers_url": "https://api.github.com/users/bsandrow/followers",
"following_url": "https://api.github.com/users/bsandrow/following{/other_user}",
"gists_url": "https://api.github.com/users/bsandrow/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bsandrow",
"id": 72231,
"login": "bsandrow",
"node_id": "MDQ6VXNlcjcyMjMx",
"organizations_url": "https://api.github.com/users/bsandrow/orgs",
"received_events_url": "https://api.github.com/users/bsandrow/received_events",
"repos_url": "https://api.github.com/users/bsandrow/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bsandrow/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bsandrow/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bsandrow",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-03-09T05:04:06Z
|
2021-09-08T04:01:14Z
|
2016-03-09T05:35:45Z
|
CONTRIBUTOR
|
resolved
|
`api.rst` references `requests.ConnectTimeout` and `requests.ReadTimeout`, but they aren't imported into the top-level of the package.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3042/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3042/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3042.diff",
"html_url": "https://github.com/psf/requests/pull/3042",
"merged_at": "2016-03-09T05:35:45Z",
"patch_url": "https://github.com/psf/requests/pull/3042.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3042"
}
| true |
[
":sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/3041
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3041/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3041/comments
|
https://api.github.com/repos/psf/requests/issues/3041/events
|
https://github.com/psf/requests/pull/3041
| 139,476,341 |
MDExOlB1bGxSZXF1ZXN0NjIxODAyMzE=
| 3,041 |
Fix autofunction Reference
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/72231?v=4",
"events_url": "https://api.github.com/users/bsandrow/events{/privacy}",
"followers_url": "https://api.github.com/users/bsandrow/followers",
"following_url": "https://api.github.com/users/bsandrow/following{/other_user}",
"gists_url": "https://api.github.com/users/bsandrow/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bsandrow",
"id": 72231,
"login": "bsandrow",
"node_id": "MDQ6VXNlcjcyMjMx",
"organizations_url": "https://api.github.com/users/bsandrow/orgs",
"received_events_url": "https://api.github.com/users/bsandrow/received_events",
"repos_url": "https://api.github.com/users/bsandrow/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bsandrow/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bsandrow/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bsandrow",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-03-09T05:02:18Z
|
2021-09-08T04:01:14Z
|
2016-03-09T08:59:44Z
|
CONTRIBUTOR
|
resolved
|
`requests.codes` is a class (`LookupDict`), not a function.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3041/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3041/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3041.diff",
"html_url": "https://github.com/psf/requests/pull/3041",
"merged_at": "2016-03-09T08:59:44Z",
"patch_url": "https://github.com/psf/requests/pull/3041.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3041"
}
| true |
[
":sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/3040
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3040/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3040/comments
|
https://api.github.com/repos/psf/requests/issues/3040/events
|
https://github.com/psf/requests/issues/3040
| 139,220,774 |
MDU6SXNzdWUxMzkyMjA3NzQ=
| 3,040 |
got response with 'Connection: close', instead of closing connection but still send message body
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4945739?v=4",
"events_url": "https://api.github.com/users/zjulmh/events{/privacy}",
"followers_url": "https://api.github.com/users/zjulmh/followers",
"following_url": "https://api.github.com/users/zjulmh/following{/other_user}",
"gists_url": "https://api.github.com/users/zjulmh/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/zjulmh",
"id": 4945739,
"login": "zjulmh",
"node_id": "MDQ6VXNlcjQ5NDU3Mzk=",
"organizations_url": "https://api.github.com/users/zjulmh/orgs",
"received_events_url": "https://api.github.com/users/zjulmh/received_events",
"repos_url": "https://api.github.com/users/zjulmh/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/zjulmh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zjulmh/subscriptions",
"type": "User",
"url": "https://api.github.com/users/zjulmh",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-03-08T09:10:58Z
|
2021-09-08T19:00:31Z
|
2016-03-08T09:13:05Z
|
NONE
|
resolved
|
When I post a big file to a server with invalid authorization, the server response with a `400` error response and with `"Connection: close"` header before read all the request body and then close the connection, requests still send message body and get `connection aborted` exception, but according to [RFC7230](https://tools.ietf.org/html/rfc7230#section-6.5), the client `SHOULD` monitor the network connection for an error response and immediately cease transmitting the body and close its side of the connection.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3040/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3040/timeline
| null |
completed
| null | null | false |
[
"@zjulmh This is a known problem. Unfortunately, the underlying HTTP library we're using makes it essentially impossible to actually do this: we only find out about the error response when the connection gets closed.\n\nWe'd need to replace httplib to avoid this problem, which is a substantial amount of work.\n"
] |
https://api.github.com/repos/psf/requests/issues/3039
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3039/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3039/comments
|
https://api.github.com/repos/psf/requests/issues/3039/events
|
https://github.com/psf/requests/issues/3039
| 139,213,935 |
MDU6SXNzdWUxMzkyMTM5MzU=
| 3,039 |
Proxy settings not working
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7224106?v=4",
"events_url": "https://api.github.com/users/alejcas/events{/privacy}",
"followers_url": "https://api.github.com/users/alejcas/followers",
"following_url": "https://api.github.com/users/alejcas/following{/other_user}",
"gists_url": "https://api.github.com/users/alejcas/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/alejcas",
"id": 7224106,
"login": "alejcas",
"node_id": "MDQ6VXNlcjcyMjQxMDY=",
"organizations_url": "https://api.github.com/users/alejcas/orgs",
"received_events_url": "https://api.github.com/users/alejcas/received_events",
"repos_url": "https://api.github.com/users/alejcas/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/alejcas/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alejcas/subscriptions",
"type": "User",
"url": "https://api.github.com/users/alejcas",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 14 |
2016-03-08T08:43:32Z
|
2021-09-08T19:00:31Z
|
2016-03-08T08:54:08Z
|
NONE
|
resolved
|
HI, using urllib2, I can connect through a proxy without problems. But, using requests it's not working:
http://stackoverflow.com/questions/35851227/python-requests-behind-proxy
Can anyone help?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3039/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3039/timeline
| null |
completed
| null | null | false |
[
"Don't use `https` schemes. Your proxy dict should be:\n\n``` python\nproxy_dict = {\n 'http': 'http://10.20.23.5:8080',\n 'https': 'http://10.20.23.5:8080',\n}\n```\n",
"It is not working...\n\nWhy urllib2 is doing it correctly and requests is unable to handle this?\n",
"I notice in your urllib2 code you aren't actually providing a list of proxies. That means it is reading the proxy information from the environment. Requests does that automatically, with no need to provide the proxies dictionary. Try removing it entirely and see if requests sorts the problem out for you.\n",
"Same result with status code 407.\n\npage = requests.get('http://www.google.com')\n\nThis returns:\n'407 Proxy Authentication Required' \n",
"@janscas What operating system are you using?\n",
"It's windows 8.1.\nI'm behind a corporate proxy (Isa Server Forefront TMG)\n\nIt seams the urllib2 can take the proxy settings from the envirenment.\n\nAfter doing this: \n\nproxy = urllib2.ProxyHandler({})\nopener = urllib2.build_opener(proxy)\nurllib2.install_opener(opener)\n\nurllib2.getproxies() returns a dictionary with the correct proxies.\n",
"What's the result of `urllib.getproxies()`, please?\n",
"it s python dict containing the proxies used by the system:\n\nproxy_dict = urllib2.getproxies()\nprint proxy_dict\n{'ftp': 'ftp://10.20.23.5:8080', 'http': 'http://10.20.23.5:8080', 'https': 'https://10.20.23.5:8080'}\n",
"Ok, can you print the headers from the 407 please?\n",
"{'Content-Length': '4128', 'Via': '1.1 SRVMADISA08', 'Proxy-Connection': 'Keep-Alive', 'Proxy-Authenticate': 'Negotiate, Kerberos, NTLM', 'Connection': 'Keep-Alive', 'Pragma': 'no-cache', 'Cache-Control': 'no-cache', 'Content-Type': 'text/html'}\n",
"Ok, so the problem here is that Requests does not out of the box support any of those authorization schemes for proxies. [Requests-NTLM](https://github.com/requests/requests-ntlm) does provide support for authenticating to a proxy, but it will require that you pass your domain credentials to requests NTLM.\n",
"But... why urllib2 does this out of the box?\n",
"@janscas I have no idea. If you want to try to dive deeper into what urllib2 is doing that we're not, you could use Wireshark to check the differences between the request that requests sends and the one that urllib2 sends.\n",
"ok, I'll check this and report back.\n\nThanks a lot\n"
] |
https://api.github.com/repos/psf/requests/issues/3038
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3038/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3038/comments
|
https://api.github.com/repos/psf/requests/issues/3038/events
|
https://github.com/psf/requests/issues/3038
| 139,189,046 |
MDU6SXNzdWUxMzkxODkwNDY=
| 3,038 |
Order of request headers should be preserved when sent to origin server
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/764688?v=4",
"events_url": "https://api.github.com/users/davidfstr/events{/privacy}",
"followers_url": "https://api.github.com/users/davidfstr/followers",
"following_url": "https://api.github.com/users/davidfstr/following{/other_user}",
"gists_url": "https://api.github.com/users/davidfstr/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/davidfstr",
"id": 764688,
"login": "davidfstr",
"node_id": "MDQ6VXNlcjc2NDY4OA==",
"organizations_url": "https://api.github.com/users/davidfstr/orgs",
"received_events_url": "https://api.github.com/users/davidfstr/received_events",
"repos_url": "https://api.github.com/users/davidfstr/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/davidfstr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/davidfstr/subscriptions",
"type": "User",
"url": "https://api.github.com/users/davidfstr",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 20 |
2016-03-08T06:04:36Z
|
2021-08-30T00:06:18Z
|
2016-04-29T21:47:35Z
|
NONE
|
resolved
|
#### Summary
If I made a request with code that looks like:
```
requests.get(
'http://127.0.0.1:8765/',
headers=OrderedDict([
('X-Header-1', 'ignoreme'),
('X-Header-2', 'ignoreme'),
('X-Header-3', 'ignoreme'),
('X-Header-4', 'ignoreme'),
])
)
```
I expect the recipient HTTP server to receive the X headers in the same order at they were listed in the code above. However I receive them in a random non-deterministic order.
#### Environment
- Requests 2.9.1
- Python 3.4.1
#### Repro Steps
- Download files in the following gist: https://gist.github.com/davidfstr/0dca5a133fa397cadd92
- Run reflect_server.py in terminal window 1.
- Run reflect_client.py in terminal window 2.
#### Expected Results
- Terminal window 1 shows a request made with the X headers in consecutive order from 1 to 4.
#### Actual Results
- Terminal window 1 shows a request made with the X headers in random non-deterministic order.
#### Notes
The following closed older issues suggest that preservation of order is indeed important:
- https://github.com/kennethreitz/requests/issues/284, https://github.com/kennethreitz/requests/issues/179
- https://github.com/kennethreitz/requests/issues/2668
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/3038/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3038/timeline
| null |
completed
| null | null | false |
[
"FWIW, I don't think any of those issues actually indicate that preservation of order on the wire is important. It _is_, but not for any of those reasons. The reason it's important is that headers can in principle be split up into multiple instances if the header is a comma-separated list, and maintaining the order of those parts is important (or the header ends up taking on a different meaning).\n\nThe fundamental reason this doesn't work is because we store the headers in our `CaseInsensitiveDict`, which does not preserve order. At this point we may want to look harder at replacing our own implementation with the one from urllib3, which _does_ preserve order.\n\nThoughts @kennethreitz, @sigmavirus24?\n",
"You should be able to provide your headers as a list of tuples, and the order will be preserved. We do not support OrderedDict this way (though I wouldn't be against fixing this). \n",
"@kennethreitz Nope, that got broken a _long_ time ago for headers. Back in 2012 it looks like.\n",
"Well, it's supposed to work :)\n",
"That code got pretty complicated. \n",
"@kennethreitz We never had a test that enforces that invariant, which means that commit 366e8e849877aea44ce96abebd4f26f5bcce12fb didn't spot that we broke it in the rewrite for v2.0.\n",
"(Might be worth reading my summary above to understand _why_ it doesn't work.)\n",
"Yeah, it was always a \"secret\" feature, so I prob chose to abandon it during the refactor. \n",
"I think this would be a nice feature to restore, but not worth complicating the codebase over. \n",
"@kennethreitz We could always replace the `CaseInsensitiveDict` used in this place with the [`HTTPHeaderMap`](https://github.com/Lukasa/hyper/blob/development/hyper/common/headers.py) from hyper.\n",
"It would still require case-insensitivy, and have a `repr()` identical to a standard dict. \n\nI don't think swapping out core collections this far along in the project is a good idea, so we should lean towards \"no\" as this is lightly discussed. \n",
"Fair enough. =)\n",
"Thank you for your consideration. It sounds like I should take a look at using the underlying urllib3 instead then.\n\nIn case you're curious, I'm writing a lightweight web service that forwards arbitrary HTTP requests to an origin HTTP server and caches+persists the responses. Effectively a caching HTTP proxy. Since the origin HTTP server could depend on all kinds of interesting properties of the original request, the proxy makes special efforts to change the original request and response as little as possible.\n",
"In case other readers are curious, it appears that urllib3 does in fact preserve the order of request headers.\n\nUnfortunately it does not preserve the order of _response_ headers. https://github.com/shazow/urllib3/issues/821 😕\n",
"I have been monkey-patching requests to achieve ordering of request headers for over a year. For all my usecases it is enough just to change underlying `_store` of `CaseInsensitiveDict` from `dict` to `collections.OrderedDict`.\n\n``` patch\nclass CaseInsensitiveDict(collections.MutableMapping):\n def __init__(self, data=None, **kwargs):\n- self._store = dict()\n+ self._store = collections.OrderedDict()\n```\n\nThen I pass headers to request method as `OrderedDict`, just like OP in the example above. I would be nice if this become merged upstream. I don't see any problems or incompatibilities such a change can bring in.\n",
"@Lukasa if @piotrjurkiewicz's patch has no unintended side-effects, I'm +1 for incorporating this. \n",
"I'm fine with it as well, would you like to raise a PR @piotrjurkiewicz?\n",
"For anyone interested, Akamai seems to block requests with the wrong order:\r\n```\r\n$ curl -v -H \"$UA\" -H \"$ACCEPT\" -H \"$ENCODING\" $URL |& grep '< HTTP'\r\n< HTTP/1.1 403 Forbidden\r\n$ curl -v -H \"$ACCEPT\" -H \"$UA\" -H \"$ENCODING\" $URL |& grep '< HTTP'\r\n< HTTP/1.1 301 Moved Permanently\r\n```",
"The problem seems to be still there I tried this code:\r\n```\r\nheaders = collections.OrderedDict([('Host', \"site.com\"),\r\n ('User-Agent', 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.11; rv:64.0) Gecko/20100101 Firefox/64.0'),\r\n ('Accept', 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8'),\r\n ('Accept-Language', 'it-IT,it;q=0.8,en-US;q=0.5,en;q=0.3'),\r\n ('Accept-Encoding', 'gzip, deflate'),\r\n ('DNT', '1'),\r\n ('Connection', 'close'),\r\n ('Upgrade-Insecure-Requests','1')])\r\n\r\nr = requests.get(\"http://site.com\", headers=headers, proxies=proxy)\r\n```\r\nAnd the request intercepted with Burp is:\r\n```\r\nGET / HTTP/1.1\r\nUser-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.11; rv:64.0) Gecko/20100101 Firefox/64.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8\r\nConnection: close\r\nHost: site.com\r\nAccept-Language: it-IT,it;q=0.8,en-US;q=0.5,en;q=0.3\r\nDNT: 1\r\nUpgrade-Insecure-Requests: 1\r\n```\r\nThe order is not maintained. I am using python 3.7 and requests 2.21.0",
"I would be interested to know why the order is subtly changed as @AlessioCantina described above -- is it being automatically sorted into the correct order according to the user-agent?"
] |
https://api.github.com/repos/psf/requests/issues/3037
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3037/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3037/comments
|
https://api.github.com/repos/psf/requests/issues/3037/events
|
https://github.com/psf/requests/issues/3037
| 139,087,894 |
MDU6SXNzdWUxMzkwODc4OTQ=
| 3,037 |
using ipv6 with requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/8584783?v=4",
"events_url": "https://api.github.com/users/1a1a11a/events{/privacy}",
"followers_url": "https://api.github.com/users/1a1a11a/followers",
"following_url": "https://api.github.com/users/1a1a11a/following{/other_user}",
"gists_url": "https://api.github.com/users/1a1a11a/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/1a1a11a",
"id": 8584783,
"login": "1a1a11a",
"node_id": "MDQ6VXNlcjg1ODQ3ODM=",
"organizations_url": "https://api.github.com/users/1a1a11a/orgs",
"received_events_url": "https://api.github.com/users/1a1a11a/received_events",
"repos_url": "https://api.github.com/users/1a1a11a/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/1a1a11a/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/1a1a11a/subscriptions",
"type": "User",
"url": "https://api.github.com/users/1a1a11a",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-03-07T20:40:02Z
|
2021-09-08T19:00:31Z
|
2016-03-07T20:49:53Z
|
NONE
|
resolved
|
Hi I read previous issue on using requests on ipv6, but I don't quite understand how to update the HOST header, could someone point out? (suppose I already have ipv6 address of a website, how do request it?)
Or is there any better solution available now?
Thank you!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3037/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3037/timeline
| null |
completed
| null | null | false |
[
"It's not clear what you mean when you say \"the previous issue\", but I assume it is #3002, which is a bug. You can work around it by doing `requests.get('https://[fe80::00]', headers={'Host': 'fe80::00'})`. Alternatively, wait for the next release, which should contain a fix.\n"
] |
https://api.github.com/repos/psf/requests/issues/3036
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3036/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3036/comments
|
https://api.github.com/repos/psf/requests/issues/3036/events
|
https://github.com/psf/requests/pull/3036
| 138,918,924 |
MDExOlB1bGxSZXF1ZXN0NjE4ODYyNDk=
| 3,036 |
Allow for exceptions from tell()
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 11 |
2016-03-07T08:33:52Z
|
2021-09-08T04:01:13Z
|
2016-03-11T13:30:53Z
|
MEMBER
|
resolved
|
Resolves #3035.
I'm not actually sure that this approach is the right one: I'm inclined to say that, when an exception is hit from `tell()`, that we may want to assume we don't know the length at all (return length 0) and use chunked-transfer encoding. That's a particularly good idea in this case, as frequently stdin has an unknown length altogether.
Thoughts on that point @sigmavirus24 and @jkbrzt?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3036/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3036/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3036.diff",
"html_url": "https://github.com/psf/requests/pull/3036",
"merged_at": "2016-03-11T13:30:53Z",
"patch_url": "https://github.com/psf/requests/pull/3036.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3036"
}
| true |
[
"> I'm inclined to say that, when an exception is hit from tell(), that we may want to assume we don't know the length at all (return length 0) and use chunked-transfer encoding\n\nExactly my thoughts :+1: \n\nBtw, I think `None` would be a better value to represent an unknown length. Length of zero can still be a valid/known one.\n",
"@jkbrzt That's true, it can, but in this requests we also send zero-length files via chunked-transfer encoding. This is because some files report having zero length but still have content in them: again, stdin on some platforms exhibits this behaviour.\n\nI think we broke our fall-back logic there when we added the `tell()` code, so that makes this a regression fix. Regardless, let's switch to that approach. \n",
"FWIW, doing things this way can cause the chunk size to be relatively small (each chunk will be one line in the example you've provided) because we naively iterate over the file, so @jkbrzt you may want to have some fancy detection logic for this case that _enables_ Nagle's algorithm. In the case of the test file here we sent about 40 tinygrams (fewer than 30 data bytes in each packet) which is not the most efficient way to use the network.\n\nRequests could _in principle_ do this too, but it's not clear to me that that's useful in the general case, and detecting the appropriate situation might be hard. We could _try_ to do something smarter here (around line 431 in models.py we catch the problem of calling `super_len()` on generators, so we may want to consider treating `length == 0` and `length == None` differently, where `length == 0` causes us to wrap the file in a generator that issues multiple large reads rather than iterating.\n\nNot sure how I feel about that though.\n",
"That auto-detection could live as an adapter in the toolbelt. That said, I think the better solution for httpie in this case is to do detect stdin file uploads and create a generator that iterates over larger chunks of stdin which is then passed to data, e.g., (untested pseudo-code)\n\n``` py\nif using_stdin:\n requests.post(url, data=chunk_stdin())\n\ndef chunk_stdin():\n # Set stdin to be non-blocking\n while True:\n chunk = sys.stdin.read(STDIN_CHUNK_SIZE)\n if not chunk:\n break\n yield chunk\n```\n",
"This looks fine to me by the way.\n",
"@Lukasa can you add release notes for this too?\n",
"@sigmavirus24 Done. =)\n",
"I love this change. \n",
"@kennethreitz Each time we change something from guessing a content length to just saying \"chunked is fine for this\", I feel like we're striking a victory for the way the web _should_ have been, rather than the status quo. \n",
"@Lukasa Thanks for the fix! \n\nBtw, I think it would useful if there was a warning in the docs about that resulting TCP inefficiency of simply passing a file object you mention above. The current example is simply `data=f`.\n",
"@jkbrzt Yeah, there's probably some value in having a section in the advanced docs about TCP efficiency. Wanna write it? ;)\n"
] |
https://api.github.com/repos/psf/requests/issues/3035
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3035/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3035/comments
|
https://api.github.com/repos/psf/requests/issues/3035/events
|
https://github.com/psf/requests/issues/3035
| 138,887,017 |
MDU6SXNzdWUxMzg4ODcwMTc=
| 3,035 |
Sending piped stdin results in OSError: [Errno 29] Illegal seek
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/326885?v=4",
"events_url": "https://api.github.com/users/jkbrzt/events{/privacy}",
"followers_url": "https://api.github.com/users/jkbrzt/followers",
"following_url": "https://api.github.com/users/jkbrzt/following{/other_user}",
"gists_url": "https://api.github.com/users/jkbrzt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jkbrzt",
"id": 326885,
"login": "jkbrzt",
"node_id": "MDQ6VXNlcjMyNjg4NQ==",
"organizations_url": "https://api.github.com/users/jkbrzt/orgs",
"received_events_url": "https://api.github.com/users/jkbrzt/received_events",
"repos_url": "https://api.github.com/users/jkbrzt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jkbrzt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jkbrzt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jkbrzt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-03-07T04:52:44Z
|
2021-09-08T19:00:29Z
|
2016-03-11T13:30:53Z
|
CONTRIBUTOR
|
resolved
|
I'm working on adding support for streamed uploads via `stdin` to HTTPie. It works fine when `stdin` redirected from a file, but fails with `OSError: [Errno 29] Illegal seek` when piping programs together.
**`test.py`:**
``` python
#!/usr/bin/env python
import sys
import requests
r = requests.post(
'http://httpbin.org/post',
data=getattr(sys.stdin, 'buffer', sys.stdin)
)
print(r.text)
```
:smile: **works:**
``` bash
$ python test.py < test.py
```
:disappointed: **fails:**
```
$ cat test.py | python test.py
```
``` python
Traceback (most recent call last):
File "test.py", line 5, in <module>
r = requests.post('http://httpbin.org/post', data=sys.stdin.buffer)
File "python3.5/site-packages/requests/api.py", line 107, in post
return request('post', url, data=data, json=json, **kwargs)
File "python3.5/site-packages/requests/api.py", line 53, in request
return session.request(method=method, url=url, **kwargs)
File "python3.5/site-packages/requests/sessions.py", line 454, in request
prep = self.prepare_request(req)
File "python3.5/site-packages/requests/sessions.py", line 388, in prepare_request
hooks=merge_hooks(request.hooks, self.hooks),
File "python3.5/site-packages/requests/models.py", line 296, in prepare
self.prepare_body(data, files, json)
File "python3.5/site-packages/requests/models.py", line 430, in prepare_body
length = super_len(data)
File "python3.5/site-packages/requests/utils.py", line 86, in super_len
current_position = o.tell()
OSError: [Errno 29] Illegal seek
```
---
Tested on OS X 10.11.3 + Python 3.5.1 & 2.7.11 with `requests==2.9.1`.
---
I imagine checking for that error in the appropriate [`except` clause in model.py:431 ](https://github.com/kennethreitz/requests/blob/4f378b0e1a2f5f60e2e57daaec80081483f7150e/requests/models.py#L431) would do the trick. Or a more robust `super_len()`. I'll be happy to submit a pull request, if interested.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3035/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3035/timeline
| null |
completed
| null | null | false |
[
"@jkbrzt I think you're probably right. I'll have a little play today and see if I can convince myself you're definitely right, and then submit a patch.\n"
] |
https://api.github.com/repos/psf/requests/issues/3034
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3034/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3034/comments
|
https://api.github.com/repos/psf/requests/issues/3034/events
|
https://github.com/psf/requests/pull/3034
| 138,842,401 |
MDExOlB1bGxSZXF1ZXN0NjE4NTYzMjE=
| 3,034 |
Support non-ASCII Reason Phrase as per RFC2616
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/447615?v=4",
"events_url": "https://api.github.com/users/rwe/events{/privacy}",
"followers_url": "https://api.github.com/users/rwe/followers",
"following_url": "https://api.github.com/users/rwe/following{/other_user}",
"gists_url": "https://api.github.com/users/rwe/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rwe",
"id": 447615,
"login": "rwe",
"node_id": "MDQ6VXNlcjQ0NzYxNQ==",
"organizations_url": "https://api.github.com/users/rwe/orgs",
"received_events_url": "https://api.github.com/users/rwe/received_events",
"repos_url": "https://api.github.com/users/rwe/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rwe/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rwe/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rwe",
"user_view_type": "public"
}
|
[
{
"color": "e11d21",
"default": false,
"description": null,
"id": 44501305,
"name": "Not Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTMwNQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Not%20Ready%20To%20Merge"
},
{
"color": "e11d21",
"default": false,
"description": null,
"id": 78002701,
"name": "Do Not Merge",
"node_id": "MDU6TGFiZWw3ODAwMjcwMQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Do%20Not%20Merge"
}
] |
closed
| true | null |
[] | null | 2 |
2016-03-06T21:50:37Z
|
2021-09-08T04:01:14Z
|
2016-03-09T21:31:41Z
|
CONTRIBUTOR
|
resolved
|
RFC2616 specifies that header values and reason phrases are encoded in in ISO-8859-1 or RFC2047 (MIME for text). RFC7230, which obsoletes RC2616, recommends treating non-ASCII header values as 'opaque data'.
In the case of the Reason Phrase, however, we should try to decode it as described due to its possible/encouraged localization and due to its purpose for display to the user; treating it as 'opaque data' is not useful and prevents its intended use.
Note that several production servers, including Apache Tomcat, send responses conforming to RFC2616. Servers that respond only in US-ASCII are unaffected by this change. This _only_ regresses (by introducing possible encoding errors) for servers that send responses conforming to neither RFC, which I believe are either uncommon or non-existent.
Fixes kennethreitz/requests#3009
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3034/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3034/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3034.diff",
"html_url": "https://github.com/psf/requests/pull/3034",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3034.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3034"
}
| true |
[
"I'm sorry, but this reasoning is still not accurate, and so this code is still not right.\n\nFirstly, RFC 2616 is not relevant anymore except inasmuch as it provides guidance to what older implementations may do.\n\nSecondly, this argument is wrong:\n\n> In the case of the Reason Phrase, however, we should try to decode it as described due to its possible localization and its purpose for display to the user; treating it as 'opaque data' is not useful and prevents its intended use.\n\nRFC 7230 is very clear on this:\n\n> The reason-phrase element exists for the sole purpose of providing a textual description associated with the numeric status code, mostly out of deference to earlier Internet application protocols that were more frequently used with interactive text clients. A client SHOULD ignore the reason-phrase content.\n\nThat is to say, we are strongly advised by RFC 7230 to ignore the reason phrase _entirely_ rather than to do any special handling of it.\n\nNow, I do believe that we should aim to fix the original problem, but the original problem is _entirely_ that the default encode/decode cycle of Python was busted. For that reason, we should restrict the scope of our fix to the `raise_for_status` method (any anywhere else where we try to do computation on the reason phrase), at which point we should use `requests.utils.to_native_string` with the `ISO-8859-1` encoding. That does the minimal translation to avoid anything exploding when doing the string interpolation, but otherwise leaves the reason phrase unmolested.\n",
"@erydo expressed on the original issue that they were going hands off. That implies to me that they're done working on this if we're not going to accept their changes. If I'm wrong, I'll be happy to reopen this if they wish to follow our recommendations for the appropriate fix. Otherwise, this will sit open for a while and I'd rather that not happen.\n"
] |
https://api.github.com/repos/psf/requests/issues/3033
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3033/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3033/comments
|
https://api.github.com/repos/psf/requests/issues/3033/events
|
https://github.com/psf/requests/issues/3033
| 138,629,976 |
MDU6SXNzdWUxMzg2Mjk5NzY=
| 3,033 |
Moving documentation translation to translating platform
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2891235?v=4",
"events_url": "https://api.github.com/users/caizixian/events{/privacy}",
"followers_url": "https://api.github.com/users/caizixian/followers",
"following_url": "https://api.github.com/users/caizixian/following{/other_user}",
"gists_url": "https://api.github.com/users/caizixian/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/caizixian",
"id": 2891235,
"login": "caizixian",
"node_id": "MDQ6VXNlcjI4OTEyMzU=",
"organizations_url": "https://api.github.com/users/caizixian/orgs",
"received_events_url": "https://api.github.com/users/caizixian/received_events",
"repos_url": "https://api.github.com/users/caizixian/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/caizixian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/caizixian/subscriptions",
"type": "User",
"url": "https://api.github.com/users/caizixian",
"user_view_type": "public"
}
|
[
{
"color": "fbca04",
"default": false,
"description": null,
"id": 44501249,
"name": "Needs BDFL Input",
"node_id": "MDU6TGFiZWw0NDUwMTI0OQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input"
}
] |
closed
| false | null |
[] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2024-05-19T18:29:04Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/34",
"id": 11073254,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/34/labels",
"node_id": "MI_kwDOABTKOs4AqPbm",
"number": 34,
"open_issues": 0,
"state": "open",
"title": "Bankruptcy",
"updated_at": "2024-05-20T14:37:16Z",
"url": "https://api.github.com/repos/psf/requests/milestones/34"
}
| 10 |
2016-03-05T01:22:38Z
|
2024-05-20T14:35:33Z
|
2024-05-20T14:35:32Z
|
NONE
| null |
#624 There're a bunch of translation repos under https://github.com/requests
They all started with a specific commit version of https://github.com/kennethreitz/requests , and then the translators worked on it.
But we lack a proper process to merge updates from upstream repo to these i18n repos. Translators cannot track what they need to retranslate if there's such change in upstream. As a result, some translation might be outdated, and therefore misleading. Maybe moving the whole translation project to Transifex or crowdin is a better idea.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3033/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3033/timeline
| null |
completed
| null | null | false |
[
"I'm really intrigued by this idea. @kennethreitz?\n",
"Transifex has a really nice permissions system, you can set it to auto\nupdate the source translation from a url and a useful API.\n\nOn Sat, 5 Mar 2016 17:11 Cory Benfield [email protected] wrote:\n\n> I'm really intrigued by this idea. @kennethreitz\n> https://github.com/kennethreitz?\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/3033#issuecomment-192600901\n> .\n> \n> ## \n> \n> _Stewart Polley_\n> Head of Technical Support\n> [email protected] | www.elvanto.com\n",
"@requests/translation-team\n",
"I'm not at all familiar with Transifex, or how it works. \n\nI think our currently solution is \"pretty good\", and doubt that having a better workflow available will somehow encourage the translators to keep their translations up to date (a few translations are much more ad-lib and freeform, for example). These translations seem to be a \"hack on it over the weekend, once\" type of thing, and are pretty low priority. \n",
"There is a guide for documentation internationalization.\nhttp://www.sphinx-doc.org/en/stable/intl.html\n\nThe basic flow is \n",
"There's an example of inconsistency between i18n repo and main repo\n\nrequests/requests-docs-cn#1\n",
"I think this is worth revisiting again. This definitely is valuable for our translators who do a lot of incredibly important work for this project.",
"Have you looked at Transifex?\n\nFree for open source projects.\n\nOn Sun, 30 Jul. 2017, 10:20 am Ian Stapleton Cordasco, <\[email protected]> wrote:\n\n> I think this is worth revisiting again. This definitely is valuable for\n> our translators who do a lot of incredibly important work for this project.\n>\n> —\n> You are receiving this because you are subscribed to this thread.\n> Reply to this email directly, view it on GitHub\n> <https://github.com/requests/requests/issues/3033#issuecomment-318868541>,\n> or mute the thread\n> <https://github.com/notifications/unsubscribe-auth/AGk_7R-erNnEvu1UpOneSf9fRk5oPBc4ks5sS8xRgaJpZM4Hp3gK>\n> .\n>\n",
"@TetraEtc I appreciate that you're replying from your email but that's exactly what we're discussing here. Please read existing issue history in the future.",
"In an effort to clean up the issue tracker to only have issues that are still relevant to the project we've done a quick pass and decided this issue may no longer be relevant for a variety of potential reasons, including:\r\n\r\n* Applies to a much older version, unclear whether the issue still applies.\r\n* Change requires a backwards incompatible release and it's unclear if the benefits are worth the migration effort from the community.\r\n* There isn't a clear demand from the community on the change landing in Requests.\r\n\r\nIf you think the issue should remain open, please comment so below or open a new issue and link back to the original issue. Again, thank you for opening the issue and for the discussion, it's much appreciated."
] |
https://api.github.com/repos/psf/requests/issues/3032
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3032/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3032/comments
|
https://api.github.com/repos/psf/requests/issues/3032/events
|
https://github.com/psf/requests/pull/3032
| 138,621,807 |
MDExOlB1bGxSZXF1ZXN0NjE3ODk3Njk=
| 3,032 |
Changes for #3028
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3794108?v=4",
"events_url": "https://api.github.com/users/davidsoncasey/events{/privacy}",
"followers_url": "https://api.github.com/users/davidsoncasey/followers",
"following_url": "https://api.github.com/users/davidsoncasey/following{/other_user}",
"gists_url": "https://api.github.com/users/davidsoncasey/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/davidsoncasey",
"id": 3794108,
"login": "davidsoncasey",
"node_id": "MDQ6VXNlcjM3OTQxMDg=",
"organizations_url": "https://api.github.com/users/davidsoncasey/orgs",
"received_events_url": "https://api.github.com/users/davidsoncasey/received_events",
"repos_url": "https://api.github.com/users/davidsoncasey/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/davidsoncasey/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/davidsoncasey/subscriptions",
"type": "User",
"url": "https://api.github.com/users/davidsoncasey",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2016-03-05T00:01:21Z
|
2021-09-08T04:01:15Z
|
2016-03-06T18:07:59Z
|
NONE
|
resolved
|
I've overwritten the `__contains__` method of `RequestsCookieJar` as discussed in the issue. It looks to be working as expected and tests pass, but this is my first contribution to this project so I could be missing something.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3032/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3032/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3032.diff",
"html_url": "https://github.com/psf/requests/pull/3032",
"merged_at": "2016-03-06T18:07:59Z",
"patch_url": "https://github.com/psf/requests/pull/3032.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3032"
}
| true |
[
"Thanks for this @davidsoncasey, this pretty much looks right! I left a few small notes inline for you to take a look at, but one those are addressed we can merge this!\n",
"@Lukasa great, thanks for taking a look! I'll have a chance later today to make those edits.\n",
"@Lukasa I added a couple fixes, take a look and let me know if there's anything else!\n",
"\\o/ Thanks so much @davidsoncasey! :sparkles: :cake: :sparkles:\n"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.