url
stringlengths 50
53
| repository_url
stringclasses 1
value | labels_url
stringlengths 64
67
| comments_url
stringlengths 59
62
| events_url
stringlengths 57
60
| html_url
stringlengths 38
43
| id
int64 597k
2.65B
| node_id
stringlengths 18
32
| number
int64 1
6.83k
| title
stringlengths 1
296
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 2
classes | assignee
dict | assignees
listlengths 0
4
| milestone
dict | comments
int64 0
211
| created_at
stringlengths 20
20
| updated_at
stringlengths 20
20
| closed_at
stringlengths 20
20
⌀ | author_association
stringclasses 3
values | active_lock_reason
stringclasses 4
values | body
stringlengths 0
65.6k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 59
62
| performed_via_github_app
null | state_reason
stringclasses 3
values | draft
bool 2
classes | pull_request
dict | is_pull_request
bool 2
classes | issue_comments
listlengths 0
30
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/psf/requests/issues/3650
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3650/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3650/comments
|
https://api.github.com/repos/psf/requests/issues/3650/events
|
https://github.com/psf/requests/issues/3650
| 185,522,789 |
MDU6SXNzdWUxODU1MjI3ODk=
| 3,650 |
Cannot import requests when filename is email.py
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2752396?v=4",
"events_url": "https://api.github.com/users/kocsenc/events{/privacy}",
"followers_url": "https://api.github.com/users/kocsenc/followers",
"following_url": "https://api.github.com/users/kocsenc/following{/other_user}",
"gists_url": "https://api.github.com/users/kocsenc/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kocsenc",
"id": 2752396,
"login": "kocsenc",
"node_id": "MDQ6VXNlcjI3NTIzOTY=",
"organizations_url": "https://api.github.com/users/kocsenc/orgs",
"received_events_url": "https://api.github.com/users/kocsenc/received_events",
"repos_url": "https://api.github.com/users/kocsenc/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kocsenc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kocsenc/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kocsenc",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-10-26T22:04:00Z
|
2021-09-08T14:00:44Z
|
2016-10-26T23:43:34Z
|
NONE
|
resolved
|
This is a weird error. But if you attempt to import requests from a filename `email.py` importing will fail.
## Steps to Reproduce
1. `mkdir potato && cd potato`
2. `vim email.py`
3. `echo 'import requests' > email.py`
4. `python email.py`
## Actual Result (bug)
Instead of importing, it will start to ask for specific dependencies.
``` bash
$ python email.py
Traceback (most recent call last):
File "email.py", line 1, in <module>
import requests
File "/Library/Python/2.7/site-packages/requests/__init__.py", line 60, in <module>
from .packages.urllib3.exceptions import DependencyWarning
File "/Library/Python/2.7/site-packages/requests/packages/__init__.py", line 29, in <module>
import urllib3
ImportError: No module named urllib3
```
## Machine information
Using macOS default Python `v2.7.10` and default `pip` located at `/Library/Python/2.7/site-packages/requests`
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3650/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3650/timeline
| null |
completed
| null | null | false |
[
"`email` is a standard library package name, you should _never_ shadow those. We, rely on it not being shadowed and use it, which is why you cannot import requests. Please change your file name.\n"
] |
https://api.github.com/repos/psf/requests/issues/3649
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3649/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3649/comments
|
https://api.github.com/repos/psf/requests/issues/3649/events
|
https://github.com/psf/requests/issues/3649
| 185,342,992 |
MDU6SXNzdWUxODUzNDI5OTI=
| 3,649 |
The data= parameter barfs on all dictionary-like objects that don't subclass dict itself
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1041265?v=4",
"events_url": "https://api.github.com/users/gruns/events{/privacy}",
"followers_url": "https://api.github.com/users/gruns/followers",
"following_url": "https://api.github.com/users/gruns/following{/other_user}",
"gists_url": "https://api.github.com/users/gruns/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gruns",
"id": 1041265,
"login": "gruns",
"node_id": "MDQ6VXNlcjEwNDEyNjU=",
"organizations_url": "https://api.github.com/users/gruns/orgs",
"received_events_url": "https://api.github.com/users/gruns/received_events",
"repos_url": "https://api.github.com/users/gruns/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gruns/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gruns/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gruns",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2016-10-26T09:37:18Z
|
2021-09-08T14:00:40Z
|
2016-11-03T15:58:22Z
|
NONE
|
resolved
|
Example:
``` python
import requests
from collections import MutableMapping
class mydict(MutableMapping):
def __init__(self, *args, **kwargs):
self.d = dict(*args, **kwargs)
def __iter__(self):
return iter(self.d)
def __getitem__(self, key):
return self.d.get(key)
def __setitem__(self, key, value):
self.d[key] = value
def __delitem__(self, key):
del self.d[key]
def __len__(self):
return len(self.d)
d = {'key':'value'}
myd = mydict(d)
# Works.
print requests.post('http://httpbin.org/post', data=d).json().get('form')
# Should also work but doesn't.
print requests.post('http://httpbin.org/post', data=myd).json().get('form')
```
In models.py, the `is_stream` check
``` python
is_stream = all([
hasattr(data, '__iter__'),
not isinstance(data, (basestring, list, tuple, dict))
])
```
explicitly checks if the `data` parameter is an instance of `dict`, so only
instances of `dict` can be passed through `data=`. All other dictionary-like
objects later bomb with
``` python
TypeError: must be convertible to a buffer, not mydict
```
Is it intentional that all dictionary-like objects not explicitly `dict`
subclasses are rejected? I can't fathom why.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3649/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3649/timeline
| null |
completed
| null | null | false |
[
"Nope, probably we should check for `Mapping` instead. Would you like to provide a PR to fix this up?\n",
"We had this at some point. Why did we get rid of this?\n",
"I'm honestly not sure. =(\n",
"Hello,\nI am a seasoned Python developer and I want to make my first steps \nas an open source contributor. Can you assign this bug to me?\n",
"@gardiac2002 Unfortunately, GitHub doesn't allow assigning issues to anyone other than contributors or the person who opened the issue. However, no-one on the team is currently working on this, so you are welcome to work on it and open a pull request.\n",
"@Lukasa thank you, I am going to take a look at the issue :)\n",
"I created a pull request for the issue: https://github.com/kennethreitz/requests/pull/3652\nAll tests ran successful from Python2.6 to Python3.5\n",
"It looks like this is now resolved thanks to @gardiac2002's work in #3652.\n"
] |
https://api.github.com/repos/psf/requests/issues/3638
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3638/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3638/comments
|
https://api.github.com/repos/psf/requests/issues/3638/events
|
https://github.com/psf/requests/issues/3638
| 185,192,077 |
MDU6SXNzdWUxODUxOTIwNzc=
| 3,638 |
malformed request block
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/12615662?v=4",
"events_url": "https://api.github.com/users/persiaAziz-zz/events{/privacy}",
"followers_url": "https://api.github.com/users/persiaAziz-zz/followers",
"following_url": "https://api.github.com/users/persiaAziz-zz/following{/other_user}",
"gists_url": "https://api.github.com/users/persiaAziz-zz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/persiaAziz-zz",
"id": 12615662,
"login": "persiaAziz-zz",
"node_id": "MDQ6VXNlcjEyNjE1NjYy",
"organizations_url": "https://api.github.com/users/persiaAziz-zz/orgs",
"received_events_url": "https://api.github.com/users/persiaAziz-zz/received_events",
"repos_url": "https://api.github.com/users/persiaAziz-zz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/persiaAziz-zz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/persiaAziz-zz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/persiaAziz-zz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2016-10-25T18:08:30Z
|
2021-09-08T14:00:45Z
|
2016-10-25T18:13:36Z
|
NONE
|
resolved
|
specifying both Content-Length and transfer-encoding in the header dictionary creates malformed request block. I think it should throw an exception and abort the request
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3638/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3638/timeline
| null |
completed
| null | null | false |
[
"I'm going to close this in favour of #3637.\n",
"This is a different issue by the way . Code to reproduce:\n\n`\nimport gevent\nimport socket\nimport requests\nimport os\nfrom threading import Thread\nimport sys\nbSTOP = False\ndef handleResponse(response,_args, *_kwargs):\n print(response.status_code)\n\ndef gen():\n yield 'pforpersia,champaignurbana'.encode('utf-8')\n yield 'there'.encode('utf-8')\n\ndef txn_replay():\n try:\n request_session = requests.Session()\n hostname = \"127.0.0.1\"\n port = \"8080\"\n request_session.proxies = {\"http\": \"http://{0}:{1}\".format(hostname, port)}\n hdr = {'Host':'www.blabla.com','content-type': 'application/json', 'Content-Length':'20' \n, 'Content-MD5':'5f4308e950ab4d7188e96ddf740855ec', 'Transfer-Encoding':'Chunked'}\n#, 'Content-Length':'20'\n body=gen()\n response = request_session.get('http://blabla.com/blabla', headers=hdr, stream=True, data=body)\n except UnicodeEncodeError as e:\n print(\"UnicodeEncodeError exception\")\n\n```\nexcept requests.exceptions.ContentDecodingError as e:\n print(\"ContentDecodingError\",e)\nexcept:\n e=sys.exc_info()\n print(\"ERROR in requests: \",e)\n```\n\ndef main():\n txn_replay()\n\nif **name** == '**main**':\n main()\n`\n",
"**wireshark**\n\n`GET http://blabla.com/blabla HTTP/1.1\nAccept: _/_\nConnection: keep-alive\nUser-Agent: python-requests/2.10.0\nAccept-Encoding: gzip, deflate\ncontent-type: application/json\nContent-Length: 20\nHost: www.blabla.com\nContent-MD5: 5f4308e950ab4d7188e96ddf740855ec\nTransfer-Encoding: chunked\n\npforpersia,champaignurbanatherethere\n\nHTTP/1.1 502 Success\nDate: Tue, 25 Oct 2016 20:41:08 GMT\nConnection: close\nServer: ATS/7.1.0\nCache-Control: no-store\nContent-Type: text/html\nContent-Language: en\nContent-Length: 247`\n",
"The problem is that Session generates the request block which is basically malformed with transfer-encoding and content-length being specified and adding message body. This should either take the content-length field out or raise an exception \n",
"This behaviour has been known about Requests for a long time and is intentional. Requests basically totally disregards the user-set headers that have anything to do with request framing (content-length or transfer-encoding). You must not set them.\n"
] |
https://api.github.com/repos/psf/requests/issues/3637
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3637/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3637/comments
|
https://api.github.com/repos/psf/requests/issues/3637/events
|
https://github.com/psf/requests/issues/3637
| 185,191,185 |
MDU6SXNzdWUxODUxOTExODU=
| 3,637 |
Chunked uploads do not deduplicate the Host header field when provided by the user
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/12615662?v=4",
"events_url": "https://api.github.com/users/persiaAziz-zz/events{/privacy}",
"followers_url": "https://api.github.com/users/persiaAziz-zz/followers",
"following_url": "https://api.github.com/users/persiaAziz-zz/following{/other_user}",
"gists_url": "https://api.github.com/users/persiaAziz-zz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/persiaAziz-zz",
"id": 12615662,
"login": "persiaAziz-zz",
"node_id": "MDQ6VXNlcjEyNjE1NjYy",
"organizations_url": "https://api.github.com/users/persiaAziz-zz/orgs",
"received_events_url": "https://api.github.com/users/persiaAziz-zz/received_events",
"repos_url": "https://api.github.com/users/persiaAziz-zz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/persiaAziz-zz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/persiaAziz-zz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/persiaAziz-zz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 10 |
2016-10-25T18:04:42Z
|
2021-09-08T11:00:28Z
|
2017-05-01T14:26:28Z
|
NONE
|
resolved
|
resquests.Session() adds extra Host field to the header dictionary passed as argument even if that field is already specified in the dictionary. This happens when transfer-encoding is specified in the header dictionary. Having duplicate host field in the header confuses apache traffic server.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3637/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3637/timeline
| null |
completed
| null | null | false |
[
"Requests should not send an extra `Host` header field. I can't reproduce this bug at all. Can you provide reproduction code?\n",
"``` python\nimport gevent\nimport socket\nimport requests\nimport os\nfrom threading import Thread\nimport sys\nbSTOP = False\ndef handleResponse(response,*args, **kwargs):\n print(response.status_code)\n\ndef gen():\n yield 'pforpersia,champaignurbana'.encode('utf-8')\n yield 'there'.encode('utf-8')\n\ndef txn_replay():\n try:\n request_session = requests.Session()\n hostname = \"127.0.0.1\"\n port = \"8080\"\n request_session.proxies = {\"http\": \"http://{0}:{1}\".format(hostname, port)}\n hdr = {'Host':'www.blabla.com','content-type': 'application/json',\n, 'Content-MD5':'5f4308e950ab4d7188e96ddf740855ec', 'Transfer-Encoding':'Chunked'}\n body=gen()\n response = request_session.get('http://blabla.com/blabla', headers=hdr, stream=True, data=body)\n\n except UnicodeEncodeError as e:\n print(\"UnicodeEncodeError exception\")\n\n except requests.exceptions.ContentDecodingError as e:\n print(\"ContentDecodingError\",e)\n except:\n e=sys.exc_info()\n print(\"ERROR in requests: \",e)\n\ndef main():\n txn_replay()\n\nif __name__ == '__main__':\n main()\n```\n",
"**Wireshark:**\n\n`1a\npforpersia,champaignurbana\n5\nthere\nGET http://blabla.com/blabla HTTP/1.1\nHost: blabla.com\nAccept: _/_\nAccept-Encoding: gzip, deflate\nConnection: keep-alive\ncontent-type: application/json\nTransfer-Encoding: chunked\nContent-MD5: 5f4308e950ab4d7188e96ddf740855ec\nHost: www.blabla.com\n\n1a\npforpersia,champaignurbana\n5\nthere\n0\n\nHTTP/1.1 400 Invalid HTTP Request\nDate: Tue, 25 Oct 2016 20:25:16 GMT\nConnection: keep-alive\nServer: ATS/7.1.0\nCache-Control: no-store\nContent-Type: text/html\nContent-Language: en\nContent-Length: 220`\n",
"There are two host fields in the request block now. \n",
"Some notes on your code:\n- Please don't provide the Host header, _especially_ as you're just providing the header Requests would set.\n- Doubly, don't provide the Transfer-Encoding header under any circumstances. Requests will not obey it, so it's better to let Requests set its own.\n- Where is that User-Agent coming from?\n\nI continue to be unable to reproduce this locally. Can you tell me more about what your environment is? Requests version, Python version, OS.\n",
"Please ignore the user agent field. I am gathering data from apache traffic server log. The logged requests have all those fields and I am replaying those requests using the python requests library. I have requests 2.10.0 installed. I am running on Ubuntu16.04 with python3.5\n",
"So, we should stop for a moment.\n\nIf your goal is to replay a log as accurately as possible, Requests is a bad choice for you. Requests will try to do a lot of things to be helpful, and all of those things have the potential to change the framing of the request. In particular, you cannot just set the Transfer-Encoding field and expect Requests to obey you: that's not how Requests works.\n\nRegardless, there _is_ a bug here: it's to do with how Requests sends generators. Right now requests has its own code for doing chunked uploads, and that code is clearly not hitting the \"deduplicate Host header\" path that it should. Probably this means we should start using `request_chunked` from `urllib3`, except that _also_ doesn't strip the `Host` header. So, two bugs really.\n",
"The urllib3 issue is tracked in shazow/urllib3#1009.\n",
"@Lukasa, it looks like this was wrapped up in shazow/urllib3#1018. Are there any outstanding pieces left here?",
"I don't think so!"
] |
https://api.github.com/repos/psf/requests/issues/3636
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3636/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3636/comments
|
https://api.github.com/repos/psf/requests/issues/3636/events
|
https://github.com/psf/requests/pull/3636
| 185,164,170 |
MDExOlB1bGxSZXF1ZXN0OTA4NDA5NTQ=
| 3,636 |
Update SSL options on proxy pool manager.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1126767?v=4",
"events_url": "https://api.github.com/users/jortel/events{/privacy}",
"followers_url": "https://api.github.com/users/jortel/followers",
"following_url": "https://api.github.com/users/jortel/following{/other_user}",
"gists_url": "https://api.github.com/users/jortel/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jortel",
"id": 1126767,
"login": "jortel",
"node_id": "MDQ6VXNlcjExMjY3Njc=",
"organizations_url": "https://api.github.com/users/jortel/orgs",
"received_events_url": "https://api.github.com/users/jortel/received_events",
"repos_url": "https://api.github.com/users/jortel/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jortel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jortel/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jortel",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-10-25T16:13:09Z
|
2021-09-08T01:21:45Z
|
2016-11-15T09:45:32Z
|
NONE
|
resolved
|
In a previous PR, the SSL options are updated on the main pool manager in `get_connection()` but not when the selected pool manager is a proxy pool manager. This PR fixes that.
In addition, I renamed the `_pool_kw_lock` because it's used in `get_connection()` to create a _critical section_ for both updating the selected pool manager _connection_pool_kw_ and calling `connection_from_url()` without being preempted.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3636/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3636/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3636.diff",
"html_url": "https://github.com/psf/requests/pull/3636",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3636.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3636"
}
| true |
[
"Thanks for this @jortel!\n\nI think that @jeremycline is planning a related PR that focuses on changes in urllib3 first. @jeremycline, how does this relate to your plan and what are your thoughts about this approach versus yours?\n",
"Hey @jortel, I filed #3633 and attached a few proposed patches. Let me know what you think. I need to take the time to write a few tests before I submit a PR, but I will probably have time for that tomorrow night. \n\nI like making a change to urllib3 first because that way the issue of thread safety is all bundled into urllib3 rather than the responsibility of each user of urllib3. \n",
"@jeremycline I like what you've proposed in #3633 and will close this PR in favor of that effort pending the outcome.\n"
] |
https://api.github.com/repos/psf/requests/issues/3635
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3635/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3635/comments
|
https://api.github.com/repos/psf/requests/issues/3635/events
|
https://github.com/psf/requests/issues/3635
| 185,029,246 |
MDU6SXNzdWUxODUwMjkyNDY=
| 3,635 |
session with proxy get error
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1875609?v=4",
"events_url": "https://api.github.com/users/Arion-Dsh/events{/privacy}",
"followers_url": "https://api.github.com/users/Arion-Dsh/followers",
"following_url": "https://api.github.com/users/Arion-Dsh/following{/other_user}",
"gists_url": "https://api.github.com/users/Arion-Dsh/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Arion-Dsh",
"id": 1875609,
"login": "Arion-Dsh",
"node_id": "MDQ6VXNlcjE4NzU2MDk=",
"organizations_url": "https://api.github.com/users/Arion-Dsh/orgs",
"received_events_url": "https://api.github.com/users/Arion-Dsh/received_events",
"repos_url": "https://api.github.com/users/Arion-Dsh/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Arion-Dsh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Arion-Dsh/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Arion-Dsh",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2016-10-25T06:25:49Z
|
2021-08-29T00:06:40Z
|
2016-10-26T08:18:28Z
|
NONE
|
resolved
|
➜ ~ python --version
Python 3.5.2
requests 2.11.1
requests session with proxy raise this error:
```
Traceback (most recent call last):
File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 595, in urlopen
chunked=chunked)
File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 393, in _make_request
six.raise_from(e, None)
File "<string>", line 2, in raise_from
File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 389, in _make_request
httplib_response = conn.getresponse()
File "/usr/lib/python3.5/http/client.py", line 1197, in getresponse
response.begin()
File "/usr/lib/python3.5/http/client.py", line 297, in begin
version, status, reason = self._read_status()
File "/usr/lib/python3.5/http/client.py", line 266, in _read_status
raise RemoteDisconnected("Remote end closed connection without"
http.client.RemoteDisconnected: Remote end closed connection without response
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.5/site-packages/requests/adapters.py", line 423, in send
timeout=timeout
File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 640, in urlopen
_stacktrace=sys.exc_info()[2])
File "/usr/lib/python3.5/site-packages/requests/packages/urllib3/util/retry.py", line 287, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
requests.packages.urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='*.*.*.*', port=8123): Max retries exceeded with url: http://1212.ip138.com/ic.asp (Caused by ProxyError('Cannot connect to proxy.', RemoteDisconnected('Remote end closed connection without response',)))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "proxy.py", line 34, in <module>
pg = s.get(url, auth=ProxyAuth(), timeout=(5, 20))
File "/usr/lib/python3.5/site-packages/requests/sessions.py", line 488, in get
return self.request('GET', url, **kwargs)
File "/usr/lib/python3.5/site-packages/requests/sessions.py", line 475, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python3.5/site-packages/requests/sessions.py", line 596, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python3.5/site-packages/requests/adapters.py", line 485, in send
raise ProxyError(e, request=request)
requests.exceptions.ProxyError: HTTPConnectionPool(host='*.*.*.*', port=8123): Max retries exceeded with url: http://1212.ip138.com/ic.asp (Caused by ProxyError('Cannot connect to proxy.', RemoteDisconnected('Remote end closed connection without response',)))
```
my code like this :
`
```
import requests
...
proxy = {'http': 'http://%s:%s' %(url, port), 'https': 'http://%s:%s' % (url, port)}
s = requests.Session()
s.proxies.update(proxy)
s.headers.update({'Proxy-Authorization': authHeader})
resp = s.get(url, timeout=6)
print(resp)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3635/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3635/timeline
| null |
completed
| null | null | false |
[
"I strongly recommend you provide proxy authorization header values in the URL instead, like this:\n\n``` python\nproxy = {'http': 'http://%s:%s@%s:%s\" % (username, password, url, port)'}\n```\n\nSimply adding session headers like you've done here doesn't necessarily actually send them to the proxy. This is particularly true if you're making HTTPS requests. Try removing the `Proxy-Authorization` header and doing the above instead.\n",
"@Lukasa Actually , I can't change the proxy authorization, It's not my proxy server. For now, I use the urllib.request instead. Is it possible this authorization work with requests? \n",
"I'm not asking you to change the authorization itself, just where you put it in your code.\n",
"> I strongly recommend you provide proxy authorization header values in the URL instead, like this:\r\n> \r\n> ```python\r\n> proxy = {'http': 'http://%s:%s@%s:%s\" % (username, password, url, port)'}\r\n> ```\r\n> \r\n> Simply adding session headers like you've done here doesn't necessarily actually send them to the proxy. This is particularly true if you're making HTTPS requests. Try removing the `Proxy-Authorization` header and doing the above instead.\r\n\r\nHi @Lukasa ,\r\nI receive same error and could not resolve.What i should do to overcome this problem?\r\nWhere should i add this : proxy = {'http': 'http://%s:%s@%s:%s\" % (username, password, url, port)'}",
"> e should i add this : proxy = {'http': 'http://%s:%s@%s:%s\" % (username, pass\r\n\r\nYou can add this in your code."
] |
https://api.github.com/repos/psf/requests/issues/3634
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3634/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3634/comments
|
https://api.github.com/repos/psf/requests/issues/3634/events
|
https://github.com/psf/requests/pull/3634
| 184,698,536 |
MDExOlB1bGxSZXF1ZXN0OTA1MjAwMDc=
| 3,634 |
Keep ``verify`` setting when no CA_BUNDLE variable exists
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1977525?v=4",
"events_url": "https://api.github.com/users/jeremycline/events{/privacy}",
"followers_url": "https://api.github.com/users/jeremycline/followers",
"following_url": "https://api.github.com/users/jeremycline/following{/other_user}",
"gists_url": "https://api.github.com/users/jeremycline/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jeremycline",
"id": 1977525,
"login": "jeremycline",
"node_id": "MDQ6VXNlcjE5Nzc1MjU=",
"organizations_url": "https://api.github.com/users/jeremycline/orgs",
"received_events_url": "https://api.github.com/users/jeremycline/received_events",
"repos_url": "https://api.github.com/users/jeremycline/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jeremycline/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jeremycline/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jeremycline",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-10-23T16:07:44Z
|
2021-09-08T02:10:25Z
|
2016-10-25T08:56:53Z
|
CONTRIBUTOR
|
resolved
|
If the `trust_env` flag is set on a session and `verify` is `True`
or `None`, the environment is checked for `CURL_CA_BUNDLE` and
`REQUESTS_CA_BUNDLE`. Before this patch, if neither existed,
`verify` would always be set to `None` rather than `True` when it
was originally `True`.
I found this while working on a patch for https://github.com/kennethreitz/requests/issues/3633.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3634/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3634/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3634.diff",
"html_url": "https://github.com/psf/requests/pull/3634",
"merged_at": "2016-10-25T08:56:53Z",
"patch_url": "https://github.com/psf/requests/pull/3634.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3634"
}
| true |
[
"@Lukasa I've adjusted the test.\n",
"@Lukasa no worries. I haven't really worked with pytest and didn't see examples of patching in the existing tests so I got lazy :frowning_face:.\n",
"Hurrah! No need to worry @jeremycline, this is what code review is for! I'm delighted to say that I'm now happy with this patch, and ready to merge. Thanks so much! :sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/3633
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3633/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3633/comments
|
https://api.github.com/repos/psf/requests/issues/3633/events
|
https://github.com/psf/requests/issues/3633
| 184,638,014 |
MDU6SXNzdWUxODQ2MzgwMTQ=
| 3,633 |
HTTPS requests through proxies in proposed/3.0.0 aren't configured correctly
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1977525?v=4",
"events_url": "https://api.github.com/users/jeremycline/events{/privacy}",
"followers_url": "https://api.github.com/users/jeremycline/followers",
"following_url": "https://api.github.com/users/jeremycline/following{/other_user}",
"gists_url": "https://api.github.com/users/jeremycline/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jeremycline",
"id": 1977525,
"login": "jeremycline",
"node_id": "MDQ6VXNlcjE5Nzc1MjU=",
"organizations_url": "https://api.github.com/users/jeremycline/orgs",
"received_events_url": "https://api.github.com/users/jeremycline/received_events",
"repos_url": "https://api.github.com/users/jeremycline/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jeremycline/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jeremycline/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jeremycline",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2016-10-22T15:59:26Z
|
2021-09-08T04:00:44Z
|
2018-01-29T13:17:25Z
|
CONTRIBUTOR
|
resolved
|
In current master:
```
>>> import requests
>>> requests.__version__
'2.11.1'
>>> session = requests.Session()
>>> r = session.get('https://www.jcline.org/', verify=True, proxies={'http': 'http://vagrant:vagrant@localhost:3128', 'https': 'http://vagrant:vagrant@localhost:3128'})
>>>
```
In current proposed/3.0.0:
```
>>> import requests
>>> requests.__version__
'3.0.0'
>>> session = requests.Session()
>>> r = session.get('https://www.jcline.org/', verify=True, proxies={'http': 'http://vagrant:vagrant@localhost:3128', 'https': 'http://vagrant:vagrant@localhost:3128'})
requests/packages/urllib3/connectionpool.py:838: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/security.html
InsecureRequestWarning)
>>>
```
This is a problem I introduced in https://github.com/kennethreitz/requests/pull/3109 :disappointed:. What happens right now is if a request is _not_ through a proxy and it's HTTPS, the urllib3 pool manager's `connection_pool_kw` are updated before requesting a new connection using [requests.adapters.HTTPAdapter._update_poolmanager_ssl_kw](https://github.com/kennethreitz/requests/blob/proposed/3.0.0/requests/adapters.py#L204). If it _is_ through a proxy, the keywords aren't updated and the request is made with the default settings for urllib3.
To me, the most appealing way to fix this is to add a keyword argument, `connection_kwargs` or something, to all the `urllib3.poolmanager.PoolManager.connection_from_*` methods that is either merged into `connection_pool_kw` or overrides them. That way `urllib3` can handle getting the connection pool with the new kwargs in a thread-safe manner. Currently, `requests` has to manage updating the keys and getting the new connection pool with a lock. It seems like that would be better in `urllib3`.
The other option is to patch up what's currently in `HTTPAdapter` so it handles updating the proxy manager or plain pool manager based on whether proxies are in use.
What do people think?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1977525?v=4",
"events_url": "https://api.github.com/users/jeremycline/events{/privacy}",
"followers_url": "https://api.github.com/users/jeremycline/followers",
"following_url": "https://api.github.com/users/jeremycline/following{/other_user}",
"gists_url": "https://api.github.com/users/jeremycline/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jeremycline",
"id": 1977525,
"login": "jeremycline",
"node_id": "MDQ6VXNlcjE5Nzc1MjU=",
"organizations_url": "https://api.github.com/users/jeremycline/orgs",
"received_events_url": "https://api.github.com/users/jeremycline/received_events",
"repos_url": "https://api.github.com/users/jeremycline/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jeremycline/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jeremycline/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jeremycline",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3633/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3633/timeline
| null |
completed
| null | null | false |
[
"I am open to having a urllib3 patch for this, at least in principle. What would it look like?\n",
"This current patch I've got should give you the general idea, although I'm still playing with it a bit:\n\n```\nFrom 0b95fd47688d5dcbc7ab0798f5d110e7c17eef86 Mon Sep 17 00:00:00 2001\nFrom: Jeremy Cline <[email protected]>\nDate: Sun, 23 Oct 2016 12:11:10 -0400\nSubject: [PATCH] The ``connection_from_*`` methods now accept pool_kw\n\nWork in progress\n\nSigned-off-by: Jeremy Cline <[email protected]>\n---\n urllib3/poolmanager.py | 47 ++++++++++++++++++++++++++++++-----------------\n 1 file changed, 30 insertions(+), 17 deletions(-)\n\ndiff --git a/urllib3/poolmanager.py b/urllib3/poolmanager.py\nindex 276b54d..ab9a108 100644\n--- a/urllib3/poolmanager.py\n+++ b/urllib3/poolmanager.py\n@@ -2,6 +2,10 @@ from __future__ import absolute_import\n import collections\n import functools\n import logging\n+try:\n+ from threading import RLock\n+except ImportError: # threading is an optional module and may not be present.\n+ from dummy_threading import RLock\n\n from ._collections import RecentlyUsedContainer\n from .connectionpool import HTTPConnectionPool, HTTPSConnectionPool\n@@ -120,6 +124,7 @@ class PoolManager(RequestMethods):\n # override them.\n self.pool_classes_by_scheme = pool_classes_by_scheme\n self.key_fn_by_scheme = key_fn_by_scheme.copy()\n+ self.new_pool_lock = RLock()\n\n def __enter__(self):\n return self\n@@ -155,7 +160,7 @@ class PoolManager(RequestMethods):\n \"\"\"\n self.pools.clear()\n\n- def connection_from_host(self, host, port=None, scheme='http'):\n+ def connection_from_host(self, host, port=None, scheme='http', pool_kwargs=None):\n \"\"\"\n Get a :class:`ConnectionPool` based on the host, port, and scheme.\n\n@@ -166,14 +171,19 @@ class PoolManager(RequestMethods):\n if not host:\n raise LocationValueError(\"No host specified.\")\n\n- request_context = self.connection_pool_kw.copy()\n- request_context['scheme'] = scheme or 'http'\n- if not port:\n- port = port_by_scheme.get(request_context['scheme'].lower(), 80)\n- request_context['port'] = port\n- request_context['host'] = host\n+ with self.new_pool_lock:\n+ # Update the connection_pool_kw and get a new pool atomically.\n+ for k, v in pool_kwargs.items():\n+ self.connection_pool_kw[k] = v\n\n- return self.connection_from_context(request_context)\n+ request_context = self.connection_pool_kw.copy()\n+ request_context['scheme'] = scheme or 'http'\n+ if not port:\n+ port = port_by_scheme.get(request_context['scheme'].lower(), 80)\n+ request_context['port'] = port\n+ request_context['host'] = host\n+\n+ return self.connection_from_context(request_context)\n\n def connection_from_context(self, request_context):\n \"\"\"\n@@ -209,7 +219,7 @@ class PoolManager(RequestMethods):\n\n return pool\n\n- def connection_from_url(self, url):\n+ def connection_from_url(self, url, pool_kwargs=None):\n \"\"\"\n Similar to :func:`urllib3.connectionpool.connection_from_url` but\n doesn't pass any additional parameters to the\n@@ -219,9 +229,10 @@ class PoolManager(RequestMethods):\n constructor.\n \"\"\"\n u = parse_url(url)\n- return self.connection_from_host(u.host, port=u.port, scheme=u.scheme)\n+ return self.connection_from_host(u.host, port=u.port, scheme=u.scheme,\n+ pool_kwargs=pool_kwargs)\n\n- def urlopen(self, method, url, redirect=True, **kw):\n+ def urlopen(self, method, url, redirect=True, pool_kwargs=None, **kw):\n \"\"\"\n Same as :meth:`urllib3.connectionpool.HTTPConnectionPool.urlopen`\n with custom cross-host redirect logic and only sends the request-uri\n@@ -231,7 +242,8 @@ class PoolManager(RequestMethods):\n :class:`urllib3.connectionpool.ConnectionPool` can be chosen for it.\n \"\"\"\n u = parse_url(url)\n- conn = self.connection_from_host(u.host, port=u.port, scheme=u.scheme)\n+ conn = self.connection_from_host(u.host, port=u.port, scheme=u.scheme,\n+ pool_kwargs=pool_kwargs)\n\n kw['assert_same_host'] = False\n kw['redirect'] = False\n@@ -322,13 +334,13 @@ class ProxyManager(PoolManager):\n super(ProxyManager, self).__init__(\n num_pools, headers, **connection_pool_kw)\n\n- def connection_from_host(self, host, port=None, scheme='http'):\n+ def connection_from_host(self, host, port=None, scheme='http', pool_kwargs=None):\n if scheme == \"https\":\n return super(ProxyManager, self).connection_from_host(\n- host, port, scheme)\n+ host, port, scheme, pool_kwargs=pool_kwargs)\n\n return super(ProxyManager, self).connection_from_host(\n- self.proxy.host, self.proxy.port, self.proxy.scheme)\n+ self.proxy.host, self.proxy.port, self.proxy.scheme, pool_kwargs=pool_kwargs)\n\n def _set_proxy_headers(self, url, headers=None):\n \"\"\"\n@@ -345,7 +357,7 @@ class ProxyManager(PoolManager):\n headers_.update(headers)\n return headers_\n\n- def urlopen(self, method, url, redirect=True, **kw):\n+ def urlopen(self, method, url, redirect=True, pool_kwargs=None, **kw):\n \"Same as HTTP(S)ConnectionPool.urlopen, ``url`` must be absolute.\"\n u = parse_url(url)\n\n@@ -356,7 +368,8 @@ class ProxyManager(PoolManager):\n headers = kw.get('headers', self.headers)\n kw['headers'] = self._set_proxy_headers(url, headers)\n\n- return super(ProxyManager, self).urlopen(method, url, redirect=redirect, **kw)\n+ return super(ProxyManager, self).urlopen(method, url, redirect=redirect,\n+ pool_kwargs=None, **kw)\n-- \n2.9.3\n```\n\nAnd then requests would get a patch something like:\n\n```\nFrom 4fd2d962b6883860beea03ca599fa38959a0e7ad Mon Sep 17 00:00:00 2001\nFrom: Jeremy Cline <[email protected]>\nDate: Sun, 23 Oct 2016 12:16:02 -0400\nSubject: [PATCH] Use ``pool_kwargs`` when getting a new connection\n\nWork in progress\n\nSigned-off-by: Jeremy Cline <[email protected]>\n---\n requests/adapters.py | 54 +++++++++++++++++++++++++++-------------------------\n 1 file changed, 28 insertions(+), 26 deletions(-)\n\ndiff --git a/requests/adapters.py b/requests/adapters.py\nindex 1032d2e..41f0d8c 100644\n--- a/requests/adapters.py\n+++ b/requests/adapters.py\n@@ -201,7 +201,8 @@ class HTTPAdapter(BaseAdapter):\n\n return manager\n\n- def _update_poolmanager_ssl_kw(self, verify, cert):\n+ @staticmethod\n+ def _pool_kwargs(verify, cert):\n \"\"\"Update the :class:`PoolManager <urllib3.poolmanager.PoolManager>`\n connection_pool_kw with the necessary SSL configuration. This method\n should not be called from user code, and is only exposed for use when\n@@ -215,6 +216,8 @@ class HTTPAdapter(BaseAdapter):\n key concatenated in a single file, or as a tuple of\n (cert_file, key_file).\n \"\"\"\n+ pool_kwargs = {}\n+\n if verify:\n\n cert_loc = None\n@@ -229,25 +232,27 @@ class HTTPAdapter(BaseAdapter):\n if not cert_loc:\n raise Exception(\"Could not find a suitable SSL CA certificate bundle.\")\n\n- self.poolmanager.connection_pool_kw['cert_reqs'] = 'CERT_REQUIRED'\n+ pool_kwargs['cert_reqs'] = 'CERT_REQUIRED'\n\n if not os.path.isdir(cert_loc):\n- self.poolmanager.connection_pool_kw['ca_certs'] = cert_loc\n- self.poolmanager.connection_pool_kw['ca_cert_dir'] = None\n+ pool_kwargs['ca_certs'] = cert_loc\n+ pool_kwargs['ca_cert_dir'] = None\n else:\n- self.poolmanager.connection_pool_kw['ca_cert_dir'] = cert_loc\n- self.poolmanager.connection_pool_kw['ca_certs'] = None\n+ pool_kwargs['ca_cert_dir'] = cert_loc\n+ pool_kwargs['ca_certs'] = None\n else:\n- self.poolmanager.connection_pool_kw['cert_reqs'] = 'CERT_NONE'\n- self.poolmanager.connection_pool_kw['ca_certs'] = None\n- self.poolmanager.connection_pool_kw['ca_cert_dir'] = None\n+ pool_kwargs['cert_reqs'] = 'CERT_NONE'\n+ pool_kwargs['ca_certs'] = None\n+ pool_kwargs['ca_cert_dir'] = None\n\n if cert:\n if not isinstance(cert, basestring):\n- self.poolmanager.connection_pool_kw['cert_file'] = cert[0]\n- self.poolmanager.connection_pool_kw['key_file'] = cert[1]\n+ pool_kwargs['cert_file'] = cert[0]\n+ pool_kwargs['key_file'] = cert[1]\n else:\n- self.poolmanager.connection_pool_kw['cert_file'] = cert\n+ pool_kwargs['cert_file'] = cert\n+\n+ return pool_kwargs\n\n def build_response(self, req, resp):\n \"\"\"Builds a :class:`Response <requests.Response>` object from a urllib3\n@@ -295,21 +300,18 @@ class HTTPAdapter(BaseAdapter):\n :param proxies: (optional) A Requests-style dictionary of proxies used on this request.\n :rtype: requests.packages.urllib3.ConnectionPool\n \"\"\"\n- with self._pool_kw_lock:\n- if url.lower().startswith('https'):\n- self._update_poolmanager_ssl_kw(verify, cert)\n-\n- proxy = select_proxy(url, proxies)\n+ pool_kwargs = self._pool_kwargs(verify, cert)\n+ proxy = select_proxy(url, proxies)\n\n- if proxy:\n- proxy = prepend_scheme_if_needed(proxy, 'http')\n- proxy_manager = self.proxy_manager_for(proxy)\n- conn = proxy_manager.connection_from_url(url)\n- else:\n- # Only scheme should be lower case\n- parsed = urlparse(url)\n- url = parsed.geturl()\n- conn = self.poolmanager.connection_from_url(url)\n+ if proxy:\n+ proxy = prepend_scheme_if_needed(proxy, 'http')\n+ proxy_manager = self.proxy_manager_for(proxy)\n+ conn = proxy_manager.connection_from_url(url, pool_kwargs=pool_kwargs)\n+ else:\n+ # Only scheme should be lower case\n+ parsed = urlparse(url)\n+ url = parsed.geturl()\n+ conn = self.poolmanager.connection_from_url(url, pool_kwargs=pool_kwargs)\n\n return conn\n\n-- \n2.9.3\n```\n\nIf those look generally reasonable I'll write some tests and open a urllib3 PR where we can hammer out the details.\n",
"Why does the patch change the connection pool keyword arguments on the PoolManager?\n\nAnyway, this patch does seem like it's a reasonable enough direction to go, so please head on over to urllib3 and open a PR.\n",
"I like the direction this is headed. In `PoolManager.connection_from_host()` consider:\n\n```\nrequest_context = self.connection_pool_kw.copy()\nrequest_context.update(pool_kwargs or {})\n```\n\ninstead of updating the `PoolManager.connection_pool_kw`. \n\nThis has two advantages:\n- Don't need the lock.\n- Updating the PoolManager.connection_pool_kw when getting a connection seems unintuitive and an undesirable side effect.\n\nThe patch in _requests_ looks good.\n\nLooking forward to seeing the PR.\n",
"@jeremycline, was there any work left to do here, or can we close this out? I was under the impression we solved this with your last patch.",
"@nateprewitt Yep, it should all be done, I just forgot to close this issue."
] |
https://api.github.com/repos/psf/requests/issues/3632
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3632/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3632/comments
|
https://api.github.com/repos/psf/requests/issues/3632/events
|
https://github.com/psf/requests/pull/3632
| 184,513,030 |
MDExOlB1bGxSZXF1ZXN0OTA0MDMzOTQ=
| 3,632 |
default dispose_func, less condition check
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1734093?v=4",
"events_url": "https://api.github.com/users/ls0f/events{/privacy}",
"followers_url": "https://api.github.com/users/ls0f/followers",
"following_url": "https://api.github.com/users/ls0f/following{/other_user}",
"gists_url": "https://api.github.com/users/ls0f/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ls0f",
"id": 1734093,
"login": "ls0f",
"node_id": "MDQ6VXNlcjE3MzQwOTM=",
"organizations_url": "https://api.github.com/users/ls0f/orgs",
"received_events_url": "https://api.github.com/users/ls0f/received_events",
"repos_url": "https://api.github.com/users/ls0f/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ls0f/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ls0f/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ls0f",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-10-21T15:23:33Z
|
2021-09-08T02:10:25Z
|
2016-10-21T15:25:10Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3632/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3632/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3632.diff",
"html_url": "https://github.com/psf/requests/pull/3632",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3632.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3632"
}
| true |
[
"Thanks for this! However, urllib3 is a separate project that Requests includes, in its entirety, with no patches. You'll need to provide this patch against urllib3 directly.\n"
] |
|
https://api.github.com/repos/psf/requests/issues/3631
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3631/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3631/comments
|
https://api.github.com/repos/psf/requests/issues/3631/events
|
https://github.com/psf/requests/pull/3631
| 184,430,871 |
MDExOlB1bGxSZXF1ZXN0OTAzNDQyMjQ=
| 3,631 |
little modification in consume_socket_content
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1734093?v=4",
"events_url": "https://api.github.com/users/ls0f/events{/privacy}",
"followers_url": "https://api.github.com/users/ls0f/followers",
"following_url": "https://api.github.com/users/ls0f/following{/other_user}",
"gists_url": "https://api.github.com/users/ls0f/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ls0f",
"id": 1734093,
"login": "ls0f",
"node_id": "MDQ6VXNlcjE3MzQwOTM=",
"organizations_url": "https://api.github.com/users/ls0f/orgs",
"received_events_url": "https://api.github.com/users/ls0f/received_events",
"repos_url": "https://api.github.com/users/ls0f/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ls0f/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ls0f/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ls0f",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2016-10-21T08:44:44Z
|
2021-09-08T02:10:26Z
|
2016-10-21T08:58:30Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3631/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3631/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3631.diff",
"html_url": "https://github.com/psf/requests/pull/3631",
"merged_at": "2016-10-21T08:58:30Z",
"patch_url": "https://github.com/psf/requests/pull/3631.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3631"
}
| true |
[] |
|
https://api.github.com/repos/psf/requests/issues/3630
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3630/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3630/comments
|
https://api.github.com/repos/psf/requests/issues/3630/events
|
https://github.com/psf/requests/issues/3630
| 184,134,463 |
MDU6SXNzdWUxODQxMzQ0NjM=
| 3,630 |
Explicit DNSError subclass for failed DNS lookups
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/515889?v=4",
"events_url": "https://api.github.com/users/techtonik/events{/privacy}",
"followers_url": "https://api.github.com/users/techtonik/followers",
"following_url": "https://api.github.com/users/techtonik/following{/other_user}",
"gists_url": "https://api.github.com/users/techtonik/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/techtonik",
"id": 515889,
"login": "techtonik",
"node_id": "MDQ6VXNlcjUxNTg4OQ==",
"organizations_url": "https://api.github.com/users/techtonik/orgs",
"received_events_url": "https://api.github.com/users/techtonik/received_events",
"repos_url": "https://api.github.com/users/techtonik/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/techtonik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/techtonik/subscriptions",
"type": "User",
"url": "https://api.github.com/users/techtonik",
"user_view_type": "public"
}
|
[] |
closed
| false | null |
[] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2024-05-19T18:29:04Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/34",
"id": 11073254,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/34/labels",
"node_id": "MI_kwDOABTKOs4AqPbm",
"number": 34,
"open_issues": 0,
"state": "open",
"title": "Bankruptcy",
"updated_at": "2024-05-20T14:37:16Z",
"url": "https://api.github.com/repos/psf/requests/milestones/34"
}
| 7 |
2016-10-20T04:49:17Z
|
2024-05-20T14:36:35Z
|
2024-05-20T14:36:35Z
|
NONE
| null |
There is no cross-platform way to catch DNS lookup errors with `requests`. At least I see that error message on Windows https://stackoverflow.com/questions/40145631/precisely-catch-dns-error-with-python-requests looks different from Linux in https://github.com/kennethreitz/requests/issues/3550
Would be nice to get dedicated exception for this use case.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 4,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 4,
"url": "https://api.github.com/repos/psf/requests/issues/3630/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3630/timeline
| null |
completed
| null | null | false |
[
"So this is somewhat reasonable, but made more opaque by the fact that urllib3 gets in the way here. urllib3 will basically always turn a connection setup problem like this one into a `MaxRetriesError` (because of its retry processing). That means for us to fire off an appropriate error in all cases means that we need to go and look at every possible exception that can fire.\n\nI think, as a first step, you should propose that urllib3 raise special exceptions when it gets problems from getaddrinfo. From there, Requests can introspect and look for such an exception, which gives us a better shot at raising an appropriate error. And at the very least, you will then be able to dig in to the exceptions to pull out the original (as all of those exceptions are just wrappers).\n",
"Reported above. Do you have access to OS X machine to test what `requests.head('http://wowsucherror')` returns there currently?\n",
"In 2.7.12:\n\n```\nrequests.exceptions.ConnectionError: HTTPConnectionPool(host='wowsucherror', port=80): Max retries exceeded with url: / (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x10e22ff10>: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known',))\n```\n",
"The hack so far:\n\n``` python\nimport requests\n\ndef sitecheck(url):\n status = None\n message = ''\n try:\n resp = requests.head('http://' + url)\n status = str(resp.status_code)\n except requests.ConnectionError as exc:\n # filtering DNS lookup error from other connection errors\n # (until https://github.com/shazow/urllib3/issues/1003 is resolved)\n if type(exc.message) != requests.packages.urllib3.exceptions.MaxRetryError:\n raise\n reason = exc.message.reason \n if type(reason) != requests.packages.urllib3.exceptions.NewConnectionError:\n raise\n if type(reason.message) != str:\n raise\n if (\"[Errno 11001] getaddrinfo failed\" in reason.message or # Windows\n \"[Errno -2] Name or service not known\" in reason.message or # Linux\n \"[Errno 8] nodename nor servname \" in reason.message): # OS X\n message = 'DNSLookupError'\n else:\n raise\n\n return url, status, message\n\nprint sitecheck('wowsucherror')\nprint sitecheck('google.com')\n```\n\n```\n('wowsucherror', None, 'DNSLookupError')\n('google.com', '302', '')\n```\n",
"I'm not sure what the status of this is a few years later... I found the above didn't work for me on current python, so came up with this. Note I'm only returning the OK status here.\r\n\r\n```\r\ndef check_live(url):\r\n try:\r\n r = requests.get(url)\r\n live = r.ok\r\n except requests.ConnectionError as e:\r\n if 'MaxRetryError' not in str(e.args) or 'NewConnectionError' not in str(e.args):\r\n raise\r\n if \"[Errno 8]\" in str(e) or \"[Errno 11001]\" in str(e) or [\"Errno -2\"] in str(e):\r\n print('DNSLookupError')\r\n live = False\r\n else:\r\n raise\r\n except:\r\n raise\r\n return live\r\n```",
"In case someone is wondering what is the status as of now: [urllib3 v2.0 will have a separate exception for this](https://github.com/urllib3/urllib3/issues/2065), see https://github.com/urllib3/urllib3/commit/1831327b881880ed871f96f56a6977d360042e1b and https://github.com/urllib3/urllib3/commit/8a1ac9f8db13d0673188d542152253d6e133eec9.",
"In an effort to clean up the issue tracker to only have issues that are still relevant to the project we've done a quick pass and decided this issue may no longer be relevant for a variety of potential reasons, including:\r\n\r\n* Applies to a much older version, unclear whether the issue still applies.\r\n* Change requires a backwards incompatible release and it's unclear if the benefits are worth the migration effort from the community.\r\n* There isn't a clear demand from the community on the change landing in Requests.\r\n\r\nIf you think the issue should remain open, please comment so below or open a new issue and link back to the original issue. Again, thank you for opening the issue and for the discussion, it's much appreciated."
] |
https://api.github.com/repos/psf/requests/issues/3629
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3629/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3629/comments
|
https://api.github.com/repos/psf/requests/issues/3629/events
|
https://github.com/psf/requests/issues/3629
| 183,880,578 |
MDU6SXNzdWUxODM4ODA1Nzg=
| 3,629 |
Use of both "files" and "json" in POST causes no data to be sent
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/249497?v=4",
"events_url": "https://api.github.com/users/wQwRtaufxJw7UFLCXzXz/events{/privacy}",
"followers_url": "https://api.github.com/users/wQwRtaufxJw7UFLCXzXz/followers",
"following_url": "https://api.github.com/users/wQwRtaufxJw7UFLCXzXz/following{/other_user}",
"gists_url": "https://api.github.com/users/wQwRtaufxJw7UFLCXzXz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/wQwRtaufxJw7UFLCXzXz",
"id": 249497,
"login": "wQwRtaufxJw7UFLCXzXz",
"node_id": "MDQ6VXNlcjI0OTQ5Nw==",
"organizations_url": "https://api.github.com/users/wQwRtaufxJw7UFLCXzXz/orgs",
"received_events_url": "https://api.github.com/users/wQwRtaufxJw7UFLCXzXz/received_events",
"repos_url": "https://api.github.com/users/wQwRtaufxJw7UFLCXzXz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/wQwRtaufxJw7UFLCXzXz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/wQwRtaufxJw7UFLCXzXz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/wQwRtaufxJw7UFLCXzXz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-10-19T06:57:18Z
|
2021-09-08T14:00:47Z
|
2016-10-19T08:47:07Z
|
NONE
|
resolved
|
From a Python3 client to a Rails server
This doesn't throw an error:
`response = requests.post(url, files=files, json=data)`
And when I send it, the files reach their destination, but **not the data**.
```
({"file"=>#<ActionDispatch::Http::UploadedFile:0x007fe825f6a260 @tempfile=#<Tempfile:/var/folders/hk/k9bttnlx3w114g2730b1bzrm0000gp/T/RackMultipart20161018-34415-1kizyxd.jpg>, @original_filename="b688139f3be64de09606dd603a437d27.jpg", @content_type=nil, @headers="Content-Disposition: form-data; name=\"file\"; filename=\"b688139f3be64de09606dd603a437d27.jpg\"\r\n">, "form"=>:json, "controller"=>"api/v1/agents/events", "action"=>"create", "agent_id"=>"5"})
```
Now if I remove files from the named parameters, it works as intended, but **without the files**:
`response = requests.post(url, json=data)`
```
({"device_id"=>"5", "wifi_networks"=>[{"mode"=>"Master", "frequency"=>"2.452 GHz", "ssid"=>"ATT29426J6", "signal"=>-22, "channel"=>9, "address"=>"E0:B7:0A:61:0F:D0", "encryption_type"=>"wpa", "bitrates"=>["1 Mb/s", "48 Mb/s"], "quality_now"=>"70", "quality_max"=>"70", "encrypted"=>true}, {"mode"=>"Master", "frequency"=>"2.447 GHz", "ssid"=>"WiPS", "signal"=>-90, "channel"=>8, "address"=>"00:12:5F:10:16:C0", "encryption_type"=>nil, "bitrates"=>["1 Mb/s", "48 Mb/s"], "quality_now"=>"20", "quality_max"=>"70", "encrypted"=>false}, {"mode"=>"Master", "frequency"=>"2.452 GHz", "ssid"=>"WiPS", "signal"=>-87, "channel"=>9, "address"=>"00:12:5F:11:6E:E6", "encryption_type"=>nil, "bitrates"=>["1 Mb/s", "48 Mb/s"], "quality_now"=>"23", "quality_max"=>"70", "encrypted"=>false}, {"mode"=>"Master", "frequency"=>"2.462 GHz", "ssid"=>"", "signal"=>-86, "channel"=>11, "address"=>"26:4E:5A:94:FD:E2", "encryption_type"=>"wpa", "bitrates"=>["1 Mb/s", "54 Mb/s"], "quality_now"=>"24", "quality_max"=>"70", "encrypted"=>true}, {"mode"=>"Master", "frequency"=>"2.437 GHz", "ssid"=>"", "signal"=>-83, "channel"=>6, "address"=>"36:1F:E4:E1:90:EC", "encryption_type"=>"wpa", "bitrates"=>["1 Mb/s", "54 Mb/s"], "quality_now"=>"27", "quality_max"=>"70", "encrypted"=>true}, {"mode"=>"Master", "frequency"=>"2.437 GHz", "ssid"=>"xfinitywifi", "signal"=>-88, "channel"=>6, "address"=>"DC:FE:07:93:52:2A", "encryption_type"=>nil, "bitrates"=>["1 Mb/s", "54 Mb/s"], "quality_now"=>"22", "quality_max"=>"70", "encrypted"=>false}, {"mode"=>"Master", "frequency"=>"2.437 GHz", "ssid"=>"Tunneltop", "signal"=>-86, "channel"=>6, "address"=>"38:6B:BB:CA:70:B0", "encryption_type"=>"wpa", "bitrates"=>["1 Mb/s", "48 Mb/s"], "quality_now"=>"24", "quality_max"=>"70", "encrypted"=>true}, {"mode"=>"Master", "frequency"=>"2.437 GHz", "ssid"=>"xfinitywifi", "signal"=>-86, "channel"=>6, "address"=>"4E:7A:8A:3C:60:4E", "encryption_type"=>nil, "bitrates"=>["1 Mb/s", "54 Mb/s"], "quality_now"=>"24", "quality_max"=>"70", "encrypted"=>false}, {"mode"=>"Master", "frequency"=>"2.437 GHz", "ssid"=>"Law Office 1", "signal"=>-83, "channel"=>6, "address"=>"2C:30:33:EB:93:6B", "encryption_type"=>"wpa2", "bitrates"=>["1 Mb/s", "48 Mb/s"], "quality_now"=>"27", "quality_max"=>"70", "encrypted"=>true}, {"mode"=>"Master", "frequency"=>"2.437 GHz", "ssid"=>"PINC", "signal"=>-92, "channel"=>6, "address"=>"50:A7:33:06:7A:F8", "encryption_type"=>"wpa", "bitrates"=>["1 Mb/s", "54 Mb/s"], "quality_now"=>"18", "quality_max"=>"70", "encrypted"=>true}, {"mode"=>"Master", "frequency"=>"2.437 GHz", "ssid"=>"", "signal"=>-86, "channel"=>6, "address"=>"5E:7A:8A:3C:60:4E", "encryption_type"=>"wpa", "bitrates"=>["1 Mb/s", "54 Mb/s"], "quality_now"=>"24", "quality_max"=>"70", "encrypted"=>true}, {"mode"=>"Master", "frequency"=>"2.462 GHz", "ssid"=>"", "signal"=>-88, "channel"=>11, "address"=>"BE:34:26:1B:23:DC", "encryption_type"=>"wpa", "bitrates"=>["1 Mb/s", "54 Mb/s"], "quality_now"=>"22", "quality_max"=>"70", "encrypted"=>true}, {"mode"=>"Master", "frequency"=>"2.462 GHz", "ssid"=>"xfinitywifi", "signal"=>-90, "channel"=>11, "address"=>"16:4E:5A:94:FD:E2", "encryption_type"=>nil, "bitrates"=>["1 Mb/s", "54 Mb/s"], "quality_now"=>"20", "quality_max"=>"70", "encrypted"=>false}], "pressure"=>101614.79052823315, "event_type"=>"Photo", "altitude"=>-24.35893568577765, "agent_public_key"=>"coyvfx0wlsri6thjuapg2nz3mb57841d9keq", "user_agent_id"=>"1", "gps"=>{"error"=>{}, "altitude"=>0.0, "longitude"=>0.0, "latitude"=>0.0}, "temperature"=>21.1, "accelerometer"=>[4088, 2, 1028], "agent_private_key"=>"", "form"=>:json, "controller"=>"api/v1/agents/events", "action"=>"create", "agent_id"=>"5", "event"=>{}})
```
Any ideas?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3629/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3629/timeline
| null |
completed
| null | null | false |
[
"`files` and `json` do not work together, unfortunately. While I'd be open to receiving a patch to have those work together, you'll find that you almost certainly need the low-level control of the [requests-toolbelt `StreamingMultipartDataEncoder`](https://toolbelt.readthedocs.io/en/latest/uploading-data.html#streaming-multipart-data-encoder).\n"
] |
https://api.github.com/repos/psf/requests/issues/3628
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3628/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3628/comments
|
https://api.github.com/repos/psf/requests/issues/3628/events
|
https://github.com/psf/requests/issues/3628
| 183,810,790 |
MDU6SXNzdWUxODM4MTA3OTA=
| 3,628 |
Inconsistent results when querying a server with requests.post(data=dict)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/407407?v=4",
"events_url": "https://api.github.com/users/jpmelos/events{/privacy}",
"followers_url": "https://api.github.com/users/jpmelos/followers",
"following_url": "https://api.github.com/users/jpmelos/following{/other_user}",
"gists_url": "https://api.github.com/users/jpmelos/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jpmelos",
"id": 407407,
"login": "jpmelos",
"node_id": "MDQ6VXNlcjQwNzQwNw==",
"organizations_url": "https://api.github.com/users/jpmelos/orgs",
"received_events_url": "https://api.github.com/users/jpmelos/received_events",
"repos_url": "https://api.github.com/users/jpmelos/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jpmelos/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jpmelos/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jpmelos",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-10-18T21:41:50Z
|
2021-09-08T14:00:47Z
|
2016-10-18T22:01:18Z
|
NONE
|
resolved
|
Python version: 2.7.12 and 3.5.2
Requests version: 2.11.1
I'm sorry if this is a duplicate, but I don't know what the issue is, so I also don't even know how to look for it in the search... But this might be a bug, so I decided to report.
When I run this in Python 2:
``` python
import requests
response = requests.post('http://evds.tcmb.gov.tr/cgi-bin/famecgi', data={
'cgi': '$ozetweb',
'ARAVERIGRUP': 'bie_yymkpyuk.db',
'DIL': 'UK',
'ONDALIK': '5',
'wfmultiple_selection': 'ZAMANSERILERI',
'f_begdt': '07-01-2005',
'f_enddt': '07-10-2016',
'ZAMANSERILERI': ['TP.PYUK1', 'TP.PYUK2', 'TP.PYUK21', 'TP.PYUK22', 'TP.PYUK3', 'TP.PYUK4', 'TP.PYUK5', 'TP.PYUK6'],
'YON': '3',
'SUBMITDEG': 'Report',
'GRTYPE': '1',
'EPOSTA': 'xxx',
'RESIMPOSTA': '***',
})
print(response.text)
```
I always get the same response, consistently, no matter how many times. If I run this in Python 3, I get a different response. I checked the request body being sent to the server, and the request headers, and they are all the same in both versions of Python. The correct response is obtained from running this in Python 2.
Since both versions of Python are supported, and the same API is exposed, then the same result should happen. I also asked this in Stack Overflow, and at least one other developer was able to reprodeuce and confirm this happens: http://stackoverflow.com/questions/40118133/library-requests-getting-different-results-unpredictably
You should check the discussion at Stack Overflow, as some people entertained some more scenarios.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/407407?v=4",
"events_url": "https://api.github.com/users/jpmelos/events{/privacy}",
"followers_url": "https://api.github.com/users/jpmelos/followers",
"following_url": "https://api.github.com/users/jpmelos/following{/other_user}",
"gists_url": "https://api.github.com/users/jpmelos/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jpmelos",
"id": 407407,
"login": "jpmelos",
"node_id": "MDQ6VXNlcjQwNzQwNw==",
"organizations_url": "https://api.github.com/users/jpmelos/orgs",
"received_events_url": "https://api.github.com/users/jpmelos/received_events",
"repos_url": "https://api.github.com/users/jpmelos/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jpmelos/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jpmelos/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jpmelos",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3628/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3628/timeline
| null |
completed
| null | null | false |
[
"The server cares about the order of the parameters. Closing the issue as it is invalid.\n"
] |
https://api.github.com/repos/psf/requests/issues/3627
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3627/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3627/comments
|
https://api.github.com/repos/psf/requests/issues/3627/events
|
https://github.com/psf/requests/pull/3627
| 183,749,952 |
MDExOlB1bGxSZXF1ZXN0ODk4NjA0OTU=
| 3,627 |
remove RequestsCookieJar specific update call
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-10-18T17:18:25Z
|
2021-09-08T02:10:14Z
|
2016-10-27T19:53:18Z
|
MEMBER
|
resolved
|
This is a minor fix in the same vein as #3591. This function calls `update` on the CookieJar which only exists on `RequestsCookieJar` not the standard library `cookielib.CookieJar`. While this function will now be somewhat trivial, this will ensure it maintains backwards compatibility. It may be worth discussing removal in the future.
I have a test as well @Lukasa but it seems a bit overkill. I can include it if you'd like though.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3627/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3627/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3627.diff",
"html_url": "https://github.com/psf/requests/pull/3627",
"merged_at": "2016-10-27T19:53:18Z",
"patch_url": "https://github.com/psf/requests/pull/3627.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3627"
}
| true |
[
"Hey @Lukasa, just pinging on this when you've got a moment. This should be the last problematic use of `update` on CookieJars in the codebase.\n",
"Thanks @nateprewitt! :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/3626
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3626/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3626/comments
|
https://api.github.com/repos/psf/requests/issues/3626/events
|
https://github.com/psf/requests/issues/3626
| 183,690,632 |
MDU6SXNzdWUxODM2OTA2MzI=
| 3,626 |
Use merge_environment_settings method in sessions.send method
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1916237?v=4",
"events_url": "https://api.github.com/users/yozel/events{/privacy}",
"followers_url": "https://api.github.com/users/yozel/followers",
"following_url": "https://api.github.com/users/yozel/following{/other_user}",
"gists_url": "https://api.github.com/users/yozel/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/yozel",
"id": 1916237,
"login": "yozel",
"node_id": "MDQ6VXNlcjE5MTYyMzc=",
"organizations_url": "https://api.github.com/users/yozel/orgs",
"received_events_url": "https://api.github.com/users/yozel/received_events",
"repos_url": "https://api.github.com/users/yozel/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/yozel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yozel/subscriptions",
"type": "User",
"url": "https://api.github.com/users/yozel",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-10-18T13:45:36Z
|
2021-09-08T14:00:48Z
|
2016-10-18T13:49:10Z
|
NONE
|
resolved
|
I work on a python project that uses requests. And as a developer, I have set that enviroment variables: HTTP_PROXY, HTTPS_PROXY and REQUESTS_CA_BUNDLE so I can see all of my requests with mitmproxy. Today, I have decided to use session.send() method with a prepared request instead of using .get(), .post() or .request(). But then, I realize that .send() doesn't respect my environment variables.
So, I think that merge_environment_settings method should be used in .send() method, instead of .request(). What do you think about that? Should I open a pull request for that?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3626/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3626/timeline
| null |
completed
| null | null | false |
[
"@yozel Thanks for this issue!\n\nPer #2807, this is intentional behaviour. `merge_environment_settings` is a public, documented method on the `Session`: if you're planning to move away from using `Session.request` to the prepared request flow, this is one of the things that needs to become part of your workflow if you want its behaviour.\n"
] |
https://api.github.com/repos/psf/requests/issues/3625
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3625/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3625/comments
|
https://api.github.com/repos/psf/requests/issues/3625/events
|
https://github.com/psf/requests/pull/3625
| 183,444,457 |
MDExOlB1bGxSZXF1ZXN0ODk2NDM1NTE=
| 3,625 |
fix issue when the file-like object raises an IOError with tell
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6294659?v=4",
"events_url": "https://api.github.com/users/mie00/events{/privacy}",
"followers_url": "https://api.github.com/users/mie00/followers",
"following_url": "https://api.github.com/users/mie00/following{/other_user}",
"gists_url": "https://api.github.com/users/mie00/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mie00",
"id": 6294659,
"login": "mie00",
"node_id": "MDQ6VXNlcjYyOTQ2NTk=",
"organizations_url": "https://api.github.com/users/mie00/orgs",
"received_events_url": "https://api.github.com/users/mie00/received_events",
"repos_url": "https://api.github.com/users/mie00/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mie00/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mie00/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mie00",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2016-10-17T15:14:38Z
|
2021-09-08T02:10:26Z
|
2016-10-21T07:20:37Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3625/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3625/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3625.diff",
"html_url": "https://github.com/psf/requests/pull/3625",
"merged_at": "2016-10-21T07:20:37Z",
"patch_url": "https://github.com/psf/requests/pull/3625.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3625"
}
| true |
[
"Good spot!\n\nUnfortunately, this fix has the effect of ensuring that files of this type pass transparently but send no data, as we've called `seek` on them to seek to the end. In this instance, probably the easiest fix is to move the `if` block to an `else` on the original `try`. That way, we know that the previous call to `tell()` didn't explode.\n",
"Isn't is possible for seek itself to give an IOError?\n\nfor example:\n\n``` python\nimport os\nf = os.fdopen(0) # getting a file object for the standard input\nf.seek(10)\n```\n\nwill give an IOError.\n\nSo I wonder if they can be handled with the same `except`, or they should be handled separately.\n",
"@mie00 The question isn't \"can `seek()` throw an `IOError`\", it's \"can `seek()` throw an `IOError` in a situation where `tell()` does not\". I feel like the odds of that are pretty low. I'm inclined to want to worry about that when we know it's an issue.\n",
"I tried to find a case, but I couldn't. The only thing that I was able to find is that `tell` giving `IOError` while `seek` gives `UnsupportedOperation` which is handled elsewhere (for example `requests.Response.raw`).\n\nSo, PR updated.\n",
"Ok, rather than block on @sigmavirus24 having time to swing back to this, I'll merge this now. @sigmavirus24 can provide feedback whenever he gets time back. Thanks for the work @mie00!\n"
] |
|
https://api.github.com/repos/psf/requests/issues/3624
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3624/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3624/comments
|
https://api.github.com/repos/psf/requests/issues/3624/events
|
https://github.com/psf/requests/issues/3624
| 183,326,003 |
MDU6SXNzdWUxODMzMjYwMDM=
| 3,624 |
how can I pass gmail's cookie to requests session?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7452500?v=4",
"events_url": "https://api.github.com/users/alireza-amirsamimi/events{/privacy}",
"followers_url": "https://api.github.com/users/alireza-amirsamimi/followers",
"following_url": "https://api.github.com/users/alireza-amirsamimi/following{/other_user}",
"gists_url": "https://api.github.com/users/alireza-amirsamimi/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/alireza-amirsamimi",
"id": 7452500,
"login": "alireza-amirsamimi",
"node_id": "MDQ6VXNlcjc0NTI1MDA=",
"organizations_url": "https://api.github.com/users/alireza-amirsamimi/orgs",
"received_events_url": "https://api.github.com/users/alireza-amirsamimi/received_events",
"repos_url": "https://api.github.com/users/alireza-amirsamimi/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/alireza-amirsamimi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alireza-amirsamimi/subscriptions",
"type": "User",
"url": "https://api.github.com/users/alireza-amirsamimi",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-10-17T05:03:15Z
|
2021-09-08T15:00:37Z
|
2016-10-17T07:23:35Z
|
NONE
|
resolved
|
Hello
assume that i define a session
`r = requests.Session`
and i want to set cookie for it :
`r.cookies = my_cookies`
I know that I must use cookie jar . and I know taht I must use dictionary format. But I have a problem!
normally cookies are in this format
`ukey=*****; __qca=*****; _ga=******;`
and they are convertable to doctionary format easily
`{'ukey' : '****' , '__qca' : '****' , '__ga' : '*****' }`
but how can deal whith gmail's cookie ? they have diffrent format , for example
`S=gmail=****; COMPASS=gmail=****;`
I tested SimpleCookie , that some one requested in this answer at stackoverflow.com
http://stackoverflow.com/questions/32281041/converting-cookie-string-into-python-dict
but I got this error message
`File "/usr/lib/python3.5/site-packages/requests/sessions.py", line 510, in head
return self.request('HEAD', url, **kwargs)
File "/usr/lib/python3.5/site-packages/requests/sessions.py", line 475, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python3.5/site-packages/requests/sessions.py", line 611, in send
extract_cookies_to_jar(self.cookies, request, r.raw)
File "/usr/lib/python3.5/site-packages/requests/cookies.py", line 133, in extract_cookies_to_jar
jar.extract_cookies(res, req)
AttributeError: 'dict' object has no attribute 'extract_cookies'
`
Please guid me
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3624/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3624/timeline
| null |
completed
| null | null | false |
[
"Hey @alireza-amirsamimi, thanks for opening this ticket. Requests currently doesn't support passing a `dict` object to Session's `cookies` attribute. You must first convert your dictionary into a `CookieJar`-like object before passing it to the session. I've attached the code below that you'll need to solve your issue. In the future, it's preferred that questions regarding how to use Requests are opened on Stackoverflow instead of the Github ticket tracker. Thanks again!\n\ne.g.\n\n``` python\nfrom http.cookies import SimpleCookie\nfrom requests.cookies import cookiejar_from_dict\nfrom requests import Session\n\nr = Session()\nmy_cookie = SimpleCookie()\nmy_cookie.load('rawcookie=string;')\n\ncookies = {key: morsel.value for key, morsel in my_cookie.items()}\n\nr.cookies = cookiejar_from_dict(cookies)\n```\n\nJust to make note of it, this is an occurrence of #3595.\n"
] |
https://api.github.com/repos/psf/requests/issues/3623
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3623/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3623/comments
|
https://api.github.com/repos/psf/requests/issues/3623/events
|
https://github.com/psf/requests/issues/3623
| 183,120,321 |
MDU6SXNzdWUxODMxMjAzMjE=
| 3,623 |
Preparing POST request hook
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/8962926?v=4",
"events_url": "https://api.github.com/users/pmart123/events{/privacy}",
"followers_url": "https://api.github.com/users/pmart123/followers",
"following_url": "https://api.github.com/users/pmart123/following{/other_user}",
"gists_url": "https://api.github.com/users/pmart123/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/pmart123",
"id": 8962926,
"login": "pmart123",
"node_id": "MDQ6VXNlcjg5NjI5MjY=",
"organizations_url": "https://api.github.com/users/pmart123/orgs",
"received_events_url": "https://api.github.com/users/pmart123/received_events",
"repos_url": "https://api.github.com/users/pmart123/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/pmart123/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pmart123/subscriptions",
"type": "User",
"url": "https://api.github.com/users/pmart123",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-10-14T18:31:47Z
|
2021-09-08T15:00:37Z
|
2016-10-14T19:39:27Z
|
NONE
|
resolved
|
First, as countless have said, I'd like to thank all of the contributors for such an excellent package.
I am trying to maintain an auto-incrementing id for a given session and url every time I submit a post request. For example, say I have an authenticated session object.
### Example
``` python
ID_KEY = 'id'
counter = 1
data = 'blah'
payload = {ID_KEY: counter, 'other_stuff': data}
resp = session.post(SPECIFIC_URL, json=payload)
counter += 1
payload = {ID_KEY: counter, 'other_stuff': data}
resp2 = session.post(SPECIFIC_URL, json=payload)
```
My easy solution is having an adapter class that wraps the session object with its own post method, but I was wondering if I could use either a prepared request or hook that would achieve the same effect more cleanly?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3623/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3623/timeline
| null |
completed
| null | null | false |
[
"A Session wrapper or something similar is the best approach here. We got rid of most of our hooks, and you can't hook into the request factory easily. \n"
] |
https://api.github.com/repos/psf/requests/issues/3622
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3622/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3622/comments
|
https://api.github.com/repos/psf/requests/issues/3622/events
|
https://github.com/psf/requests/pull/3622
| 183,012,163 |
MDExOlB1bGxSZXF1ZXN0ODkzNTc3ODg=
| 3,622 |
Update makefile to only ship tagged releases.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2016-10-14T10:08:31Z
|
2021-09-08T02:10:27Z
|
2016-10-14T16:38:17Z
|
MEMBER
|
resolved
|
Extra work that spun out of #3620.
We've had it as our policy that we only ship tagged releases of our vendored modules for a while now. This makefile update consolidates that by ensuring that the makefile will only ever use the latest tag for updates.
For the moment this is probably a bit naive, because if any of our dependencies tagged backport releases we'd be totally screwed. Right now they don't, and I think we'll notice if they start doing it, so we should be safe.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3622/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3622/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3622.diff",
"html_url": "https://github.com/psf/requests/pull/3622",
"merged_at": "2016-10-14T16:38:17Z",
"patch_url": "https://github.com/psf/requests/pull/3622.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3622"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/3621
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3621/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3621/comments
|
https://api.github.com/repos/psf/requests/issues/3621/events
|
https://github.com/psf/requests/pull/3621
| 183,010,234 |
MDExOlB1bGxSZXF1ZXN0ODkzNTY0MjE=
| 3,621 |
Add a missing backport.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2016-10-14T09:59:08Z
|
2021-09-08T02:10:26Z
|
2016-10-14T16:38:33Z
|
MEMBER
|
resolved
|
Spotted this while working on #3620.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3621/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3621/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3621.diff",
"html_url": "https://github.com/psf/requests/pull/3621",
"merged_at": "2016-10-14T16:38:33Z",
"patch_url": "https://github.com/psf/requests/pull/3621.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3621"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/3620
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3620/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3620/comments
|
https://api.github.com/repos/psf/requests/issues/3620/events
|
https://github.com/psf/requests/pull/3620
| 183,009,639 |
MDExOlB1bGxSZXF1ZXN0ODkzNTYwMjc=
| 3,620 |
Better support for internationalized domain names.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2016-10-14T09:56:36Z
|
2021-09-08T02:10:25Z
|
2016-10-21T12:09:04Z
|
MEMBER
|
resolved
|
Fixes #3616. This adds support for IDNA 2008 by vendoring the idna module, with the kind permission of @kjd.
For those keeping track, changes like this are another reason that Requests should stay out of the stdlib. ;) See also: [CPython issue 17305](https://bugs.python.org/issue17305).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3620/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3620/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3620.diff",
"html_url": "https://github.com/psf/requests/pull/3620",
"merged_at": "2016-10-21T12:09:04Z",
"patch_url": "https://github.com/psf/requests/pull/3620.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3620"
}
| true |
[
"\"This adds support for IDNA 2008 by vendoring the idna module\"\n\nWhy copying the whole module? Why not using a dependency? requests already uses requirements.txt.\n",
"@haypo Without wanting to get too far into the weeds here, it remains the policy of the Requests project that we vendor all dependencies. If you'd like to discuss the amendment of that policy I recommend you email Kenneth, as all such decisions are his to make.\n\nYou'll also note that our requirements.txt are _development_ requirements, not our actual requirements for functioning.\n",
"\"it remains the policy of the Requests project that we vendor all dependencies\"\n\nOh ok. I misunderstood the requirements.txt file.\n",
"@haypo No problem. =) This is an unusual policy, and it'll be interesting to see how much longer we keep it, but on a personal level I'd be totally happy to unvendor our deps.\n",
"Ok, addressed that issue.\n",
"Cool, nothing else jumps out at me :shipit:\n",
"I'll change the way imports are performed in the upstream.\n",
":cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/3619
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3619/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3619/comments
|
https://api.github.com/repos/psf/requests/issues/3619/events
|
https://github.com/psf/requests/issues/3619
| 182,969,308 |
MDU6SXNzdWUxODI5NjkzMDg=
| 3,619 |
ValueError: negative shift count in _proxy_bypass_macosx_sysconf
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/12994663?v=4",
"events_url": "https://api.github.com/users/AndiCui/events{/privacy}",
"followers_url": "https://api.github.com/users/AndiCui/followers",
"following_url": "https://api.github.com/users/AndiCui/following{/other_user}",
"gists_url": "https://api.github.com/users/AndiCui/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/AndiCui",
"id": 12994663,
"login": "AndiCui",
"node_id": "MDQ6VXNlcjEyOTk0NjYz",
"organizations_url": "https://api.github.com/users/AndiCui/orgs",
"received_events_url": "https://api.github.com/users/AndiCui/received_events",
"repos_url": "https://api.github.com/users/AndiCui/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/AndiCui/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/AndiCui/subscriptions",
"type": "User",
"url": "https://api.github.com/users/AndiCui",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2016-10-14T05:52:09Z
|
2021-09-08T14:00:46Z
|
2016-10-21T14:41:03Z
|
NONE
|
resolved
|
Pls see https://github.com/pypa/pip/issues/4009
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3619/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3619/timeline
| null |
completed
| null | null | false |
[
"Sorry you were referred here, but this isn't a requests problem _either_. You should be able to reproduce the problem with this code, which uses entirely modules in the standard library:\n\n``` python\nfrom urllib.request import proxy_bypass\nfrom urllib.parse import urlparse\n\nnetloc = urlparse(\"https://pypi.python.org/pypi\").netloc\nproxy_bypass(netloc)\n```\n\nI should note that I'm a little bit confused here as to why you're encountering this problem with the config you showed in the other post. Is your HTTPS config the same?\n",
"My HTTPS config is absolutely the same.\n",
"Were you able to verify the problem with the standard library code?\n",
"Closing due to inactivity.\n",
"Sorry for the delated answer\nI cannot recreate the problem with the lib code, but the network config is changed a bit. (No major changes though) \n",
"@AndiCui So, to be clear, using Requests gives you the problem but using the code above does not.\n",
"I retried once. This time it recreated the issue.\nThe problem happens when I have a “Location” enabled that is other than the Location which I had that config. \n\n```\n>>> proxy_bypass(netloc)\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/urllib/request.py\", line 2546, in proxy_bypass\n return proxy_bypass_macosx_sysconf(host)\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/urllib/request.py\", line 2523, in proxy_bypass_macosx_sysconf\n return _proxy_bypass_macosx_sysconf(host, proxy_settings)\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/urllib/request.py\", line 2509, in _proxy_bypass_macosx_sysconf\n if (hostIP >> mask) == (base >> mask):\nValueError: negative shift count\n```\n",
"So this is not a Requests issue I'm afraid. 😔\n",
"To be clear @AndiCui your problem is arising from the standard library, not from within Requests itself. If you're seeing this problem using Requests, we can handle it, but we the ValueError being raised needs to be addressed in the standard library and you should file a bug on bugs.python.org for that.\n"
] |
https://api.github.com/repos/psf/requests/issues/3618
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3618/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3618/comments
|
https://api.github.com/repos/psf/requests/issues/3618/events
|
https://github.com/psf/requests/issues/3618
| 182,944,589 |
MDU6SXNzdWUxODI5NDQ1ODk=
| 3,618 |
form action without schema'ed ACTION fails with MissingSchema
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/287758?v=4",
"events_url": "https://api.github.com/users/chadmiller/events{/privacy}",
"followers_url": "https://api.github.com/users/chadmiller/followers",
"following_url": "https://api.github.com/users/chadmiller/following{/other_user}",
"gists_url": "https://api.github.com/users/chadmiller/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/chadmiller",
"id": 287758,
"login": "chadmiller",
"node_id": "MDQ6VXNlcjI4Nzc1OA==",
"organizations_url": "https://api.github.com/users/chadmiller/orgs",
"received_events_url": "https://api.github.com/users/chadmiller/received_events",
"repos_url": "https://api.github.com/users/chadmiller/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/chadmiller/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/chadmiller/subscriptions",
"type": "User",
"url": "https://api.github.com/users/chadmiller",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2016-10-14T01:42:37Z
|
2021-09-08T15:00:38Z
|
2016-10-14T07:24:02Z
|
NONE
|
resolved
|
Used `requests` (and mechanicalsoup) to .get() a web page that has a form on it. The form tag looks like
`<form action="/submit_uri" method="post">`
I filled a part of the form, and then .submit()ed the form.
```
File mechanicalsoup/browser.py", line 114, in submit
request = self._prepare_request(form, url, **kwargs)
File "mechanicalsoup/browser.py", line 109, in _prepare_request
return self.session.prepare_request(request)
File "requests/sessions.py", line 394, in prepare_request
hooks=merge_hooks(request.hooks, self.hooks),
File "requests/models.py", line 294, in prepare
self.prepare_url(url, params)
File "requests/models.py", line 354, in prepare_url
raise MissingSchema(error)
requests.exceptions.MissingSchema: Invalid URL '/submit_uri': No schema supplied. Perhaps you meant http:///submit_uri?
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3618/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3618/timeline
| null |
completed
| null | null | false |
[
"You're telling Requests to `get` at the URI of `/submit_uri` which isn't a valid URI.\n\nYou need to get the URI of the form, IE, `https://www.example.com` and add this to the URI before sending the `get` request.\n",
"Correct.\n\nResolving a URL you find in the HTML of a page into a gettable URL is an extremely complex operation, and Requests doesn't do it for you because it's fundamentally not a HTML library. The [WHATWG URL Specification](https://url.spec.whatwg.org) defines the logic for taking a base URL (the URL of the page you requested) and processing a URL on that page (in this case a path-absolute URL) into a new URL. Unfortunately, I am aware of no Python library that fully implements that logic at this time, but something like [rfc3986](http://pypi.python.org/pypi/rfc3986) is probably good enough.\n",
"Here's what I expect out of requests:\n\nMethods that request pages newly take a context url, and interpret the passed in destination, pulling from the context url when necessary.\n\n`get(\"foo\", context=\"http://other/location/bar\")` gets `http://other/location/foo`\n\n`get(\"/baz\", context=\"http://other/location/bar\")` gets `http://other/baz`\n\n`get(\"://third/baz\", context=\"https://other/location/bar\")` gets `https://third/baz`\n",
"Why would you expect that out of Requests? Building URLs in this way is not a HTTP concern.\n",
"@chadmiller other libraries will do that for you (e.g., `rfc3986`) because they are not in the HTTP specification, they're in other (URI) specifications.\n"
] |
https://api.github.com/repos/psf/requests/issues/3617
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3617/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3617/comments
|
https://api.github.com/repos/psf/requests/issues/3617/events
|
https://github.com/psf/requests/pull/3617
| 182,775,945 |
MDExOlB1bGxSZXF1ZXN0ODkxOTIwNjE=
| 3,617 |
Note that @jeremycline is now our contact.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-10-13T12:28:59Z
|
2021-09-08T02:10:27Z
|
2016-10-13T14:08:10Z
|
MEMBER
|
resolved
|
Congratulations @jeremycline!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3617/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3617/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3617.diff",
"html_url": "https://github.com/psf/requests/pull/3617",
"merged_at": "2016-10-13T14:08:10Z",
"patch_url": "https://github.com/psf/requests/pull/3617.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3617"
}
| true |
[
"Thanks :smile: - I'll do my best to fill Ralph's shoes and hopefully not cause any downstream packaging trouble for you all!\n",
"Cheers @jeremycline!\n"
] |
https://api.github.com/repos/psf/requests/issues/3616
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3616/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3616/comments
|
https://api.github.com/repos/psf/requests/issues/3616/events
|
https://github.com/psf/requests/issues/3616
| 182,570,188 |
MDU6SXNzdWUxODI1NzAxODg=
| 3,616 |
Consider vendoring the IDNA library.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"color": "777777",
"default": false,
"description": null,
"id": 162780722,
"name": "Question/Not a bug",
"node_id": "MDU6TGFiZWwxNjI3ODA3MjI=",
"url": "https://api.github.com/repos/psf/requests/labels/Question/Not%20a%20bug"
}
] |
closed
| true | null |
[] | null | 5 |
2016-10-12T16:21:01Z
|
2021-09-08T14:00:46Z
|
2016-10-21T12:09:04Z
|
MEMBER
|
resolved
|
As has been noted on the [Python Security SIG mailing list](https://mail.python.org/pipermail/security-sig/2016-October/000122.html), Python currently only includes the deprecated IDNA 2003 codec in the standard library. This is problematic, because IDNA 2003 is forbidden for some ccTLDs, such as `.de`: instead, the newer IDNA 2008 standard should be used. That standard is implemented in the PyPI package `idna`.
If we're going to support IDNA 2008, I would like it to not be optional: it leads to a fairly substantial and difficult to debug change, so it should be supported fully. To do that and keep in touch with Kenneth's wishes regarding dependencies, we'd have to vendor it.
I'd like contributors and packagers, particularly @eriol, @ralphbean, and @sigmavirus24 to weigh in with their thoughts here. What are your thoughts here? I'd also, if possible, like to hear from @kjd: while `idna` is licensed under a BSD-like license that should be broadly compatible with our own Apache 2.0 and so should present no legal blockers to vendoring, I'd still rather do it with the original author's permission than without it.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3616/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3616/timeline
| null |
completed
| null | null | false |
[
"It is fine by me. Happy to support widespread use of the modern IDNA standard however I can.\n",
"I'm 👍 on it. I think making it non-optional is an excellent idea.\n",
"+1 for me. I agree with @sigmavirus24 on making it non-optional.\n",
"Hey, @ralphbean has asked me to take over being the point of contact for Fedora packaging interactions.\n\nI'm fine with this and I'm also fine with it being non-optional.\n",
"Thanks for your feedback folks! I've done the work in #3620: please cast your eye over it. This is particularly true for downstream packagers, who will likely want to consider the effect this dependency will have on your packaging ecosystems.\n"
] |
https://api.github.com/repos/psf/requests/issues/3615
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3615/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3615/comments
|
https://api.github.com/repos/psf/requests/issues/3615/events
|
https://github.com/psf/requests/pull/3615
| 181,759,916 |
MDExOlB1bGxSZXF1ZXN0ODg1MDIxOTU=
| 3,615 |
Add Documentation for custom methods
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6897645?v=4",
"events_url": "https://api.github.com/users/StewPoll/events{/privacy}",
"followers_url": "https://api.github.com/users/StewPoll/followers",
"following_url": "https://api.github.com/users/StewPoll/following{/other_user}",
"gists_url": "https://api.github.com/users/StewPoll/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/StewPoll",
"id": 6897645,
"login": "StewPoll",
"node_id": "MDQ6VXNlcjY4OTc2NDU=",
"organizations_url": "https://api.github.com/users/StewPoll/orgs",
"received_events_url": "https://api.github.com/users/StewPoll/received_events",
"repos_url": "https://api.github.com/users/StewPoll/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/StewPoll/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/StewPoll/subscriptions",
"type": "User",
"url": "https://api.github.com/users/StewPoll",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-10-07T20:24:10Z
|
2021-09-08T01:21:37Z
|
2016-10-12T10:11:30Z
|
CONTRIBUTOR
|
resolved
|
This PR simply adds the documentation for custom method verbs that people may need or wish to use, such as in #3611
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3615/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3615/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3615.diff",
"html_url": "https://github.com/psf/requests/pull/3615",
"merged_at": "2016-10-12T10:11:30Z",
"patch_url": "https://github.com/psf/requests/pull/3615.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3615"
}
| true |
[
"Thanks @TetraEtc! :sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/3614
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3614/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3614/comments
|
https://api.github.com/repos/psf/requests/issues/3614/events
|
https://github.com/psf/requests/issues/3614
| 181,678,347 |
MDU6SXNzdWUxODE2NzgzNDc=
| 3,614 |
Support 100-continue (expect100)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/103634?v=4",
"events_url": "https://api.github.com/users/redixin/events{/privacy}",
"followers_url": "https://api.github.com/users/redixin/followers",
"following_url": "https://api.github.com/users/redixin/following{/other_user}",
"gists_url": "https://api.github.com/users/redixin/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/redixin",
"id": 103634,
"login": "redixin",
"node_id": "MDQ6VXNlcjEwMzYzNA==",
"organizations_url": "https://api.github.com/users/redixin/orgs",
"received_events_url": "https://api.github.com/users/redixin/received_events",
"repos_url": "https://api.github.com/users/redixin/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/redixin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/redixin/subscriptions",
"type": "User",
"url": "https://api.github.com/users/redixin",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-10-07T14:08:19Z
|
2021-09-08T15:00:40Z
|
2016-10-07T15:07:05Z
|
NONE
|
resolved
|
This feature is used when client wants to check whether request body will be accepted by server before sending the body.
This is achieved by setting "Expect: 100-continue" request header. Server will respond with status line "100 Continue HTTP/1.1" if request headers are ok, or with actual full response if something goes wrong. Client should not send body if response was not "100 Continue HTTP/1.1"
Some useful links:
Related section in pep333:
https://www.python.org/dev/peps/pep-0333/#http-1-1-expect-continue
How it looks like in aiohttp client (see expect100 parameter):
http://aiohttp.readthedocs.io/en/stable/client_reference.html#basic-api
More detailed description:
https://tools.ietf.org/html/rfc7231#section-5.1.1
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3614/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3614/timeline
| null |
completed
| null | null | false |
[
"Hi @redixin ,\n\nAfter a quick search I found https://github.com/kennethreitz/requests/issues/713 which is an old issue that provides the reasoning for why Requests does not support this flow. In fact, this is because we are built on top of httplib. If you want to support 100-continue properly in Requests, you need to go convince the core python developers to support this first.\n\nIn the future, please search for similar issues.\n"
] |
https://api.github.com/repos/psf/requests/issues/3613
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3613/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3613/comments
|
https://api.github.com/repos/psf/requests/issues/3613/events
|
https://github.com/psf/requests/issues/3613
| 181,576,246 |
MDU6SXNzdWUxODE1NzYyNDY=
| 3,613 |
Update streaming request headers
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10917363?v=4",
"events_url": "https://api.github.com/users/thisbejim/events{/privacy}",
"followers_url": "https://api.github.com/users/thisbejim/followers",
"following_url": "https://api.github.com/users/thisbejim/following{/other_user}",
"gists_url": "https://api.github.com/users/thisbejim/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/thisbejim",
"id": 10917363,
"login": "thisbejim",
"node_id": "MDQ6VXNlcjEwOTE3MzYz",
"organizations_url": "https://api.github.com/users/thisbejim/orgs",
"received_events_url": "https://api.github.com/users/thisbejim/received_events",
"repos_url": "https://api.github.com/users/thisbejim/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/thisbejim/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/thisbejim/subscriptions",
"type": "User",
"url": "https://api.github.com/users/thisbejim",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-10-07T03:10:03Z
|
2021-09-08T15:00:39Z
|
2016-10-08T14:11:40Z
|
NONE
|
resolved
|
Is there any way to update headers on a streaming request without restarting the request?
I am using an Authorization header with tokens that go stale after an hour, so without updating the header to include new tokens a stream can only run for an hour.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3613/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3613/timeline
| null |
completed
| null | null | false |
[
"Headers are sent at the start of the message. How do you think you would update them without a new request?\n\nThe RFC specifies \"trailers\" but those are for the end of the message, not for the middle of a message and those aren't widely supported (and thus are not supported by requests).\n"
] |
https://api.github.com/repos/psf/requests/issues/3612
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3612/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3612/comments
|
https://api.github.com/repos/psf/requests/issues/3612/events
|
https://github.com/psf/requests/issues/3612
| 181,576,034 |
MDU6SXNzdWUxODE1NzYwMzQ=
| 3,612 |
Memory Bug in requests when being used with subprocess module.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/17210104?v=4",
"events_url": "https://api.github.com/users/KylePiira/events{/privacy}",
"followers_url": "https://api.github.com/users/KylePiira/followers",
"following_url": "https://api.github.com/users/KylePiira/following{/other_user}",
"gists_url": "https://api.github.com/users/KylePiira/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/KylePiira",
"id": 17210104,
"login": "KylePiira",
"node_id": "MDQ6VXNlcjE3MjEwMTA0",
"organizations_url": "https://api.github.com/users/KylePiira/orgs",
"received_events_url": "https://api.github.com/users/KylePiira/received_events",
"repos_url": "https://api.github.com/users/KylePiira/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/KylePiira/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/KylePiira/subscriptions",
"type": "User",
"url": "https://api.github.com/users/KylePiira",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-10-07T03:07:34Z
|
2021-09-08T15:00:40Z
|
2016-10-07T20:15:09Z
|
NONE
|
resolved
|
So I ran into this while working on a more complex project however here is a short demo to showcase the bug.
**NOTE:** I highly recommend you open some type of task viewer program to monitor RAM usage and know when to `Ctrl C` to prevent your OS from crashing.
# Steps to Reproduce:
a. Create Directory
b. Create two new files in the directory called script.py and queue.py
c. Put the following code into script.py:
```
import requests
```
d. Put the following code into queue.py:
```
from subprocess import call
call(['python','script.py'])
```
e. Now run the following command:
```
python script.py
```
Your computer should now begin rapidly using RAM and will eventually crash (took about 30 seconds for my Ubuntu install to crash on a rig with 8 GB of RAM.)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3612/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3612/timeline
| null |
completed
| null | null | false |
[
"So this is happening because of how `six`, a package used by Requests, does its module discovery. You can confirm by changing `import requests` to `import six.moves.queue` (the line in `urllib3` causing this) and you'll get the same behavior. Your queue.py ends up getting imported and run, which obviously results in an infinite loop and your PC spawns Python processes until it crashes.\n\nThe solution is to name queue.py something else, preferably not a name shadowing a standard library module, because as you can see that can result in some very weird behavior and is generally considered a bad idea. If you want to file a bug with six their repo is [here](https://bitbucket.org/gutworth/six) but the simpler solution is probably you just change your filename.\n",
"@drpoggi is correct. There's not much to be done here within requests (or six for that matter). It's usually best to not to shadow the names of standard library modules.\n"
] |
https://api.github.com/repos/psf/requests/issues/3611
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3611/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3611/comments
|
https://api.github.com/repos/psf/requests/issues/3611/events
|
https://github.com/psf/requests/issues/3611
| 181,378,862 |
MDU6SXNzdWUxODEzNzg4NjI=
| 3,611 |
MKCOL verb from WebDAV
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/82182?v=4",
"events_url": "https://api.github.com/users/paul-hammant/events{/privacy}",
"followers_url": "https://api.github.com/users/paul-hammant/followers",
"following_url": "https://api.github.com/users/paul-hammant/following{/other_user}",
"gists_url": "https://api.github.com/users/paul-hammant/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/paul-hammant",
"id": 82182,
"login": "paul-hammant",
"node_id": "MDQ6VXNlcjgyMTgy",
"organizations_url": "https://api.github.com/users/paul-hammant/orgs",
"received_events_url": "https://api.github.com/users/paul-hammant/received_events",
"repos_url": "https://api.github.com/users/paul-hammant/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/paul-hammant/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/paul-hammant/subscriptions",
"type": "User",
"url": "https://api.github.com/users/paul-hammant",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-10-06T10:29:35Z
|
2021-09-08T15:00:41Z
|
2016-10-06T13:16:52Z
|
NONE
|
resolved
|
MKCOL would be super useful - make a directory on the remote server.
curl has it:
```
curl -u user:pwd -X MKCOL http://yourserver.com/baseurl/newdiruwannacreate
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3611/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3611/timeline
| null |
completed
| null | null | false |
[
"From what I know the -X flag means you're using a custom metbod, laying\noutside normal HTTP requests.\n\nYou should be able to achieve this using `requests.request('MKCOL',\n**kwargs)`\n\nI may be wrong though.\n\nOn Thu, 6 Oct 2016, 8:29 PM Paul Hammant [email protected] wrote:\n\n> MKCOL would be super useful - make a directory on the remote server.\n> \n> curl has it:\n> \n> curl -u user:pwd -X MKCOL http://yourserver.com/baseurl/newdiruwannacreate\n> \n> —\n> You are receiving this because you are subscribed to this thread.\n> Reply to this email directly, view it on GitHub\n> https://github.com/kennethreitz/requests/issues/3611, or mute the thread\n> https://github.com/notifications/unsubscribe-auth/AGk_7Wb4lCaJIKH_tBRks0B4TB0DuM1lks5qxM2lgaJpZM4KPyV0\n> .\n",
"@TetraEtc is correct. Cheers!\n",
"Thanks :)\n"
] |
https://api.github.com/repos/psf/requests/issues/3610
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3610/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3610/comments
|
https://api.github.com/repos/psf/requests/issues/3610/events
|
https://github.com/psf/requests/issues/3610
| 181,073,201 |
MDU6SXNzdWUxODEwNzMyMDE=
| 3,610 |
Failed to establish a new connection: [Errno 61] Connection refused
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/11945009?v=4",
"events_url": "https://api.github.com/users/xiazhibin/events{/privacy}",
"followers_url": "https://api.github.com/users/xiazhibin/followers",
"following_url": "https://api.github.com/users/xiazhibin/following{/other_user}",
"gists_url": "https://api.github.com/users/xiazhibin/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/xiazhibin",
"id": 11945009,
"login": "xiazhibin",
"node_id": "MDQ6VXNlcjExOTQ1MDA5",
"organizations_url": "https://api.github.com/users/xiazhibin/orgs",
"received_events_url": "https://api.github.com/users/xiazhibin/received_events",
"repos_url": "https://api.github.com/users/xiazhibin/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/xiazhibin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xiazhibin/subscriptions",
"type": "User",
"url": "https://api.github.com/users/xiazhibin",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2016-10-05T06:16:47Z
|
2021-09-08T15:00:41Z
|
2016-10-05T06:23:59Z
|
NONE
|
resolved
|
the error is shown in there [stackoverflow](http://stackoverflow.com/questions/39854217/proxy-error-when-using-flask-as-a-simple-http-proxy)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/11945009?v=4",
"events_url": "https://api.github.com/users/xiazhibin/events{/privacy}",
"followers_url": "https://api.github.com/users/xiazhibin/followers",
"following_url": "https://api.github.com/users/xiazhibin/following{/other_user}",
"gists_url": "https://api.github.com/users/xiazhibin/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/xiazhibin",
"id": 11945009,
"login": "xiazhibin",
"node_id": "MDQ6VXNlcjExOTQ1MDA5",
"organizations_url": "https://api.github.com/users/xiazhibin/orgs",
"received_events_url": "https://api.github.com/users/xiazhibin/received_events",
"repos_url": "https://api.github.com/users/xiazhibin/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/xiazhibin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xiazhibin/subscriptions",
"type": "User",
"url": "https://api.github.com/users/xiazhibin",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3610/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3610/timeline
| null |
completed
| null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/3609
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3609/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3609/comments
|
https://api.github.com/repos/psf/requests/issues/3609/events
|
https://github.com/psf/requests/issues/3609
| 180,978,775 |
MDU6SXNzdWUxODA5Nzg3NzU=
| 3,609 |
[Errno -3] When using Threading
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/22095848?v=4",
"events_url": "https://api.github.com/users/htpasswd/events{/privacy}",
"followers_url": "https://api.github.com/users/htpasswd/followers",
"following_url": "https://api.github.com/users/htpasswd/following{/other_user}",
"gists_url": "https://api.github.com/users/htpasswd/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/htpasswd",
"id": 22095848,
"login": "htpasswd",
"node_id": "MDQ6VXNlcjIyMDk1ODQ4",
"organizations_url": "https://api.github.com/users/htpasswd/orgs",
"received_events_url": "https://api.github.com/users/htpasswd/received_events",
"repos_url": "https://api.github.com/users/htpasswd/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/htpasswd/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/htpasswd/subscriptions",
"type": "User",
"url": "https://api.github.com/users/htpasswd",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-10-04T19:11:19Z
|
2021-09-08T15:00:38Z
|
2016-10-12T09:33:44Z
|
NONE
|
resolved
|
Hello. There is my example of code I'm using:
```
import threading
import resource
import time
import sys
#maximum Open File Limit for thread limiter.
maxOpenFileLimit = resource.getrlimit(resource.RLIMIT_NOFILE)[0] # For example, it shows 50.
# I will use one session for every Thread.
requestSessions = requests.Session()
# Making the requests Pool bigger.
adapter = requests.adapters.HTTPAdapter(pool_connections=maxOpenFileLimit, pool_maxsize=maxOpenFileLimit, pool_block=True)
requestSessions.mount('http://', adapter)
requestSessions.mount('https://', adapter)
def requestAction(a1, a2):
global number
time.sleep(1) # My actions with Requests for each thread.
print number = number + 1
number = 0 # Count of complete actions
ThreadsActions = [] # Action tasks.
for i in range(100): # I have 100 websites I need to do in parallel threads.
a1 = i
for n in range(3): # Every website I need to "GET" in 3 threads
a2 = n
ThreadsActions.append(threading.Thread(target=requestAction, args=(a1,a2)))
for item in ThreadsActions:
# But I can't do more than 50 Threads at once, because of maxOpenFileLimit.
while True:
# Thread limiter, analogue of BoundedSemaphore.
if (int(threading.activeCount()) < threadLimiter):
item.start()
break
else:
time.sleep(.1)
continue
for item in ThreadsActions:
item.join()
```
But I'm getting an error: `HTTPSConnectionPool(host='www.example.org', port=443): Max retries exceeded with url: /page?news-25 (Caused by NewConnectionError('<requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7fdc5c138b90>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution',))`
As I read in issues, it might happens when some socket is stacked in CLOSE_WAIT status. If there are a lot of sockets, for example, because of slow connection or slow servers, the script might go to the point where all the sockets are in CLOSE_WAIT state. I tried to use `max_retries=10`, and got errors about `Too many files openned`.
How to pause new requests if all the sockets are in CLOSE_WAIT state? Or why I'm getting these errors?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3609/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3609/timeline
| null |
completed
| null | null | false |
[
"Requests doesn't provide a way to pause making new requests based on the status of sockets. It doesn't see the sockets itself and even so, it couldn't introspect them for that status.\n\nYou should try creating a separate session per thread. This is what the [toolbelt](https://github.com/sigmavirus24/requests-toolbelt) does and it should also avoid any situations where the same socket is used for two different requests. (The Session object used to be thread-safe but those tests disappeared and so the guarantees of thread-safety no longer really exist. It's safer to use one session per thread.)\n",
"Then what's the purpose of `pool_block=True`? Shouldn't it make some bottleneck when all the sockets are in use? Maybe it safer to use `resp = requests.get` with `resp.close()` to make sure all the sockets are under the script's control?\n",
"\"Temporary failure in name resolution\" corresponds to EAI_AGAIN. There are _many_ possible causes for such an error, not just the number of sockets in CLOSE_WAIT. Have you seen any evidence that this is the actual problem?\n\nRegardless, your code is not going to do quite what you think. You're trying to limit the number of open file handles you get by using getrlimit. However, you aren't actually limiting yourself to that. `pool_connections` is actually the number of connection pools to keep hold of (one per host), and `pool_maxsize` is the maximum number of connections allowed in each pool. That means you're allowing `maxOpenFileLimit**2` sockets. Even if this did do what you want, you'd still run the risk of running out of file descriptors, because you're not allowing for the fact that stdin/stdout/stderr are almost certainly open.\n\nGiven that raising the retry limit fixed the problem in temporary name resolution, it's extremely likely that you want to do that and fix your handling of the maximum number of connections.\n\nRegardless, at this time there is no evidence that this is a bug in Requests, so I'm closing this issue. Thanks for the report!\n"
] |
https://api.github.com/repos/psf/requests/issues/3608
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3608/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3608/comments
|
https://api.github.com/repos/psf/requests/issues/3608/events
|
https://github.com/psf/requests/issues/3608
| 180,231,088 |
MDU6SXNzdWUxODAyMzEwODg=
| 3,608 |
requests.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:645)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/192655?v=4",
"events_url": "https://api.github.com/users/opt9/events{/privacy}",
"followers_url": "https://api.github.com/users/opt9/followers",
"following_url": "https://api.github.com/users/opt9/following{/other_user}",
"gists_url": "https://api.github.com/users/opt9/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/opt9",
"id": 192655,
"login": "opt9",
"node_id": "MDQ6VXNlcjE5MjY1NQ==",
"organizations_url": "https://api.github.com/users/opt9/orgs",
"received_events_url": "https://api.github.com/users/opt9/received_events",
"repos_url": "https://api.github.com/users/opt9/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/opt9/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/opt9/subscriptions",
"type": "User",
"url": "https://api.github.com/users/opt9",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-09-30T06:37:47Z
|
2021-09-08T15:00:42Z
|
2016-09-30T07:44:27Z
|
NONE
|
resolved
|
Error messages
``` bash
(env) ➜ test $ python download.py
Traceback (most recent call last):
File "/home/dev2/test/env/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 595, in urlopen
chunked=chunked)
File "/home/dev2/test/env/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 352, in _make_request
self._validate_conn(conn)
File "/home/dev2/test/env/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 831, in _validate_conn
conn.connect()
File "/home/dev2/test/env/lib/python3.5/site-packages/requests/packages/urllib3/connection.py", line 289, in connect
ssl_version=resolved_ssl_version)
File "/home/dev2/test/env/lib/python3.5/site-packages/requests/packages/urllib3/util/ssl_.py", line 308, in ssl_wrap_socket
return context.wrap_socket(sock, server_hostname=server_hostname)
File "/usr/lib/python3.5/ssl.py", line 377, in wrap_socket
_context=self)
File "/usr/lib/python3.5/ssl.py", line 752, in __init__
self.do_handshake()
File "/usr/lib/python3.5/ssl.py", line 988, in do_handshake
self._sslobj.do_handshake()
File "/usr/lib/python3.5/ssl.py", line 633, in do_handshake
self._sslobj.do_handshake()
ssl.SSLEOFError: EOF occurred in violation of protocol (_ssl.c:645)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/dev2/test/env/lib/python3.5/site-packages/requests/adapters.py", line 423, in send
timeout=timeout
File "/home/dev2/test/env/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 621, in urlopen
raise SSLError(e)
requests.packages.urllib3.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:645)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "download.py", line 17, in <module>
main()
File "download.py", line 14, in main
download('https://www.kmac.co.kr/inc/mailfiledownload.asp?mode=notice&idx=6770&gubun=filename')
File "download.py", line 8, in download
res = requests.get(url, stream=True)
File "/home/dev2/test/env/lib/python3.5/site-packages/requests/api.py", line 70, in get
return request('get', url, params=params, **kwargs)
File "/home/dev2/test/env/lib/python3.5/site-packages/requests/api.py", line 56, in request
return session.request(method=method, url=url, **kwargs)
File "/home/dev2/test/env/lib/python3.5/site-packages/requests/sessions.py", line 475, in request
resp = self.send(prep, **send_kwargs)
File "/home/dev2/test/env/lib/python3.5/site-packages/requests/sessions.py", line 596, in send
r = adapter.send(request, **kwargs)
File "/home/dev2/test/env/lib/python3.5/site-packages/requests/adapters.py", line 497, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:645)
```
My environment
``` bash
(env) ➜ test $ python -V
Python 3.5.2
```
``` bash
(env) ➜ test $ python -c "import ssl; print(ssl.OPENSSL_VERSION)"
OpenSSL 1.0.2g 1 Mar 2016
```
``` bash
(env) ➜ test $ cat /etc/os-release
NAME="Ubuntu"
VERSION="16.04.1 LTS (Xenial Xerus)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 16.04.1 LTS"
VERSION_ID="16.04"
HOME_URL="http://www.ubuntu.com/"
SUPPORT_URL="http://help.ubuntu.com/"
BUG_REPORT_URL="http://bugs.launchpad.net/ubuntu/"
```
``` bash
(env) ➜ test $ pip list
pip (8.1.2)
requests (2.11.1)
setuptools (28.0.0)
wheel (0.30.0a0)
```
Test code
``` python
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import requests
def download(url):
with open('output.dat', "wb") as f:
res = requests.get(url, stream=True)
for chunk in res.iter_content(8192):
f.write(chunk)
print(filename + ' download complete!')
def main():
download('https://www.kmac.co.kr/inc/mailfiledownload.asp?mode=notice&idx=6770&gubun=filename')
if __name__ == "__main__":
main()
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3608/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3608/timeline
| null |
completed
| null | null | false |
[
"So the problem here is that your security configuration is sufficiently high that you cannot connect to the remote peer. Specifically, Requests has a very restrictive set of ciphers: it only uses ciphers that are not known to be flawed. That means that in recent releases the set is quite restricted: in the release you're on that means AES and 3DES are the only supported ciphers. In an upcoming release that'll be further restricted to just AES due to known flaws in 3DES on large files.\n\nThe problem here is that the server only seems to support _utterly abominable_ ciphers. Check this out:\n\n```\nopenssl s_client -connect www.kmac.co.kr:443 \n...\nNew, TLSv1/SSLv3, Cipher is RC4-MD5\n...\n```\n\nThat cipher there? That's a _terrible_ cipher. RC4 is known-broken, as is MD5. Either of these _by themselves_ would be a reason not to use the cipher suite, combining them is the height of absurdity. That made me want to check this server in SSLLabs: the result is [here](https://www.ssllabs.com/ssltest/analyze.html?d=www.kmac.co.kr), but let me show you the highlights:\n- Grade: F\n - Experimental: This server is vulnerable to the DROWN attack. Grade set to F.\n - This server supports SSL 2, which is obsolete and insecure, and can be used against TLS (DROWN attack). Grade set to F. \n - This server supports 512-bit export suites and might be vulnerable to the FREAK attack. Grade set to F. \n - This server uses SSL 3, which is obsolete and insecure. Grade capped to B.\n - The server supports only older protocols, but not the current best TLS 1.2. Grade capped to C.\n - This server accepts RC4 cipher, but only with older protocol versions. Grade capped to B.\n - The server does not support Forward Secrecy with the reference browsers.\n- It supports 14 cipher suites, of which 13 (!) are _totally_ insecure. Almost all use MD5 as a hash function, many use DES (not 3DES, regular one-time DES), several are export strength, a couple use RC2 (!), and almost everything that use SHA are sunk because they use RC4.\n- There is _one_ cipher suite in that list that current Requests is willing to speak, `TLS_RSA_WITH_3DES_EDE_CBC_SHA`, it's the weakest on our current list, and it's going away in the next major release of Requests.\n\nThe pièce de résistance on this sandwich of terribleness is that, in fact, there's no reason Requests and this server shouldn't be able to speak. We do have a protocol overlap (we both allow TLSv1.0), and we do have a cipher suite overlap. However, it seems like the server has a _maximum number of cipher suites_ it will look through in a handshake before it gives up. You can actually test this by taking the full Requests cipher suite string (`'ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:ECDH+HIGH:DH+HIGH:ECDH+3DES:DH+3DES:RSA+AESGCM:RSA+AES:RSA+HIGH:RSA+3DES:!aNULL:!eNULL:!MD5'`) and passing it to the OpenSSL command-line tool and attempting to connect. Each time you fail, remove one element from the front. Eventually, this will _start to work_.\n\nThe TL;DR here is that this server is terrible, it only supports terribly weak TLS, and if at all possible you should _avoid using it_. I strongly recommend not speaking to this server in any way, shape, or form. If you _absolutely must_, you can set `requests.packages.urllib.util.ssl_.DEFAULT_CIPHERS = 'RSA+3DES'`, and that will allow your connection to succeed. But I strongly discourage doing that.\n"
] |
https://api.github.com/repos/psf/requests/issues/3607
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3607/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3607/comments
|
https://api.github.com/repos/psf/requests/issues/3607/events
|
https://github.com/psf/requests/pull/3607
| 180,211,371 |
MDExOlB1bGxSZXF1ZXN0ODc0Mjg1OTg=
| 3,607 |
Remove error swallowing exception catching of AttributeError
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/299380?v=4",
"events_url": "https://api.github.com/users/frankier/events{/privacy}",
"followers_url": "https://api.github.com/users/frankier/followers",
"following_url": "https://api.github.com/users/frankier/following{/other_user}",
"gists_url": "https://api.github.com/users/frankier/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/frankier",
"id": 299380,
"login": "frankier",
"node_id": "MDQ6VXNlcjI5OTM4MA==",
"organizations_url": "https://api.github.com/users/frankier/orgs",
"received_events_url": "https://api.github.com/users/frankier/received_events",
"repos_url": "https://api.github.com/users/frankier/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/frankier/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/frankier/subscriptions",
"type": "User",
"url": "https://api.github.com/users/frankier",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-09-30T03:20:09Z
|
2021-09-08T02:10:28Z
|
2016-09-30T12:17:17Z
|
CONTRIBUTOR
|
resolved
|
This exception catching is far too aggressive in my opinion. A lot of code could run in this and could throw AttributeError for any number of reasons. It's actually not clear to me what error it's trying to catch. It made it quite difficult for me to debug this problem: https://github.com/shazow/urllib3/pull/990
This PR removes it altogether since as far as I can see it is not needed in normal operation and can only serve to mask errors.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3607/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3607/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3607.diff",
"html_url": "https://github.com/psf/requests/pull/3607",
"merged_at": "2016-09-30T12:17:17Z",
"patch_url": "https://github.com/psf/requests/pull/3607.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3607"
}
| true |
[
"I did some ~~dumpster~~ history diving and found https://github.com/kennethreitz/requests/commit/f7968b6797a50716a2f177ad23439a3c6bb9f683 and https://github.com/kennethreitz/requests/commit/296d8cc097b694ec8a9c4d8d7c00abed4755e5d5\n\nThe first is linked to https://github.com/kennethreitz/requests/issues/236 which indicates that at times `Response.raw` used to possibly be None. I don't think this could happen any longer (I think 98% of the time there's a valid urllib3 HTTPResponse) and we no longer access `self.raw` directly in this method, so I suspect this is probably safe.\n"
] |
https://api.github.com/repos/psf/requests/issues/3606
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3606/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3606/comments
|
https://api.github.com/repos/psf/requests/issues/3606/events
|
https://github.com/psf/requests/pull/3606
| 179,946,490 |
MDExOlB1bGxSZXF1ZXN0ODcyNDIwNjM=
| 3,606 |
Issue #3597 - Rework Bytestring Host Test to use PyTest Httpbin
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5546456?v=4",
"events_url": "https://api.github.com/users/bbamsch/events{/privacy}",
"followers_url": "https://api.github.com/users/bbamsch/followers",
"following_url": "https://api.github.com/users/bbamsch/following{/other_user}",
"gists_url": "https://api.github.com/users/bbamsch/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bbamsch",
"id": 5546456,
"login": "bbamsch",
"node_id": "MDQ6VXNlcjU1NDY0NTY=",
"organizations_url": "https://api.github.com/users/bbamsch/orgs",
"received_events_url": "https://api.github.com/users/bbamsch/received_events",
"repos_url": "https://api.github.com/users/bbamsch/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bbamsch/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bbamsch/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bbamsch",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-09-29T03:50:11Z
|
2021-09-08T02:10:29Z
|
2016-09-29T07:09:22Z
|
CONTRIBUTOR
|
resolved
|
Fixes dependency on internet connection for test added in PR #3598 for Issue #3597
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3606/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3606/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3606.diff",
"html_url": "https://github.com/psf/requests/pull/3606",
"merged_at": "2016-09-29T07:09:22Z",
"patch_url": "https://github.com/psf/requests/pull/3606.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3606"
}
| true |
[
"Looks good to me @bbamsch, thanks for fixing this up!\n"
] |
https://api.github.com/repos/psf/requests/issues/3605
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3605/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3605/comments
|
https://api.github.com/repos/psf/requests/issues/3605/events
|
https://github.com/psf/requests/issues/3605
| 179,905,294 |
MDU6SXNzdWUxNzk5MDUyOTQ=
| 3,605 |
ndg package not being installed
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/558175?v=4",
"events_url": "https://api.github.com/users/variable/events{/privacy}",
"followers_url": "https://api.github.com/users/variable/followers",
"following_url": "https://api.github.com/users/variable/following{/other_user}",
"gists_url": "https://api.github.com/users/variable/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/variable",
"id": 558175,
"login": "variable",
"node_id": "MDQ6VXNlcjU1ODE3NQ==",
"organizations_url": "https://api.github.com/users/variable/orgs",
"received_events_url": "https://api.github.com/users/variable/received_events",
"repos_url": "https://api.github.com/users/variable/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/variable/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/variable/subscriptions",
"type": "User",
"url": "https://api.github.com/users/variable",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 11 |
2016-09-28T22:14:36Z
|
2021-09-08T13:05:30Z
|
2016-09-29T07:11:23Z
|
NONE
|
resolved
|
So I had a very long day debugging why I was getting
`EOF occurred in violation of protocol (_ssl.c:590)`
It turns out that **init**.py tried to monkey patch the `ssl_wrap_socket`
```
from .packages.urllib3.contrib import pyopenssl
pyopenssl.inject_into_urllib3()
```
But it couldn't, since it couldn't import `from .packages.urllib3.contrib import pyopenssl`
Which due to `packages/urllib3/contrib/pyopenssl.py` tried to import:
`from ndg.httpsclient.ssl_peer_verification import SUBJ_ALT_NAME_SUPPORT`
but `ndg` didn't exist.
Once I installed `ndg-httpsclient` then the error went away.
What I am wondering is why package `ndg-httpsclient` was not installed during installing requests? Or is it supposed to be installed individually?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3605/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3605/timeline
| null |
completed
| null | null | false |
[
"The reason we don't install it when you install requests is that it's optional. Requests can, on many systems, successfully handle TLS without needing any third-party libraries. However, some users will want to use a newer version of OpenSSL or enable Requests to use more advanced features. Those require some additional third-party dependencies that are strictly _optional_. They can be requested when installing Requests by running `pip install requests[security]`.\n",
"`pip install requests[security]` works great on Mac OS X Sierra but on Ubuntu 14.04, it ran fine but didn't seem to have any effect, any idea why? \n\n```\n >>> ssl.OPENSSL_VERSION\n 'OpenSSL 1.0.1f 6 Jan 2014'\n $ openssl version\n OpenSSL 1.0.1f 6 Jan 2014\n```\n\nBTW I am using `pyenv` and `pyenv-virtualenvwrapper`if that makes a difference.\n",
"@IntelBob The security packages don't install a newer OpenSSL on Linux and, even if they did, they don't monkeypatch the standard library. Are you having a specific problem on Ubuntu?\n",
"@Lukasa I am getting intermittent `EOF occurred in violation of protocol (_ssl.c:590)` error on Ubuntu, should I just upgrade Ubuntu openssl package? Looks like the latest openssl version on Ubuntu is 1.0.1g, would installing that version help with my issue? I am not explicitly using ssl in my code, just using requests to access https URLs. \n",
"You can try, but intermittent errors rather suggest that the problem isn't with OpenSSL. Are the errors intermittent in the sense that they only occur for some hosts, or intermittent in the sense that sometimes a request works for a host and sometimes it fails?\n",
"It's the latter, intermittent in the sense that it works almost all the time and occasionally it will generate that error, it's not reliably reproducible so I don't know how to debug it, I am on Python 2.7.12. Also I am using requests with threads and unsure if such an error might have caused a thread to hang (I am using try...exception block), that's the core of my issue is that I get these random hanging threads and no discernible cause. \n\nAlso sometimes i also get this error `[('Connection aborted.', BadStatusLine(\"''\",))]`. I am making these requests through a proxy server and that has also thrown intermittent errors. I read from another issue that using `requests.session` might help, right now I am using `requests.get`, `requests.delete` in all the threads. Any insights will be much appreciated. \n",
"So both of these intermittent issues smell like issues that have nothing to do with your OpenSSL version and everything to do with various kinds of network weirdness. Both of these errors basically boil down to \"the remote peer shut the connection down when we didn't expect them to\". Working out a more specific understanding of _why_ the remote peer is doing that is much harder if you don't have access to server logs.\n",
"Yeah, I kinda had the same feeling since the errors are intermittent and random and unfortunately I don't have access to the backend logs. As a side note, do you know if using `requests.get(), requests.delete()` in multiple threads is safe? I assume so since `requests` is thread-safe and i have a `try...except` block that the thread should end normally even in the case of such an exception. \n",
"Yes, `requests.*` methods share no state so you can safely use them across threads.\n",
"Hi @Lukasa \r\n\r\nI the same error , and it stuck me for couple of days :+1: \r\n`urllib2.URLError: <urlopen error EOF occurred in violation of protocol (_ssl.c:590)>`\r\nmy python version is 2.7.10, and I use urllib2 \r\nbelow is my related code:\r\n`cookieprocessor = urllib2.HTTPCookieProcessor()\r\n opener = urllib2.build_opener(cookieprocessor)\r\n urllib2.install_opener(opener)\r\n request = urllib2.Request(url,postParams)\r\n if sys.version_info < (2, 7, 9):\r\n file = urllib2.urlopen(request)\r\n else:\r\n ctx = ssl.create_default_context()\r\n ctx.check_hostname = False\r\n ctx.verify_mode = ssl.CERT_NONE\r\n print (\"---\")\r\n file = urllib2.urlopen(request, context=ctx, timeout=30)\r\n fileInfo = file.read()`\r\n\r\nPlease help me...",
"@szyl111 Your code is using urllib2. This issue tracker is for support with the Python Requests library only. You'll need to go discuss your problem upstream with the Python Development team."
] |
https://api.github.com/repos/psf/requests/issues/3604
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3604/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3604/comments
|
https://api.github.com/repos/psf/requests/issues/3604/events
|
https://github.com/psf/requests/pull/3604
| 179,747,699 |
MDExOlB1bGxSZXF1ZXN0ODcxMDE3NzQ=
| 3,604 |
fixes broken link on documentation page
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4259587?v=4",
"events_url": "https://api.github.com/users/iamprakashom/events{/privacy}",
"followers_url": "https://api.github.com/users/iamprakashom/followers",
"following_url": "https://api.github.com/users/iamprakashom/following{/other_user}",
"gists_url": "https://api.github.com/users/iamprakashom/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/iamprakashom",
"id": 4259587,
"login": "iamprakashom",
"node_id": "MDQ6VXNlcjQyNTk1ODc=",
"organizations_url": "https://api.github.com/users/iamprakashom/orgs",
"received_events_url": "https://api.github.com/users/iamprakashom/received_events",
"repos_url": "https://api.github.com/users/iamprakashom/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/iamprakashom/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iamprakashom/subscriptions",
"type": "User",
"url": "https://api.github.com/users/iamprakashom",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-09-28T11:31:11Z
|
2021-09-08T02:10:29Z
|
2016-09-28T11:32:53Z
|
CONTRIBUTOR
|
resolved
|
it fixes a broken link on documentation page and updates AUTHORS.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3604/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3604/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3604.diff",
"html_url": "https://github.com/psf/requests/pull/3604",
"merged_at": "2016-09-28T11:32:53Z",
"patch_url": "https://github.com/psf/requests/pull/3604.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3604"
}
| true |
[
"Thanks @iamprakashom! :sparkles: :cake: :sparkles:\n",
"Thank you very much @Lukasa for merging my PR. :smile: \n"
] |
https://api.github.com/repos/psf/requests/issues/3603
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3603/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3603/comments
|
https://api.github.com/repos/psf/requests/issues/3603/events
|
https://github.com/psf/requests/issues/3603
| 179,687,386 |
MDU6SXNzdWUxNzk2ODczODY=
| 3,603 |
Link on Documentation web page not working
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4259587?v=4",
"events_url": "https://api.github.com/users/iamprakashom/events{/privacy}",
"followers_url": "https://api.github.com/users/iamprakashom/followers",
"following_url": "https://api.github.com/users/iamprakashom/following{/other_user}",
"gists_url": "https://api.github.com/users/iamprakashom/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/iamprakashom",
"id": 4259587,
"login": "iamprakashom",
"node_id": "MDQ6VXNlcjQyNTk1ODc=",
"organizations_url": "https://api.github.com/users/iamprakashom/orgs",
"received_events_url": "https://api.github.com/users/iamprakashom/received_events",
"repos_url": "https://api.github.com/users/iamprakashom/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/iamprakashom/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iamprakashom/subscriptions",
"type": "User",
"url": "https://api.github.com/users/iamprakashom",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2016-09-28T06:04:27Z
|
2021-09-08T15:00:43Z
|
2016-09-29T10:01:10Z
|
CONTRIBUTOR
|
resolved
|
link for _connection pooling_ on [this](http://docs.python-requests.org/en/master/user/advanced/#session-objects), line 3 looks broken and shows message
_SORRY
This Page doesn't exist yet._
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3603/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3603/timeline
| null |
completed
| null | null | false |
[
"Seems that the old documentation on connection pooling moved around in urllib. The page it references (http://urllib3.readthedocs.io/en/1.5/pools.html) no longer exists after some refactoring was done.\n\nPerhaps a better alternative is:\nhttp://urllib3.readthedocs.io/en/latest/reference/index.html#module-urllib3.connectionpool\nor\nhttp://urllib3.readthedocs.io/en/latest/advanced-usage.html#customizing-pool-behavior\n",
"Yup, that looks wrong. I recommend the first link you posted @bbamsch. Would either of you be interested in submitting a pull request to fix that?\n",
"I am working on it.\n",
"@Lukasa, If I understand right, first link you are referring to is http://urllib3.readthedocs.io/en/1.5/pools.html ?\n",
"No, sorry, I meant http://urllib3.readthedocs.io/en/latest/reference/index.html#module-urllib3.connectionpool\n",
"thank you\n",
"This is resolved with #3604.\n"
] |
https://api.github.com/repos/psf/requests/issues/3602
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3602/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3602/comments
|
https://api.github.com/repos/psf/requests/issues/3602/events
|
https://github.com/psf/requests/issues/3602
| 179,492,612 |
MDU6SXNzdWUxNzk0OTI2MTI=
| 3,602 |
Self signed certificate, passed to via verify=/path/to/cert does still trigger certificate verify failed
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/147512?v=4",
"events_url": "https://api.github.com/users/do3cc/events{/privacy}",
"followers_url": "https://api.github.com/users/do3cc/followers",
"following_url": "https://api.github.com/users/do3cc/following{/other_user}",
"gists_url": "https://api.github.com/users/do3cc/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/do3cc",
"id": 147512,
"login": "do3cc",
"node_id": "MDQ6VXNlcjE0NzUxMg==",
"organizations_url": "https://api.github.com/users/do3cc/orgs",
"received_events_url": "https://api.github.com/users/do3cc/received_events",
"repos_url": "https://api.github.com/users/do3cc/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/do3cc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/do3cc/subscriptions",
"type": "User",
"url": "https://api.github.com/users/do3cc",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 13 |
2016-09-27T13:26:56Z
|
2021-09-08T06:00:43Z
|
2016-10-05T09:02:17Z
|
NONE
|
resolved
|
```
-> request = session.get(url, verify=cert)
(Pdb) c
Traceback (most recent call last):
File "inventory.py", line 195, in <module>
inventory()
File "/Users/do3cc/dev/ansible/lib/python2.7/site-packages/click/core.py", line 716, in __call__
return self.main(*args, **kwargs)
File "/Users/do3cc/dev/ansible/lib/python2.7/site-packages/click/core.py", line 696, in main
rv = self.invoke(ctx)
File "/Users/do3cc/dev/ansible/lib/python2.7/site-packages/click/core.py", line 889, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/Users/do3cc/dev/ansible/lib/python2.7/site-packages/click/core.py", line 534, in invoke
return callback(*args, **kwargs)
File "inventory.py", line 192, in inventory
print get_host(host)
File "inventory.py", line 174, in get_host
for host in get_all_infrastructure_elements():
File "inventory.py", line 98, in get_all_infrastructure_elements
request = session.get(url, verify=cert)
File "/Users/do3cc/dev/ansible/lib/python2.7/site-packages/requests/sessions.py", line 488, in get
return self.request('GET', url, **kwargs)
File "/Users/do3cc/dev/ansible/lib/python2.7/site-packages/requests/sessions.py", line 475, in request
resp = self.send(prep, **send_kwargs)
File "/Users/do3cc/dev/ansible/lib/python2.7/site-packages/requests/sessions.py", line 596, in send
r = adapter.send(request, **kwargs)
File "/Users/do3cc/dev/ansible/lib/python2.7/site-packages/cachecontrol/adapter.py", line 47, in send
resp = super(CacheControlAdapter, self).send(request, **kw)
File "/Users/do3cc/dev/ansible/lib/python2.7/site-packages/requests/adapters.py", line 497, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: ("bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)",)
```
This happens under mac. openssl gets installed via brew.
I have installed requests with pip via requests[security]
Versions:
- python: 2.7.10
- SSL: python -c 'import ssl;print(ssl.OPENSSL_VERSION)'
**OpenSSL 1.0.2j 26 Sep 2016**
I have downloaded the certificate via:
```
/usr/local/Cellar/openssl/1.0.2j/bin/openssl s_client -showcerts -connect ourserver:443 </dev/null | openssl x509 -outform PEM > eam_cert.pem
```
I have verified that the cert itself is ok via:
```
curl --cacert eam_cert.pem https://ourserver
```
This used to work in the past, but the server got a new certificate.
Right now I am out of ideas what I could be doing wrong here.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/147512?v=4",
"events_url": "https://api.github.com/users/do3cc/events{/privacy}",
"followers_url": "https://api.github.com/users/do3cc/followers",
"following_url": "https://api.github.com/users/do3cc/following{/other_user}",
"gists_url": "https://api.github.com/users/do3cc/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/do3cc",
"id": 147512,
"login": "do3cc",
"node_id": "MDQ6VXNlcjE0NzUxMg==",
"organizations_url": "https://api.github.com/users/do3cc/orgs",
"received_events_url": "https://api.github.com/users/do3cc/received_events",
"repos_url": "https://api.github.com/users/do3cc/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/do3cc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/do3cc/subscriptions",
"type": "User",
"url": "https://api.github.com/users/do3cc",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3602/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3602/timeline
| null |
completed
| null | null | false |
[
"Are you able to share with us the output of `openssl x509 -in eam_cert.pem -noout -text`?\n",
"Yes, here it is:\n\n```\nCertificate:\n Data:\n Version: 3 (0x2)\n Serial Number:\n 12:a6:13:9d:00:01:00:04:f3:bb\n Signature Algorithm: sha256WithRSAEncryption\n Issuer: DC=net, DC=it, CN=Company Group Enterprise CA3\n Validity\n Not Before: Aug 11 13:28:38 2016 GMT\n Not After : Aug 10 13:28:38 2021 GMT\n Subject: C=DE, L=Somewhere, O=Company AG, CN=eam.company.com\n Subject Public Key Info:\n Public Key Algorithm: rsaEncryption\n RSA Public Key: (2048 bit)\n Modulus (2048 bit):\n 00:c3:20:4e:2f:05:3a:6b:6b:85:82:7e:2e:a1:83:\n 85:98:a4:2f:cd:46:88:bc:3f:40:02:69:e7:97:8f:\n e5:18:63:57:f9:63:48:c6:66:47:eb:b8:3b:34:ca:\n 86:c8:3e:df:a4:f3:5c:0b:62:a5:61:85:1e:9a:e0:\n 68:f1:92:ed:a6:22:47:cf:ae:b9:da:7b:67:f8:16:\n 36:44:1c:21:8b:82:1e:f6:e8:25:69:53:36:06:00:\n 79:fe:e8:a5:7e:20:8f:39:f7:12:ad:0d:1d:19:55:\n 33:3a:6a:93:5e:98:b6:35:f8:01:d4:45:5b:dd:c8:\n f5:80:02:8a:30:53:23:77:2a:06:92:c1:95:8e:ce:\n fc:69:9f:4d:4a:ee:b8:b7:28:d0:4f:ca:1a:c3:c7:\n 07:08:c4:aa:82:98:3f:e9:89:ac:92:1a:7a:3f:ca:\n 6a:7e:03:3f:e1:1a:36:64:e2:d3:cb:1a:11:38:de:\n a0:b9:3a:6b:6f:40:da:9a:43:e6:d5:af:db:c2:a3:\n 15:67:69:8e:ff:a1:5d:80:22:c7:4a:09:e1:fe:06:\n 2c:eb:2e:d6:db:66:37:f4:29:c2:d4:9a:47:c5:8a:\n 76:be:2a:1e:72:11:d4:77:7c:68:28:0a:bb:07:08:\n 61:02:1c:1b:43:5d:cc:0e:4a:48:f6:1f:4c:45:6c:\n 03:5f\n Exponent: 65537 (0x10001)\n X509v3 extensions:\n X509v3 Subject Key Identifier:\n A8:7A:25:98:F8:D8:1D:01:60:AC:72:0C:63:0B:FB:60:E7:47:50:18\n X509v3 Authority Key Identifier:\n keyid:04:F4:7E:8A:12:F6:92:4E:87:8D:A1:78:C9:2F:2A:D5:64:01:4A:6D\n\n X509v3 CRL Distribution Points:\n URI:ldap:///CN=Company%20Group%20Enterprise%20CA3(1),CN=hostname,CN=CDP,CN=Public%20Key%20Services,CN=Services,CN=Configuration,DC=it,DC=net?certificateRevocationList?base?objectClass=cRLDistributionPoint\n URI:http://pki.xgrp.net/CA3/Company%20Group%20Enterprise%20CA3(1).crl\n\n Authority Information Access:\n CA Issuers - URI:ldap:///CN=Company%20Group%20Enterprise%20CA3,CN=AIA,CN=Public%20Key%20Services,CN=Services,CN=Configuration,DC=it,DC=net?cACertificate?base?objectClass=certificationAuthority\n CA Issuers - URI:http://pki.xgrp.net/CA3/hostname.it.net_Company%20Group%20Enterprise%20CA3(1).crt\n\n X509v3 Key Usage: critical\n Digital Signature, Key Encipherment\n 1.3.6.1.4.1.311.21.7:\n 0-.%+.....7....t...\n...\"...-....a...u...4..d...\n X509v3 Extended Key Usage:\n TLS Web Server Authentication\n 1.3.6.1.4.1.311.21.10:\n 0.0\n..+.......\n Signature Algorithm: sha256WithRSAEncryption\n 66:f1:bf:eb:a8:d3:3f:d2:49:6f:93:d1:7b:b4:94:13:d1:91:\n 9a:f9:c0:74:3b:f7:78:59:54:64:1e:3f:96:ae:ab:f8:7d:0b:\n 68:63:00:f3:57:a2:20:76:49:dc:dd:95:10:b4:c4:d3:d8:e9:\n d1:ee:ed:a9:48:58:ff:82:3a:d6:7a:cd:f1:00:08:32:20:27:\n bc:f1:ce:9b:5c:9b:2c:67:69:2b:bf:d8:8e:03:5a:d6:da:c3:\n 70:06:f1:a3:55:d2:8e:52:1d:65:05:5b:46:33:4f:f4:f2:70:\n d2:55:00:c7:f7:f3:11:e6:20:08:02:81:c8:67:4b:57:ce:9a:\n bf:8a:d0:e4:35:8e:67:29:38:60:32:cb:29:19:19:d6:90:57:\n cd:2b:29:ce:56:1c:e0:61:bc:dc:b6:3d:2f:64:50:38:50:0f:\n 37:09:4c:47:18:3b:e4:21:a5:bf:d2:28:40:7a:a7:a1:27:3b:\n 64:a0:e7:9d:d9:40:94:42:11:34:bd:a8:b3:09:66:7e:48:d8:\n 8e:ec:d9:61:62:29:31:33:be:ff:e7:12:e0:8e:94:e3:37:80:\n db:39:18:1f:ca:fc:a4:b9:41:1d:ab:6c:ef:98:f5:2f:2d:d7:\n 31:c2:13:ef:1b:77:4e:bc:26:9d:8a:d2:c3:a1:2c:22:a5:3b:\n ed:d8:b0:91\n```\n",
"That certificate is not self-signed. As a result, it's not a root certificate and is not a valid issuer for that cert. I suspect you have the root cert for the connection in your macOS Keychain, which is what curl is actually using to validate the connection.\n",
"@Lukasa Thanks for your responses so far. Curl did not use my macos keychain, when I didn't pass the certificate as an argument, curl would not download anything either.\nI followed the crt link in the output above, downloaded that certificate, checked the output too and so on to get something that might just be the root certificate. It still does not work but next I first try to find a human who might have the proper root certificate.\n",
"@do3cc Can you print `curl -V` for me please?\n",
"```\n* Rebuilt URL to: https://eam.company.com/\n* Trying 192.168.239.102...\n* Connected to eam.company.com (192.168.239.102) port 443 (#0)\n* TLS 1.2 connection using TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256\n* Server certificate: eam.company.com\n* Server certificate: company Group Enterprise CA3\n* Server certificate: company Group Internal Root CA\n> GET / HTTP/1.1\n> Host: eam.company.com\n> User-Agent: curl/7.43.0\n> Accept: */*\n>\n< HTTP/1.1 200 OK\n< Server: Apache-Coyote/1.1\n< Accept-Ranges: bytes\n< ETag: W/\"97-1434464263000\"\n< Last-Modified: Tue, 16 Jun 2015 14:17:43 GMT\n< Content-Type: text/html\n< Content-Length: 97\n< Date: Tue, 27 Sep 2016 14:24:12 GMT\n<\n<html>\n\n<head>\n<meta http-equiv=\"refresh\" content=\"0;URL=/rep/\">\n</head>\n\n<body>\n</body>\n\n* Connection #0 to host eam.company.com left intact\n</html>%\n```\n",
"Sorry, I meant literally just the command `curl -V` (case-sensitive, that capital V is on purpose).\n",
"yeah, I just noticed myself:\n\n```\ncurl 7.43.0 (x86_64-apple-darwin15.0) libcurl/7.43.0 SecureTransport zlib/1.2.5\nProtocols: dict file ftp ftps gopher http https imap imaps ldap ldaps pop3 pop3s rtsp smb smbs smtp smtps telnet tftp\nFeatures: AsynchDNS IPv6 Largefile GSS-API Kerberos SPNEGO NTLM NTLM_WB SSL libz UnixSockets\n```\n",
"Yup, so it remains the case that curl is using `SecureTransport` here. Regardless of whether curl found a root in the keychain or not, curl is using a totally different library than Requests is to build its cert chain. That will always cause a deviance in behaviour. =(\n",
"Hi, I got an answer and the certificate files. \nIt was still failing while adding all certs to my browser made my browser accept the website. Googling for how to validate the cert via ssl gave me this page http://stackoverflow.com/questions/25482199/verify-a-certificate-chain-using-openssl-verify and after that I realized that it is not enough to pass the root certificate to `verify` but also the intermediate certificates. \n\nOur org uses its own root certificate and intermediate certificates. To verify the https connection, requests needs the complete certificate chain. \nCreating a cert file where I pasted the root certificate and intermediate certificates did just that.\n\n@Lukasa Thank you so much for your quick help!\n",
"@Lukasa @do3cc Hi, sorry for tagging you in an old issue but I have a similar question regarding how requests verify SSL certs. If I want to verify SSL cert of a Server which is using a chain like:\r\nCert1 (Root Certificate, self-signed) --> Cert2(Intermediate cert, for ABC, issued by root CA) --> Cert3(Server Cert issued by ABC )\r\nWhat I have noticed is, if I put Cert2 in a directory and point to it by environment variable 'REQUESTS_CA_BUNDLE' (after running 'c_rehash' utility on it) or if I directly point to it in the requests method call, I am still receiving SSL verification error. From my understanding it seems root certificate(Cert1 in this scenario) is required to verify the SSL certs successfully.\r\nIs there a way to verify SSL certificate for Server(Cert3) with just the intermediate cert(Cert2 in this case) without using root certificate?",
"No, Requests (more properly OpenSSL) always requires a root certificate to successfully build a chain. ",
"@Lukasa Thanks for the response."
] |
https://api.github.com/repos/psf/requests/issues/3601
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3601/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3601/comments
|
https://api.github.com/repos/psf/requests/issues/3601/events
|
https://github.com/psf/requests/issues/3601
| 179,433,092 |
MDU6SXNzdWUxNzk0MzMwOTI=
| 3,601 |
when will the requests supports HTTP/2?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6284532?v=4",
"events_url": "https://api.github.com/users/fendouai/events{/privacy}",
"followers_url": "https://api.github.com/users/fendouai/followers",
"following_url": "https://api.github.com/users/fendouai/following{/other_user}",
"gists_url": "https://api.github.com/users/fendouai/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/fendouai",
"id": 6284532,
"login": "fendouai",
"node_id": "MDQ6VXNlcjYyODQ1MzI=",
"organizations_url": "https://api.github.com/users/fendouai/orgs",
"received_events_url": "https://api.github.com/users/fendouai/received_events",
"repos_url": "https://api.github.com/users/fendouai/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/fendouai/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fendouai/subscriptions",
"type": "User",
"url": "https://api.github.com/users/fendouai",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2016-09-27T08:52:49Z
|
2021-09-08T15:00:44Z
|
2016-09-27T09:50:28Z
|
NONE
|
resolved
|
Our project need a http/2 client.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3601/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3601/timeline
| null |
completed
| null | null | false |
[
"There is no roadmap for this. HTTP/2 is a complex protocol and it is not likely that Requests in its current form will achieve good support for it any time soon. In the meantime, you can [use this](https://hyper.readthedocs.io/en/latest/quickstart.html#requests-integration).\n",
"Read the docs appears to be down. Fyi\n\nOn Tue, 27 Sep 2016, 7:51 PM Cory Benfield [email protected] wrote:\n\n> There is no roadmap for this. HTTP/2 is a complex protocol and it is not\n> likely that Requests in its current form will achieve good support for it\n> any time soon. In the meantime, you can use this\n> https://hyper.readthedocs.io/en/latest/quickstart.html#requests-integration\n> .\n> \n> —\n> You are receiving this because you are subscribed to this thread.\n> Reply to this email directly, view it on GitHub\n> https://github.com/kennethreitz/requests/issues/3601#issuecomment-249818864,\n> or mute the thread\n> https://github.com/notifications/unsubscribe-auth/AGk_7ehXc8joUXLgtfbtMW3v9m-iQRUVks5quOb5gaJpZM4KHaKv\n> .\n",
"Loading fine for me here.\n",
"All resolved. Must have been a temporary DNS issue.\n\nOn Tue, 27 Sep 2016, 7:58 PM Cory Benfield [email protected] wrote:\n\n> Loading fine for me here.\n> \n> —\n> You are receiving this because you commented.\n> \n> Reply to this email directly, view it on GitHub\n> https://github.com/kennethreitz/requests/issues/3601#issuecomment-249820462,\n> or mute the thread\n> https://github.com/notifications/unsubscribe-auth/AGk_7SXTxilVQFhiFn25z9gq-DKVx9qnks5quOiggaJpZM4KHaKv\n> .\n"
] |
https://api.github.com/repos/psf/requests/issues/3600
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3600/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3600/comments
|
https://api.github.com/repos/psf/requests/issues/3600/events
|
https://github.com/psf/requests/pull/3600
| 179,327,409 |
MDExOlB1bGxSZXF1ZXN0ODY4MTE2NjU=
| 3,600 |
Indicate shell command in HISTORY is code-like.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/416575?v=4",
"events_url": "https://api.github.com/users/frewsxcv/events{/privacy}",
"followers_url": "https://api.github.com/users/frewsxcv/followers",
"following_url": "https://api.github.com/users/frewsxcv/following{/other_user}",
"gists_url": "https://api.github.com/users/frewsxcv/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/frewsxcv",
"id": 416575,
"login": "frewsxcv",
"node_id": "MDQ6VXNlcjQxNjU3NQ==",
"organizations_url": "https://api.github.com/users/frewsxcv/orgs",
"received_events_url": "https://api.github.com/users/frewsxcv/received_events",
"repos_url": "https://api.github.com/users/frewsxcv/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/frewsxcv/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/frewsxcv/subscriptions",
"type": "User",
"url": "https://api.github.com/users/frewsxcv",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-09-26T20:21:20Z
|
2021-09-08T02:10:30Z
|
2016-09-27T07:06:27Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3600/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3600/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3600.diff",
"html_url": "https://github.com/psf/requests/pull/3600",
"merged_at": "2016-09-27T07:06:27Z",
"patch_url": "https://github.com/psf/requests/pull/3600.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3600"
}
| true |
[
"Thanks @frewsxcv!\n"
] |
|
https://api.github.com/repos/psf/requests/issues/3599
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3599/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3599/comments
|
https://api.github.com/repos/psf/requests/issues/3599/events
|
https://github.com/psf/requests/issues/3599
| 179,301,485 |
MDU6SXNzdWUxNzkzMDE0ODU=
| 3,599 |
Windows C runtime error R6034 when importing requests in embedded python
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7239951?v=4",
"events_url": "https://api.github.com/users/alannorton/events{/privacy}",
"followers_url": "https://api.github.com/users/alannorton/followers",
"following_url": "https://api.github.com/users/alannorton/following{/other_user}",
"gists_url": "https://api.github.com/users/alannorton/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/alannorton",
"id": 7239951,
"login": "alannorton",
"node_id": "MDQ6VXNlcjcyMzk5NTE=",
"organizations_url": "https://api.github.com/users/alannorton/orgs",
"received_events_url": "https://api.github.com/users/alannorton/received_events",
"repos_url": "https://api.github.com/users/alannorton/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/alannorton/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alannorton/subscriptions",
"type": "User",
"url": "https://api.github.com/users/alannorton",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2016-09-26T18:22:37Z
|
2021-09-08T15:00:43Z
|
2016-09-26T18:31:12Z
|
NONE
|
resolved
|
When I import requests from python embedded in C++, e.g. by calling
PyRun_SimpleString("import requests")
I get the C runtime error R6034. This happens in Visual Studio 2010 or 2013, using the latest version of requests. Is there a workaround?
Please copy [email protected] on any response to this issue.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3599/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3599/timeline
| null |
completed
| null | null | false |
[
"Sorry, but this issue has nothing to do with requests. Requests is a pure-Python library, meaning that if normal Python code works, it should work. The error almost certainly comes from your embedding. Please investigate there first, and reopen this issue if you find strong evidence that suggests that Requests is at fault. \n",
"I would agree with you if all the modules that Requests imports are pure-Python. However, since we are only seeing this problem with Requests (no other imports exhibit this problem), there must be something that Requests is importing that uses a C runtime. Would you please tell me what imports are occurring in Requests?\n",
"Requests by default imports only the standard library. It optionally may attempt to import pyopenssl, which could have C bindings. \n",
"Cory,\nI hear what you are saying, and yet it is clear to me that there is a\nproblem with using Requests with embedded Python in Windows. I have a\ntrivial (10-line) program that reproduces the issue (attached). I also\nhave a visual studio project that uses this program, you can download it\nfrom: http://vis.ucar.edu/~alan/TEMP/importrequest.zip\nYou should be able to demonstrate the C runtime error quite easily.\nPlease check it out,\n-Alan\n\nOn Mon, Sep 26, 2016 at 2:44 PM, Cory Benfield [email protected]\nwrote:\n\n> Requests by default imports only the standard library. It optionally may\n> attempt to import pyopenssl, which could have C bindings.\n> \n> —\n> You are receiving this because you authored the thread.\n> Reply to this email directly, view it on GitHub\n> https://github.com/kennethreitz/requests/issues/3599#issuecomment-249691856,\n> or mute the thread\n> https://github.com/notifications/unsubscribe-auth/AG55D_C5-_voipHX232wItL9jGNi4Ljbks5quC7DgaJpZM4KG1ti\n> .\n",
"I am not disputing that there is a problem using Requests with embedded Python in Windows: you're trying to do it, and you hit a C runtime error. I am suggesting that the problem is not _Requests_.\n\nIn this case, what you're doing is extremely far outside the expertise of this development team. At best, I can tell you that when I run your script on my Unix development box, I encounter no problem. Of course, I have had to remember to pass the appropriate compiler configuration (obtained from running `python-config --cflags` and `python-config --ldflags`) to ensure that any shared libraries that were compiled against that Python are appropriately linkable. If I do that, then the import behaves as expected with no errors.\n\nThat means, as far as I can see, there are two problems: either you are not passing the correct compiler/linker flags, or this problem is Windows specific. Unfortunately, I do not have sufficient experience or access to Windows environments to easily investigate this issue. That means that I need to rely on you to confirm that everything is functioning correctly. For example, it would be worth trying to import a few other 3rd-party libraries to check whether it really is only Requests, or whether the problem is broader than that.\n",
"Cory,\nThanks for your serious examination of this problem.\nWe do know that this is a Windows-specific problem. The code that imports\nRequests is working fine on Mac and Linux. We have not performed an\nexhaustive search, but we have imported many (i.e. dozens of) different\nmodules and Requests is the only one that causes the C runtime error. If\nyou send me a list of all the modules that are imported by Requests, then I\ncould test each of them to see if one of them causes the runtime error.\n\nWe have verified that this problem occurs on different Windows platforms\nand different Visual Studio versions. If anyone on your team has Windows\nexpertise and can offer us any suggestions we would very much appreciate\nit. We are expecting to release VAPOR 2.6 (http://www.vapor.ucar.edu) in\nthe next month and one of our features depends on a working Requests\nmodule. About 1/3 of our users are Windows-based.\nRegards,\n-Alan\n\nOn Tue, Sep 27, 2016 at 3:49 AM, Cory Benfield [email protected]\nwrote:\n\n> I am not disputing that there is a problem using Requests with embedded\n> Python in Windows: you're trying to do it, and you hit a C runtime error. I\n> am suggesting that the problem is not _Requests_.\n> \n> In this case, what you're doing is extremely far outside the expertise of\n> this development team. At best, I can tell you that when I run your script\n> on my Unix development box, I encounter no problem. Of course, I have had\n> to remember to pass the appropriate compiler configuration (obtained from\n> running python-config --cflags and python-config --ldflags) to ensure\n> that any shared libraries that were compiled against that Python are\n> appropriately linkable. If I do that, then the import behaves as expected\n> with no errors.\n> \n> That means, as far as I can see, there are two problems: either you are\n> not passing the correct compiler/linker flags, or this problem is Windows\n> specific. Unfortunately, I do not have sufficient experience or access to\n> Windows environments to easily investigate this issue. That means that I\n> need to rely on you to confirm that everything is functioning correctly.\n> For example, it would be worth trying to import a few other 3rd-party\n> libraries to check whether it really is only Requests, or whether the\n> problem is broader than that.\n> \n> —\n> You are receiving this because you authored the thread.\n> Reply to this email directly, view it on GitHub\n> https://github.com/kennethreitz/requests/issues/3599#issuecomment-249818599,\n> or mute the thread\n> https://github.com/notifications/unsubscribe-auth/AG55Dz0hmkzeABlJ0W_WB7b9JSbInPkQks5quOa8gaJpZM4KG1ti\n> .\n",
"@alannorton Here are all the import statements in the requests codebase. I should note that, while I'm happy to help here, this is information that would have been just as easy for you to obtain as it was for me.\n\nFirstly, the stdlib imports:\n\n```\nargparse\nbase64\nbinascii\ncalendar\ncgi\ncodecs\ncollections\ncontextlib\ncopy\ndatetime\nemail.utils\nencodings.idna\nerrno\nfunctools\nhashlib\nhmac\nio\nitertools\nlogging\nmimetypes\noperator\nos\nos.path\nre\nselect\nsix\nsocket\nssl\nstruct\nsys\nthreading\ntime\ntypes\nuuid\nwarnings\nzlib\n```\n\nThen the optional 3rd party imports:\n\n```\nOpenSSL.SSL\nntlm\npyasn1.codec.der\npyasn1.type\n```\n",
"I have good news! The Windows C runtime error goes away when we use Python\n2.7.11 instead of 2.7.8. So no fix is needed.\n\nOn Tue, Sep 27, 2016 at 10:21 AM, Cory Benfield [email protected]\nwrote:\n\n> @alannorton https://github.com/alannorton Here are all the import\n> statements in the requests codebase. I should note that, while I'm happy to\n> help here, this is information that would have been just as easy for you to\n> obtain as it was for me.\n> \n> Firstly, the stdlib imports:\n> \n> argparse\n> base64\n> binascii\n> calendar\n> cgi\n> codecs\n> collections\n> contextlib\n> copy\n> datetime\n> email.utils\n> encodings.idna\n> errno\n> functools\n> hashlib\n> hmac\n> io\n> itertools\n> logging\n> mimetypes\n> operator\n> os\n> os.path\n> re\n> select\n> six\n> socket\n> ssl\n> struct\n> sys\n> threading\n> time\n> types\n> uuid\n> warnings\n> zlib\n> \n> Then the optional 3rd party imports:\n> \n> OpenSSL.SSL\n> ntlm\n> pyasn1.codec.der\n> pyasn1.type\n> \n> —\n> You are receiving this because you were mentioned.\n> Reply to this email directly, view it on GitHub\n> https://github.com/kennethreitz/requests/issues/3599#issuecomment-249917076,\n> or mute the thread\n> https://github.com/notifications/unsubscribe-auth/AG55D_b-uoRWNGNwbI8aGifDJ2FxrAsRks5quUKVgaJpZM4KG1ti\n> .\n"
] |
https://api.github.com/repos/psf/requests/issues/3598
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3598/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3598/comments
|
https://api.github.com/repos/psf/requests/issues/3598/events
|
https://github.com/psf/requests/pull/3598
| 179,142,667 |
MDExOlB1bGxSZXF1ZXN0ODY2ODM1NDU=
| 3,598 |
Issue #3597 - Avoid bytestring/str hodgepodge when resolving cookie from Request URL
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5546456?v=4",
"events_url": "https://api.github.com/users/bbamsch/events{/privacy}",
"followers_url": "https://api.github.com/users/bbamsch/followers",
"following_url": "https://api.github.com/users/bbamsch/following{/other_user}",
"gists_url": "https://api.github.com/users/bbamsch/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bbamsch",
"id": 5546456,
"login": "bbamsch",
"node_id": "MDQ6VXNlcjU1NDY0NTY=",
"organizations_url": "https://api.github.com/users/bbamsch/orgs",
"received_events_url": "https://api.github.com/users/bbamsch/received_events",
"repos_url": "https://api.github.com/users/bbamsch/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bbamsch/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bbamsch/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bbamsch",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-09-26T05:31:21Z
|
2021-09-08T02:10:30Z
|
2016-09-28T07:56:38Z
|
CONTRIBUTOR
|
resolved
|
When resolving the request full URL, the Host cookie is used to override the URL's host, in Python 3 this causes a mix of str and bytestring objects when the Host cookie is set to a bytestring. Patch checks that ParsedResult is str type and host is non-str type before attempting to decode bytestring to str.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3598/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3598/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3598.diff",
"html_url": "https://github.com/psf/requests/pull/3598",
"merged_at": "2016-09-28T07:56:38Z",
"patch_url": "https://github.com/psf/requests/pull/3598.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3598"
}
| true |
[
"Patch should address Issue #3597\n",
"All appears to be in order 😃 \n\n```\n============================= test session starts =============================\nplatform win32 -- Python 3.5.2, pytest-2.8.7, py-1.4.31, pluggy-0.3.1\nrootdir: C:\\Users\\Brian\\PycharmProjects\\requests, inifile:\nplugins: httpbin-0.0.7, cov-2.2.1, mock-0.11.0\ncollected 381 items\n\ntests\\test_hooks.py ...\ntests\\test_lowlevel.py .........\ntests\\test_requests.py .............................................................................................................................................X.........X....................................................\ntests\\test_structures.py ....................\ntests\\test_testserver.py ...........\ntests\\test_utils.py ..s...................................................................................................................................\n\n============== 378 passed, 1 skipped, 2 xpassed in 55.53 seconds ==============\n```\n",
"Went through your comments and reworked the module of internal references to to_native_string. 👍 \n"
] |
https://api.github.com/repos/psf/requests/issues/3597
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3597/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3597/comments
|
https://api.github.com/repos/psf/requests/issues/3597/events
|
https://github.com/psf/requests/issues/3597
| 179,070,635 |
MDU6SXNzdWUxNzkwNzA2MzU=
| 3,597 |
TypeError in MockRequest when 'Host' header set to bytestring
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5546456?v=4",
"events_url": "https://api.github.com/users/bbamsch/events{/privacy}",
"followers_url": "https://api.github.com/users/bbamsch/followers",
"following_url": "https://api.github.com/users/bbamsch/following{/other_user}",
"gists_url": "https://api.github.com/users/bbamsch/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bbamsch",
"id": 5546456,
"login": "bbamsch",
"node_id": "MDQ6VXNlcjU1NDY0NTY=",
"organizations_url": "https://api.github.com/users/bbamsch/orgs",
"received_events_url": "https://api.github.com/users/bbamsch/received_events",
"repos_url": "https://api.github.com/users/bbamsch/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bbamsch/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bbamsch/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bbamsch",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
},
{
"color": "0b02e1",
"default": false,
"description": null,
"id": 191274,
"name": "Contributor Friendly",
"node_id": "MDU6TGFiZWwxOTEyNzQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly"
}
] |
closed
| true | null |
[] | null | 3 |
2016-09-25T06:04:14Z
|
2021-09-08T15:00:43Z
|
2016-09-29T07:09:41Z
|
CONTRIBUTOR
|
resolved
|
Issue is specific to Python3
An incorrect assumption is made in requests.cookies.MockRequest on Python3, [see here](https://github.com/kennethreitz/requests/blob/master/requests/cookies.py#L52):
```
class MockRequest(object):
...
def get_full_url(self):
# Only return the response's URL if the user hadn't set the Host
# header
if not self._r.headers.get('Host'):
return self._r.url
# If they did set it, retrieve it and reconstruct the expected domain
host = self._r.headers['Host']
parsed = urlparse(self._r.url)
# Reconstruct the URL as we expect it
return urlunparse([
parsed.scheme, host, parsed.path, parsed.params, parsed.query,
parsed.fragment
])
...
```
In Python2 this works fine where str and unicode objects are allowed interchangeably in urlunparse.
In Python3 this is an issue because `host = self._r.headers['Host']` may return a bytestring and `parsed = urlparse(self._r.url)` returns a ParseResult with str objects. Combining these two causes urlunparse to throw a fit.
Steps to reproduce:
```
>>> import requests
>>> a = requests.Request('GET', 'http://www.google.com', headers={'Host': b'www.google.com'})
>>> b = requests.cookies.MockRequest(a)
>>> b.get_full_url()
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3597/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3597/timeline
| null |
completed
| null | null | false |
[
"Yup, this looks wrong to me. It does look like we need to ensure that we have a match between the two types. In fact, we should probably ensure we're always returning the correct type for the platform.\n",
"@Lukasa \nI've attempted a patch of this issue in PR #3598 \nWould appreciate your input on the patch.\n\n:smile:\n",
"Will follow up with a fix for the dependency on httpbin.org. Please leave this open until then.\n"
] |
https://api.github.com/repos/psf/requests/issues/3596
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3596/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3596/comments
|
https://api.github.com/repos/psf/requests/issues/3596/events
|
https://github.com/psf/requests/issues/3596
| 179,047,905 |
MDU6SXNzdWUxNzkwNDc5MDU=
| 3,596 |
TooManyRedirects error in session.py
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2313132?v=4",
"events_url": "https://api.github.com/users/Hretic/events{/privacy}",
"followers_url": "https://api.github.com/users/Hretic/followers",
"following_url": "https://api.github.com/users/Hretic/following{/other_user}",
"gists_url": "https://api.github.com/users/Hretic/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Hretic",
"id": 2313132,
"login": "Hretic",
"node_id": "MDQ6VXNlcjIzMTMxMzI=",
"organizations_url": "https://api.github.com/users/Hretic/orgs",
"received_events_url": "https://api.github.com/users/Hretic/received_events",
"repos_url": "https://api.github.com/users/Hretic/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Hretic/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Hretic/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Hretic",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-09-24T19:37:49Z
|
2021-09-08T15:00:45Z
|
2016-09-24T22:57:58Z
|
NONE
|
resolved
|
i dont know if this has to do anything with this problem ... but i'm kinda sending request to my own app from localhost to server (basically its the same app on both side 1 on the server and 1 on the localhost)
im using requests 2.11.1
```
resp = requests.get('http://apineginpay.in/api/init')
return HttpResponse( resp.text )
```
> TooManyRedirects at /pointer/init
> Exceeded 30 redirects.
> Request Method: GET
> Request URL: http://localhost:8000/pointer/init
> Django Version: 1.9.6
> Exception Type: TooManyRedirects
> Exception Value:
> Exceeded 30 redirects.
> Exception Location: c:\Python344\lib\site-packages\requests\sessions.py in resolve_redirects, line 110
> Python Executable: c:\Python344\python.exe
> Python Version: 3.4.4
> Traceback Switch to copy-and-paste view
>
> c:\Python344\lib\site-packages\django\core\handlers\base.py in get_response
> response = self.process_exception_by_middleware(e, request) ...
> ▶ Local vars
> c:\Python344\lib\site-packages\django\core\handlers\base.py in get_response
> response = wrapped_callback(request, _callback_args, *_callback_kwargs) ...
> ▶ Local vars
> C:\wamp\www\djangoapp\pointer\views.py in init
> resp = requests.get('http://apineginpay.in/api/init') ...
> ▶ Local vars
> c:\Python344\lib\site-packages\requests\api.py in get
> return request('get', url, params=params, *_kwargs) ...
> ▶ Local vars
> c:\Python344\lib\site-packages\requests\api.py in request
> return session.request(method=method, url=url, *_kwargs) ...
> ▶ Local vars
> c:\Python344\lib\site-packages\requests\sessions.py in request
> resp = self.send(prep, **send_kwargs) ...
> ▶ Local vars
> c:\Python344\lib\site-packages\requests\sessions.py in send
> history = [resp for resp in gen] if allow_redirects else [] ...
> ▶ Local vars
> c:\Python344\lib\site-packages\requests\sessions.py in <listcomp>
> history = [resp for resp in gen] if allow_redirects else [] ...
> ▶ Local vars
> c:\Python344\lib\site-packages\requests\sessions.py in resolve_redirects
> raise TooManyRedirects('Exceeded %s redirects.' % self.max_redirects, response=resp)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2313132?v=4",
"events_url": "https://api.github.com/users/Hretic/events{/privacy}",
"followers_url": "https://api.github.com/users/Hretic/followers",
"following_url": "https://api.github.com/users/Hretic/following{/other_user}",
"gists_url": "https://api.github.com/users/Hretic/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Hretic",
"id": 2313132,
"login": "Hretic",
"node_id": "MDQ6VXNlcjIzMTMxMzI=",
"organizations_url": "https://api.github.com/users/Hretic/orgs",
"received_events_url": "https://api.github.com/users/Hretic/received_events",
"repos_url": "https://api.github.com/users/Hretic/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Hretic/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Hretic/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Hretic",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3596/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3596/timeline
| null |
completed
| null | null | false |
[
"i solved this by setting user-agent in the headers (i wasn't sending any headers before )\n"
] |
https://api.github.com/repos/psf/requests/issues/3595
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3595/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3595/comments
|
https://api.github.com/repos/psf/requests/issues/3595/events
|
https://github.com/psf/requests/pull/3595
| 178,911,980 |
MDExOlB1bGxSZXF1ZXN0ODY1MzY4MDI=
| 3,595 |
Allow dicts for Session cookies
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 10 |
2016-09-23T15:51:56Z
|
2021-09-08T02:10:12Z
|
2016-11-10T13:50:06Z
|
MEMBER
|
resolved
|
Right now you can provide a dictionary to the `cookies` param almost everywhere in Requests and it will be converted into a RequestsCookieJar (aka "just work"). The one place this isn't happening is with Session, which will accept a dictionary without complaint but then fail when you try to send a PreparedRequest.
I discussed wanting to make this part of the API a bit more predictable with @Lukasa and this was a suggested fix for the problem. I did fair amount of testing outside of the included test and I don't believe this will adversely affect functionality. The one case I found is where someone is providing a CookieJar-like object that doesn't inherit from `cookielib.CookieJar`, and doesn't have an `__iter__` method. This seems like an unlikely case and wasn't _really_ supported before, so I wouldn't consider this breaking.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3595/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3595/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3595.diff",
"html_url": "https://github.com/psf/requests/pull/3595",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3595.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3595"
}
| true |
[
"@sigmavirus24, if we're planning on enforcing cookies requiring domains with your work in #2714, then I'd agree this PR may not be needed. If we don't intend to deprecate the use of dict->cookies though, I do think this should work the same everywhere. The documentation is already a bit opaque on this, and my original suggestion to @Lukasa was to update it to reflect how updating cookies currently works with a Session. If we want to make a note of this change in the documentation, I think simply adding \"As of 2.12\" at the beginning should be sufficient.\n\nI can make the changes for `self.cookies` to `self._cookies` if there's a consensus to move forward with this. I'm not sure I'm clear on the answer to that at present though. I'll wait for a go ahead from you and @Lukasa before I make any further changes.\n",
"very brilliant~~\n",
"@sigmavirus24 @Lukasa, if we don't want to address this issue by allowing dictionaries, perhaps we should at least raise an exception when someone tries to assign one? I'm not particularly keen on this approach, but the current point of failure doesn't generate a helpful error.\n",
"@nateprewitt I think @sigmavirus24 was tentatively in favour of this change, at least as a temporary stopping-off point before something better. He was just asking for a change in the descriptor protocol.\n",
"So here's my genuine perspective:\n\nDictionaries cannot adequately describe everything about a particular cookie. In fact, I'd go so far as to say that setting cookies on a session (using dictionaries) is a security _risk_ if not full out _vulnerability_. When we specify cookies as a dictionary, there's no Host provided which means that with very few exceptions we will send those cookies to _every_ host that session talks to. Using a dictionary means you can't effectively say \"Only send this cookie to https://example.com\". In my opinion we should absolutely do something to warn folks, but raising an exception is backwards incompatible and this approach (while internally consistent) is a footgun more than anything.\n",
"@sigmavirus24 I agree with you about there being a risk of unintentionally sending cookies over the wire with session cookies, but I think that isn't entirely coupled to this issue. Instead of allowing people to pass a dictionary, we currently have them dump a dictionary into a RequestsCookieJar which has no domain info either. So this is an extra step for the sake of purity, that currently provides the user no extra safety or benefit.\n\nI can see how this modification may reenforce bad behaviour that should probably be removed in 3.0.0, so in the meantime, I'll propose this. I think adding a section in the documentation detailing _how_ to do this correctly would go a long way. We should also at the very least add a warning when setting Session's cookies attribute with a dict, directing them to the documentation.\n\nSomething like this could go in the docs with an explanation of why using the domain property is important. An iterable-based version of `set` or `set_cookie` would probably be a useful helper for 3.0.0 too.\n\n``` python\n\ns = Session()\ncookiejar = RequestsCookieJar()\ndomain = '.google.com'\nfor name, value in cookie_dict.items():\n cookiejar.set(name, value, domain=domain)\ns.cookies = cookiejar\n```\n",
"Like @nateprewitt said, there is benefits here we can not ignore.\nWhen the user want to use `add_dict_to_cookiejar` this will add the dict to `''`domain. This is most likely not the desired scenario. \n\nFor example:\n\n```\ns = requests.Session()\ns.get('https://google.com')\n\n# Generating cookie dict, based on the sessions cookies and other source \nnew_cookie_dict = generate(s.cookies)\nadd_dict_to_cookiejar(s.cookies, new_cookie_dict)\n\n# s.cookies might have duplicated cookies on different domains s.cookies['dup'] will error CookieConflictError \n```\n",
"Alright @Lukasa, @sigmavirus24,\n\nI took a swing at my [last comment](https://github.com/kennethreitz/requests/pull/3595#issuecomment-254563756). It comes in two pieces. \n\n3e4e5b7: This adds documentation for adding dictionaries with domains. It's currently based off of the solution below, but if it's decided that isn't a good fit, we can change the code to my comment above. This at least provides direction for the user to do the right thing, rather than identical behaviour to a dictionary, with more code.\n\ne2a4f9f & 4454849: I added in a `**kwargs` option to the `add_dict_to_cookiejar` and `cookiejar_from_dict` to allow cookie parameters. This will allow you to pass dictionary-wide cookie params to use the current proposed solution for session cookies \"safely\". This behaviour resembles what the browser does with set-cookie on a per request basis, allows for _fairly_ easy extensibility and it's a minimal code change. It definitely needs some updated doc strings but I wanted to get feed back before proceeding.\n\nLastly, I think that a warning, pointing users to the documentation, when setting the session cookie to a dict is low overhead and will prevent issues like #3624.\n\nThis is becoming a much longer running discussion than I anticipated after discussing opening this patch with Lukasa. It may be worth closing this PR and moving it into an issue, but I'll defer to your judgements on how you want this cataloged.\n",
"I can get behind moving this to an issue for discussion. It's clear @sigmavirus24 has a bigger goal for this API than I originally did, and so I'd like us to move to a forum where he can participate effectively despite his current high workload.\n",
"Alright, I'm going to close this for now then and open a likely broader cookie issue.\n"
] |
https://api.github.com/repos/psf/requests/issues/3594
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3594/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3594/comments
|
https://api.github.com/repos/psf/requests/issues/3594/events
|
https://github.com/psf/requests/pull/3594
| 178,108,842 |
MDExOlB1bGxSZXF1ZXN0ODYwMDk4MjI=
| 3,594 |
#3576 for Proposed/3.0.0
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-09-20T16:24:26Z
|
2021-09-08T02:10:15Z
|
2016-09-21T09:41:00Z
|
MEMBER
|
resolved
|
Merging change from #3576 into the proposed/3.0.0 branch as per @sigmavirus24's request.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3594/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3594/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3594.diff",
"html_url": "https://github.com/psf/requests/pull/3594",
"merged_at": "2016-09-21T09:41:00Z",
"patch_url": "https://github.com/psf/requests/pull/3594.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3594"
}
| true |
[
"I wish there was a plane landing emoji.\n"
] |
https://api.github.com/repos/psf/requests/issues/3593
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3593/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3593/comments
|
https://api.github.com/repos/psf/requests/issues/3593/events
|
https://github.com/psf/requests/issues/3593
| 178,101,984 |
MDU6SXNzdWUxNzgxMDE5ODQ=
| 3,593 |
requests.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:590)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/22324812?v=4",
"events_url": "https://api.github.com/users/shiranhonigFiverr/events{/privacy}",
"followers_url": "https://api.github.com/users/shiranhonigFiverr/followers",
"following_url": "https://api.github.com/users/shiranhonigFiverr/following{/other_user}",
"gists_url": "https://api.github.com/users/shiranhonigFiverr/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/shiranhonigFiverr",
"id": 22324812,
"login": "shiranhonigFiverr",
"node_id": "MDQ6VXNlcjIyMzI0ODEy",
"organizations_url": "https://api.github.com/users/shiranhonigFiverr/orgs",
"received_events_url": "https://api.github.com/users/shiranhonigFiverr/received_events",
"repos_url": "https://api.github.com/users/shiranhonigFiverr/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/shiranhonigFiverr/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shiranhonigFiverr/subscriptions",
"type": "User",
"url": "https://api.github.com/users/shiranhonigFiverr",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 12 |
2016-09-20T15:57:39Z
|
2021-09-08T15:00:46Z
|
2016-09-20T18:58:01Z
|
NONE
|
resolved
|
Hi,
I don't why all of a sudden I get this error, it worked perfectly fine before.
I work with python 2.7.10, in windows.
**This is my code:**
```
def checkCreds():
response= requests.get(apiPath + 'hello world' , auth=(user, pasw), verify=False)
return response.status_code
```
**and I get this error:**
```
File "C:\Users\sh\GUI.py", line 343, in handleLogin
if checkCreds()==HTTP_OK:
File "C:\Users\sh\GUI.py", line 212, in checkCreds
response= requests.get(apiPath + 'hello world' , auth=(user, pasw), verify=False)
File "C:\Python27\lib\requests\api.py", line 69, in get
return request('get', url, params=params, **kwargs)
File "C:\Python27\lib\requests\api.py", line 50, in request
response = session.request(method=method, url=url, **kwargs)
File "C:\Python27\lib\requests\sessions.py", line 471, in request
resp = self.send(prep, **send_kwargs)
File "C:\Python27\lib\requests\sessions.py", line 579, in send
r = adapter.send(request, **kwargs)
File "C:\Python27\lib\requests\adapters.py", line 430, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:590)
```
Please help!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3593/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3593/timeline
| null |
completed
| null | null | false |
[
"This error is almost always the result of the server not liking your TLS configuration. Can you show me the output of `python -m pip freeze` and `python -c \"import ssl; print ssl.OPENSSL_VERSION\"` please?\n",
"`python -m pip freeze`\ncffi==1.8.3\nchatter==0.1.3\ncryptography==1.5\ncx-Freeze==4.3.4\ncycler==0.10.0\nelementtree==1.2.6.post2005\nenum34==1.1.6\net-xmlfile==1.0.1\nfreetype-py==1.0.2\nfuture==0.15.2\nidna==2.1\nipaddress==1.0.17\njdcal==1.2\nJinja2==2.8\nMarkupSafe==0.23\nmatplotlib==1.5.1\nMySQL-python==1.2.4b4\nndg-httpsclient==0.4.2\nnltk==3.2.1\nnumpy==1.10.1\nopenpyxl==2.4.0a1\npathlib==1.0.1\npefile==2016.3.28\npy2exe==0.6.9\npyasn1==0.1.9\npycparser==2.14\npyenchant==1.6.6\npyforms==0.1.3\npyinstaller==3.2\npymssql==2.1.1\npynsist==1.7\npyodbc==3.0.7\nPyOpenGL==3.1.0\npyOpenSSL==16.1.0\npyparsing==2.1.4\npypng==0.0.18\npython-dateutil==2.5.3\npytz==2016.4\npywin32==220\nrequests==2.11.1\nrequests-download==0.1.1\nsix==1.10.0\nurllib3==1.17\nvisvis==1.8\nwin-cli-launchers==0.1\nxlsxwriter==0.8.7\nyarg==0.1.9\n\n`python -c \"import ssl; print ssl.OPENSSL_VERSION\"`\nOpenSSL 1.0.2a 19 Mar 2015\n",
"Hrm, did you update Requests in any way?\n",
"I tried but it says it is already up to date (but I got the error above before trying to update)\n",
"No, I'm sorry, I wasn't clear. You said that \"it worked perfectly fine before\". What I'm trying to work out is, what did you to between the time it last worked and now, when it does not work.\n",
"Nothing actually. The last time I modified the code was 10 days ago and it worked fine back then.\n",
"Literally _nothing_ changed? You didn't change the installed versions of packages, your virtual environment, your OpenSSL version, your operating system, your network, _nothing_? \n",
"The only thing I can think of is that I installed pyinstaller and pynsist in order to make my files executable, but I don't think I changed versions of packages that worked fine (but definitely didn't change virtual environment, OpenSSL version, operating system, or network)\n",
"The thing I did try this morning is to add environment variable HTTPS_PROXY but this gave me this:\n`File \"C:\\Users\\sh\\GUI3.py\", line 216, in checkCreds\n response= requests.get(apiPath + 'hello world' , auth=(user, pasw), verify=False)\n File \"C:\\Python27\\lib\\requests\\api.py\", line 69, in get\n return request('get', url, params=params, **kwargs)\n File \"C:\\Python27\\lib\\requests\\api.py\", line 50, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"C:\\Python27\\lib\\requests\\sessions.py\", line 471, in request\n resp = self.send(prep, **send_kwargs)\n File \"C:\\Python27\\lib\\requests\\sessions.py\", line 579, in send\n r = adapter.send(request, **kwargs)\n File \"C:\\Python27\\lib\\requests\\adapters.py\", line 430, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: (\"bad handshake: SysCallError(-1, 'Unexpected EOF')\",)\n`\n",
"HTTPS_PROXY is unlikely to be relevant. Do you have proxies configured?\n",
"yes\n",
"@Sh-Ho So if your system hasn't changed there are only two options. Either the proxy has changed or the server has. In either case, the failure mode is simply that the connection is getting closed in the face of the Client Hello TLS message. It's extremely hard to work out what that is without quite a lot of knowledge and some guess work. Ideally we'd have a packet capture from the previous handshake and one from the new handshake.\n\nHowever, given the way these kinds of things usually go, I'd start by investigating the proxy. Ideally you should consult with the operator of the proxy to see why it's shutting down your handshake.\n"
] |
https://api.github.com/repos/psf/requests/issues/3592
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3592/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3592/comments
|
https://api.github.com/repos/psf/requests/issues/3592/events
|
https://github.com/psf/requests/issues/3592
| 178,079,668 |
MDU6SXNzdWUxNzgwNzk2Njg=
| 3,592 |
single request causing connection-pool to repeat multiple times
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4296161?v=4",
"events_url": "https://api.github.com/users/jefftune/events{/privacy}",
"followers_url": "https://api.github.com/users/jefftune/followers",
"following_url": "https://api.github.com/users/jefftune/following{/other_user}",
"gists_url": "https://api.github.com/users/jefftune/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jefftune",
"id": 4296161,
"login": "jefftune",
"node_id": "MDQ6VXNlcjQyOTYxNjE=",
"organizations_url": "https://api.github.com/users/jefftune/orgs",
"received_events_url": "https://api.github.com/users/jefftune/received_events",
"repos_url": "https://api.github.com/users/jefftune/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jefftune/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jefftune/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jefftune",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2016-09-20T14:40:44Z
|
2021-09-03T00:10:47Z
|
2016-09-20T14:44:33Z
|
NONE
|
resolved
|
I posted the same issue on [stackoverflow](http://stackoverflow.com/questions/39594458/python-requests-single-request-causing-connection-pool-to-repeat-multiple-time)
**Question:** **requests** module in the background is repeating "Starting new HTTPS connection" followed by an equal number of method calls, and these repeated background actions increase over runtime of my program to **20 times**. This is concerning to me.
1. For each "Starting new HTTPS connection" is **requests** module actually starting a new connection?
2. For each method call "PUT ..." is **requests** module actually calling endpoint?
3. If either or both of the aforementioned are true, then what should I do to prevent requests module from making repeated connections and calls?
I turned on **DEBUG logging** on requests module.
Initially for the first couple of minutes, the logs generated from **requests** looks fine, single "Starting new HTTPS connection" followed by and single method call, for example, "PUT":
```
2016-09-20 05:14:48 INFO "Starting new HTTPS connection (1): ****.**.**"
2016-09-20 05:14:49 DEBUG ""PUT /*/***/***/*** HTTP/1.1" 200 17"
```
However, then for a single action request, the number of "Starting new HTTPS connection" followed by the number of actions start to increase more and more, from 1 to 2 to 3 and up to 20 repeats.
This is happening regardless of what is the endpoint or action method applied.
These repeated logging appear to be coming from: **requests.packages.urllib3.connectionpool**
Here is an example the **requests** module is repeating 3 times for each request:
```
2016-09-20 05:15:08 INFO "Starting new HTTPS connection (1): ****.**.**"
2016-09-20 05:15:08 INFO "Starting new HTTPS connection (1): ****.**.**"
2016-09-20 05:15:08 INFO "Starting new HTTPS connection (1): ****.**.**"
2016-09-20 05:15:08 DEBUG ""PUT /*/***/***/*** HTTP/1.1" 200 17"
2016-09-20 05:15:08 DEBUG ""PUT /*/***/***/*** HTTP/1.1" 200 17"
2016-09-20 05:15:08 DEBUG ""PUT /*/***/***/*** HTTP/1.1" 200 17"
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3592/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3592/timeline
| null |
completed
| null | null | false |
[
"We cannot answer your question until you show us the code you've written producing these warnings. Also, since this is a _question_ it should be answered on StackOverflow (where you asked it originally).\n",
"@sigmavirus24 \n\nI am not sure why this was **closed**.\n\nI cannot provide the code because it contains proprietary credentials.\n\nSo, what would cause requests would make repeated \"Starting new HTTPS connection\" followed by repeated method calls?\n",
"@jefftune Multithreaded use of Requests is the most likely case. In the absence of any kind of code, we cannot effectively debug your situation: you are asking us to look at two logs and to discern everything you are doing with the code. We know nothing about your environment, the manner in which you're using the code, whether you're running with multiple sessions, or anything else.\n\nIf you could _strip_ the credentials from your code (e.g. replace them with `****`), then we can help. But otherwise, what you're asking us to do is to blindfold ourselves, tie both our hands behind our back, sit on a chair in a darkened room, and then work out what item you're holding in your hand based on the sound it makes when you drop it. We can _try_, but our odds are bad.\n",
"@Lukasa \n\nThanks for the feedback, I will look into **Multithreaded use of Requests**\n",
"@jefftune this was **closed** because:\n- it's not a defect\n- this project answers questions on StackOverflow (although maybe not as quickly as you would like)\n- we cannot help you without providing some sample\n\ntherefore there's nothing actionable to come out of this issue.\n",
"@sigmavirus24 \n\nUnderstood, thank you\n",
"Hi, I am having the same issue.\r\nIf I request for the url that starts with “https” using GET/POST method with logger module, then the logger prints repeated logs just like @jefftune’s case. (in the case of \"http\", it works really well). \r\nI can't access your StackOverflow link. \r\n@sigmavirus24 said that you got the answer on StackOverflow.\r\nCan you give me the link or where I can find the link?\r\n\r\nThank you"
] |
https://api.github.com/repos/psf/requests/issues/3591
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3591/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3591/comments
|
https://api.github.com/repos/psf/requests/issues/3591/events
|
https://github.com/psf/requests/pull/3591
| 177,847,331 |
MDExOlB1bGxSZXF1ZXN0ODU4MzYwOTk=
| 3,591 |
Fix RequestsCookieJar specific update call
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-09-19T17:33:08Z
|
2021-09-08T02:10:16Z
|
2016-09-23T09:32:52Z
|
MEMBER
|
resolved
|
This should solve the issue in #3579. `merge_cookies` performs the same `update` call inside of a try/except block which will fallback to the same logic that [`update`](https://github.com/kennethreitz/requests/blob/master/requests/cookies.py#L347-L353) itself implements for non-`RequestsCookieJar`-like objects.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3591/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3591/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3591.diff",
"html_url": "https://github.com/psf/requests/pull/3591",
"merged_at": "2016-09-23T09:32:52Z",
"patch_url": "https://github.com/psf/requests/pull/3591.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3591"
}
| true |
[
"Ok, per my note above I'm going ahead and merging this. Thanks @nateprewitt!\n"
] |
https://api.github.com/repos/psf/requests/issues/3580
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3580/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3580/comments
|
https://api.github.com/repos/psf/requests/issues/3580/events
|
https://github.com/psf/requests/pull/3580
| 177,561,250 |
MDExOlB1bGxSZXF1ZXN0ODU2NjY2OTI=
| 3,580 |
Add workaround to avoid implicit import of encodings.idna. Fixes #3578.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/46059?v=4",
"events_url": "https://api.github.com/users/carsonyl/events{/privacy}",
"followers_url": "https://api.github.com/users/carsonyl/followers",
"following_url": "https://api.github.com/users/carsonyl/following{/other_user}",
"gists_url": "https://api.github.com/users/carsonyl/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/carsonyl",
"id": 46059,
"login": "carsonyl",
"node_id": "MDQ6VXNlcjQ2MDU5",
"organizations_url": "https://api.github.com/users/carsonyl/orgs",
"received_events_url": "https://api.github.com/users/carsonyl/received_events",
"repos_url": "https://api.github.com/users/carsonyl/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/carsonyl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/carsonyl/subscriptions",
"type": "User",
"url": "https://api.github.com/users/carsonyl",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2016-09-17T07:16:45Z
|
2021-09-08T02:10:30Z
|
2016-09-17T07:18:59Z
|
CONTRIBUTOR
|
resolved
|
This is the workaround needed to avoid "LookupError: unknown encoding: idna" when Requests is used in a threaded context, and the Python standard library is inside a ZIP, such as the case of embedded distributions of Python.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3580/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3580/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3580.diff",
"html_url": "https://github.com/psf/requests/pull/3580",
"merged_at": "2016-09-17T07:18:59Z",
"patch_url": "https://github.com/psf/requests/pull/3580.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3580"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/3579
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3579/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3579/comments
|
https://api.github.com/repos/psf/requests/issues/3579/events
|
https://github.com/psf/requests/issues/3579
| 177,543,771 |
MDU6SXNzdWUxNzc1NDM3NzE=
| 3,579 |
Error when passing CookieJar to Session via PreparedRequest
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/172168?v=4",
"events_url": "https://api.github.com/users/agacek/events{/privacy}",
"followers_url": "https://api.github.com/users/agacek/followers",
"following_url": "https://api.github.com/users/agacek/following{/other_user}",
"gists_url": "https://api.github.com/users/agacek/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/agacek",
"id": 172168,
"login": "agacek",
"node_id": "MDQ6VXNlcjE3MjE2OA==",
"organizations_url": "https://api.github.com/users/agacek/orgs",
"received_events_url": "https://api.github.com/users/agacek/received_events",
"repos_url": "https://api.github.com/users/agacek/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/agacek/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/agacek/subscriptions",
"type": "User",
"url": "https://api.github.com/users/agacek",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2016-09-17T00:10:42Z
|
2021-09-08T15:00:44Z
|
2016-09-25T13:35:37Z
|
NONE
|
resolved
|
I'm using requests 2.11.1. The following code throws an exception:
```
from requests import Request, Session
from http.cookiejar import CookieJar
cj = CookieJar()
req = Request('GET', 'http://google.com', cookies=cj).prepare()
resp = Session().send(req)
```
The exception is:
```
Traceback (most recent call last):
File "test.py", line 6, in <module>
resp = Session().send(req)
File "C:\Anaconda3\lib\site-packages\requests\sessions.py", line 617, in send
history = [resp for resp in gen] if allow_redirects else []
File "C:\Anaconda3\lib\site-packages\requests\sessions.py", line 617, in <listcomp>
history = [resp for resp in gen] if allow_redirects else []
File "C:\Anaconda3\lib\site-packages\requests\sessions.py", line 159, in resolve_redirects
prepared_request._cookies.update(self.cookies)
AttributeError: 'CookieJar' object has no attribute 'update'
```
I'm able to avoid this by using `Session.prepare_request` or by passing the cookies via `Session.cookies`.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3579/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3579/timeline
| null |
completed
| null | null | false |
[
"Thanks for opening this issue, @agacek. So the problem we're encountering here is a difference in how we \"prepare\" the CookieJar between a `PreparedRequest` and a `Session`.\n\n`Session` will always [merge the cookies into a fresh `RequestsCookieJar`](https://github.com/kennethreitz/requests/blob/master/requests/sessions.py#L375-L376) when `prepare_request()` is called. `RequestsCookieJar` will have the update method, and is a subclass of `cookielib.CookieJar` so everything works fine.\n\n`PreparedRequest` however doesn't perform this same merge, and won't modify the CookieJar if it's an instance of cookielib.CookieJar, so this one remains untouched. I'm not sure if this was an intentional feature of `PreparedRequest` to allow people to do what they wanted, or if it was an oversight. The code base is obviously expecting `cookies` to be `RequestsCookieJar`-like now though. If we're enforcing this in Session, we should probably also implement the same logic in `prepare_cookies` for the `PreparedRequest`. The user can always override the `cookies` param with their own CookieJar after if needed.\n\n@agacek as an interim fix, I would suggest you either use the [`RequestsCookieJar`](https://github.com/kennethreitz/requests/blob/master/requests/cookies.py#L172) class instead of `http.cookiejar.CookieJar`, or use the [`merge_cookies`](https://github.com/kennethreitz/requests/blob/master/requests/cookies.py#L521) function to merge your existing `CookieJar` into a new `RequestsCookieJar`. You can, of course, also continue using `Session.prepare_request()` too. Hope that helps!\n",
"Another quick thought, if we don't want to coerce a `CookieJar` into a `RequestsCookieJar`, we could also replace the [line](https://github.com/kennethreitz/requests/blob/master/requests/sessions.py#L159) in `resolve_redirects` with `merge_cookies` to catch this error.\n\nIt seems odd to me though to subtly create a new object in `Session`, but not do this consistently in `PreparedRequest`. This isn't the first time this behaviour has been problematic (#3416).\n",
"We may need to tolerate having non-RequestsCookieJar cookiejars provided during the PreparedRequest flow. It may be that the best way to do that is to duplicate the `update` method into a helper in `cookies.py` which we can use.\n",
"I believe that's what [this portion](https://github.com/kennethreitz/requests/blob/master/requests/cookies.py#L533-L538) of `merge_cookies` is doing. Were you suggesting moving that logic out into another helper function to avoid the extra conditionals?\n",
"Hrm, I hadn't noticed that. `merge_cookies` is _probably_ the right choice here, yes.\n",
"This should be resolved with #3591.\n"
] |
https://api.github.com/repos/psf/requests/issues/3578
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3578/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3578/comments
|
https://api.github.com/repos/psf/requests/issues/3578/events
|
https://github.com/psf/requests/issues/3578
| 177,534,940 |
MDU6SXNzdWUxNzc1MzQ5NDA=
| 3,578 |
"LookupError: unknown encoding: idna" in worker threads, under embedded Python 3.5
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/46059?v=4",
"events_url": "https://api.github.com/users/carsonyl/events{/privacy}",
"followers_url": "https://api.github.com/users/carsonyl/followers",
"following_url": "https://api.github.com/users/carsonyl/following{/other_user}",
"gists_url": "https://api.github.com/users/carsonyl/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/carsonyl",
"id": 46059,
"login": "carsonyl",
"node_id": "MDQ6VXNlcjQ2MDU5",
"organizations_url": "https://api.github.com/users/carsonyl/orgs",
"received_events_url": "https://api.github.com/users/carsonyl/received_events",
"repos_url": "https://api.github.com/users/carsonyl/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/carsonyl/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/carsonyl/subscriptions",
"type": "User",
"url": "https://api.github.com/users/carsonyl",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
},
{
"color": "0b02e1",
"default": false,
"description": null,
"id": 191274,
"name": "Contributor Friendly",
"node_id": "MDU6TGFiZWwxOTEyNzQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly"
}
] |
closed
| true | null |
[] | null | 5 |
2016-09-16T22:43:02Z
|
2021-09-08T15:00:46Z
|
2016-09-17T07:19:01Z
|
CONTRIBUTOR
|
resolved
|
The following snippet raises an exception on Windows, using the [embedded distribution of Python 3.5.2 x64](https://www.python.org/downloads/release/python-352/):
``` python
from concurrent.futures import ThreadPoolExecutor
import requests
def task(_):
return requests.get("https://example.org").reason
executor = ThreadPoolExecutor(max_workers=2)
for x in executor.map(task, [(1,), (1,),]):
print(x)
```
```
OK
Traceback (most recent call last):
File "repro.py", line 7, in <module>
for x in executor.map(task, [(1,), (1,),]):
File "concurrent\futures\_base.py", line 556, in result_iterator
File "concurrent\futures\_base.py", line 398, in result
File "concurrent\futures\_base.py", line 357, in __get_result
File "concurrent\futures\thread.py", line 55, in run
File "repro.py", line 4, in task
return requests.get("https://example.org").reason
File "K:\PR\pybug\requests\api.py", line 70, in get
return request('get', url, params=params, **kwargs)
File "K:\PR\pybug\requests\api.py", line 56, in request
return session.request(method=method, url=url, **kwargs)
File "K:\PR\pybug\requests\sessions.py", line 461, in request
prep = self.prepare_request(req)
File "K:\PR\pybug\requests\sessions.py", line 394, in prepare_request
hooks=merge_hooks(request.hooks, self.hooks),
File "K:\PR\pybug\requests\models.py", line 294, in prepare
self.prepare_url(url, params)
File "K:\PR\pybug\requests\models.py", line 361, in prepare_url
host = host.encode('idna').decode('utf-8')
LookupError: unknown encoding: idna
```
This problem seems specific to the standard library being inside a zip file in the embedded distribution, as I couldn't reproduce this in regular distributions, or when Python35.zip is extracted out to Lib.
`import encodings.idna` succeeds. Doing this in the main thread prior to launching the workers will avoid this problem, as per http://stackoverflow.com/a/13057751. Though it looks like the `str.encode()` call is the culprit, changing my task function to be `'foo'.encode('idna')` doesn't trigger the exception.
I've tested this using Requests 2.10.0 and 2.11.1. This is also reproducible using `multiprocessing.pool.ThreadPool`.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3578/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3578/timeline
| null |
completed
| null | null | false |
[
"@rbcarson Can you confirm for me: does adding `import encodings.idna` to your `task` function in your example also fix the problem?\n",
"@Lukasa Yes, it does.\n",
"Ok, so I think that means the solution is to have a no-op import. =)\n",
"If we put that import in the `models.py` file I think that fix will be sufficient. Let's put a comment on it though so we remember why we did it!\n",
"I've already started down this path in my own code. It seemed a bit shady to me, but if it's good enough for upstream... :smirk: \n\nDo you think this problem lies with the interpreter, and thus worth reporting to python.org?\n"
] |
https://api.github.com/repos/psf/requests/issues/3577
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3577/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3577/comments
|
https://api.github.com/repos/psf/requests/issues/3577/events
|
https://github.com/psf/requests/issues/3577
| 177,530,988 |
MDU6SXNzdWUxNzc1MzA5ODg=
| 3,577 |
response.iter_lines() did not print the last line of stream log.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7462257?v=4",
"events_url": "https://api.github.com/users/haocs/events{/privacy}",
"followers_url": "https://api.github.com/users/haocs/followers",
"following_url": "https://api.github.com/users/haocs/following{/other_user}",
"gists_url": "https://api.github.com/users/haocs/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/haocs",
"id": 7462257,
"login": "haocs",
"node_id": "MDQ6VXNlcjc0NjIyNTc=",
"organizations_url": "https://api.github.com/users/haocs/orgs",
"received_events_url": "https://api.github.com/users/haocs/received_events",
"repos_url": "https://api.github.com/users/haocs/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/haocs/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/haocs/subscriptions",
"type": "User",
"url": "https://api.github.com/users/haocs",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 17 |
2016-09-16T22:12:50Z
|
2021-09-08T15:00:42Z
|
2016-09-30T17:16:46Z
|
NONE
|
resolved
|
```
python: 3.4
requests: 2.9.1
```
I implemented the following function to fetch stream log from server continuously.
``` py
def stream_trace(streaming_url, user_name, password):
import requests
r = requests.get(streaming_url, auth=(user_name, password), stream=True)
for line in r.iter_lines(decode_unicode=True):
if line:
print(line)
```
The above code could fetch and print the log successfully however its behavior was different as expected.
Basically, it holds the last line of current content/chunk and prints it together with the next chunk of logs.
For example, let's say there are two chunks of logs from server and the expected print:
```
# first chunk
a
# second chunk
b
c
```
what `stream_trace` function printed out('a' printed as 2nd chunk and 'c' was missing)
```
# first chunk
# second chunk
a
b
```
Could you help me figure out what may went wrong? Thanks.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7462257?v=4",
"events_url": "https://api.github.com/users/haocs/events{/privacy}",
"followers_url": "https://api.github.com/users/haocs/followers",
"following_url": "https://api.github.com/users/haocs/following{/other_user}",
"gists_url": "https://api.github.com/users/haocs/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/haocs",
"id": 7462257,
"login": "haocs",
"node_id": "MDQ6VXNlcjc0NjIyNTc=",
"organizations_url": "https://api.github.com/users/haocs/orgs",
"received_events_url": "https://api.github.com/users/haocs/received_events",
"repos_url": "https://api.github.com/users/haocs/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/haocs/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/haocs/subscriptions",
"type": "User",
"url": "https://api.github.com/users/haocs",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3577/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3577/timeline
| null |
completed
| null | null | false |
[
"So `iter_lines` has a somewhat unexpected implementation. Naïvely, we would expect that `iter_lines` would receive data as it arrives and look for newlines. In practice, this is not what it does. Instead it waits to read an entire `chunk_size`, and only then searches for newlines. This is a consequence of the underlying httplib implementation, which only allows for file-like reading semantics, rather then the early return semantics usually associated with a socket.\n\nYou can get the effect you want by setting the chunk size to `1`. However, this will drastically reduce performance. If you can tolerate late log delivery, then it is probably enough to leave the implementation as it is: when the connection is eventually closed, all of the lines should safely be delivered and no data will be lost. The only caveat here is that if the connection is closed uncleanly, then we will probably throw an exception rather then return the buffered data. \n",
"Thanks @Lukasa \nHowever, setting `chunk_size` to `1` or `None` did not change the results in my case. It seems that my issue is related to https://github.com/kennethreitz/requests/issues/2020 . Requests somehow handles chucked-encoding differently as curl does.\nThe following example shows different results GET from my log-server using curl and requests.\n\n``` bash\n# curl --raw -v 'https://some-url.com'\n< Transfer-Encoding: chunked\n< Content-Type: text/plain; charset=utf-8\n\n4f\n2016-09-20T09:56:36 Welcome, you are now connected to log-streaming service.\n\n37\nPORT is:\\\\\\\\.\\\\pipe\\\\xxxxxx-xxxx-xxxx-xxxx-xxxxxxxx\n\n11\nSomething wrong\n\n12\nSomething normal\n\n```\n\n``` python\n# r.iter_content(hunk_size=None, decode_unicode=False)\n\nb'2016-09-20T10:12:09 Welcome, you are now connected to log-streaming service.'\nb'\\r\\nPORT is:\\\\\\\\.\\\\pipe\\\\xxxxxx-xxxx-xxxx-xxxx-xxxxxxxx\\r'\nb'\\nSomething normal'\nb'\\r\\nSomething wrong\\r'\nb'\\n'\nb'PORT is:\\\\\\\\.\\\\pipe\\\\xxxxxx-xxxx-xxxx-xxxx-xxxxxxxx\\r\\n'\nb'Something wrong\\r\\nSomething '\nb'normal\\r\\n'\nb'PORT is:\\\\\\\\.\\\\pipe\\\\xxxxxx-xxxx-xxxx-xxxx-xxxxxxxx\\r\\nSomething '\nb'wrong\\r\\nSomething '\n```\n\nWe can see that `iter_content` get the correct data as well as CRLF but chunks them in a different way. Since `iter_lines` internally called `iter_content`, the line split differently accordingly. \n",
"I didn't realise you were getting chunked content. In that case, can you try the latest Requests with `iter_content(None)`? That should actually give you chunks.\n",
"I tried with v2.11 but saw the same issue. Does `iter_content` chunk the data based on the chuck_size provided by server?\n",
"@haocs Only when using `chunk_size=None`.\n",
"Even with `chunk_size=None`, the length of content generated from `iter_content` is different to `chunk_size` from server. \n\n```\n# r.iter_contents(chunk_size=None)\n[length of chunk] 0x4d\nb'2016-09-20T21:41:52 Welcome, you are now connected to log-streaming service.'\n[length of chunk] 0x38\nb'\\r\\nPORT is:\\\\\\\\.\\\\pipe\\\\xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx\\r'\n\n######\n#curl --raw\n4f\n2016-09-20T22:00:30 Welcome, you are now connected to log-streaming service.\n\n37\nPORT is:\\\\.\\pipe\\xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx\n```\n\nThe above snippet shows two chunks that fetched by requests and curl from server. It seems that requests did not handle trailing CRLF(which is part of the chunk) properly. For example, chunk_size of the first chunk indicate the size is `4F` but `iter_content` only received `4D` length and add `\\r\\n` to the beginning of the next chunk.\n",
"I don't observe this problem on Python when using `https://mkcert.org/generate/`, where Requests generates exactly the same chunk boundaries as curl. Can you confirm for me please that server really is generating the exact same chunk boundaries in each case?\n\nCan you also confirm for me that you ran your test on v2.11?\n",
"yes, I tested against v2.11.1. Requests works fine with `https://mkcert.org/generate/`. \nOne difference I noticed is that chunks from my testing server contains a `\\r\\n` explicitly at the end of each line(and the length of `\\r\\n` has been included in chunk length). Versus the `mkcert.org` ones don't have. Will this cause any trouble for Requests to process chunks?\nBTW. my testing is running against Azure [kudu server](https://github.com/projectkudu/kudu). If necessary, I can provide a testing account as well as repro steps.\n",
"mkcert.org provides a `\\r\\n` at the end of each chunk too, because it's required to by RFC 7230 Section 4.1. That section says that a chunked body looks like this:\n\n```\n<chunk size in hex>\\r\\n\n<chunk data, whose total length is == chunk size>\\r\\n\n```\n\nNote that the `\\r\\n` at the end is excluded from the chunk size. Does your output end each chunk with _two_ `\\r\\n`, one counted in the body and one that isn't? Because it's supposed to. The raw body above seems to be overcounting its chunk sizes by counting the CRLF characters in the chunk size, when it should not.\n",
"I understand the end `\\r\\n` of each chunk should not be counted in chunk_size. But another `\\r\\n` should be, right? However, per my testing, requests ignored both `\\r\\n` if I understand correctly.\nI implemented another request function using urllib3 and it performed same as curl did. Which makes me believe that requests skipped `\\r\\n` when iterates contents.\nPlease see the following results from urllib3 and requests.\n\n``` python\n# python req_urllib3.py\n[len of chunk] 0x4f\nb'2016-09-23T19:27:27 Welcome, you are now connected to log-streaming service.\\r\\n'\n[len of chunk] 0x39\nb'2016-09-23T19:28:27 No new trace in the past 1 min(s).\\r\\n'\n\n\n# python req_requests.py\n[len of chunk] 0x4d\nb'2016-09-23T19:25:09 Welcome, you are now connected to log-streaming service.'\n[len of chunk] 0x39\nb'\\r\\n2016-09-23T19:26:09 No new trace in the past 1 min(s).'\n```\n",
"Requests uses urllib3 directly and performs no additional post processing in this case. `iter_content(None)` is identical to `stream(None)`.\n",
"What's the urllib3 version shipped with requests v2.11? \n",
"Ok, I could repro this \"issue\" with urllib3. If I use urllib3 and set `accept_encoding=True`, it will give me exactly what. Seems Requests by default set header `Accept-Encoding=Ture` if called by `requests.get()`\n",
"So do you see the problem go away if you set `headers={'Accept-Encoding': 'identity'}`?\n",
"Yes. After I set `headers={'Accept-Encoding': 'identity'}`, `iter_content(chunk_size=None, decode_unicode=False)` worked as expected.\n",
"Ok. This strongly suggests that the problem is the way that the server is handling gzipping a chunked body. It would be very interesting if possible to see the raw data stream.\n",
"Since I could observe same problem using curl or urllib3 with gzip enabled, obviously this not necessary to be an issue of requests. Thank you very much for the help, issue closed. \n"
] |
https://api.github.com/repos/psf/requests/issues/3576
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3576/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3576/comments
|
https://api.github.com/repos/psf/requests/issues/3576/events
|
https://github.com/psf/requests/pull/3576
| 177,466,281 |
MDExOlB1bGxSZXF1ZXN0ODU2MDE3MjQ=
| 3,576 |
Fix encoding issue in test_response_reason_unicode_fallback
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2016-09-16T16:23:45Z
|
2021-09-08T02:10:16Z
|
2016-09-20T14:48:40Z
|
MEMBER
|
resolved
|
`test_response_reason_unicode_fallback` is currently failing in Python 2. The original proposed fix in #3557 unfortunately didn't address the underlying issue.
While this patch should fix the test now, it may be worth taking a brief moment to discuss how we envision the original PR #3554 working. Currently an error retrieved from an `except` clause can't be cast as `str`, `unicode`, written to a file, or easily decoded in Python 2. It prints to the console fine, but that's about it. This seems like a semi-serious issue for making this exception usable. I need to run a few more tests this evening or tomorrow, so we can sit on this for the weekend unless someone has some obvious insight I'm missing.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3576/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3576/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3576.diff",
"html_url": "https://github.com/psf/requests/pull/3576",
"merged_at": "2016-09-20T14:48:40Z",
"patch_url": "https://github.com/psf/requests/pull/3576.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3576"
}
| true |
[
"Ok, I'm gonna call this good to go I guess. I feel like the behavior of the unicode string in an Exception is less than ideal but it's apparently one of the many long running unicode issues in Python 2.x.\n\nFor anyone reading this commit, trying to figure out how to do anything useful with the exception in Python 2, both of these should work:\n\n``` python\nunicode(e.message) # unicode\nunicode(e.message).decode('utf-8') # str\n```\n",
"@sigmavirus24 Would you like to take a look at this?\n",
"@Lukasa I would like to pull it and run the tests locally. I suspect we might have a locale mismatch which is why this _appears_ to work for @nateprewitt and not for me. This is why I was using bytes explicitly previously.\n",
"Hey @sigmavirus24, while this could be a locale mismatch, there are 5 other tests in the same file using unicode characters in unicode strings like this (including \\xf6). I _think_ I figured out why we were getting different results, and [detailed it](https://github.com/kennethreitz/requests/pull/3557#issuecomment-246908451), along with most of my other findings in #3557. I had @Lukasa confirm he could reproduce this issue before I opened the PR. Is this patch not working on your system?\n",
"Quoting myself:\n\n> I would like to pull it and run the tests locally. \n",
"> ...which is why **this** _appears_ to work for @nateprewitt and not for me.\n\nSorry @sigmavirus24, I interpreted \"this\" as meaning \"this patch\". I was just trying to clarify.\n",
"Ok, so this test as written passes for me, as do all the tests. My locale is:\n\n```\n(env) cory@heimdall:requests/ % locale\nLANG=\"en_GB.UTF-8\"\nLC_COLLATE=\"en_GB.UTF-8\"\nLC_CTYPE=\"en_GB.UTF-8\"\nLC_MESSAGES=\"en_GB.UTF-8\"\nLC_MONETARY=\"en_GB.UTF-8\"\nLC_NUMERIC=\"en_GB.UTF-8\"\nLC_TIME=\"en_GB.UTF-8\"\nLC_ALL=\n```\n",
"Thanks @nateprewitt. Care to update proposed/3.0.0 too then?\n",
"Sure thing, I'll move the changes over sometime today.\n"
] |
https://api.github.com/repos/psf/requests/issues/3575
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3575/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3575/comments
|
https://api.github.com/repos/psf/requests/issues/3575/events
|
https://github.com/psf/requests/pull/3575
| 177,221,245 |
MDExOlB1bGxSZXF1ZXN0ODU0MzQ4MTk=
| 3,575 |
added results of benchmark test about using session
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3514015?v=4",
"events_url": "https://api.github.com/users/vitaly-zdanevich/events{/privacy}",
"followers_url": "https://api.github.com/users/vitaly-zdanevich/followers",
"following_url": "https://api.github.com/users/vitaly-zdanevich/following{/other_user}",
"gists_url": "https://api.github.com/users/vitaly-zdanevich/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/vitaly-zdanevich",
"id": 3514015,
"login": "vitaly-zdanevich",
"node_id": "MDQ6VXNlcjM1MTQwMTU=",
"organizations_url": "https://api.github.com/users/vitaly-zdanevich/orgs",
"received_events_url": "https://api.github.com/users/vitaly-zdanevich/received_events",
"repos_url": "https://api.github.com/users/vitaly-zdanevich/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/vitaly-zdanevich/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vitaly-zdanevich/subscriptions",
"type": "User",
"url": "https://api.github.com/users/vitaly-zdanevich",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2016-09-15T16:15:41Z
|
2021-09-08T01:21:40Z
|
2016-11-17T01:25:45Z
|
NONE
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3575/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3575/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3575.diff",
"html_url": "https://github.com/psf/requests/pull/3575",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3575.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3575"
}
| true |
[
"Thanks for this! I'm inclined to want to avoid having a specific benchmark, and instead to speak in general terms. Something general about approximate improvement would be better. Are you interested in that change?\n",
"When I first time read this documentation with words `significant performance` about session my thoughts was `eh maybe ten milliseconds`, I did not know that for every https call without session server must send certificate to client and get response and that this is seconds but not milliseconds, I was surprised when maked lazy test of it and get this numbers. Why are you against benchmarks? It is good illustration, I love benchmarks in any kind :)\n",
"I'm not against benchmarks at all: I'm against benchmarks in _narrative documentation_. I'd be happy to have the benchmark in a `.py` file in the docs tree and then have the narrative docs just say what you said above with a link to that benchmark. It just doesn't need to be inline on the page, that's all.\n",
"There's been no response to this pull request since feedback was provided. We can reopen this if interest in it is renewed at any point.\n"
] |
|
https://api.github.com/repos/psf/requests/issues/3574
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3574/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3574/comments
|
https://api.github.com/repos/psf/requests/issues/3574/events
|
https://github.com/psf/requests/pull/3574
| 177,076,107 |
MDExOlB1bGxSZXF1ZXN0ODUzMzg4MDc=
| 3,574 |
Streaming responses require encoding to be set
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1097666?v=4",
"events_url": "https://api.github.com/users/sentientcucumber/events{/privacy}",
"followers_url": "https://api.github.com/users/sentientcucumber/followers",
"following_url": "https://api.github.com/users/sentientcucumber/following{/other_user}",
"gists_url": "https://api.github.com/users/sentientcucumber/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sentientcucumber",
"id": 1097666,
"login": "sentientcucumber",
"node_id": "MDQ6VXNlcjEwOTc2NjY=",
"organizations_url": "https://api.github.com/users/sentientcucumber/orgs",
"received_events_url": "https://api.github.com/users/sentientcucumber/received_events",
"repos_url": "https://api.github.com/users/sentientcucumber/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sentientcucumber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sentientcucumber/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sentientcucumber",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2016-09-15T02:58:02Z
|
2021-09-08T02:10:31Z
|
2016-09-16T10:36:03Z
|
NONE
|
resolved
|
Resolves the issues called out in #3359 and #3481 when `Response.iter_content(decode_unicode=True)` would return bytes instead of unicode. This is a breaking change as it raises an exception if `Response.encoding` is not set before invoking this function.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 1,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/3574/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3574/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3574.diff",
"html_url": "https://github.com/psf/requests/pull/3574",
"merged_at": "2016-09-16T10:36:03Z",
"patch_url": "https://github.com/psf/requests/pull/3574.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3574"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/3573
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3573/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3573/comments
|
https://api.github.com/repos/psf/requests/issues/3573/events
|
https://github.com/psf/requests/issues/3573
| 177,032,100 |
MDU6SXNzdWUxNzcwMzIxMDA=
| 3,573 |
convert all headers to bytes before invoking urllib3/httplib
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/14330?v=4",
"events_url": "https://api.github.com/users/stefanfoulis/events{/privacy}",
"followers_url": "https://api.github.com/users/stefanfoulis/followers",
"following_url": "https://api.github.com/users/stefanfoulis/following{/other_user}",
"gists_url": "https://api.github.com/users/stefanfoulis/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/stefanfoulis",
"id": 14330,
"login": "stefanfoulis",
"node_id": "MDQ6VXNlcjE0MzMw",
"organizations_url": "https://api.github.com/users/stefanfoulis/orgs",
"received_events_url": "https://api.github.com/users/stefanfoulis/received_events",
"repos_url": "https://api.github.com/users/stefanfoulis/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/stefanfoulis/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/stefanfoulis/subscriptions",
"type": "User",
"url": "https://api.github.com/users/stefanfoulis",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-09-14T21:55:23Z
|
2021-09-08T15:00:47Z
|
2016-09-15T06:15:49Z
|
NONE
|
resolved
|
related to #3177 and shazow/urllib3#855
It's really easy to get this wrong. See sam-washington/requests-aws4auth#24 for an example.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3573/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3573/timeline
| null |
completed
| null | null | false |
[
"There is no requirement for us to pass urllib3 bytes. urllib3's interface has always allowed native strings on the platform, which includes unicode on Python 3. The specific error you're looking at is one that we have already fixed in urllib3, so a fix will be coming down the pipe in the next few months. =)\n",
"@Lukasa what is the change in urllib3 that will solve this issue?\n",
"It's shazow/urllib3#719\n"
] |
https://api.github.com/repos/psf/requests/issues/3572
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3572/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3572/comments
|
https://api.github.com/repos/psf/requests/issues/3572/events
|
https://github.com/psf/requests/pull/3572
| 177,019,238 |
MDExOlB1bGxSZXF1ZXN0ODUzMDM4Mzg=
| 3,572 |
Swallow SSL errors raised by eventlet
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/304831?v=4",
"events_url": "https://api.github.com/users/benkuhn/events{/privacy}",
"followers_url": "https://api.github.com/users/benkuhn/followers",
"following_url": "https://api.github.com/users/benkuhn/following{/other_user}",
"gists_url": "https://api.github.com/users/benkuhn/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/benkuhn",
"id": 304831,
"login": "benkuhn",
"node_id": "MDQ6VXNlcjMwNDgzMQ==",
"organizations_url": "https://api.github.com/users/benkuhn/orgs",
"received_events_url": "https://api.github.com/users/benkuhn/received_events",
"repos_url": "https://api.github.com/users/benkuhn/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/benkuhn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/benkuhn/subscriptions",
"type": "User",
"url": "https://api.github.com/users/benkuhn",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-09-14T20:58:57Z
|
2021-09-08T02:10:31Z
|
2016-09-15T06:34:28Z
|
NONE
|
resolved
|
Fixes #3104 (or the weird mutant version of it that happens if you use eventlet).
When eventlet monkey-patches ssl, the resulting `socket.read` calls throw an instance of the same class (`SSLError`) with a different message (`'timed out'` instead of `'read operation timed out'`). This causes the SSLError to not get swallowed by urllib3 or requests.
Please let me know if you'd prefer the issue to be fixed in another way, happy to adjust--this was just the clearest place to fix it.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3572/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3572/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3572.diff",
"html_url": "https://github.com/psf/requests/pull/3572",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3572.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3572"
}
| true |
[
"Urllib3 is vendored with Requests, you'll want to submit your PR to [their repo](https://github.com/shazow/urllib3).\n",
"Yup, as @drpoggi said, Requests doesn't carry any patches to urllib3, we just use it as is. Please make this change to urllib3 and open a PR there. =)\n"
] |
https://api.github.com/repos/psf/requests/issues/3571
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3571/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3571/comments
|
https://api.github.com/repos/psf/requests/issues/3571/events
|
https://github.com/psf/requests/issues/3571
| 176,936,189 |
MDU6SXNzdWUxNzY5MzYxODk=
| 3,571 |
ValueError encountered
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/92340?v=4",
"events_url": "https://api.github.com/users/antlong/events{/privacy}",
"followers_url": "https://api.github.com/users/antlong/followers",
"following_url": "https://api.github.com/users/antlong/following{/other_user}",
"gists_url": "https://api.github.com/users/antlong/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/antlong",
"id": 92340,
"login": "antlong",
"node_id": "MDQ6VXNlcjkyMzQw",
"organizations_url": "https://api.github.com/users/antlong/orgs",
"received_events_url": "https://api.github.com/users/antlong/received_events",
"repos_url": "https://api.github.com/users/antlong/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/antlong/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/antlong/subscriptions",
"type": "User",
"url": "https://api.github.com/users/antlong",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2016-09-14T15:18:11Z
|
2021-09-08T15:00:47Z
|
2016-09-14T20:09:06Z
|
NONE
|
resolved
|
https://dpaste.de/t0kx
More info available if needed.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3571/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3571/timeline
| null |
completed
| null | null | false |
[
"@antlong Can you provide your code, please? Can you also tell us what version of Requests you're using, and where you got it from?\n",
"https://dpaste.de/zUtN\n\nRequests 1.2.3 from pip.\n\nNote: The urls in the file contain special characters.\n",
"So Requests 1.2.3 is _ancient_: I'm almost certain that we fixed this in the last two years. Please upgrade your Requests. =)\n",
"Ahhhh forgot i was on an older centos dev box running this. Thanks!\n"
] |
https://api.github.com/repos/psf/requests/issues/3570
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3570/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3570/comments
|
https://api.github.com/repos/psf/requests/issues/3570/events
|
https://github.com/psf/requests/issues/3570
| 176,795,572 |
MDU6SXNzdWUxNzY3OTU1NzI=
| 3,570 |
socks5 does not work well
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4336119?v=4",
"events_url": "https://api.github.com/users/ljdawn/events{/privacy}",
"followers_url": "https://api.github.com/users/ljdawn/followers",
"following_url": "https://api.github.com/users/ljdawn/following{/other_user}",
"gists_url": "https://api.github.com/users/ljdawn/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ljdawn",
"id": 4336119,
"login": "ljdawn",
"node_id": "MDQ6VXNlcjQzMzYxMTk=",
"organizations_url": "https://api.github.com/users/ljdawn/orgs",
"received_events_url": "https://api.github.com/users/ljdawn/received_events",
"repos_url": "https://api.github.com/users/ljdawn/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ljdawn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ljdawn/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ljdawn",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-09-14T01:59:17Z
|
2021-09-08T15:00:48Z
|
2016-09-14T07:07:42Z
|
NONE
|
resolved
|
Hi. When I'm using proxy = {'http': 'socks5://IP:PORT', 'https': 'socks5://IP:PORT'}, I got "AssertionError: Not supported proxy scheme socks5". I had upgraded my requests( version:2.11.1) and env is mac osx.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3570/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3570/timeline
| null |
completed
| null | null | false |
[
"```\nproxies = {'http': 'socks5://127.0.0.1:9050',\n 'https': 'socks5://127.0.0.1:9050'}\nrequests.get(url, proxies=proxies)\n```\n\nit works for me. I suggest you debug to find error.\n",
"@PegasusWang This suggests that your PySocks isn't working well. `pip install -U pysocks==1.5.6` should resolve your problem.\n"
] |
https://api.github.com/repos/psf/requests/issues/3569
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3569/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3569/comments
|
https://api.github.com/repos/psf/requests/issues/3569/events
|
https://github.com/psf/requests/issues/3569
| 176,329,994 |
MDU6SXNzdWUxNzYzMjk5OTQ=
| 3,569 |
[translation error] We should use "码" instead of "吗"
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/20574212?v=4",
"events_url": "https://api.github.com/users/zhancongc/events{/privacy}",
"followers_url": "https://api.github.com/users/zhancongc/followers",
"following_url": "https://api.github.com/users/zhancongc/following{/other_user}",
"gists_url": "https://api.github.com/users/zhancongc/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/zhancongc",
"id": 20574212,
"login": "zhancongc",
"node_id": "MDQ6VXNlcjIwNTc0MjEy",
"organizations_url": "https://api.github.com/users/zhancongc/orgs",
"received_events_url": "https://api.github.com/users/zhancongc/received_events",
"repos_url": "https://api.github.com/users/zhancongc/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/zhancongc/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhancongc/subscriptions",
"type": "User",
"url": "https://api.github.com/users/zhancongc",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-09-12T09:21:17Z
|
2021-09-08T15:00:49Z
|
2016-09-12T09:26:24Z
|
NONE
|
resolved
|

|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3569/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3569/timeline
| null |
completed
| null | null | false |
[
"This translation is hosted [here](https://github.com/requests/requests-docs-cn). Can you please open a PR against that repository?\n"
] |
https://api.github.com/repos/psf/requests/issues/3568
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3568/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3568/comments
|
https://api.github.com/repos/psf/requests/issues/3568/events
|
https://github.com/psf/requests/issues/3568
| 176,309,269 |
MDU6SXNzdWUxNzYzMDkyNjk=
| 3,568 |
unusual performance difference between http.client and python-requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2700942?v=4",
"events_url": "https://api.github.com/users/pawelmhm/events{/privacy}",
"followers_url": "https://api.github.com/users/pawelmhm/followers",
"following_url": "https://api.github.com/users/pawelmhm/following{/other_user}",
"gists_url": "https://api.github.com/users/pawelmhm/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/pawelmhm",
"id": 2700942,
"login": "pawelmhm",
"node_id": "MDQ6VXNlcjI3MDA5NDI=",
"organizations_url": "https://api.github.com/users/pawelmhm/orgs",
"received_events_url": "https://api.github.com/users/pawelmhm/received_events",
"repos_url": "https://api.github.com/users/pawelmhm/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/pawelmhm/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pawelmhm/subscriptions",
"type": "User",
"url": "https://api.github.com/users/pawelmhm",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-09-12T07:20:21Z
|
2021-09-08T15:00:49Z
|
2016-09-12T07:41:01Z
|
CONTRIBUTOR
|
resolved
|
I originally asked question [here](http://stackoverflow.com/questions/39435443/why-is-python-3-http-client-so-much-faster-than-python-requests/39438703#39438703), but maybe that's good place to ask as well.
I noticed unusual difference in performance between python-requests and http.client in Python 3.5.2. It puzzles me and I'm really curious what causes this. I have two code samples that do same thing in Python, and one of them (http.client) is significantly faster than other. Why is that?
http.client code:
``` python
import http.client
conn = http.client.HTTPConnection("localhost", port=8000)
for i in range(1000):
conn.request("GET", "/")
r1 = conn.getresponse()
body = r1.read()
print(r1.status)
conn.close()
```
python-requests
``` python
import requests
with requests.Session() as session:
for i in range(1000):
r = session.get("http://localhost:8000")
print(r.status_code)
```
now if I run both of them, python-requests is always significantly slower.
On my machine client takes:
```
0.35user 0.10system 0:00.71elapsed 64%CPU
```
and python-requests:
```
1.76user 0.10system 0:02.17elapsed 85%CPU
```
I'm testing with python SimpleHTTPServer (python -m http.server).
Stack Overflow user suggests that performance difference is caused by python-requests not caching hostname lookups properly. I tried to verify this claim but I have not found reasons to support it. When I do cProfile and look at number of ncalls to socket.getaddrinfo I see that both code samples do same amount of calls to getaddrinfo.
```
# running http.client
~/p/p/requests (master) python -m cProfile cc.py | grep getaddrinfo
1000 0.003 0.000 0.036 0.000 socket.py:715(getaddrinfo)
1000 0.021 0.000 0.026 0.000 {built-in method _socket.getaddrinfo}
# running requests
requests ~/p/p/requests (master) python -m cProfile r.py | grep getaddrinfo 09:18:00
1000 0.003 0.000 0.040 0.000 socket.py:715(getaddrinfo)
1000 0.026 0.000 0.030 0.000 {built-in method _socket.getaddrinfo}
```
for some reasons python-requests do spend more time in this function, but I'm not sure if this explains requests slowness.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/3568/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3568/timeline
| null |
completed
| null | null | false |
[
"> Stack Overflow user suggests that performance difference is caused by python-requests not caching hostname lookups properly. \n\nThat would affect httplib just as much as it affects us. Given that we use httplib for our low-level HTTP, I'd be startled it if was caching hostname lookups and we weren't.\n\nThe reason Requests is slower is because it does _substantially_ more than httplib. httplib can be thought of as the bottom layer of the stack: it does the low-level wrangling of sockets. Requests is two layers further up, and adds things like cookies, connection pooling, additional settings, and kinds of other fun things. This is _necessarily_ going to slow things down. We simply have to compute a lot more than httplib does.\n\nYou can see this by looking at cProfile results for Requests: there's just _way more_ result than there is for httplib. This is always to be expected with high-level libraries: they add more overhead because they have to do a lot more work.\n\nWhile we can look at targetted performance improvements, the sheer height of the call stack in all cases is going to hurt our performance markedly. That means that the complaint that \"requests is slower than httplib\" is always going to be true: it's like complaining that \"requests is slower than sending carefully crafted raw bytes down sockets.\" That's true, and it'll always be true: there's nothing we can do about that.\n"
] |
https://api.github.com/repos/psf/requests/issues/3567
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3567/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3567/comments
|
https://api.github.com/repos/psf/requests/issues/3567/events
|
https://github.com/psf/requests/issues/3567
| 176,252,319 |
MDU6SXNzdWUxNzYyNTIzMTk=
| 3,567 |
Start without network hang-up
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/8550352?v=4",
"events_url": "https://api.github.com/users/lrodorigo/events{/privacy}",
"followers_url": "https://api.github.com/users/lrodorigo/followers",
"following_url": "https://api.github.com/users/lrodorigo/following{/other_user}",
"gists_url": "https://api.github.com/users/lrodorigo/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/lrodorigo",
"id": 8550352,
"login": "lrodorigo",
"node_id": "MDQ6VXNlcjg1NTAzNTI=",
"organizations_url": "https://api.github.com/users/lrodorigo/orgs",
"received_events_url": "https://api.github.com/users/lrodorigo/received_events",
"repos_url": "https://api.github.com/users/lrodorigo/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/lrodorigo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lrodorigo/subscriptions",
"type": "User",
"url": "https://api.github.com/users/lrodorigo",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2016-09-11T16:38:17Z
|
2021-09-08T15:00:50Z
|
2016-09-11T17:50:23Z
|
NONE
|
resolved
|
Hi,
I am trying to handle and recovery intermittent-network conditions while using requests.
I tried this very-simple script:
https://gist.github.com/lrodorigo/30b00c20f5994aaf2911d7443b8f1f15
And i noticed as follows:
- if the script is started with _wifi-on_ it works as expected (Connection Alive and exception traceback printed coherently with network status)
- if the script is started with **wifi-off** it continously raise **ConnectionError: HTTPConnectionPool(host='www.google.com', port=80): Max retries exceeded** exception, **even if I connect to the internet while it's running**.
I'm running on Arch Linux Kernel 4.5.4-1-ARCH, Python 2.7.11 x64, Requests v2.10.0
Anyone experienced this kind of issue?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3567/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3567/timeline
| null |
completed
| null | null | false |
[
"This seems like it may be related to #3223. I don't think there was any resolution with that issue though.\n",
"It's my strong belief that this is a behaviour outside of Requests manifesting. If you aren't using a `Session`, there is essentially no persistent state across requests. Can you try your script with urllib2 to confirm whether you experience the same issue?\n",
"I am not using Session.\nI experience same issue also with urllib2.\n\nI just work-rounded with a custom error exit-code and a shell script that\nmonitor it and restarts my program in a loop until, but it's really ugly.\n\n2016-09-11 19:15 GMT+02:00 Cory Benfield [email protected]:\n\n> It's my strong belief that this is a behaviour outside of Requests\n> manifesting. If you aren't using a Session, there is essentially no\n> persistent state across requests. Can you try your script with urllib2 to\n> confirm whether you experience the same issue?\n> \n> —\n> You are receiving this because you authored the thread.\n> Reply to this email directly, view it on GitHub\n> https://github.com/kennethreitz/requests/issues/3567#issuecomment-246191585,\n> or mute the thread\n> https://github.com/notifications/unsubscribe-auth/AIJ30Pwwu4PL2X0CjVW7uRVOMTfnUI-wks5qpDcigaJpZM4J6C7F\n> .\n\n## \n\n_Luigi R._\n",
"If you experience the same issue with urllib2 then there is very little we can do to help you. Given that I tried to reproduce this on OS X and was unable to, it seems to be a problem with the way the Linux networking stack is configured. You'll need to pursue debugging it with the Linux folks or your OS vendor. \n"
] |
https://api.github.com/repos/psf/requests/issues/3566
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3566/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3566/comments
|
https://api.github.com/repos/psf/requests/issues/3566/events
|
https://github.com/psf/requests/issues/3566
| 176,125,121 |
MDU6SXNzdWUxNzYxMjUxMjE=
| 3,566 |
behavior of encoding
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1165771?v=4",
"events_url": "https://api.github.com/users/fredzannarbor/events{/privacy}",
"followers_url": "https://api.github.com/users/fredzannarbor/followers",
"following_url": "https://api.github.com/users/fredzannarbor/following{/other_user}",
"gists_url": "https://api.github.com/users/fredzannarbor/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/fredzannarbor",
"id": 1165771,
"login": "fredzannarbor",
"node_id": "MDQ6VXNlcjExNjU3NzE=",
"organizations_url": "https://api.github.com/users/fredzannarbor/orgs",
"received_events_url": "https://api.github.com/users/fredzannarbor/received_events",
"repos_url": "https://api.github.com/users/fredzannarbor/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/fredzannarbor/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fredzannarbor/subscriptions",
"type": "User",
"url": "https://api.github.com/users/fredzannarbor",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-09-09T22:21:19Z
|
2021-09-08T15:00:51Z
|
2016-09-09T22:33:25Z
|
NONE
|
resolved
|
Hello,
I am having a terrible time figuring out what requests is doing here.
``` r = requests.get('http://0.0.0.0:5006/buy', params=payload)
print(r.url) #debug
print(r.encoding)
print(r.text)
print(r.headers)
``````
debugging commands in console produce:
```payload from cli is {"key1": "foo", "key2": "bar"}
payload in function is {"key1": "foo", "key2": "bar"}
http://0.0.0.0:5006/buy?%7B%22key1%22:%20%22foo%22,%20%22key2%22:%20%22bar%22%7D
ISO-8859-1```
The URL looks wrong to me, is it somehow being created incorrectly?
``````
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3566/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3566/timeline
| null |
completed
| null | null | false |
[
"The URL is right; its urlencoding the string payload you gave it. You cannot just send arbitrary data in a URL: as it is structured, certain bytes have semantic meaning. If you don't want that meaning, you need to encode those bytes using what is called urlencoding. That's what we have done here. \n"
] |
https://api.github.com/repos/psf/requests/issues/3565
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3565/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3565/comments
|
https://api.github.com/repos/psf/requests/issues/3565/events
|
https://github.com/psf/requests/issues/3565
| 175,963,897 |
MDU6SXNzdWUxNzU5NjM4OTc=
| 3,565 |
Allow the customization of the default session creation
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/838250?v=4",
"events_url": "https://api.github.com/users/cdman/events{/privacy}",
"followers_url": "https://api.github.com/users/cdman/followers",
"following_url": "https://api.github.com/users/cdman/following{/other_user}",
"gists_url": "https://api.github.com/users/cdman/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/cdman",
"id": 838250,
"login": "cdman",
"node_id": "MDQ6VXNlcjgzODI1MA==",
"organizations_url": "https://api.github.com/users/cdman/orgs",
"received_events_url": "https://api.github.com/users/cdman/received_events",
"repos_url": "https://api.github.com/users/cdman/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/cdman/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cdman/subscriptions",
"type": "User",
"url": "https://api.github.com/users/cdman",
"user_view_type": "public"
}
|
[
{
"color": "02e10c",
"default": false,
"description": null,
"id": 76800,
"name": "Feature Request",
"node_id": "MDU6TGFiZWw3NjgwMA==",
"url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request"
},
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
},
{
"color": "fbca04",
"default": false,
"description": null,
"id": 615414998,
"name": "GAE Support",
"node_id": "MDU6TGFiZWw2MTU0MTQ5OTg=",
"url": "https://api.github.com/repos/psf/requests/labels/GAE%20Support"
}
] |
closed
| false | null |
[] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2024-05-19T18:29:04Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/34",
"id": 11073254,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/34/labels",
"node_id": "MI_kwDOABTKOs4AqPbm",
"number": 34,
"open_issues": 0,
"state": "open",
"title": "Bankruptcy",
"updated_at": "2024-05-20T14:37:16Z",
"url": "https://api.github.com/repos/psf/requests/milestones/34"
}
| 8 |
2016-09-09T09:15:31Z
|
2024-05-20T14:36:31Z
|
2024-05-20T14:36:31Z
|
NONE
| null |
I'm trying to use the `sift-python` library on Google Appengine which depends on requests (https://github.com/SiftScience/sift-python/issues/52). There is the solution of using `requests-toolbelt` to monkeypatch `requests` to work on GAE, however monkeypatching would not be needed if `requests` allowed for the customization of the "session factory". Ie. in `api.request` have something like:
``` python
with session_factory() as session:
return session.request(method=method, url=url, **kwargs)
```
With `session_factory` having a default value of `sessions.Session`.
Then we could write code like:
``` python
from urllib3.contrib.appengine import AppEngineManager
requests.set_default_session_factory(AppEngineManager)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3565/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3565/timeline
| null |
completed
| null | null | false |
[
"I'm honestly fairly unconvinced of the value of this. It only works if people opt-in to it in their libraries, and that's functionally equivalent to having the library you're using let you pass in a Session or SessionFactory callable. \n",
"@Lukasa - it also works if library authors _don't opt-out_ by using the Session API. Ie. if they just write `request.get` (which I suspect most of them do) instead of setting up a session and calling methods on it, this \"default session factory\" would be used.\n\nAlso, perhaps some guidance in form of documentation - ie. \"How to use requests if you're a library author\" - could be useful. It could cover things like: allow the users of your library to customize how requests are made, to set up proxies, etc.\n",
"I have to agree with @Lukasa here.\n\n> it also works if library authors don't opt-out by using the Session API. Ie. if they just write request.get (which I suspect most of them do) instead of setting up a session and calling methods on it, this \"default session factory\" would be used.\n\nIf someone is building a library that is interacting with a service using requests and not using a Session they're performing a **severe** disservice to their users. Their users are paying the performance price for the library author not using a Session. Further, if the library is not providing a way for their users to customize the session (or provide their own ready-made one) then they're also hurting their users. Poor usage of requests is not requests responsibility to fix.\n\n> Also, perhaps some guidance in form of documentation - ie. \"How to use requests if you're a library author\" - could be useful. It could cover things like: allow the users of your library to customize how requests are made, to set up proxies, etc.\n\nPeople don't often like to be told how to do things, and certain authors will violently disagree with what I've said above. We already have people who send pull requests to change the code-style of requests from Kenneth's style to their own personal style. The last thing we need is another avenue for people to send pull requests that are based on their own opinions of how to do a thing.\n\nBlog posts are the better place for this kind of discussion, not Requests' documentation.\n",
"> Blog posts are the better place for this kind of discussion, not Requests' documentation.\r\n\r\nRespectfully, I disagree. \r\n\r\nThis paragraph belongs in the documentation, perhaps a page titled \"Recommendations for Package Maintainers who use Requests\":\r\n\r\n> If someone is building a library that is interacting with a service using requests and not using a Session they're performing a severe disservice to their users. Their users are paying the performance price for the library author not using a Session. Further, if the library is not providing a way for their users to customize the session (or provide their own ready-made one) then they're also hurting their users.\r\n\r\nSincerely, a library author who used `requests` the wrong way and should have used a Session but didn't, and wasted a lot more time on fixing their mistake and repackaging than I would have liked.",
"So I agree that *that* paragraph belongs in the documentation: I was talking about overriding the TLS config. 😉",
"That paragraph needs rephrasing to not be completely condescending though.",
"Agreed, condescension isn't a great look for us.",
"In an effort to clean up the issue tracker to only have issues that are still relevant to the project we've done a quick pass and decided this issue may no longer be relevant for a variety of potential reasons, including:\r\n\r\n* Applies to a much older version, unclear whether the issue still applies.\r\n* Change requires a backwards incompatible release and it's unclear if the benefits are worth the migration effort from the community.\r\n* There isn't a clear demand from the community on the change landing in Requests.\r\n\r\nIf you think the issue should remain open, please comment so below or open a new issue and link back to the original issue. Again, thank you for opening the issue and for the discussion, it's much appreciated."
] |
https://api.github.com/repos/psf/requests/issues/3564
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3564/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3564/comments
|
https://api.github.com/repos/psf/requests/issues/3564/events
|
https://github.com/psf/requests/issues/3564
| 175,892,780 |
MDU6SXNzdWUxNzU4OTI3ODA=
| 3,564 |
get request with a payload param key having a '-' dash in the word, does not work
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1267234?v=4",
"events_url": "https://api.github.com/users/mnikhil-git/events{/privacy}",
"followers_url": "https://api.github.com/users/mnikhil-git/followers",
"following_url": "https://api.github.com/users/mnikhil-git/following{/other_user}",
"gists_url": "https://api.github.com/users/mnikhil-git/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mnikhil-git",
"id": 1267234,
"login": "mnikhil-git",
"node_id": "MDQ6VXNlcjEyNjcyMzQ=",
"organizations_url": "https://api.github.com/users/mnikhil-git/orgs",
"received_events_url": "https://api.github.com/users/mnikhil-git/received_events",
"repos_url": "https://api.github.com/users/mnikhil-git/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mnikhil-git/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mnikhil-git/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mnikhil-git",
"user_view_type": "public"
}
|
[
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
},
{
"color": "fef2c0",
"default": false,
"description": null,
"id": 298537994,
"name": "Needs More Information",
"node_id": "MDU6TGFiZWwyOTg1Mzc5OTQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20More%20Information"
}
] |
closed
| true | null |
[] | null | 1 |
2016-09-08T23:47:19Z
|
2021-09-08T08:00:29Z
|
2017-07-30T00:14:35Z
|
NONE
|
resolved
|
example:
url_payload = { 'username': user, 'severity': 'critical', 'job-type': 'spark', 'finished-time-begin': START_TIME, 'finished-time-end': FINISH_TIME}
results in non-200 code while testing with curl or postman or a browser with a same parameter with '-' in it, yields correct results.
Any clues?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3564/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3564/timeline
| null |
completed
| null | null | false |
[
"We need a lot more details about the actual code before being able to be helpful.\n"
] |
https://api.github.com/repos/psf/requests/issues/3563
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3563/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3563/comments
|
https://api.github.com/repos/psf/requests/issues/3563/events
|
https://github.com/psf/requests/pull/3563
| 175,831,728 |
MDExOlB1bGxSZXF1ZXN0ODQ1MzE3OTQ=
| 3,563 |
use enforce_content_length in Requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
}
] |
closed
| true | null |
[] | null | 12 |
2016-09-08T18:45:28Z
|
2021-09-08T01:21:41Z
|
2016-11-16T13:29:51Z
|
MEMBER
|
resolved
|
This is a **DO NOT MERGE** until we bring urllib3 1.17 into the Requests `proposed/3.0.0` branch, which should likely happen before 3.0.0 gets out the door.
This is the final step for the work in shazow/urllib3#949 and addresses the issues originally raised in shazow/urllib3#723 and #2833.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 2,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 2,
"url": "https://api.github.com/repos/psf/requests/issues/3563/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3563/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3563.diff",
"html_url": "https://github.com/psf/requests/pull/3563",
"merged_at": "2016-11-16T13:29:51Z",
"patch_url": "https://github.com/psf/requests/pull/3563.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3563"
}
| true |
[
"@Lukasa, here's a bit of a hiccup I'd appreciate some input on. Currently all data reads in Requests end up using `itercontent`. \n\nCurrently, regardless of whether or not the content is streamed, we end up initially running it through the `generate` function inside `itercontent`. This function assumes we're using Transfer-Encoding: chunked if it's called, and raises `ChunkedEncodingError` wrapping urllib3's `ProtocolError`. This is clearly incorrect because we're not chunked for responses with Content-Length but I'm not sure how best to catch this. \n\nWe could parse the exception string for 'IncompleteRead', but that seems clunky to me. We could also add a conditional inside to raise `ProtocolError` on `stream=False` and `ChunkedEncodingError` on `stream=True` but that may be subtle enough to confuse since it's the same base error. The last option I can think of is removing `ChunkedEncodingError` completely and just raising the unmodified `ProtocolError` instead. Thoughts?\n",
"...since when does `iter_content()` make any assumptions about the transfer-encoding of the response? Can you point to where in the code you believe this assumption is made?\n",
"Oh, nevermind, I see! There's no need for all that, we can just check the response headers for a transfer encoding. =)\n",
"This is [the block](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L678-L679) I'm hitting now that we're raising the appropriate `ProtocolError` in urllib3. Perhaps this is simply a wording issue on the [exception](https://github.com/kennethreitz/requests/blob/master/requests/exceptions.py#L88-L89) though. When I read \"chunked encoding\", I immediately go to Transfer-Encoding, but perhaps that's me misreading things.\n",
"Ok, great! To clarify, are you suggesting a conditional in the `except ProtocolError` checking the headers? Is raising the `ProtocolError` otherwise appropriate?\n\nEdit: The reason I ask is it would adversely affect `resolve_redirect` [here](https://github.com/kennethreitz/requests/blob/master/requests/sessions.py#L106-L107).\n",
"We must raise a Requests exception.\n",
"Alright, I'll run with that then. Thanks!\n",
"@Lukasa we'll need to merge master into Proposed/3.0.0 to run the tests normally, but with 1.19 in Requests 2.12 this should be ready for a look.\n",
"That merge is done. =)\n",
"Thanks @Lukasa! I'll merge it in now.\n",
"Branch updated and tests are running clean. I think this is ready for a review whenever you find a spare moment.\n",
"Ok, I think this looks good! Thanks so much @nateprewitt! :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/3562
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3562/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3562/comments
|
https://api.github.com/repos/psf/requests/issues/3562/events
|
https://github.com/psf/requests/issues/3562
| 175,797,019 |
MDU6SXNzdWUxNzU3OTcwMTk=
| 3,562 |
Same error here " bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/22066790?v=4",
"events_url": "https://api.github.com/users/CHW2017/events{/privacy}",
"followers_url": "https://api.github.com/users/CHW2017/followers",
"following_url": "https://api.github.com/users/CHW2017/following{/other_user}",
"gists_url": "https://api.github.com/users/CHW2017/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/CHW2017",
"id": 22066790,
"login": "CHW2017",
"node_id": "MDQ6VXNlcjIyMDY2Nzkw",
"organizations_url": "https://api.github.com/users/CHW2017/orgs",
"received_events_url": "https://api.github.com/users/CHW2017/received_events",
"repos_url": "https://api.github.com/users/CHW2017/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/CHW2017/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/CHW2017/subscriptions",
"type": "User",
"url": "https://api.github.com/users/CHW2017",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-09-08T16:16:01Z
|
2021-09-08T15:00:52Z
|
2016-09-08T16:46:54Z
|
NONE
|
resolved
|
import urllib3
import urllib3.contrib.pyopenssl
import certifi
urllib3.contrib.pyopenssl.inject_into_urllib3()
http = urllib3.PoolManager(
cert_reqs='CERT_REQUIRED', # Force certificate check.
ca_certs=certifi.where(), # Path to the Certifi bundle.
)
# You're ready to make verified HTTPS requests.
try:
r = http.request('GET', 'https://devapi.trustmile.com/static/index.html')
print r.status
except urllib3.exceptions.SSLError as e:
print e
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3562/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3562/timeline
| null |
completed
| null | null | false |
[
"This is a bug report for urllib3, but you have issued it against Requests. =) Please open this bug report on the urllib3 repository.\n"
] |
https://api.github.com/repos/psf/requests/issues/3561
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3561/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3561/comments
|
https://api.github.com/repos/psf/requests/issues/3561/events
|
https://github.com/psf/requests/pull/3561
| 175,639,898 |
MDExOlB1bGxSZXF1ZXN0ODQzOTc4NDg=
| 3,561 |
Allow use of 'no_proxy' in the proxies argument
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/578155?v=4",
"events_url": "https://api.github.com/users/JohnVillalovos/events{/privacy}",
"followers_url": "https://api.github.com/users/JohnVillalovos/followers",
"following_url": "https://api.github.com/users/JohnVillalovos/following{/other_user}",
"gists_url": "https://api.github.com/users/JohnVillalovos/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/JohnVillalovos",
"id": 578155,
"login": "JohnVillalovos",
"node_id": "MDQ6VXNlcjU3ODE1NQ==",
"organizations_url": "https://api.github.com/users/JohnVillalovos/orgs",
"received_events_url": "https://api.github.com/users/JohnVillalovos/received_events",
"repos_url": "https://api.github.com/users/JohnVillalovos/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/JohnVillalovos/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JohnVillalovos/subscriptions",
"type": "User",
"url": "https://api.github.com/users/JohnVillalovos",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2016-09-08T00:47:43Z
|
2021-09-07T00:06:35Z
|
2017-02-10T17:21:29Z
|
CONTRIBUTOR
|
resolved
|
Add the ability to add 'no_proxy' and a value to the 'proxies'
dictionary argument.
https://github.com/kennethreitz/requests/issues/2817
Closes gh-2817
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3561/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3561/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3561.diff",
"html_url": "https://github.com/psf/requests/pull/3561",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3561.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3561"
}
| true |
[
"Alright, good start @JohnVillalovos! I've left some feedback in the diff that would be good to have addressed, thanks. =D\n",
"Closing due to inactivity (not sure if that's our fault or not, sorry!). Please re-submit if you're still interested in contributing this code :)",
"As far as I know the submitted patch is still valid, though I haven't dug into it.\r\n\r\nI thought I had responded to all comments and latest version took them into account.",
"@JohnVillalovos that's true but we cannot reopen this since your repository seems to have been deleted.",
"Okay. Looks like I have some merge conflicts. I will create another pull request after I fix it.",
"Thanks @JohnVillalovos ",
"Created new pull request: https://github.com/kennethreitz/requests/pull/3865"
] |
https://api.github.com/repos/psf/requests/issues/3560
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3560/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3560/comments
|
https://api.github.com/repos/psf/requests/issues/3560/events
|
https://github.com/psf/requests/issues/3560
| 175,624,505 |
MDU6SXNzdWUxNzU2MjQ1MDU=
| 3,560 |
TypeError: unsupported operand type(s) for -=: 'Retry' and 'int'
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/17013620?v=4",
"events_url": "https://api.github.com/users/ethanaward/events{/privacy}",
"followers_url": "https://api.github.com/users/ethanaward/followers",
"following_url": "https://api.github.com/users/ethanaward/following{/other_user}",
"gists_url": "https://api.github.com/users/ethanaward/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ethanaward",
"id": 17013620,
"login": "ethanaward",
"node_id": "MDQ6VXNlcjE3MDEzNjIw",
"organizations_url": "https://api.github.com/users/ethanaward/orgs",
"received_events_url": "https://api.github.com/users/ethanaward/received_events",
"repos_url": "https://api.github.com/users/ethanaward/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ethanaward/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ethanaward/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ethanaward",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 10 |
2016-09-07T22:44:04Z
|
2021-09-08T07:00:29Z
|
2016-09-08T09:23:39Z
|
NONE
|
resolved
|
### Description:
I've been trying to install pip into an virtualenv as part of building a Linux package. When installing pip, it always gives me the same error in requests and fails. I've seen the same error in shazow/urllib3#567, but the fix there doesn't seem to have worked for me.
### What I've run:
```
dh build-arch -S --buildsystem=dh_virtualenv --setuptools --python=python2.7 --no-test
dh_testdir -a -O-S -O--buildsystem=dh_virtualenv -O--setuptools -O--python=python2.7 -O--no-test
dh_update_autotools_config -a -O-S -O--buildsystem=dh_virtualenv -O--setuptools -O--python=python2.7 -O--no-test
dh_auto_configure -a -O-S -O--buildsystem=dh_virtualenv -O--setuptools -O--python=python2.7 -O--no-test
mkdir -p build/usr/lib/mycroft-core
dh_auto_build -a -O-S -O--buildsystem=dh_virtualenv -O--setuptools -O--python=python2.7 -O--no-test
cd build
virtualenv --always-copy --clear /«PKGBUILDDIR»/build/usr/lib/mycroft-core
Not deleting /«PKGBUILDDIR»/build/usr/lib/mycroft-core/bin
New python executable in /«PKGBUILDDIR»/build/usr/lib/mycroft-core/bin/python2
Also creating executable in /«PKGBUILDDIR»/build/usr/lib/mycroft-core/bin/python
Installing setuptools, pkg_resources, pip, wheel...
Complete output from command /build/mycroft-core-...oft-core/bin/python2 - setuptools pkg_resources pip wheel:
Collecting setuptools
Exception:
Traceback (most recent call last):
File "/usr/share/python-wheels/pip-8.1.1-py2.py3-none-any.whl/pip/basecommand.py", line 209, in main
status = self.run(options, args)
File "/usr/share/python-wheels/pip-8.1.1-py2.py3-none-any.whl/pip/commands/install.py", line 328, in run
wb.build(autobuilding=True)
File "/usr/share/python-wheels/pip-8.1.1-py2.py3-none-any.whl/pip/wheel.py", line 748, in build
self.requirement_set.prepare_files(self.finder)
File "/usr/share/python-wheels/pip-8.1.1-py2.py3-none-any.whl/pip/req/req_set.py", line 360, in prepare_files
ignore_dependencies=self.ignore_dependencies))
File "/usr/share/python-wheels/pip-8.1.1-py2.py3-none-any.whl/pip/req/req_set.py", line 512, in _prepare_file
finder, self.upgrade, require_hashes)
File "/usr/share/python-wheels/pip-8.1.1-py2.py3-none-any.whl/pip/req/req_install.py", line 273, in populate_link
self.link = finder.find_requirement(self, upgrade)
File "/usr/share/python-wheels/pip-8.1.1-py2.py3-none-any.whl/pip/index.py", line 442, in find_requirement
all_candidates = self.find_all_candidates(req.name)
File "/usr/share/python-wheels/pip-8.1.1-py2.py3-none-any.whl/pip/index.py", line 400, in find_all_candidates
for page in self._get_pages(url_locations, project_name):
File "/usr/share/python-wheels/pip-8.1.1-py2.py3-none-any.whl/pip/index.py", line 545, in _get_pages
page = self._get_page(location)
File "/usr/share/python-wheels/pip-8.1.1-py2.py3-none-any.whl/pip/index.py", line 648, in _get_page
return HTMLPage.get_page(link, session=self.session)
File "/usr/share/python-wheels/pip-8.1.1-py2.py3-none-any.whl/pip/index.py", line 757, in get_page
"Cache-Control": "max-age=600",
File "/«PKGBUILDDIR»/build/usr/lib/mycroft-core/share/python-wheels/requests-2.9.1-py2.py3-none-any.whl/requests/sessions.py", line 480, in get
return self.request('GET', url, **kwargs)
File "/usr/share/python-wheels/pip-8.1.1-py2.py3-none-any.whl/pip/download.py", line 378, in request
return super(PipSession, self).request(method, url, *args, **kwargs)
File "/«PKGBUILDDIR»/build/usr/lib/mycroft-core/share/python-wheels/requests-2.9.1-py2.py3-none-any.whl/requests/sessions.py", line 468, in request
resp = self.send(prep, **send_kwargs)
File "/«PKGBUILDDIR»/build/usr/lib/mycroft-core/share/python-wheels/requests-2.9.1-py2.py3-none-any.whl/requests/sessions.py", line 576, in send
r = adapter.send(request, **kwargs)
File "/«PKGBUILDDIR»/build/usr/lib/mycroft-core/share/python-wheels/CacheControl-0.11.5-py2.py3-none-any.whl/cachecontrol/adapter.py", line 46, in send
resp = super(CacheControlAdapter, self).send(request, **kw)
File "/«PKGBUILDDIR»/build/usr/lib/mycroft-core/share/python-wheels/requests-2.9.1-py2.py3-none-any.whl/requests/adapters.py", line 376, in send
Traceback (most recent call last):
timeout=timeout
File "/«PKGBUILDDIR»/build/usr/lib/mycroft-core/share/python-wheels/urllib3-1.13.1-py2.py3-none-any.whl/urllib3/connectionpool.py", line 610, in urlopen
File "/usr/lib/python3/dist-packages/virtualenv.py", line 2363, in <module>
_stacktrace=sys.exc_info()[2])
File "/«PKGBUILDDIR»/build/usr/lib/mycroft-core/share/python-wheels/urllib3-1.13.1-py2.py3-none-any.whl/urllib3/util/retry.py", line 228, in increment
total -= 1
TypeError: unsupported operand type(s) for -=: 'Retry' and 'int'
----------------------------------------
...Installing setuptools, pkg_resources, pip, wheel...done.
main()
File "/usr/lib/python3/dist-packages/virtualenv.py", line 719, in main
symlink=options.symlink)
File "/usr/lib/python3/dist-packages/virtualenv.py", line 988, in create_environment
download=download,
File "/usr/lib/python3/dist-packages/virtualenv.py", line 918, in install_wheel
call_subprocess(cmd, show_stdout=False, extra_env=env, stdin=SCRIPT)
File "/usr/lib/python3/dist-packages/virtualenv.py", line 812, in call_subprocess
% (cmd_desc, proc.returncode))
OSError: Command /build/mycroft-core-...oft-core/bin/python2 - setuptools pkg_resources pip wheel failed with error code 2
Running virtualenv with interpreter /usr/bin/python2
dh_auto_build: virtualenv --always-copy --clear /«PKGBUILDDIR»/build/usr/lib/mycroft-core returned exit code 1
cd /«PKGBUILDDIR»
debian/rules:10: recipe for target 'build-arch' failed
make: *** [build-arch] Error 25
dpkg-buildpackage: error: debian/rules build-arch gave error exit status 2
```
`
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3560/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3560/timeline
| null |
completed
| null | null | false |
[
"In this instance, this is almost certainly Debian/Ubuntu's problem and not ours: you appear to be using Debian/Ubuntu's packages. That means you need to follow up with Debian/Ubuntu in the first case, as they maintain several patches and are responsible for supporting those packages in their system. If they find that the problem can be reproduced using the source that we distributed to them, then we'll accept the bug report here.\n\nSorry we can't be more helpful!\n",
"@Lukasa could you give any pointers to debug this? I'm getting the same error when I try to use pip. I'm trying to understand what's going on, but I don't know where to start",
"@elopio The first question is to work out what OS you're on and where your packages have come from. How did you get pip? How did you get Requests? Where are they coming from in your OS?",
"I'm using ubuntu core 16 on beaglebone black. I installed pip from the 16.04 archive.\r\nI added a debug print and found that my board can't ping pypi.python.org. There's something weird with the dns, so I added the ip to /etc/hosts, and the error no longer appears.",
"So that has *masked* the underlying problem. You say you got pip from 16.04: did you install Requests from anywhere else?",
"I tried with the one installed by pip with sudo, the one from the ubuntu archive, and the one installed in a virtualenv. py2 and py3. They all failed the same way:\r\n\r\n```\r\n File \"/home/ubuntu/.ENV/share/python-wheels/requests-2.9.1-py2.py3-none-any.whl/requests/adapters.py\", line 376, in send\r\n timeout=timeout\r\n File \"/home/ubuntu/.ENV/share/python-wheels/urllib3-1.13.1-py2.py3-none-any.whl/urllib3/connectionpool.py\", line 610, in urlopen\r\n _stacktrace=sys.exc_info()[2])\r\n File \"/home/ubuntu/.ENV/share/python-wheels/urllib3-1.13.1-py2.py3-none-any.whl/urllib3/util/retry.py\", line 228, in increment\r\n total -= 1\r\nTypeError: unsupported operand type(s) for -=: 'Retry' and 'int'\r\n\r\n```",
"Once you have installed via sudo with pip, all bets are off. Ubuntu's built in packages make substantial modifications to both pip and Requests, which includes removing stuff. If you *ever* mix the Ubuntu packages with the pip ones, a mess will ensue and it basically permanently damages them unless you can reinstall from Ubuntu's sources. \r\n\r\nYou have to start with a virtualenv if you want pip, otherwise restrict yourself to apt-get.",
"@Lukasa can you try to explain this to me? I am trying to install a virtualenv, and it the virtualenv is trying to install pip, and I get this error. If I install pip first, I still get this error. Should I install pip first, or virtualenv first?",
"@cfossace It depends how you're installing them. On Linux, you should do `apt-get install python-virtualenv` (or the `yum` equivalent), and then use `virtualenv`. Don't install pip or virtualenv any other way. At that point, when you create virtualenvs you will be able to have isolated Python environments that don't affect your system.",
"Thank you! I didn't realize python-virtualenv was a package.\n\nOn Fri, Aug 25, 2017 at 3:28 AM, Cory Benfield <[email protected]>\nwrote:\n\n> @cfossace\n> <https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fcfossace&data=02%7C01%7Ccfossace%40villanova.edu%7C0bfb1a2ead2e44a9108108d4eb8ae7be%7C765a8de5cf9444f09cafae5bf8cfa366%7C0%7C0%7C636392429213691672&sdata=8NFbq8D6Bdu%2BnTocxi%2BbAJaczOjqqQ8taAJI3t5grQ0%3D&reserved=0>\n> It depends how you're installing them. On Linux, you should do apt-get\n> install python-virtualenv (or the yum equivalent), and then use virtualenv.\n> Don't install pip or virtualenv any other way. At that point, when you\n> create virtualenvs you will be able to have isolated Python environments\n> that don't affect your system.\n>\n> —\n> You are receiving this because you were mentioned.\n> Reply to this email directly, view it on GitHub\n> <https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Frequests%2Frequests%2Fissues%2F3560%23issuecomment-324844916&data=02%7C01%7Ccfossace%40villanova.edu%7C0bfb1a2ead2e44a9108108d4eb8ae7be%7C765a8de5cf9444f09cafae5bf8cfa366%7C0%7C0%7C636392429213691672&sdata=uO7pfPYNAEXkKHZobD7IErbHrRSZwG5ZufkVwbhjdK0%3D&reserved=0>,\n> or mute the thread\n> <https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fnotifications%2Funsubscribe-auth%2FAIOMc9pD6XN47SKmDBX8HEGoJ7kBcCb3ks5sbnemgaJpZM4J3bzG&data=02%7C01%7Ccfossace%40villanova.edu%7C0bfb1a2ead2e44a9108108d4eb8ae7be%7C765a8de5cf9444f09cafae5bf8cfa366%7C0%7C0%7C636392429213691672&sdata=0wYRDY73g2VcL7lRMaKxwTwcF0OhY0ATCCXIb%2FLvsU8%3D&reserved=0>\n> .\n>\n"
] |
https://api.github.com/repos/psf/requests/issues/3559
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3559/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3559/comments
|
https://api.github.com/repos/psf/requests/issues/3559/events
|
https://github.com/psf/requests/issues/3559
| 175,593,078 |
MDU6SXNzdWUxNzU1OTMwNzg=
| 3,559 |
Utils.check_header_validity fails on integer values
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5606946?v=4",
"events_url": "https://api.github.com/users/napalmer7/events{/privacy}",
"followers_url": "https://api.github.com/users/napalmer7/followers",
"following_url": "https://api.github.com/users/napalmer7/following{/other_user}",
"gists_url": "https://api.github.com/users/napalmer7/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/napalmer7",
"id": 5606946,
"login": "napalmer7",
"node_id": "MDQ6VXNlcjU2MDY5NDY=",
"organizations_url": "https://api.github.com/users/napalmer7/orgs",
"received_events_url": "https://api.github.com/users/napalmer7/received_events",
"repos_url": "https://api.github.com/users/napalmer7/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/napalmer7/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/napalmer7/subscriptions",
"type": "User",
"url": "https://api.github.com/users/napalmer7",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-09-07T20:07:07Z
|
2021-09-08T15:00:52Z
|
2016-09-07T20:15:38Z
|
NONE
|
resolved
|
I am working on an app for ThreatConnect using their python SDK. If I use requests v2.11.0 it works fine, but testing against v2.11.1 caused an error in check_header_validity. This issue is easily reproducible with any call against requests where you specify a Content-Length header using an integer.
The check_header_validity method in requests/utils.py currently expects values to be either byte or string types using an isinstance check against the type 'bytes'. If an integer value was specified for a header (i.e. Content-Length) it would fail the isinstance check and is then assumed to be a string.
The call to re.match requires an input type of byte/string so when an integer is passed in it fails throwing an InvalidHeader exception.
Considering that this check is in place to ensure that there is no leading spaces, it should be acceptable to positively validate all numerical values without a re.match call. Alternatively you could do a hard conversion 'value = str(value)'.
ThreatConnect request object
```
{'_content_type': None, '_request_uri': '/v2/indicators/files', '_headers': {'Content-Length': 142, 'Content-Type': 'application/json'}, '_remaining_results': 1, '_path_url': None, '_description': u'adding indicator 868CB6B12D319973349D8A7752C23C05.', '_body': '{"size": "6657", "rating": 5, "confidence": 95, "sha1": "19D049110800D02F15956CE90E76CE429FC4DB9F", "md5": "868CB6B12D319973349D8A7752C23C05"}', '_payload': {'createActivityLog': 'false'}, '_resource_type': <ResourceType.FILES: 535>, '_owner': None, '_success_callback': None, '_result_start': 0, '_http_method': 'POST', '_failure_callback': None, '_owner_allowed': True, '_track': False, '_resource_pagination': False, '_result_limit': 0}
```
Trace:
```
Traceback (most recent call last):
File "dg_tc.py", line 173, in <module>
"TLP Red")
File "dg_tc.py", line 78, in process_data
indicator.commit()
File "/home/tester/.local/lib/python2.7/site-packages/threatconnect/IndicatorObject.py", line 1193, in commit
api_response = self._tc.api_request(ro)
File "/home/tester/.local/lib/python2.7/site-packages/threatconnect/ThreatConnect.py", line 289, in api_request
request_prepped.prepare_headers(ro.headers)
File "/home/tester/.local/lib/python2.7/site-packages/requests/models.py", line 409, in prepare_headers
check_header_validity(header)
File "/home/tester/.local/lib/python2.7/site-packages/requests/utils.py", line 800, in check_header_validity
"not %s" % (value, type(value)))
requests.exceptions.InvalidHeader: Header value 142 must be of type str or bytes, not <type 'int'>
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3559/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3559/timeline
| null |
completed
| null | null | false |
[
"Hey @napalmer7, this is the intention of this check. All header values must be passed as strings in Requests. There are unfortunately more complicated datatypes than `int`, such as `datetime`, which won't format correctly in a `str()` call. You can find relevant discussions and reasoning in #3366 and #3477.\n",
"@napalmer7 **Please** search closed issues for relevant discussions before opening new ones.\n"
] |
https://api.github.com/repos/psf/requests/issues/3558
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3558/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3558/comments
|
https://api.github.com/repos/psf/requests/issues/3558/events
|
https://github.com/psf/requests/issues/3558
| 175,499,119 |
MDU6SXNzdWUxNzU0OTkxMTk=
| 3,558 |
Using Python's requests lib throwing ProxyError
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/14370681?v=4",
"events_url": "https://api.github.com/users/flzq/events{/privacy}",
"followers_url": "https://api.github.com/users/flzq/followers",
"following_url": "https://api.github.com/users/flzq/following{/other_user}",
"gists_url": "https://api.github.com/users/flzq/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/flzq",
"id": 14370681,
"login": "flzq",
"node_id": "MDQ6VXNlcjE0MzcwNjgx",
"organizations_url": "https://api.github.com/users/flzq/orgs",
"received_events_url": "https://api.github.com/users/flzq/received_events",
"repos_url": "https://api.github.com/users/flzq/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/flzq/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/flzq/subscriptions",
"type": "User",
"url": "https://api.github.com/users/flzq",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 13 |
2016-09-07T13:13:43Z
|
2021-08-31T00:06:42Z
|
2016-09-07T14:23:56Z
|
NONE
|
resolved
|
When I use requests.get('http://www.google.com')
It throws a `ProxyError`:
```
ProxyError: HTTPConnectionPool(host=My vps's IP , port=80): Max retries exceeded with url: http://www.google.com/ (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x03EB66F0>: Failed to establish a new connection: [Errno 10061] ',)))
```
When I use the HTTP Protocol request website, the host will turn into my vps's IP. Although I shutdown the VPN. But userHTTPS Protocol will not appear this issue.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3558/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3558/timeline
| null |
completed
| null | null | false |
[
"@coderfreedom You almost certainly have the `HTTP_PROXY` environment variable set. Why is that set if you don't want HTTP tools to use your VPS?\n",
"@Lukasa But I close it in IE settings. \nsometimes I need VPN to climbe the Great Fire Wall : ) you know, a lot of websites I can't visit. Sometimes visit github I also need use VPN.\n",
"Hrm. Can you run this for me?\n\n``` python\nimport requests\nprint requests.utils.get_environ_proxies('http://www.google.com')\n```\n",
"@Lukasa \n\n```\n{'http': '107.160.9.10'}\n```\n\nIt was my vps's IP, why????\n",
"I don't know, but it's somewhere in your system: either the HTTP_PROXY environment variable or elsewhere in your system configuration. Either way, we're just doing what your system tells us to do. You can probably override that by setting `proxies={'http': None}`.\n",
"@Lukasa I use this way before. But sometimes I need use othre lib which can't set proxies, such as `builtwith.parse('http://example.webscraping.com')` it also through `Error: <urlopen error [Errno 10061] >`\n",
"@coderfreedom In that case, this is a configuration issue on your machine. I'm afraid I can't really help beyond that. =(\n",
"@coderfreedom you can use `trust_env=False` to avoid using any proxies discovered in your environment as a work around until you figure out your system configuration problem.\n",
"@Lukasa Anyway, thank you for your help : )\nThis problem has troubled me for months, It drives me crazy \"_\"\nI think it's time to use Mac OS. : )\n",
"@sigmavirus24 But another lib will still not work. \n",
"@Lukasa @sigmavirus24 Does python has a global variable to carvery it???\n",
"@coderfreedom : Did you solve this problem? I also met VPN related issue.\n",
"it read your IE's proxy setting, you can try unset proxy setting in inetcpl.cpl."
] |
https://api.github.com/repos/psf/requests/issues/3557
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3557/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3557/comments
|
https://api.github.com/repos/psf/requests/issues/3557/events
|
https://github.com/psf/requests/pull/3557
| 175,269,053 |
MDExOlB1bGxSZXF1ZXN0ODQxMzc4NTc=
| 3,557 |
Fix how we test fallback to latin-1 reason encoding
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
] | null | 6 |
2016-09-06T14:56:26Z
|
2021-09-08T02:10:31Z
|
2016-09-06T14:58:02Z
|
CONTRIBUTOR
|
resolved
|
This was terribly broken and found while merging master into proposed/3.0.0
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3557/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3557/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3557.diff",
"html_url": "https://github.com/psf/requests/pull/3557",
"merged_at": "2016-09-06T14:58:02Z",
"patch_url": "https://github.com/psf/requests/pull/3557.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3557"
}
| true |
[
"Sorry I let Python 2 slip through the cracks. I am, however, still seeing a failure in python 2.7.10 on multiple systems with this merge. The issue appears to be that we're handing a unicode string in Python 2 to the HTTPError which is producing an \"unprintable HTTPError\" message when it can't be decoded to ascii. This was also present in the original test in PR #3385, we just weren't testing the string. Encoding the actual [error message](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L862-L869) string before passing it to the exception fixes this problem in Python 2.\n\ne.g.\n\n``` python\nif http_error_msg:\n if is_py2:\n http_error_msg = http_error_msg.encode('utf-8')\n raise HTTPError(http_error_msg, response=self)\n```\n\n**Edit:** @sigmavirus24 can you confirm this is failing for you in Python 2 still and that my tests aren't borked?\n\n**Second Edit:** I can't find anything functionally different in this PR from pre-#3557 except that we're now comparing a string object with a unicode object in the final assert statement. I'm not particularly tied to the encoding ordering, but I do think that this test should match the other unicode tests above it. Is there a reason that the encoding needs to be done during the assertion (bytes->str) instead of at the beginning (unicode->bytes)? These will always be unicode strings, so I'm not sure if we gain anything from dropping the reason phrase down to latin-1 during the final assertion (other than it needs to not be bytes). Apologies if I'm missing something obvious.\n\nThis would be [my proposed solution](https://github.com/nateprewitt/requests/commit/0cf4e700d2ce30c9422d788d705c5496d5401d3d) for fixing the failing tests. Like I said, I'm not tied to reverting but I think related tests are easier to understand if they share common pieces.\n",
"@nateprewitt this wasn't working on Python 2 _or_ Python 3 before but this passes for me on Linux and OSX. What are you testing on?\n",
"Odd, it's been working with 3.5.1 on my system since I opened the PR. When you opened this this morning, I checked 2.7.10 on OSX 10.10 and it is not working. The call to str(e) with a unicode string produces an `UnicodeEncodeError: 'ascii' codec can't encode characters in position...` response. I'm reproducing this with this test code, which mimics how we're [passing the message to HTTPError](https://github.com/nateprewitt/requests/blob/master/requests/models.py#L862-L869). The string will always be unicode.\n\n``` python\ne = Exception(u'testöö')\nprint(str(e))\n```\n\nPerhaps this is due to something I've configured in all of my environments though. I'm encountering the same errors with with 2.7.12 on RHEL6.\n",
"Also if you look at the [test](https://github.com/sigmavirus24/requests/blob/42d4eaf6eaad8b43e1048b8ac06aec760c2285ab/tests/test_requests.py#L1016-L1023) immediately above this one, it is very similar in structure except for the string comparison at the end.\n\n#3557 passes in the identical bytes to what was passed prior to the patch:\n\n``` python\nreason = u'Komponenttia ei löydy'\nbytes_reason = b'Komponenttia ei l\\xf6ydy'\nreason.encode('latin-1') == bytes_reason\n```\n\n and they're both comparing an identical unicode reason with a unicode error message.\n\n``` python\nreason = u'Komponenttia ei löydy'\nbytes_reason = b'Komponenttia ei l\\xf6ydy'\nbytes_reason.decode('latin-1') == reason\n```\n\n Do you know what the test was failing on in Python 3? \n",
"@sigmavirus24, here's some tox runs on blank travis instances displaying what I'm seeing locally.\n\nPre-#3557: https://travis-ci.org/nateprewitt/requests/builds/158050706 (87f9693)\nPost-#3557: https://travis-ci.org/nateprewitt/requests/builds/158050156 (2041adb)\nMy Proposed Patch: https://travis-ci.org/nateprewitt/requests/builds/158051134 (0cf4e70)\n",
"Alright @sigmavirus24, I think I've finally gotten to the bottom of the confusion here (or most of it). It looks like you wrote and commited [a separate test](https://github.com/kennethreitz/requests/blob/proposed/3.0.0/tests/test_requests.py#L1054-L1069) for proposed/3.0.0 than you submitted here, which does in fact pass on all versions. This patch as it stands on master though is functionally identical to the test beforehand and still fails for Python 2. \n\nWe could move the test from proposed/3.0.0 over to master to fix the problem, but I feel like it may be unnecessarily verbose. You perform an identical check by simply switching `str(e)` to `e.value.args[0]`. I've updated my commit (39e8c0d) to use this instead, still with the reversion of the encodings for the reasons listed in the comments above. As I said before, not tied to that, but I'd rather keep the tests similar if possible.\n\nI'm fairly confident this is the correct approach at this point, but please let me know if you see something I'm not. I'll open this as a PR tomorrow afternoon unless you'd like to make the changes yourself.\n\nHere's the [Travis build](https://travis-ci.org/nateprewitt/requests/builds/159784486), if you wanna take a look.\n"
] |
https://api.github.com/repos/psf/requests/issues/3556
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3556/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3556/comments
|
https://api.github.com/repos/psf/requests/issues/3556/events
|
https://github.com/psf/requests/issues/3556
| 175,192,290 |
MDU6SXNzdWUxNzUxOTIyOTA=
| 3,556 |
"Requests is the only Non-GMO HTTP library for Python, safe for human consumption."
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1174730?v=4",
"events_url": "https://api.github.com/users/mouuff/events{/privacy}",
"followers_url": "https://api.github.com/users/mouuff/followers",
"following_url": "https://api.github.com/users/mouuff/following{/other_user}",
"gists_url": "https://api.github.com/users/mouuff/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mouuff",
"id": 1174730,
"login": "mouuff",
"node_id": "MDQ6VXNlcjExNzQ3MzA=",
"organizations_url": "https://api.github.com/users/mouuff/orgs",
"received_events_url": "https://api.github.com/users/mouuff/received_events",
"repos_url": "https://api.github.com/users/mouuff/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mouuff/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mouuff/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mouuff",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 24 |
2016-09-06T08:46:00Z
|
2021-09-04T00:06:14Z
|
2016-09-06T09:06:44Z
|
NONE
|
resolved
|
GMO are safe
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 42,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 6,
"hooray": 5,
"laugh": 6,
"rocket": 0,
"total_count": 59,
"url": "https://api.github.com/repos/psf/requests/issues/3556/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3556/timeline
| null |
completed
| null | null | false |
[
"This is _absolutely_ not the venue for this discussion.\n",
"@Lukasa too serious\n",
"@mouuff it's a joke :)\n",
"https://github.com/drtshock/Potato\n",
"@mouuff haha, behold—art!\n",
"What _is_ GMO?\n\nI'm wondering because it's in the first sentence of the repo's `README` and I've never heard of it and have no clue what it's supposed to mean in a programming context. I googled it and only came up with _genetically modified organism_. Is this some sort of in-joke I don't get?\n",
"Yes :)\n",
"I don't know the meaning either, and try to figure it out, but still puzzled. I'm not a native English. @kerstin ",
"@Celthi it's a joke :)",
"@kennethreitz Oh, I see, :) :) :). Thank you.",
"I think @kerstin's and @Celthi's reactions sums this up: this has no place in a README for such a widely depended on project.\r\n\r\nFunny and confusing? Yes.\r\nProfessional? No.\r\n\r\n@Lukasa If this is not the place for this discussion, then it probably shouldn't be in the README in the first place. Seems like complaining about an open invitation.",
"Software is perfectly able to have a sense of humor. ",
"I hadn't thought about it in terms of professionalism – though maybe more because I feel like I can't prescribe what is and what isn't in contexts like these –, but yeah, the problem I see with it is more that inside jokes like this can be pretty confusing for people who are not native speakers of English and/or programming novices, and thus feel exclusionary. (\"Is this something I should know about? If I don't, is it a problem?\")",
"I'm sorry to see that this issue has been closed out-of-hand. I absolutely agree that this is not the venue for a discussion of the merits of GMOs, but by including this sentence - whether it be a joke or not - you are implicitly aligning the requests project with one side of that debate.\r\n\r\nIf you still think this is just a joke rather than a political issue, consider how you would feel if it instead said \"The only GMO-enhanced HTTP library for Python: nutritious and healthy\". If your reaction to that sentence is also \"haha, good joke\" then well done: you really are impartial. But don't expect all your users to laugh it off in the same way.\r\n\r\nEssentially, I consider your joke to be in poor taste. Perhaps we should also have a joke involving some racial stereotypes for completeness?\r\n\r\nAs others have already said: a software library should not be taking political sides. It's possible to have a sense of humour without doing so. ",
"I don't think I agree with the notion that software/libraries shouldn't take political sides ever. There's certainly code out there which is written specifically to help activists or spread awareness (about politicial issues, political figures and their actions etc.) or to make people stop and think (a good bunch of art-related code, I'd guess). In my opinion all of that is valid and even important as long as the motivations behind it aren't unsavory.\r\n\r\n(Following that, I don't get why one would equate a critical stance on a controversial topic with dystopian qualities – which is how I read it – with racial jokes.)\r\n\r\nIt's fine if we disagree on this, I just don't want my previous comments to be misinterpreted. As stated above, I believe the issue is that the joke is inaccessible to a good portion of people wanting to use/using the library due to the potentially unfamiliar abbreviation and its general off-topic-ness, not because it's \"taking a side\".",
"Confused me as well. Maybe mentioning the fact it's a joke for non-native english?",
"Life's too short to get caught up by stuff like this. Just ignore the joke if you cannot handle it.",
"@laneschmidt your subtle pejorative is not missed. One could equally insult you with, \"don't enter a conversation if you aren't mature enough to address people like adults\" or even \"life's too short to enter a conversation that you haven't meaningfully added to. Just delete your github account if you can't handle it\".",
"Echoing what @cornfeedhobo said, I was wondering why you, @laneschmidt, would trigger email notifications for everyone subscribed to this thread so far for what's basically a non-comment, thereby wasting all of our time, when you feel life's too short for stuff like this...",
"Whether GMO is safe is a serious scientific discussion because of all the miss-information out there. Implying that this statement is a joke and blame people for not getting it shows poor taste.",
"To ensure that non native english speakers also understand this joke, no abbreviation should be used. A smiley at the end could also contribute to understanding. :-)",
"This joke is perfectly normal and should not be treated such seriously as \"not professional\" and confusing. This is just a tagline, *nothing else.*\r\n@keikoro",
"@episodeyang is right. There is a lot of miss-information out there and joke-or-not it contributes to it.\r\nIt's ignoring such little stuff is what got us into this pseudoscientific mess of a world we have right now.\r\nOn the other hand I do appreciate a joke when I see one, especially in professional surroundings.\r\nI'd just replace it with something like \"Not tested on animals... they were too scared by the snake.\" or alike.\r\n\r\nUpdate: I'd be equally concerned if it said \"the only gluten free library\", since that also while being a criminally overused marketing scheme (with no standing science proof, except for people with a very rare condition), would perpetuate pseudoscience myths (FUDs if you will) rather than be an innocent joke.",
"Seems to be finally resolved in the commit [7ca0e86a6752a5a763e5e82d94f34a522a976708](https://github.com/psf/requests/commit/7ca0e86a6752a5a763e5e82d94f34a522a976708#diff-04c6e90faac2675aa89e2176d2eec7d8)."
] |
https://api.github.com/repos/psf/requests/issues/3555
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3555/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3555/comments
|
https://api.github.com/repos/psf/requests/issues/3555/events
|
https://github.com/psf/requests/issues/3555
| 175,145,969 |
MDU6SXNzdWUxNzUxNDU5Njk=
| 3,555 |
Feature request: support options for suppressing InsecurePlatformWarning and requests.exceptions.InvalidHeader
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4",
"events_url": "https://api.github.com/users/ghost/events{/privacy}",
"followers_url": "https://api.github.com/users/ghost/followers",
"following_url": "https://api.github.com/users/ghost/following{/other_user}",
"gists_url": "https://api.github.com/users/ghost/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ghost",
"id": 10137,
"login": "ghost",
"node_id": "MDQ6VXNlcjEwMTM3",
"organizations_url": "https://api.github.com/users/ghost/orgs",
"received_events_url": "https://api.github.com/users/ghost/received_events",
"repos_url": "https://api.github.com/users/ghost/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ghost/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ghost",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-09-06T02:03:28Z
|
2021-09-08T15:00:52Z
|
2016-09-06T08:29:09Z
|
NONE
|
resolved
|
I think the requests is designed for keep high security. But when prototype use or trial use, that restrictions are pesty. Especially two errors below.
- InsecurePlatformWarning: A true SSLContext object is not available
- requests.exceptions.InvalidHeader: Invalid return character or leading space in header
I'll show the code below.
``` python
...
# [Pesty Case-1]
# This application-specific header includes spaces and marks,
# but the requests outputs 'requests.exceptions.InvalidHeader' error.
headers ={
"Authorization" : " Basic q3io5yaFA="
}
# [Pesty Case-2]
# If you use 'verify=False' option, the requests outputs 'InsecurePlatformWarning' warning.
r = requests.get(url, headers=headers, verify=False)
...
```
I think supporting a option for suppressing them is benefit, like this.
```
r = requests.get(url, headers=headers, disable_security_assertion=True)
# ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ like this!
```
What do you think?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3555/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3555/timeline
| null |
completed
| null | null | false |
[
"Thanks for this report! However, we will not be disabling either of these features.\n\nFirstly, the warning. You've identified it as an `InsecurePlatformWarning`, but I suspect it's actually an `InsecureRequestWarning`, as `InsecurePlatformWarning` is not caused by passing `verify=False`. This warning is intended to be a reminder to you to remove `verify=False` as soon as feasibly possible. There is _almost never_ a reason to pass `verify=False`. If you're using a self-signed cert on your service, you can simply download the self-signed certificate using a web browser or OpenSSL, save it to a PEM file, and then pass the path to that file to `verify` instead. This will prevent you bumping into errors. Certainly you should never allow your software to enter production with `verify=False` in place. That's why this error is there, and providing an extra switch to easily defeat it would entirely remove the purpose of it.\n\nSecondly, the exception. This exception is a hard fail, and again we will not provide tools to circumvent it. You should _never_, under _any circumstances_, emit a header with leading whitespace. While in principle such a header is syntactically valid HTTP/1.1, in practice other HTTP/1.1 implementations frequently interpret such a header incorrectly, most commonly by failing to parse it entirely. This opens up the possibility of subtle errors or request smuggling attacks, and the risk is doubly strong in the case of any code that passes user-specified data to Requests. Given that there is no _legitimate_ purpose for disabling this test (there is no reason you should ever need leading whitespace in your header field value), there is no reason to make this bypassable. Simply adjust the code that is generating your headers. =)\n",
"@Lukasa Thank you for response.\n\nThe REST API server needs a header leading whitespace. In this system, we must use the header field `\"Authorization\" : \" Basic (SECURITY-TOKEN)\"` for the authorization to the system. Spaces also must be needed! :sob: \n\nI think this system's REST API design is so bad :sob: \n",
"My goodness. Really?\n\nIf you absolutely _must_ do that, you can use the [PreparedRequest flow](http://docs.python-requests.org/en/master/user/advanced/#prepared-requests) which will let you modify headers _after_ we have prepared them, at which point we won't do further validation.\n\nBoy that's silly though. =P\n"
] |
https://api.github.com/repos/psf/requests/issues/3554
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3554/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3554/comments
|
https://api.github.com/repos/psf/requests/issues/3554/events
|
https://github.com/psf/requests/pull/3554
| 175,116,485 |
MDExOlB1bGxSZXF1ZXN0ODQwMzI2NTI=
| 3,554 |
ISO-8859-1 fallback for reason decoding
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-09-05T18:51:18Z
|
2021-09-08T02:10:16Z
|
2016-09-05T23:15:06Z
|
MEMBER
|
resolved
|
This implements the proposed solution to #3538 by falling back from Unicode to ISO-8859-1 for raw reason decoding.
This is a relatively trivial fix, and I wasn't sure if you wanted to waste bandwidth fixing, @mitsuhiko. I had a brief chat with @Lukasa who said to toss this up, but I'll gladly drop it in favor of #3538, if you had plans to update.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3554/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3554/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3554.diff",
"html_url": "https://github.com/psf/requests/pull/3554",
"merged_at": "2016-09-05T23:15:06Z",
"patch_url": "https://github.com/psf/requests/pull/3554.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3554"
}
| true |
[
"@sigmavirus24 are you happy with this?\n",
"Yep!\n"
] |
https://api.github.com/repos/psf/requests/issues/3553
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3553/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3553/comments
|
https://api.github.com/repos/psf/requests/issues/3553/events
|
https://github.com/psf/requests/issues/3553
| 175,107,496 |
MDU6SXNzdWUxNzUxMDc0OTY=
| 3,553 |
preserving {0} in requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1165771?v=4",
"events_url": "https://api.github.com/users/fredzannarbor/events{/privacy}",
"followers_url": "https://api.github.com/users/fredzannarbor/followers",
"following_url": "https://api.github.com/users/fredzannarbor/following{/other_user}",
"gists_url": "https://api.github.com/users/fredzannarbor/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/fredzannarbor",
"id": 1165771,
"login": "fredzannarbor",
"node_id": "MDQ6VXNlcjExNjU3NzE=",
"organizations_url": "https://api.github.com/users/fredzannarbor/orgs",
"received_events_url": "https://api.github.com/users/fredzannarbor/received_events",
"repos_url": "https://api.github.com/users/fredzannarbor/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/fredzannarbor/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fredzannarbor/subscriptions",
"type": "User",
"url": "https://api.github.com/users/fredzannarbor",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2016-09-05T17:13:43Z
|
2021-09-08T15:00:55Z
|
2016-09-05T17:20:54Z
|
NONE
|
resolved
|
```
def buy_fortune():
value1 = 'riddles'
payload = { 'fortunefile': value1}
url = server_url+'buy?payout_address={0}'
print(url)
r = requests.get(url, params=payload)
print(r.url)
```
produces:
```
http://localhost:5000/buy?payout_address={0}
http://localhost:5000/buy?payout_address=%7B0%7D&fortunefile=riddles
```
I want to preserve the payout address parameter as {0) rather than %7B0%7D
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1165771?v=4",
"events_url": "https://api.github.com/users/fredzannarbor/events{/privacy}",
"followers_url": "https://api.github.com/users/fredzannarbor/followers",
"following_url": "https://api.github.com/users/fredzannarbor/following{/other_user}",
"gists_url": "https://api.github.com/users/fredzannarbor/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/fredzannarbor",
"id": 1165771,
"login": "fredzannarbor",
"node_id": "MDQ6VXNlcjExNjU3NzE=",
"organizations_url": "https://api.github.com/users/fredzannarbor/orgs",
"received_events_url": "https://api.github.com/users/fredzannarbor/received_events",
"repos_url": "https://api.github.com/users/fredzannarbor/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/fredzannarbor/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fredzannarbor/subscriptions",
"type": "User",
"url": "https://api.github.com/users/fredzannarbor",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3553/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3553/timeline
| null |
completed
| null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/3552
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3552/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3552/comments
|
https://api.github.com/repos/psf/requests/issues/3552/events
|
https://github.com/psf/requests/pull/3552
| 175,102,936 |
MDExOlB1bGxSZXF1ZXN0ODQwMjQyNTc=
| 3,552 |
pysocks 1.5.7 blacklisting, due to IPv6 problems
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2480847?v=4",
"events_url": "https://api.github.com/users/nano3k0a/events{/privacy}",
"followers_url": "https://api.github.com/users/nano3k0a/followers",
"following_url": "https://api.github.com/users/nano3k0a/following{/other_user}",
"gists_url": "https://api.github.com/users/nano3k0a/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nano3k0a",
"id": 2480847,
"login": "nano3k0a",
"node_id": "MDQ6VXNlcjI0ODA4NDc=",
"organizations_url": "https://api.github.com/users/nano3k0a/orgs",
"received_events_url": "https://api.github.com/users/nano3k0a/received_events",
"repos_url": "https://api.github.com/users/nano3k0a/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nano3k0a/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nano3k0a/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nano3k0a",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2016-09-05T16:33:17Z
|
2021-09-08T02:10:32Z
|
2016-09-05T16:47:13Z
|
CONTRIBUTOR
|
resolved
|
https://github.com/kennethreitz/requests/issues/3551
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/3552/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3552/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3552.diff",
"html_url": "https://github.com/psf/requests/pull/3552",
"merged_at": "2016-09-05T16:47:13Z",
"patch_url": "https://github.com/psf/requests/pull/3552.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3552"
}
| true |
[
"Thanks @Trodis!\n",
"✨ 🍰 ✨ \n",
"Glad I could help and thanks @Lukasa \n",
"My pleasure: sorry that you bumped into this!\n"
] |
https://api.github.com/repos/psf/requests/issues/3551
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3551/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3551/comments
|
https://api.github.com/repos/psf/requests/issues/3551/events
|
https://github.com/psf/requests/issues/3551
| 175,100,781 |
MDU6SXNzdWUxNzUxMDA3ODE=
| 3,551 |
test_lowlevel.py is failing in a Python 2 environment
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2480847?v=4",
"events_url": "https://api.github.com/users/nano3k0a/events{/privacy}",
"followers_url": "https://api.github.com/users/nano3k0a/followers",
"following_url": "https://api.github.com/users/nano3k0a/following{/other_user}",
"gists_url": "https://api.github.com/users/nano3k0a/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nano3k0a",
"id": 2480847,
"login": "nano3k0a",
"node_id": "MDQ6VXNlcjI0ODA4NDc=",
"organizations_url": "https://api.github.com/users/nano3k0a/orgs",
"received_events_url": "https://api.github.com/users/nano3k0a/received_events",
"repos_url": "https://api.github.com/users/nano3k0a/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nano3k0a/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nano3k0a/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nano3k0a",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-09-05T16:16:34Z
|
2021-09-08T15:00:55Z
|
2016-09-05T17:01:12Z
|
CONTRIBUTOR
|
resolved
|
If the Test test_lowlevel.py is being run by pytest in a Python 2 environment, the Testing fails at line 55 with an assertion error: assert len(fake_proxy.handler_results) == 1 is false because fake_proxy.handler_results is empty. The Problem could be as discussed on IRC, that a broken import is causing this failure, because pysocks for python 2 is not imported correct.
Running the same test file within the Python 3 environment, doesn't fail, the whole requests test suite is passed without an issue.
System: Arch Linux x64
Requests Version for Python 2 and 3: 2.11.1
PySocks Version for Python 2 and 3: 1.5.7

|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3551/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3551/timeline
| null |
completed
| null | null | false |
[
"This error is a direct result of PySocks version 1.5.7, which when a SOCKS proxy is listening on only an IPv4 or IPv6 address but has routes available for both, connection setup can fail. This is fixed in the current master of PySocks, and can also be resolved by downgrading to PySocks 1.5.6 which does not support IPv6 addresses at all.\n\nThe best fix we can make here is the same as shazow/urllib3#964. Are you interested in contributing such a fix?\n",
"Yes I would love to but, I need some direction what to do exactly\n",
"@Trodis Have you seen the fix in the linked PR?\n"
] |
https://api.github.com/repos/psf/requests/issues/3550
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3550/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3550/comments
|
https://api.github.com/repos/psf/requests/issues/3550/events
|
https://github.com/psf/requests/issues/3550
| 175,100,262 |
MDU6SXNzdWUxNzUxMDAyNjI=
| 3,550 |
ConnectionError: HTTPConnectionPool( Max retries exceeded with url) [Errno -2] Name or service not known
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10820548?v=4",
"events_url": "https://api.github.com/users/zhumingyu/events{/privacy}",
"followers_url": "https://api.github.com/users/zhumingyu/followers",
"following_url": "https://api.github.com/users/zhumingyu/following{/other_user}",
"gists_url": "https://api.github.com/users/zhumingyu/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/zhumingyu",
"id": 10820548,
"login": "zhumingyu",
"node_id": "MDQ6VXNlcjEwODIwNTQ4",
"organizations_url": "https://api.github.com/users/zhumingyu/orgs",
"received_events_url": "https://api.github.com/users/zhumingyu/received_events",
"repos_url": "https://api.github.com/users/zhumingyu/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/zhumingyu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zhumingyu/subscriptions",
"type": "User",
"url": "https://api.github.com/users/zhumingyu",
"user_view_type": "public"
}
|
[
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
},
{
"color": "fef2c0",
"default": false,
"description": null,
"id": 298537994,
"name": "Needs More Information",
"node_id": "MDU6TGFiZWwyOTg1Mzc5OTQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20More%20Information"
}
] |
closed
| true | null |
[] | null | 7 |
2016-09-05T16:12:52Z
|
2017-12-19T17:04:51Z
|
2016-09-06T08:18:28Z
|
NONE
|
off-topic
|
Hey,guys!
Maybe I did not ask the right place,i follow this project:[https://github.com/jiehua233/phoneregion](https://github.com/jiehua233/phoneregion),then run command 'python main.py --scrapy',but always get error like this:
Traceback (most recent call last):
File "/usr/lib64/python2.7/site-packages/gevent/greenlet.py", line 534, in run
result = self._run(_self.args, *_self.kwargs)
File "/home/zjhxmjl/Documents/phoneregion/scrapy.py", line 55, in worker
info = self.validate(phone)
File "/home/zjhxmjl/Documents/phoneregion/scrapy.py", line 62, in validate
r = requests.get(url)
File "/usr/lib/python2.7/site-packages/requests/api.py", line 70, in get
return request('get', url, params=params, *_kwargs)
File "/usr/lib/python2.7/site-packages/requests/api.py", line 56, in request
return session.request(method=method, url=url, *_kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 475, in request
resp = self.send(prep, *_send_kwargs)
File "/usr/lib/python2.7/site-packages/requests/sessions.py", line 596, in send
r = adapter.send(request, *_kwargs)
File "/usr/lib/python2.7/site-packages/requests/adapters.py", line 487, in send
raise ConnectionError(e, request=request)
ConnectionError: HTTPConnectionPool(host='www.ip138.com', port=8080): Max retries exceeded with url: /search.asp?mobile=1459626&action=mobile (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x14663d0>: Failed to establish a new connection: [Errno -2] Name or service not known',))
<Greenlet at 0x140d0f0: <bound method Scrapy.worker of <scrapy.Scrapy instance at 0x12dcf38>>(['1450000', '1450001', '1450002', '1450003', '1450)> failed with ConnectionError
i check the number "1459626" from [here](http://www.ip138.com:8080/search.asp?mobile=1459626&action=mobile) which i need but can't store in the file "phonenum.dat",thks for advance.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3550/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3550/timeline
| null |
completed
| null | null | false |
[
"this is requirements.txt which i had install\n[requirements.txt](https://github.com/kennethreitz/requests/files/455358/requirements.txt)\n",
"This error says that we are unable to resolve the domain name `www.ip138.com`. Can you confirm that that domain name is valid?\n",
"@Lukasa thks for reply,i confirm the domain name is valid,because i can get some datas.but sometimes i get the error \"ConnectionError\".\n**some error info:**\n`[zjhxmjl@localhost phoneregion]$ python main.py --scrapy > log.txt\nTraceback (most recent call last):\n File \"/usr/lib64/python2.7/site-packages/gevent/greenlet.py\", line 534, in run\n result = self._run(_self.args, *_self.kwargs)\n File \"/home/zjhxmjl/Documents/phoneregion/scrapy.py\", line 55, in worker\n info = self.validate(phone)\n File \"/home/zjhxmjl/Documents/phoneregion/scrapy.py\", line 62, in validate\n r = requests.get(url)\n File \"/usr/lib/python2.7/site-packages/requests/api.py\", line 70, in get\n return request('get', url, params=params, *_kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/api.py\", line 56, in request\n return session.request(method=method, url=url, *_kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 475, in request\n resp = self.send(prep, *_send_kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 596, in send\n r = adapter.send(request, *_kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/adapters.py\", line 487, in send\n raise ConnectionError(e, request=request)\nConnectionError: HTTPConnectionPool(host='www.ip138.com', port=8080): Max retries exceeded with url: /search.asp?mobile=1459772&action=mobile (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x3970d90>: Failed to establish a new connection: [Errno -2] Name or service not known',))\n<Greenlet at 0x31ae0f0: <bound method Scrapy.worker of <scrapy.Scrapy instance at 0x307ef38>>(['1450000', '1450001', '1450002', '1450003', '1450)> failed with ConnectionError\n\nTraceback (most recent call last):\n File \"/usr/lib64/python2.7/site-packages/gevent/greenlet.py\", line 534, in run\n result = self._run(_self.args, *_self.kwargs)\n File \"/home/zjhxmjl/Documents/phoneregion/scrapy.py\", line 55, in worker\n info = self.validate(phone)\n File \"/home/zjhxmjl/Documents/phoneregion/scrapy.py\", line 62, in validate\n r = requests.get(url)\n File \"/usr/lib/python2.7/site-packages/requests/api.py\", line 70, in get\n return request('get', url, params=params, *_kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/api.py\", line 56, in request\n return session.request(method=method, url=url, *_kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 475, in request\n resp = self.send(prep, *_send_kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 596, in send\n r = adapter.send(request, *_kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/adapters.py\", line 487, in send\n raise ConnectionError(e, request=request)\nConnectionError: HTTPConnectionPool(host='www.ip138.com', port=8080): Max retries exceeded with url: /search.asp?mobile=1459728&action=mobile (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x34df390>: Failed to establish a new connection: [Errno -2] Name or service not known',))\n<Greenlet at 0x31aef50: <bound method Scrapy.worker of <scrapy.Scrapy instance at 0x307ef38>>(['1450000', '1450001', '1450002', '1450003', '1450)> failed with ConnectionError\n\nTraceback (most recent call last):\n File \"/usr/lib64/python2.7/site-packages/gevent/greenlet.py\", line 534, in run\n result = self._run(_self.args, *_self.kwargs)\n File \"/home/zjhxmjl/Documents/phoneregion/scrapy.py\", line 55, in worker\n info = self.validate(phone)\n File \"/home/zjhxmjl/Documents/phoneregion/scrapy.py\", line 62, in validate\n r = requests.get(url)\n File \"/usr/lib/python2.7/site-packages/requests/api.py\", line 70, in get\n return request('get', url, params=params, *_kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/api.py\", line 56, in request\n return session.request(method=method, url=url, *_kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 475, in request\n resp = self.send(prep, *_send_kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 596, in send\n r = adapter.send(request, *_kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/adapters.py\", line 487, in send\n raise ConnectionError(e, request=request)\nConnectionError: HTTPConnectionPool(host='www.ip138.com', port=8080): Max retries exceeded with url: /search.asp?mobile=1459727&action=mobile (Caused by NewConnectionError('<requests.packages.urllib3.connection.HTTPConnection object at 0x361c210>: Failed to establish a new connection: [Errno -2] Name or service not known',))\n<Greenlet at 0x306b4b0: <bound method Scrapy.worker of <scrapy.Scrapy instance at 0x307ef38>>(['1450000', '1450001', '1450002', '1450003', '1450)> failed with ConnectionError`\n\n**this is log.txt**\n[log.txt](https://github.com/kennethreitz/requests/files/455727/log.txt)\n",
"So that exception means that the DNS lookup has failed. There's very little we can do about that other than simply to retry, so I recommend adding some error handling around that block.\n",
"We just saw this with multiple domains at the same time on 2 different servers on the same network. The weird thing is that all other processes could connect to the domains without any problem and DNS was working fine at the time we attempted to diagnose this. The above error continued until the process was restarted it, it did not recover even when we knew the DNS was working properly. I'm guessing somewhere either in requests or python that the failed DNS lookup was cached and it never attempted to resolve the domain again.",
"So, Requests does not cache DNS lookups. That means that this must be a Python or OS level issue.",
"When i tried to configure HTTP/HTTPS proxy for iPython notebook server i am getting error like below. No idea what to do. Any help is appreciated. \r\n\r\nProxyError: HTTPConnectionPool(host='proxy.example.com', port=80): Max retries exceeded with url: http://google.com/ (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<urllib3.connection.HTTPConnection object at 0x0B9B95D0>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed',)))\r\n\r\n"
] |
https://api.github.com/repos/psf/requests/issues/3549
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3549/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3549/comments
|
https://api.github.com/repos/psf/requests/issues/3549/events
|
https://github.com/psf/requests/issues/3549
| 174,954,450 |
MDU6SXNzdWUxNzQ5NTQ0NTA=
| 3,549 |
requests.get doesn't fail HTTP requests with verify option
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3913223?v=4",
"events_url": "https://api.github.com/users/Findeton/events{/privacy}",
"followers_url": "https://api.github.com/users/Findeton/followers",
"following_url": "https://api.github.com/users/Findeton/following{/other_user}",
"gists_url": "https://api.github.com/users/Findeton/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Findeton",
"id": 3913223,
"login": "Findeton",
"node_id": "MDQ6VXNlcjM5MTMyMjM=",
"organizations_url": "https://api.github.com/users/Findeton/orgs",
"received_events_url": "https://api.github.com/users/Findeton/received_events",
"repos_url": "https://api.github.com/users/Findeton/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Findeton/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Findeton/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Findeton",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-09-04T15:32:50Z
|
2021-09-08T15:00:56Z
|
2016-09-04T15:37:57Z
|
NONE
|
resolved
|
When you use the verify parameter on requests.get, the function is supposed to check that the webpage certificate is valid. Obviously, if the webpage is not HTTPS, the webpage has no certificate.
I think it would be nice if in that case the request fails. I mean, imagine we have Google's cert on /srv/google-cert:
```
$ cat /srv/google-cert
-----BEGIN CERTIFICATE-----
MIIDVDCCAjygAwIBAgIDAjRWMA0GCSqGSIb3DQEBBQUAMEIxCzAJBgNVBAYTAlVT
MRYwFAYDVQQKEw1HZW9UcnVzdCBJbmMuMRswGQYDVQQDExJHZW9UcnVzdCBHbG9i
YWwgQ0EwHhcNMDIwNTIxMDQwMDAwWhcNMjIwNTIxMDQwMDAwWjBCMQswCQYDVQQG
EwJVUzEWMBQGA1UEChMNR2VvVHJ1c3QgSW5jLjEbMBkGA1UEAxMSR2VvVHJ1c3Qg
R2xvYmFsIENBMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA2swYYzD9
9BcjGlZ+W988bDjkcbd4kdS8odhM+KhDtgPpTSEHCIjaWC9mOSm9BXiLnTjoBbdq
fnGk5sRgprDvgOSJKA+eJdbtg/OtppHHmMlCGDUUna2YRpIuT8rxh0PBFpVXLVDv
iS2Aelet8u5fa9IAjbkU+BQVNdnARqN7csiRv8lVK83Qlz6cJmTM386DGXHKTubU
1XupGc1V3sjs0l44U+VcT4wt/lAjNvxm5suOpDkZALeVAjmRCw7+OC7RHQWa9k0+
bw8HHa8sHo9gOeL6NlMTOdReJivbPagUvTLrGAMoUgRx5aszPeE4uwc2hGKceeoW
MPRfwCvocWvk+QIDAQABo1MwUTAPBgNVHRMBAf8EBTADAQH/MB0GA1UdDgQWBBTA
ephojYn7qwVkDBF9qn1luMrMTjAfBgNVHSMEGDAWgBTAephojYn7qwVkDBF9qn1l
uMrMTjANBgkqhkiG9w0BAQUFAAOCAQEANeMpauUvXVSOKVCUn5kaFOSPeCpilKIn
Z57QzxpeR+nBsqTP3UEaBU6bS+5Kb1VSsyShNwrrZHYqLizz/Tt1kL/6cdjHPTfS
tQWVYrmm3ok9Nns4d0iXrKYgjy6myQzCsplFAMfOEVEiIuCl6rYVSAlk6l5PdPcF
PseKUgzbFbS9bZvlxrFUaKnjaZC2mqUPuLk/IH2uSrW4nOQdtqvmlKXBx4Ot2/Un
hw4EbNX/3aBd7YdStysVAq45pmp06drE57xNNB6pXE0zX5IJL4hmXXeXxx12E6nV
5fEWCRE11azbJHFwLJhWC9kXtNHjUStedejV0NxPNO3CBWaAocvmMw==
-----END CERTIFICATE-----
```
Then right now this is what happens:
```
>>> import requests
>>> requests.get('https://www.google.com', verify='/srv/google-cert')
<Response [200]>
>>> requests.get('http://www.ull.es', verify='/srv/google-cert')
<Response [200]>
```
I'm proposing a change: the second get call (which is HTTP) should fail.
I've tested this on requests 2.11.1.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3549/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3549/timeline
| null |
completed
| null | null | false |
[
"Thanks for this suggestion! However, I'm afraid I disagree. =) That represents a pretty substantial change in the behaviour of the verify argument, and is well outside what is currently documented. Additionally, it raises some extremely difficult questions. For example, what if the request is redirected from HTTPS to HTTP: should that fail?\n\nBut fundamentally it just punishes users too heavily for simple errors. `verify` has a very simply defined behaviour [as shown here](http://docs.python-requests.org/en/master/user/advanced/#ssl-cert-verification): whenever set to a validation mode, requests will validate _HTTPS_ requests, but not others.\n\nYou can achieve your desired behaviour (no HTTPS support) by reconfiguring your Session somewhat:\n\n``` python\ns = requests.Session()\ndel s.adapters['https://']\n```\n\nThis will make Requests entirely unable to make HTTP requests.\n"
] |
https://api.github.com/repos/psf/requests/issues/3548
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3548/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3548/comments
|
https://api.github.com/repos/psf/requests/issues/3548/events
|
https://github.com/psf/requests/issues/3548
| 174,474,807 |
MDU6SXNzdWUxNzQ0NzQ4MDc=
| 3,548 |
Error received after updating the requests package
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/15614358?v=4",
"events_url": "https://api.github.com/users/MehmetAzizYirik/events{/privacy}",
"followers_url": "https://api.github.com/users/MehmetAzizYirik/followers",
"following_url": "https://api.github.com/users/MehmetAzizYirik/following{/other_user}",
"gists_url": "https://api.github.com/users/MehmetAzizYirik/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/MehmetAzizYirik",
"id": 15614358,
"login": "MehmetAzizYirik",
"node_id": "MDQ6VXNlcjE1NjE0MzU4",
"organizations_url": "https://api.github.com/users/MehmetAzizYirik/orgs",
"received_events_url": "https://api.github.com/users/MehmetAzizYirik/received_events",
"repos_url": "https://api.github.com/users/MehmetAzizYirik/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/MehmetAzizYirik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MehmetAzizYirik/subscriptions",
"type": "User",
"url": "https://api.github.com/users/MehmetAzizYirik",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-09-01T09:16:31Z
|
2021-09-08T15:00:57Z
|
2016-09-01T12:35:07Z
|
NONE
|
resolved
|
@shazow @Lukasa @mcescalante @borbamartin
Hi guys;
I just tried to extract data by using the URL of the webservices; moreover, I used the requests package however it returned the error "TypeError: **str** returned non-string (type SysCallError)" according to that; I just read the section for the issue at GitHub and upgraded the requests package. However, it returned another error. Before updating the code worked for an hour and stopped and gave the error as I shared "TypeError: **str** returned non-string (type SysCallError)" however right now it does not work; moreover jit just returns the traceback:
Traceback (most recent call last):
File "C:\Users\may\Desktop\Documents\BOUN-CSE\Master Thesis\Project Code Files\pubchem\PubchemMapping.py", line 35, in <module>
activities = new_client.activity.filter(target_chembl_id__in=[target['target_chembl_id'] for target in targets])
File "C:\Python27\lib\site-packages\chembl_webresource_client\query_set.py", line 114, in next
self.chunk = self.query.get_page()
File "C:\Python27\lib\site-packages\chembl_webresource_client\url_query.py", line 390, in get_page
res = session.post(self.base_url + '.' + self.frmt, data=data, timeout=self.timeout)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 522, in post
return self.request('POST', url, data=data, json=json, *_kwargs)
File "C:\Python27\lib\site-packages\requests_cache\core.py", line 126, in request
*_kwargs
File "C:\Python27\lib\site-packages\requests\sessions.py", line 475, in request
resp = self.send(prep, **send_kwargs)
File "C:\Python27\lib\site-packages\requests_cache\core.py", line 97, in send
response, timestamp = self.cache.get_response_and_time(cache_key)
File "C:\Python27\lib\site-packages\requests_cache\backends\base.py", line 70, in get_response_and_time
if key not in self.responses:
File "C:\Python27\lib_abcoll.py", line 388, in **contains**
self[key]
File "C:\Python27\lib\site-packages\requests_cache\backends\storage\dbdict.py", line 163, in **getitem**
return pickle.loads(bytes(super(DbPickleDict, self).**getitem**(key)))
File "C:\Python27\lib\copy_reg.py", line 50, in _reconstructor
obj = base.__new__(cls, state)
TypeError: ('dict.**new**(HTTPHeaderDict): HTTPHeaderDict is not a subtype of dict', <function _reconstructor at 0x03039570>, (<class 'requests.packages.urllib3._collections.HTTPHeaderDict'>, <type 'dict'>, {'date': ('date', 'Thu, 01 Sep 2016 08:24:03 GMT'), 'transfer-encoding': ('transfer-encoding', 'chunked'), 'vary': ('vary', 'Accept'), 'server': ('server', 'gunicorn/19.1.1'), 'x-chembl-retrieval-time': ('x-chembl-retrieval-time', '0.00829792022705'), 'connection': ('connection', 'Keep-Alive'), 'cache-control': ('cache-control', 's-maxage=30000000, max-age=30000000'), 'x-chembl-in-cache': ('x-chembl-in-cache', 'True'), 'content-type': ('content-type', 'application/json')}))
Can you help to solve the problem please ?
All the Best;
Aziz
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3548/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3548/timeline
| null |
completed
| null | null | false |
[
"This error is coming from requests_cache. I recommend contacting that project for assistance. =)\n"
] |
https://api.github.com/repos/psf/requests/issues/3546
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3546/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3546/comments
|
https://api.github.com/repos/psf/requests/issues/3546/events
|
https://github.com/psf/requests/issues/3546
| 174,260,846 |
MDU6SXNzdWUxNzQyNjA4NDY=
| 3,546 |
requote_uri modifies URI
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/12715952?v=4",
"events_url": "https://api.github.com/users/mocsar/events{/privacy}",
"followers_url": "https://api.github.com/users/mocsar/followers",
"following_url": "https://api.github.com/users/mocsar/following{/other_user}",
"gists_url": "https://api.github.com/users/mocsar/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mocsar",
"id": 12715952,
"login": "mocsar",
"node_id": "MDQ6VXNlcjEyNzE1OTUy",
"organizations_url": "https://api.github.com/users/mocsar/orgs",
"received_events_url": "https://api.github.com/users/mocsar/received_events",
"repos_url": "https://api.github.com/users/mocsar/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mocsar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mocsar/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mocsar",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-08-31T12:31:24Z
|
2021-09-08T15:00:57Z
|
2016-08-31T13:03:24Z
|
NONE
|
resolved
|
I would like to fetch
http://git.mycompany.com/api/v3/projects/my%2Ename%2Fmyproject/repository/commits?ref_name=master
but requote_uri transforms this URI to
http://git.mycompany.com/api/v3/projects/my.name%2Fmyproject/repository/commits?ref_name=master
what is an invalid URI for gitlab rest api.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3546/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3546/timeline
| null |
completed
| null | null | false |
[
"Well, in the first instance you should consider contacting GitLab, because they clearly have a problem with their infrastructure. The period is a safe character in the path component of a URL, and as a result a URI that contains an unescaped period in the path must be treated exactly the same as one that contains an escaped period.\n\nAs to your issue, you can address this by using the [PreparedRequest flow](http://docs.python-requests.org/en/master/user/advanced/#prepared-requests), which will allow you to force the URL to any form you like without requote_uri getting in your way.\n",
"Thank you!\n"
] |
https://api.github.com/repos/psf/requests/issues/3545
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3545/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3545/comments
|
https://api.github.com/repos/psf/requests/issues/3545/events
|
https://github.com/psf/requests/pull/3545
| 173,813,774 |
MDExOlB1bGxSZXF1ZXN0ODMxMjYyOTU=
| 3,545 |
removing redundant logic from prepare_content_length
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2016-08-29T16:22:24Z
|
2021-09-08T02:10:14Z
|
2016-09-22T12:49:57Z
|
MEMBER
|
resolved
|
# Background
So while working on #3535 I noticed that `prepare_content_length` will be fairly redundant after the patch is merged. I was going to push the abbreviated version of `prepare_content_length` into #3535 but found some inconsistencies in how we're currently handling Content-Length.
So here's a quick rundown of what I've found, and what it seems like _should_ happen.
- #957 addressed #223 and added Content-Length to _everything_.
- #1142 amended this to NOT send Content-Length with GET/HEAD requests (Issue #1051)
- #2329 addressed the issue of not being able to override Content-Length with a custom header (Issue #2329)
From this, I'd expect:
- Content-Length to be set on all requests that are not using a HEAD/GET method.
- If a Content-Length header exists, that it remains untouched when sending a request.
# Next Steps
Currently, setting a body on a request will override the custom Content-Length header. The test implemented in #2329 doesn't test the code path with a body. I've added that test and the required logic to make it succeed. However, the solution attached here is not my preferred approach, rather the closest to the old functionality of prepare_content_length.
_Ideally_, I think that any call to `prepare_content_length` will do just that, set the Content-Length. That way if someone updates the body on a PreparedRequest, they can use `prepare_content_length` without having to delete a header, or recreate the object. This approach would require moving `if self.headers.get('Content-Length') is None` up to `prepare_body` and `prepare_auth` to run the overwrite check there.
This has the issue of causing latent errors though, if someone decides to use `prepare_content_length` elsewhere without the existence check. So if we're concerned about that being a possibility, then I think `prepare_content_length` should instead have an `overwrite` param which will bypass the [check](https://github.com/kennethreitz/requests/compare/master...nateprewitt:new_prepare_content_length#diff-afd5aad80649cdfae687bee05242c8faR472) to see if the header exists.
#
This PR is two tangentially related fixes (abbreviating prepare_content_length post #3535 and fixing inconsistencies in how we handle Content-Lengths). This SHOULD NOT be merged and probably doesn't need to even be addressed until after #3535 is merged.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3545/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3545/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3545.diff",
"html_url": "https://github.com/psf/requests/pull/3545",
"merged_at": "2016-09-22T12:49:57Z",
"patch_url": "https://github.com/psf/requests/pull/3545.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3545"
}
| true |
[
"@Lukasa whenever you've got a chance, could I get your thoughts on this?\n",
"Ok, I'm going to do an abridged reframe for this PR because the first post is a bit wordy.\n\nNow that we've updated `super_len` in #3535, `prepare_content_length` is mostly redundant and a space were we could accidentally end up with differing functionality in the future. I'm proposing we consolidate the method to the code below, which will have equivalent logic:\n\n``` python\n def prepare_content_length(self, body):\n if body is not None:\n length = super_len(body)\n self.headers['Content-Length'] = builtin_str(length)\n elif self.method not in ('GET', 'HEAD') and self.headers.get('Content-Length') is None:\n self.headers['Content-Length'] = '0'\n```\n\nThe second issue is the functionality over 'Content-Length' override, which I addressed [above](https://github.com/kennethreitz/requests/pull/3545#discussion_r79694617). They seemed somewhat related at first, which is why I raised it at a single PR.\n",
"I think, given my comment above, that we should restrict ourselves to the redundancy removal for now. Overwriting Content-Length is, IMO, a severe misfeature, and I'm willing to make that case pretty forcefully to Kenneth if he needs convincing. ;)\n",
"Ok, that's good enough for me. For cataloging purposes, would you prefer I pare this PR down to just the redundancy fix, or close this and open a new one?\n",
"Let's pare this one down. =)\n",
"I'd like to be included on this review prior to its merging. We (three) have made enough serious problems in the content-length portion of requests that it needs as many reviews as we can afford.\n",
"Ok, things are pared down. The patch is now pretty simple.\n\nThe logic for file-like objects (with seek/tell) is now in [`super_len`](https://github.com/kennethreitz/requests/blob/d7227fbb7e07af35f23a0d370ab3b01661af9e40/requests/utils.py#L78-L103). I'm removing [this chunk](https://github.com/kennethreitz/requests/blob/d7227fbb7e07af35f23a0d370ab3b01661af9e40/requests/models.py#L478-L483) since it is now redundant. I'm also adding two tests to verify that Content-Length is set to 0 appropriately.\n\n@Lukasa, @sigmavirus24 feel free to take a peak when you have a moment.\n",
"@Lukasa it looks as though @nateprewitt pushed new changes after your approval. Care to re-review?\n",
"Woops, forgot to approve this. :)\n"
] |
https://api.github.com/repos/psf/requests/issues/3544
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3544/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3544/comments
|
https://api.github.com/repos/psf/requests/issues/3544/events
|
https://github.com/psf/requests/issues/3544
| 173,714,020 |
MDU6SXNzdWUxNzM3MTQwMjA=
| 3,544 |
proxies has no effect
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7439574?v=4",
"events_url": "https://api.github.com/users/ripples-alive/events{/privacy}",
"followers_url": "https://api.github.com/users/ripples-alive/followers",
"following_url": "https://api.github.com/users/ripples-alive/following{/other_user}",
"gists_url": "https://api.github.com/users/ripples-alive/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ripples-alive",
"id": 7439574,
"login": "ripples-alive",
"node_id": "MDQ6VXNlcjc0Mzk1NzQ=",
"organizations_url": "https://api.github.com/users/ripples-alive/orgs",
"received_events_url": "https://api.github.com/users/ripples-alive/received_events",
"repos_url": "https://api.github.com/users/ripples-alive/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ripples-alive/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ripples-alive/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ripples-alive",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-08-29T07:36:30Z
|
2021-09-08T16:00:23Z
|
2016-08-29T14:22:56Z
|
NONE
|
resolved
|
Suppose my IP is `1.1.1.1` and an anonymous proxy is `2.2.2.2:2222`. I write code like this:
``` python
while True:
proxy = '2.2.2.2:2222'
session = requests.Session()
session.headers['User-Agent'] = 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0'
session.proxies = {
'http': proxy,
'https': proxy,
}
print session.get('http://members.3322.org/dyndns/getip').content
print session.get('http://members.3322.org/dyndns/getip', proxies=session.proxies).content
```
The output for one loop should be:
```
2.2.2.2
2.2.2.2
```
Firstly, I ran this program, the result was the same as expected.
However, several days later, I suddenly found that the output for one loop became:
```
1.1.1.1
2.2.2.2
```
I killed the process and ran it again, the output was still wrong. (repeated several times)
Then I restart the computer and ran it again. The output became right, again.
In fact, I can hardly reproduce the issue. But it did happened.
```
OS: Windows 10 home (version 1511, os version 10586.545).
Python 2.7.11
requests 2.11.0 (installed by pip)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3544/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3544/timeline
| null |
completed
| null | null | false |
[
"Without the ability to reproduce this, I don't think we can really take any action on this. =(\n"
] |
https://api.github.com/repos/psf/requests/issues/3543
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3543/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3543/comments
|
https://api.github.com/repos/psf/requests/issues/3543/events
|
https://github.com/psf/requests/pull/3543
| 173,591,883 |
MDExOlB1bGxSZXF1ZXN0ODI5ODU2NDY=
| 3,543 |
Closes #3542, Wraps docstrings/comments to 72 column width
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/20254860?v=4",
"events_url": "https://api.github.com/users/nagracks/events{/privacy}",
"followers_url": "https://api.github.com/users/nagracks/followers",
"following_url": "https://api.github.com/users/nagracks/following{/other_user}",
"gists_url": "https://api.github.com/users/nagracks/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nagracks",
"id": 20254860,
"login": "nagracks",
"node_id": "MDQ6VXNlcjIwMjU0ODYw",
"organizations_url": "https://api.github.com/users/nagracks/orgs",
"received_events_url": "https://api.github.com/users/nagracks/received_events",
"repos_url": "https://api.github.com/users/nagracks/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nagracks/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nagracks/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nagracks",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-08-27T11:31:58Z
|
2021-09-07T00:06:37Z
|
2017-02-10T17:12:43Z
|
NONE
|
resolved
| null |
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3543/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3543/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3543.diff",
"html_url": "https://github.com/psf/requests/pull/3543",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3543.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3543"
}
| true |
[
"@kennethreitz This one is for you, my friend.\n",
"needs a refactor. Closing, since it needs a rebase, but i'd merge it if it gets rebased! ",
"P.S. sorry for the delay, i didn't get the notification!"
] |
https://api.github.com/repos/psf/requests/issues/3542
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3542/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3542/comments
|
https://api.github.com/repos/psf/requests/issues/3542/events
|
https://github.com/psf/requests/issues/3542
| 173,589,749 |
MDU6SXNzdWUxNzM1ODk3NDk=
| 3,542 |
Wrap function docstrings and comments to 72 (PEP8)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/20254860?v=4",
"events_url": "https://api.github.com/users/nagracks/events{/privacy}",
"followers_url": "https://api.github.com/users/nagracks/followers",
"following_url": "https://api.github.com/users/nagracks/following{/other_user}",
"gists_url": "https://api.github.com/users/nagracks/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nagracks",
"id": 20254860,
"login": "nagracks",
"node_id": "MDQ6VXNlcjIwMjU0ODYw",
"organizations_url": "https://api.github.com/users/nagracks/orgs",
"received_events_url": "https://api.github.com/users/nagracks/received_events",
"repos_url": "https://api.github.com/users/nagracks/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nagracks/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nagracks/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nagracks",
"user_view_type": "public"
}
|
[
{
"color": "fbca04",
"default": false,
"description": null,
"id": 44501249,
"name": "Needs BDFL Input",
"node_id": "MDU6TGFiZWw0NDUwMTI0OQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input"
},
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
] | null | 1 |
2016-08-27T10:29:24Z
|
2021-09-08T08:00:29Z
|
2017-07-30T00:14:50Z
|
NONE
|
resolved
|
I can do this. The function's docstring out of 72 column looks ugly
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3542/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3542/timeline
| null |
completed
| null | null | false |
[
"Depends :)\n"
] |
https://api.github.com/repos/psf/requests/issues/3541
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3541/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3541/comments
|
https://api.github.com/repos/psf/requests/issues/3541/events
|
https://github.com/psf/requests/pull/3541
| 173,451,640 |
MDExOlB1bGxSZXF1ZXN0ODI4ODY3MTQ=
| 3,541 |
Specify self.cert is used for SSL client certificates
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1461970?v=4",
"events_url": "https://api.github.com/users/RichieB2B/events{/privacy}",
"followers_url": "https://api.github.com/users/RichieB2B/followers",
"following_url": "https://api.github.com/users/RichieB2B/following{/other_user}",
"gists_url": "https://api.github.com/users/RichieB2B/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/RichieB2B",
"id": 1461970,
"login": "RichieB2B",
"node_id": "MDQ6VXNlcjE0NjE5NzA=",
"organizations_url": "https://api.github.com/users/RichieB2B/orgs",
"received_events_url": "https://api.github.com/users/RichieB2B/received_events",
"repos_url": "https://api.github.com/users/RichieB2B/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/RichieB2B/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RichieB2B/subscriptions",
"type": "User",
"url": "https://api.github.com/users/RichieB2B",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-08-26T13:21:50Z
|
2021-09-08T02:10:33Z
|
2016-08-26T14:02:22Z
|
CONTRIBUTOR
|
resolved
|
In addition to #3539 clarify that self.cert is used for SSL client certificates
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3541/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3541/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3541.diff",
"html_url": "https://github.com/psf/requests/pull/3541",
"merged_at": "2016-08-26T14:02:21Z",
"patch_url": "https://github.com/psf/requests/pull/3541.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3541"
}
| true |
[
"Looks good to me, thanks!\n"
] |
https://api.github.com/repos/psf/requests/issues/3540
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3540/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3540/comments
|
https://api.github.com/repos/psf/requests/issues/3540/events
|
https://github.com/psf/requests/issues/3540
| 173,396,880 |
MDU6SXNzdWUxNzMzOTY4ODA=
| 3,540 |
Header doesn't accept int values already
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6483469?v=4",
"events_url": "https://api.github.com/users/anchaj/events{/privacy}",
"followers_url": "https://api.github.com/users/anchaj/followers",
"following_url": "https://api.github.com/users/anchaj/following{/other_user}",
"gists_url": "https://api.github.com/users/anchaj/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/anchaj",
"id": 6483469,
"login": "anchaj",
"node_id": "MDQ6VXNlcjY0ODM0Njk=",
"organizations_url": "https://api.github.com/users/anchaj/orgs",
"received_events_url": "https://api.github.com/users/anchaj/received_events",
"repos_url": "https://api.github.com/users/anchaj/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/anchaj/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/anchaj/subscriptions",
"type": "User",
"url": "https://api.github.com/users/anchaj",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-08-26T08:21:31Z
|
2021-09-08T16:00:24Z
|
2016-08-26T12:23:48Z
|
NONE
|
resolved
|
Hi,
I discovered that new version (2.11.1, build:0x021101) doesn't accept int type in headers.
Before update (2.3.0, build:0x20300) everything was ok.
```
import requests
headers = {'exampleHeader': 10} # raise InvalidHeader: Header value 10 must be of type str or bytes...
# headers = {'exampleHeader': '10'} # for new version of library
requests.request("PUT", http://google.com, headers=headers)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3540/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3540/timeline
| null |
completed
| null | null | false |
[
"This was an intentional change.\n\nYou will find many other closed issues discussing this since 11 came out.\n\nYou will need to convert to string before using as a header.\n\nOn Fri, 26 Aug 2016, 6:21 PM Mateusz [email protected] wrote:\n\n> Hi,\n> I discovered that new version (2.11.1, build:0x021101) doesn't accept int\n> type in headers.\n> Before update (2.3.0, build:0x20300) everything was ok.\n> \n> import requests\n> headers = {'exampleHeader': 10} # raise InvalidHeader: Header value 10 must be of type str or bytes...\n> \n> # headers = {'exampleHeader': '10'} # for new version of library\n> \n> requests.request(\"PUT\", http://google.com, headers=headers)\n> \n> —\n> You are receiving this because you are subscribed to this thread.\n> Reply to this email directly, view it on GitHub\n> https://github.com/kennethreitz/requests/issues/3540, or mute the thread\n> https://github.com/notifications/unsubscribe-auth/AGk_7boWiOSD9tXu8NMDEBLGVkyZ9MoBks5qjqIggaJpZM4Jt2QH\n> .\n",
"Yup, that's correct. =) Thanks @TetraEtc!\n"
] |
https://api.github.com/repos/psf/requests/issues/3539
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3539/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3539/comments
|
https://api.github.com/repos/psf/requests/issues/3539/events
|
https://github.com/psf/requests/pull/3539
| 173,390,546 |
MDExOlB1bGxSZXF1ZXN0ODI4NDUzMTk=
| 3,539 |
Add persistent examples
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1461970?v=4",
"events_url": "https://api.github.com/users/RichieB2B/events{/privacy}",
"followers_url": "https://api.github.com/users/RichieB2B/followers",
"following_url": "https://api.github.com/users/RichieB2B/following{/other_user}",
"gists_url": "https://api.github.com/users/RichieB2B/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/RichieB2B",
"id": 1461970,
"login": "RichieB2B",
"node_id": "MDQ6VXNlcjE0NjE5NzA=",
"organizations_url": "https://api.github.com/users/RichieB2B/orgs",
"received_events_url": "https://api.github.com/users/RichieB2B/received_events",
"repos_url": "https://api.github.com/users/RichieB2B/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/RichieB2B/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/RichieB2B/subscriptions",
"type": "User",
"url": "https://api.github.com/users/RichieB2B",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2016-08-26T07:47:12Z
|
2021-09-08T03:00:44Z
|
2016-08-26T13:16:54Z
|
CONTRIBUTOR
|
resolved
|
Learning python and Requests it was not immediately clear that many arguments can also be made persistent using the Sessions class.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3539/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3539/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3539.diff",
"html_url": "https://github.com/psf/requests/pull/3539",
"merged_at": "2016-08-26T13:16:54Z",
"patch_url": "https://github.com/psf/requests/pull/3539.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3539"
}
| true |
[
"@Lukasa I was certain this was described elsewhere, am I wrong?\n",
"[The first sentence of the Session docs](http://docs.python-requests.org/en/master/user/advanced/#session-objects) reads:\n\n> The Session object allows you to persist certain parameters across requests.\n\nIs there a reason we need this change? Do we need to improve the Session docs directly?\n",
"While searching on how to use an SSL client certificate the [developer interface documentation](http://docs.python-requests.org/en/master/api/) did not help at all. SSL client certificates are only mentioned as a parameter for requests.request(). \n\nThe [Session Object documentation](http://docs.python-requests.org/en/master/user/advanced/#session-objects) does have a [section](http://docs.python-requests.org/en/master/user/advanced/#ssl-cert-verification) covering SSL client certificates but only as a parameter to requests.get(). Additional examples for using persistent parameters would have really helped me.\n",
"Well, the scope of the changes here is pretty small, so I don't object to merging this for now. =)\n",
"Thanks @RichieB2B!\n"
] |
https://api.github.com/repos/psf/requests/issues/3538
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3538/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3538/comments
|
https://api.github.com/repos/psf/requests/issues/3538/events
|
https://github.com/psf/requests/pull/3538
| 173,332,986 |
MDExOlB1bGxSZXF1ZXN0ODI4MDg3MTc=
| 3,538 |
Fix incorrect encoding for reason
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7396?v=4",
"events_url": "https://api.github.com/users/mitsuhiko/events{/privacy}",
"followers_url": "https://api.github.com/users/mitsuhiko/followers",
"following_url": "https://api.github.com/users/mitsuhiko/following{/other_user}",
"gists_url": "https://api.github.com/users/mitsuhiko/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mitsuhiko",
"id": 7396,
"login": "mitsuhiko",
"node_id": "MDQ6VXNlcjczOTY=",
"organizations_url": "https://api.github.com/users/mitsuhiko/orgs",
"received_events_url": "https://api.github.com/users/mitsuhiko/received_events",
"repos_url": "https://api.github.com/users/mitsuhiko/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mitsuhiko/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mitsuhiko/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mitsuhiko",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 11 |
2016-08-25T23:10:33Z
|
2021-09-08T02:10:32Z
|
2016-09-06T08:33:02Z
|
CONTRIBUTOR
|
resolved
|
HTTP status lines are latin1 and not utf-8
> The TEXT rule is only used for descriptive field contents and values
> that are not intended to be interpreted by the message parser. Words
> of *TEXT MAY contain characters from character sets other than ISO-
> 8859-1 only when encoded according to the rules of RFC 2047
To quote the RFC. Nobody implements RFC 2047 so this can be disregarded.
<!-- Reviewable:start -->
---
This change is [<img src="https://reviewable.io/review_button.svg" height="34" align="absmiddle" alt="Reviewable"/>](https://reviewable.io/reviews/kennethreitz/requests/3538)
<!-- Reviewable:end -->
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3538/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3538/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3538.diff",
"html_url": "https://github.com/psf/requests/pull/3538",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3538.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3538"
}
| true |
[
"The UTF-8 decode was added here very deliberately, to handle the fact that many origin servers incorrectly serve UTF-8 reason phrases. Given that all defined reason phrases are ASCII, it seemed a reasonable middle ground to handle them as UTF-8. \n\nDo you have a specific problem you have encountered with this choice?\n",
"Of course, I would rather ignore the reason phrase entirely because it's stupid. \n",
"See also: #3385.\n",
"@Lukasa it came up because Jira when served up with french locale sends french status reasons. Originally it was an exception and when we saw this was fixed I noticed that now the reason is garbled. I don't care _that_ much about it but the current code is definitely incorrect.\n",
"So I agree that the current code is incorrect per the spec, but I don't agree that it's incorrect per actual usage.\n\nFundamentally this boils down to the use cases of reason phrases. There are two schools of thought on them. The first (and I would argue the correct) one is that reason phrases are vestigial additional data that serves no meaningful purpose. Clients cannot be expected to take any action based on the reason phrase alone, and so it's basically nothing more than an opaque byte sequence with no meaning. This interpretation is supported by browsers and by the IETF: browsers ignore reason phrases, and the IETF stripped reason phrases from HTTP/2.\n\nHowever, the persistent desire to _localize_ reason phrases reveals the other interpretation. The only point of localizing reason phrases is if you expect humans to read them. If you expect that, then it's bizarre to privilege ISO-8859-1: why do [these languages alone](https://en.wikipedia.org/wiki/ISO/IEC_8859-1#Modern_languages_with_complete_coverage) get to localize their reason phrases, while everyone else is out in the cold? And, in fact, we have good evidence that reason phrases _are_ delivered in UTF-8 in some instances.\n\nSo the question becomes: given that people insist on localizing their reason phrases (which makes very little sense to me), what is the correct way to handle them? We could pass opaque byte strings around, but that doesn't work on Python 3. Decoding in either Latin 1 or UTF-8 risks mangling some reason phrases. Or we could just remove the reason phrase entirely from the debug output.\n\nFor my part, I don't think I care. UTF-8 with 'ignore' seemed like a best-effort choice to handle this insane server behaviour, but ISO-8859-1 would also do just fine. With that in mind, I think right now I'm inclined to prefer the current behaviour slightly, if for no better reason than anyone who knows enough to care about this charset choice also knows enough not to give a crap about what the reason phrase says.\n\nIncidentally, JIRA should stop localizing its reason phrases.\n\nAnyway, I'll let one of the other maintainers step in here. I don't really care, and when I don't care I tend to want to prefer the status quo. Roll on HTTP/2 where we'll never have to have this discussion again. =D\n",
"Another option is that, given that we're off the hot path, we could try to decode as UTF-8 first and then, if that fails, fall back to ISO-8859-1. That is probably the closest to best effort. \n",
"> So I agree that the current code is incorrect per the spec, but I don't agree that it's incorrect per actual usage.\n\nActual usage: there are actual servers out there sending back latin1. Are there actual servers out there sending back utf-8? Even in case the latter is correct you can go from latin1 to utf-8 without loss, you cannot go the other way.\n",
"@mitsuhiko we changed it because there are servers sending back utf-8. I can't search for it at the moment, but you should have little problems finding that pull request and issue.\n",
"I linked it above: #3385.\n",
"> Even in case the latter is correct you can go from latin1 to utf-8 without loss, you cannot go the other way.\n\nSure, which is why we don't unconditionally convert the reason phrase. We only convert it during the processing for `raise_for_status`, specifically to build the Unicode error string. That string is _strictly_ best effort.\n\nRegardless, I'm happy to remove the 'ignore' encoding directive and instead fall back to ISO-8859-1 if UTF-8 fails. That should cover the complete set of problems. \n",
"Resolved by #3554 instead. Thanks @mitsuhiko!\n"
] |
https://api.github.com/repos/psf/requests/issues/3537
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3537/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3537/comments
|
https://api.github.com/repos/psf/requests/issues/3537/events
|
https://github.com/psf/requests/issues/3537
| 173,265,416 |
MDU6SXNzdWUxNzMyNjU0MTY=
| 3,537 |
('Connection aborted.', error(105, 'No buffer space available'))
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/691783?v=4",
"events_url": "https://api.github.com/users/glennpierce/events{/privacy}",
"followers_url": "https://api.github.com/users/glennpierce/followers",
"following_url": "https://api.github.com/users/glennpierce/following{/other_user}",
"gists_url": "https://api.github.com/users/glennpierce/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/glennpierce",
"id": 691783,
"login": "glennpierce",
"node_id": "MDQ6VXNlcjY5MTc4Mw==",
"organizations_url": "https://api.github.com/users/glennpierce/orgs",
"received_events_url": "https://api.github.com/users/glennpierce/received_events",
"repos_url": "https://api.github.com/users/glennpierce/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/glennpierce/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/glennpierce/subscriptions",
"type": "User",
"url": "https://api.github.com/users/glennpierce",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-08-25T17:38:36Z
|
2021-09-08T16:00:25Z
|
2016-08-25T18:29:07Z
|
NONE
|
resolved
|
I have some code that calls like
r = session.get(url, stream=False, timeout=1)
data = r.content
This is called in a loop for 150 urls constantly. Usually this works but after a while I get errors like
('Connection aborted.', error(105, 'No buffer space available'))
This is over an ipsec tunnel so I suspect the bug is with the tunnel but wanted to make sure.
Thanks
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3537/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3537/timeline
| null |
completed
| null | null | false |
[
"Do you have a more complete traceback for that? This error _generally_ indicates a problem with the socket buffer itself: that is, the buffer is overflowing. It'd be good to see where it's coming from, but _probably_ it's related to the ipsec tunnel: certainly many similar bugs seem to be raised that involve ipsec.\n",
"I solved it in the by doing\necho \"65535\" > /proc/sys/net/ipv4/xfrm4_gc_thresh\n\nI found the solution https://wiki.strongswan.org/issues/1169\n\nThe default on centos 7 was quite low.\n\nThanks\n"
] |
https://api.github.com/repos/psf/requests/issues/3536
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3536/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3536/comments
|
https://api.github.com/repos/psf/requests/issues/3536/events
|
https://github.com/psf/requests/pull/3536
| 173,047,220 |
MDExOlB1bGxSZXF1ZXN0ODI2MDk1NjU=
| 3,536 |
Test case for requests getting stuck on post redirect with seekable streams
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7489847?v=4",
"events_url": "https://api.github.com/users/tzickel/events{/privacy}",
"followers_url": "https://api.github.com/users/tzickel/followers",
"following_url": "https://api.github.com/users/tzickel/following{/other_user}",
"gists_url": "https://api.github.com/users/tzickel/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tzickel",
"id": 7489847,
"login": "tzickel",
"node_id": "MDQ6VXNlcjc0ODk4NDc=",
"organizations_url": "https://api.github.com/users/tzickel/orgs",
"received_events_url": "https://api.github.com/users/tzickel/received_events",
"repos_url": "https://api.github.com/users/tzickel/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tzickel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tzickel/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tzickel",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2016-08-24T20:15:21Z
|
2021-09-08T02:10:22Z
|
2016-11-03T15:36:57Z
|
NONE
|
resolved
|
https://github.com/kennethreitz/requests/issues/3079
This is a test case that depends on httpbin 0.5.0 to show the bug.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3536/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3536/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3536.diff",
"html_url": "https://github.com/psf/requests/pull/3536",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/3536.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3536"
}
| true |
[
"Thanks for these tests! They do, as expected, fail.\n\nWhat we need now, I think, is for someone to step up and write a solution. Any volunteers?\n",
"I guess this issue can only be fixed inside urllib3 since it handles the actual redirects.\n",
"@tzickel At a quick glance, I don't think that urllib3 has any knowledge of whether our request is a redirect or not. We pass each redirect call to urllib3 as if it were a single request.\n\n#3655 addresses the problem posed in your second test, I'm not claiming it's the best solution though. Any input there or clarification on what I'm missing in a urllib3 implementation would be appreciated :)\n",
"@Lukasa tzickel's commit was merged into master with #3655, so this can probably be closed.\n",
"Yup. Thanks for the work @tzickel!\n"
] |
https://api.github.com/repos/psf/requests/issues/3535
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3535/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3535/comments
|
https://api.github.com/repos/psf/requests/issues/3535/events
|
https://github.com/psf/requests/pull/3535
| 173,019,706 |
MDExOlB1bGxSZXF1ZXN0ODI1ODk3MDI=
| 3,535 |
avoid use of getvalues in super_len
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 10 |
2016-08-24T18:04:53Z
|
2021-09-08T02:10:20Z
|
2016-09-14T07:10:28Z
|
MEMBER
|
resolved
|
This is a follow up on @jseabold's work in #3339. These last minor changes should fix the issues with return values of `seek` between Python 2 and Python 3.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3535/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3535/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3535.diff",
"html_url": "https://github.com/psf/requests/pull/3535",
"merged_at": "2016-09-14T07:10:28Z",
"patch_url": "https://github.com/psf/requests/pull/3535.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3535"
}
| true |
[
"It'd be really nice to have some more tests around this. Ideally tests that go into each of the branches. @nateprewitt, are you open to adding those?\n",
"Yep, I'll throw a few more together for `TestSuperLen`.\n",
"@Lukasa, tests are updated with custom classes. I think things should be good to go with tests.\n",
"Small notes about the tests, but this is looking really good!\n",
"Great! Changes in last notes have been updated.\n",
"Ok, cool, I like this. @sigmavirus24?\n",
"@sigmavirus24, wanted to ping in case this fell off the queue. No rush, just checking in :)\n",
"I have one suggestion but it's optional. \n",
"I don't think I've got anything else, so I think we're good to merge from my end. Let me know if you have anything else @lukasa.\n\nWe can probably start discussing #3545 too.\n",
"Ok then, let's do this! =D\n"
] |
https://api.github.com/repos/psf/requests/issues/3534
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3534/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3534/comments
|
https://api.github.com/repos/psf/requests/issues/3534/events
|
https://github.com/psf/requests/issues/3534
| 172,912,537 |
MDU6SXNzdWUxNzI5MTI1Mzc=
| 3,534 |
proxies missing [ipv6] address support
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/16426974?v=4",
"events_url": "https://api.github.com/users/lvg01/events{/privacy}",
"followers_url": "https://api.github.com/users/lvg01/followers",
"following_url": "https://api.github.com/users/lvg01/following{/other_user}",
"gists_url": "https://api.github.com/users/lvg01/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/lvg01",
"id": 16426974,
"login": "lvg01",
"node_id": "MDQ6VXNlcjE2NDI2OTc0",
"organizations_url": "https://api.github.com/users/lvg01/orgs",
"received_events_url": "https://api.github.com/users/lvg01/received_events",
"repos_url": "https://api.github.com/users/lvg01/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/lvg01/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/lvg01/subscriptions",
"type": "User",
"url": "https://api.github.com/users/lvg01",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 17 |
2016-08-24T10:02:35Z
|
2021-09-07T00:06:19Z
|
2017-07-30T13:53:22Z
|
NONE
|
resolved
|
For an IPv6 proxy address then proxies-dict is not working en ends with temporary nameresolving problems. Change of the addess to a name (name added with ipv6 address in /etc/hosts) the proxies-dict is working.
For IPv4 an address is working
```
proxies={"https":"http:[<ipv6:address>]:<portnumber>"} fails
proxies={"https":"http:[<proxyname>:<portnumber>"} succeeds (/etc/hosts entry: "<ipv6:address> <proxyname>"]
proxies={"https":"http:[<ipv4.address>:<portnumber>"} succeeds
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3534/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3534/timeline
| null |
completed
| null | null | false |
[
"What does _fails_ mean, please? Can you provide a clearer explanation of what you expect to happen and what is actually happening?\n",
"Hello, \nI'm sorry for not being clear in this call. \nI use the proxies dict to access the internet via a proxy server.\n1 Using an ipv4 address passes the proxy\n2 Using a name resolved to either an ipv4 or ipv6 address passes the proxy\n3 Using an ipv6 address enclosed in [ ] it fails with a temporary resolving error and is not passing the proxy.\nIt seems that an ipv6 address is not being recognized as an ip address and therefore is used as a name to be resolved. \nKind regards, Luitzen woensdag, 24 augustus 2016, 04:34PM +02:00 van Cory Benfield [email protected] :\n\n> What does fails mean, please? Can you provide a clearer explanation of what you expect to happen and what is actually happening?\n> —\n> You are receiving this because you authored the thread.\n> Reply to this email directly, view it on GitHub , or mute the thread .\n",
"When you say it fails with a temporary resolving error, do you get a traceback? If you do, can you provide it please?\n",
"@lvg01 just a friendly reminder that we still need some information from you here. Thanks in advance!\n",
"At first, excuses for my late response..\n\nWith the proxies={\"https\":\"http://[abcd:0123:dcba:3210::1]:3128\"} the command fails with following traceback:\n\n```\nTraceback (most recent call last):\n File \"/usr/local/sendsms.py\", line 96, in <module>\n r = requests.post(gwURL, data=xmltree, headers=headers, proxies=proxyDict)\n File \"/usr/lib/python2.7/site-packages/requests/api.py\", line 108, in post\n return request('post', url, data=data, json=json, **kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/api.py\", line 50, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 464, in request\n resp = self.send(prep, **send_kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 576, in send\n r = adapter.send(request, **kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/adapters.py\", line 424, in send\n raise ConnectionError(e, request=request)\nrequests.exceptions.ConnectionError: HTTPSConnectionPool(host='url.org', port=443): Max retries exceeded with url: /smssgateway/cm/gateway.ashx (Caused by ProxyError('Cannot connect to proxy.', gaierror(-3, 'Temporary failure in name resolution')))\n```\n\nWith /etc/hosts line 'abcd:0123:dcba:3210::1 proxy' and proxies={\"https\":\"http://proxy:3128\"} the command succeeds.\n",
"Yup, this looks like an error in parsing URLs. @sigmavirus24, you're the resident expert here: any thoughts?\n",
"@Lukasa I don't see the problem parsing the URL. \n\n```\nCaused by ProxyError('Cannot connect to proxy.', gaierror(-3, 'Temporary failure in name resolution'))\n```\n\nSeems like that _might_ be the problem based on that message but I'm having trouble figuring out why. We [create a ProxyManager](https://github.com/kennethreitz/requests/blob/5259b374512ec4e47785f5004ec6ad30dafe906f/requests/adapters.py#L188) from urllib3 which parses the URL using `parse_url` from it's url utility module. This parses the example without issue and has always handled IPv6 addresses correctly.\n\nIf we look at the ProxyManager it has different logic for requests to an [https url versus an http url](https://github.com/kennethreitz/requests/blob/05d90b9379f57ee5f5d0beb268c495793fb5b2d7/requests/packages/urllib3/poolmanager.py#L331). \n\nI'm failing to see where the URL parsing would fail and why an IPv6 literal address would cause a name resolution failure.\n",
"@lvg01 Can you confirm that this works for me, please?\n\n``` python\nimport socket\nsocket.getaddrinfo('[abcd:0123:dcba:3210::1]', 80)\n```\n",
"@Lukasa it should be \n\n``` py\nimport socket\nsocket.getaddrinfo('[abcd:0123:dcba:3210::1]', 3128)\n```\n\nAs that's the port specified in the proxy url that @lvg01 is showing us\n",
"sorry for the late response, I was busy for some other things...\n\nThe test gives the same error:\n\n> > > import socket\n> > > socket.getaddrinfo('[abcd:0123:dcba:3210::1]',80)\n> > > Traceback (most recent call last):\n> > > File \"<stdin>\", line 1, in <module>\n> > > socket.gaierror: [Errno -3] Temporary failure in name resolution\n\nOn 06-09-16 13:34, Cory Benfield wrote:\n\n> import socket\n> socket.getaddrinfo('\n",
"as for port 3128, the same:\n\n> > > socket.getaddrinfo('[abcd:0123:dcba:3210::1]',3128)\n> > > Traceback (most recent call last):\n> > > File \"<stdin>\", line 1, in <module>\n> > > socket.gaierror: [Errno -3] Temporary failure in name resolution\n\nOn 06-09-16 14:38, Ian Cordasco wrote:\n\n> @Lukasa https://github.com/Lukasa it should be\n> \n> import socket\n> socket.getaddrinfo('[abcd:0123:dcba:3210::1]',3128)\n> \n> As that's the port specified in the proxy url that @lvg01 \n> https://github.com/lvg01 is showing us\n> \n> —\n> You are receiving this because you were mentioned.\n> Reply to this email directly, view it on GitHub \n> https://github.com/kennethreitz/requests/issues/3534#issuecomment-244936907, \n> or mute the thread \n> https://github.com/notifications/unsubscribe-auth/APqn3sQEPsDAyL0KKslJrvy1Vc1lJvUwks5qnV66gaJpZM4Jr16P.\n",
"@lvg01 And what results do you get without the square brackets?\n",
"Hé, thats a good suggestion, then the test-code works!\n\nSo there is a parsing problem... Becaus in de proxies dict the ip \naddress is surrounded by square brackets to seperate the ipv6 address \nwith the tcp port (normal convention). The square brackets have to be \nstripped somewhere in the code OR the library should be able to read \nsquare bracket enclosed ipv6 addresses...\n\nOn 07-09-16 09:20, Cory Benfield wrote:\n\n> @lvg01 https://github.com/lvg01 And what results do you get without \n> the square brackets?\n> \n> —\n> You are receiving this because you were mentioned.\n> Reply to this email directly, view it on GitHub \n> https://github.com/kennethreitz/requests/issues/3534#issuecomment-245196498, \n> or mute the thread \n> https://github.com/notifications/unsubscribe-auth/APqn3rUJ7CGgmbzHaY4zyhwxadrL8tVEks5qnmXagaJpZM4Jr16P.\n",
"So I don't think there's strictly a _parsing_ problem so much as there's a usage problem. Somewhere in urllib3/requests we are failing to strip the square brackets from the IPv6 address before we do a lookup on it.\n",
"OK, I understand. When separating the IP from the PORT the IP has to \nstripped before using the libraries beneath it.\n\nOn 07-09-16 10:42, Cory Benfield wrote:\n\n> So I don't think there's strictly a /parsing/ problem so much as \n> there's a usage problem. Somewhere in urllib3/requests we are failing \n> to strip the square brackets from the IPv6 address before we do a \n> lookup on it.\n> \n> —\n> You are receiving this because you were mentioned.\n> Reply to this email directly, view it on GitHub \n> https://github.com/kennethreitz/requests/issues/3534#issuecomment-245214619, \n> or mute the thread \n> https://github.com/notifications/unsubscribe-auth/APqn3vq8B2cvKtBuIsp1rlA2Ns4MWlMSks5qnnj6gaJpZM4Jr16P.\n",
"I think this is a problem in urllib3's Proxy|Pool Manager.\n",
"I'm fairly confident we've fixed this in urllib3. I vaguely remember the fix. As such, I'm going to close this. If I'm mistaken, I'll happily reopen it just let me know."
] |
https://api.github.com/repos/psf/requests/issues/3533
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3533/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3533/comments
|
https://api.github.com/repos/psf/requests/issues/3533/events
|
https://github.com/psf/requests/issues/3533
| 172,897,667 |
MDU6SXNzdWUxNzI4OTc2Njc=
| 3,533 |
Segfault while using requests in GPIO Callback on Raspberry Pi 3
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5537530?v=4",
"events_url": "https://api.github.com/users/momo-aux/events{/privacy}",
"followers_url": "https://api.github.com/users/momo-aux/followers",
"following_url": "https://api.github.com/users/momo-aux/following{/other_user}",
"gists_url": "https://api.github.com/users/momo-aux/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/momo-aux",
"id": 5537530,
"login": "momo-aux",
"node_id": "MDQ6VXNlcjU1Mzc1MzA=",
"organizations_url": "https://api.github.com/users/momo-aux/orgs",
"received_events_url": "https://api.github.com/users/momo-aux/received_events",
"repos_url": "https://api.github.com/users/momo-aux/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/momo-aux/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/momo-aux/subscriptions",
"type": "User",
"url": "https://api.github.com/users/momo-aux",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2016-08-24T08:48:41Z
|
2021-09-08T16:00:26Z
|
2016-08-24T14:33:42Z
|
NONE
|
resolved
|
I'm getting a segfault sometimes after pressing the Button.
This little script can be used to reproduce the error. (If you don't get a segfault after 20 button presses, restart the script, sooner or later you'll get one).
requests Version is 2.4.3
Python Version is 3.4.2
Perhaps someone could give me a hint what/how to find out what the problem is.
```
import time
from time import mktime, time, sleep
import RPi.GPIO as GPIO
import requests
def button_press_isr1(channel):
GPIO.remove_event_detect(18)
sleep(0.0001)
cnt = 0
edge_start1 = time()
while (time() - edge_start1) <= 0.035 : # 35 mSec max, 15mSec margin
sleep(0.0015) # 1.5 mSec
if GPIO.input(18) == 0 : # 0/False for Falling, 1/True for Rising
cnt += 1
else: # when we have captured a glitch, start all over again
cnt = 0
if cnt == 7 : # 7? lucky number
break
if cnt == 7 :
response = requests.post(url="http://192.168.1.5:8080")
GPIO.add_event_detect(18, GPIO.FALLING, callback=button_press_isr1)
def init():
#GPIO.setwarnings(True)
GPIO.setmode(GPIO.BCM)
GPIO.setup(18, GPIO.IN)
GPIO.add_event_detect(18, GPIO.FALLING, callback=button_press_isr1)
def main():
try:
while True:
sleep(0.03)
except KeyboardInterrupt:
# exit with CTRL+C
print("CTRL+C used to end Program")
finally:
GPIO.cleanup()
if __name__ == "__main__":
init()
main()
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3533/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3533/timeline
| null |
completed
| null | null | false |
[
"Thanks for reporting this!\n\nIt's not possible to segfault the interpreter from inside Requests, because Requests is written in pure-Python. That suggests that the issue is inside the GPIO calls. Unfortunately, to be more specific you'll need to use a debugger like GDB to catch the segfault and print the C stack for that call.\n\nSupport for that problem is outside the scope of this bugtracker, I'm afraid.\n",
"Ah great to know that there is only one option left :-), thank you\n"
] |
https://api.github.com/repos/psf/requests/issues/3532
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3532/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3532/comments
|
https://api.github.com/repos/psf/requests/issues/3532/events
|
https://github.com/psf/requests/issues/3532
| 172,782,893 |
MDU6SXNzdWUxNzI3ODI4OTM=
| 3,532 |
Passing seekable objects without len to super_len causes .getvalue() which copies them just to get the length
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7489847?v=4",
"events_url": "https://api.github.com/users/tzickel/events{/privacy}",
"followers_url": "https://api.github.com/users/tzickel/followers",
"following_url": "https://api.github.com/users/tzickel/following{/other_user}",
"gists_url": "https://api.github.com/users/tzickel/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tzickel",
"id": 7489847,
"login": "tzickel",
"node_id": "MDQ6VXNlcjc0ODk4NDc=",
"organizations_url": "https://api.github.com/users/tzickel/orgs",
"received_events_url": "https://api.github.com/users/tzickel/received_events",
"repos_url": "https://api.github.com/users/tzickel/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tzickel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tzickel/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tzickel",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-08-23T19:16:17Z
|
2021-09-08T16:00:26Z
|
2016-08-23T19:54:41Z
|
NONE
|
resolved
|
This is an performance issue, especially when passing large objects. This can be easily triggered with a data post body:
bigdata = 'a' \* 100000000
stream = io.BytesIO(bigdata)
a = requests.post('http://localhost', data=stream)
will call super_len on stream, which will call stream.getvalue() which will actually copy the 100MB in memory, get it's length, and discard the copy.
we should check first if the object has 'seek' and 'tell' and use them (much quicker) to get the length instead.
https://github.com/kennethreitz/requests/blob/52b15f811f8ef52a144c96b8b742d734dd39d693/requests/utils.py#L59
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3532/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3532/timeline
| null |
completed
| null | null | false |
[
"Thanks for the report! This is a known issue and we're working on a fix: see #3339.\n"
] |
https://api.github.com/repos/psf/requests/issues/3531
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3531/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3531/comments
|
https://api.github.com/repos/psf/requests/issues/3531/events
|
https://github.com/psf/requests/issues/3531
| 172,733,759 |
MDU6SXNzdWUxNzI3MzM3NTk=
| 3,531 |
Issue chunked request cause warning "Connection pool is full"
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7723788?v=4",
"events_url": "https://api.github.com/users/ddzialak/events{/privacy}",
"followers_url": "https://api.github.com/users/ddzialak/followers",
"following_url": "https://api.github.com/users/ddzialak/following{/other_user}",
"gists_url": "https://api.github.com/users/ddzialak/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ddzialak",
"id": 7723788,
"login": "ddzialak",
"node_id": "MDQ6VXNlcjc3MjM3ODg=",
"organizations_url": "https://api.github.com/users/ddzialak/orgs",
"received_events_url": "https://api.github.com/users/ddzialak/received_events",
"repos_url": "https://api.github.com/users/ddzialak/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ddzialak/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ddzialak/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ddzialak",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2016-08-23T15:39:38Z
|
2021-09-08T16:00:27Z
|
2016-08-23T15:59:56Z
|
NONE
|
resolved
|
```
import requests
import sys
import logging
logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)
logger=logging.getLogger()
def produce():
for _ in xrange(5000):
yield "bla" * 1000 + '\0\n'
logger.info("sending....")
response = requests.request("POST", "http://localhost:30006/upload", data=produce(), stream=True)
logger.info("received respone: %s", response.json())
```
As the result I'm receiving:
```
python test.py
INFO:root:sending....
INFO:requests.packages.urllib3.connectionpool:Starting new HTTP connection (1): localhost
**WARNING:requests.packages.urllib3.connectionpool:Connection pool is full, discarding connection: localhost**
INFO:root:received respone: {u'estimation': 0, u'reason_code': None,
```
Everything works but there should be no warning.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7723788?v=4",
"events_url": "https://api.github.com/users/ddzialak/events{/privacy}",
"followers_url": "https://api.github.com/users/ddzialak/followers",
"following_url": "https://api.github.com/users/ddzialak/following{/other_user}",
"gists_url": "https://api.github.com/users/ddzialak/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ddzialak",
"id": 7723788,
"login": "ddzialak",
"node_id": "MDQ6VXNlcjc3MjM3ODg=",
"organizations_url": "https://api.github.com/users/ddzialak/orgs",
"received_events_url": "https://api.github.com/users/ddzialak/received_events",
"repos_url": "https://api.github.com/users/ddzialak/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ddzialak/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ddzialak/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ddzialak",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3531/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3531/timeline
| null |
completed
| null | null | false |
[
"That happens because `_put_conn` (from connectionpool.py line 259) is called twice, one from:\n\n```\n File \"/home/dzialak/.virtualenvs/sf3/lib/python3.5/site-packages/requests/api.py\", line 44, in request\n return session.request(method=method, url=url, **kwargs)\n File \"/home/dzialak/.virtualenvs/sf3/lib/python3.5/site-packages/requests/sessions.py\", line 456, in request\n resp = self.send(prep, **send_kwargs)\n File \"/home/dzialak/.virtualenvs/sf3/lib/python3.5/site-packages/requests/sessions.py\", line 559, in send\n r = adapter.send(request, **kwargs)\n File \"/home/dzialak/.virtualenvs/sf3/lib/python3.5/site-packages/requests/adapters.py\", line 369, in send\n conn._put_conn(low_conn)\n File \"/home/dzialak/.virtualenvs/sf3/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 259, in _put_conn\n```\n\n AND second time from:\n\n```\n File \"/home/dzialak/.virtualenvs/sf3/lib/python3.5/site-packages/requests/api.py\", line 44, in request\n return session.request(method=method, url=url, **kwargs)\n File \"/home/dzialak/.virtualenvs/sf3/lib/python3.5/site-packages/requests/sessions.py\", line 456, in request\n resp = self.send(prep, **send_kwargs)\n File \"/home/dzialak/.virtualenvs/sf3/lib/python3.5/site-packages/requests/sessions.py\", line 596, in send\n r.content\n File \"/home/dzialak/.virtualenvs/sf3/lib/python3.5/site-packages/requests/models.py\", line 694, in content\n self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()\n File \"/home/dzialak/.virtualenvs/sf3/lib/python3.5/site-packages/requests/models.py\", line 627, in generate\n for chunk in self.raw.stream(chunk_size, decode_content=True):\n File \"/home/dzialak/.virtualenvs/sf3/lib/python3.5/site-packages/requests/packages/urllib3/response.py\", line 240, in stream\n data = self.read(amt=amt, decode_content=decode_content)\n File \"/home/dzialak/.virtualenvs/sf3/lib/python3.5/site-packages/requests/packages/urllib3/response.py\", line 221, in read\n self.release_conn()\n File \"/home/dzialak/.virtualenvs/sf3/lib/python3.5/site-packages/requests/packages/urllib3/response.py\", line 125, in release_conn\n self._pool._put_conn(self._connection)\n File \"/home/dzialak/.virtualenvs/sf3/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 267, in _put_conn\n```\n",
"It's about requests (2.3.0)\n",
"Closing that issue as it's already solved in 2.11.1\n"
] |
https://api.github.com/repos/psf/requests/issues/3530
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/3530/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/3530/comments
|
https://api.github.com/repos/psf/requests/issues/3530/events
|
https://github.com/psf/requests/pull/3530
| 172,602,163 |
MDExOlB1bGxSZXF1ZXN0ODIyOTY2NjI=
| 3,530 |
Fixed another scheme proxy over "all" priority issue
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1097666?v=4",
"events_url": "https://api.github.com/users/sentientcucumber/events{/privacy}",
"followers_url": "https://api.github.com/users/sentientcucumber/followers",
"following_url": "https://api.github.com/users/sentientcucumber/following{/other_user}",
"gists_url": "https://api.github.com/users/sentientcucumber/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sentientcucumber",
"id": 1097666,
"login": "sentientcucumber",
"node_id": "MDQ6VXNlcjEwOTc2NjY=",
"organizations_url": "https://api.github.com/users/sentientcucumber/orgs",
"received_events_url": "https://api.github.com/users/sentientcucumber/received_events",
"repos_url": "https://api.github.com/users/sentientcucumber/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sentientcucumber/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sentientcucumber/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sentientcucumber",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2016-08-23T03:03:39Z
|
2021-09-08T03:00:45Z
|
2016-08-23T11:54:35Z
|
NONE
|
resolved
|
Fixed the missing lines pointed out in #3526 to hopefully squash #3518 once and for all.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/3530/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/3530/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/3530.diff",
"html_url": "https://github.com/psf/requests/pull/3530",
"merged_at": "2016-08-23T11:54:35Z",
"patch_url": "https://github.com/psf/requests/pull/3530.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/3530"
}
| true |
[
"Thanks @shellhead! :sparkles:\n"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.