url
stringlengths 50
53
| repository_url
stringclasses 1
value | labels_url
stringlengths 64
67
| comments_url
stringlengths 59
62
| events_url
stringlengths 57
60
| html_url
stringlengths 38
43
| id
int64 597k
2.65B
| node_id
stringlengths 18
32
| number
int64 1
6.83k
| title
stringlengths 1
296
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 2
classes | assignee
dict | assignees
listlengths 0
4
| milestone
dict | comments
int64 0
211
| created_at
stringlengths 20
20
| updated_at
stringlengths 20
20
| closed_at
stringlengths 20
20
⌀ | author_association
stringclasses 3
values | active_lock_reason
stringclasses 4
values | body
stringlengths 0
65.6k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 59
62
| performed_via_github_app
null | state_reason
stringclasses 3
values | draft
bool 2
classes | pull_request
dict | is_pull_request
bool 2
classes | issue_comments
listlengths 0
30
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/psf/requests/issues/2330
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2330/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2330/comments
|
https://api.github.com/repos/psf/requests/issues/2330/events
|
https://github.com/psf/requests/issues/2330
| 48,481,595 |
MDU6SXNzdWU0ODQ4MTU5NQ==
| 2,330 |
Where can I download the Python, Sphinx or PyDoctor-generated documentation?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3921062?v=4",
"events_url": "https://api.github.com/users/acgtyrant/events{/privacy}",
"followers_url": "https://api.github.com/users/acgtyrant/followers",
"following_url": "https://api.github.com/users/acgtyrant/following{/other_user}",
"gists_url": "https://api.github.com/users/acgtyrant/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/acgtyrant",
"id": 3921062,
"login": "acgtyrant",
"node_id": "MDQ6VXNlcjM5MjEwNjI=",
"organizations_url": "https://api.github.com/users/acgtyrant/orgs",
"received_events_url": "https://api.github.com/users/acgtyrant/received_events",
"repos_url": "https://api.github.com/users/acgtyrant/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/acgtyrant/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/acgtyrant/subscriptions",
"type": "User",
"url": "https://api.github.com/users/acgtyrant",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-11-12T07:34:06Z
|
2021-09-08T23:07:02Z
|
2014-11-12T07:38:18Z
|
NONE
|
resolved
|
I would like to [use Dash to convert the office documentation to a docset](http://kapeli.com/docsets), but I do not find any offline documentation...
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2330/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2330/timeline
| null |
completed
| null | null | false |
[
"You need to render the documentation. Checkout the code from this repository and run `pip install Sphinx`. Then, type `cd docs && make html`. That will build the documentation you can then point doc2dash at.\n",
"Done, thank you for your tip!\n"
] |
https://api.github.com/repos/psf/requests/issues/2329
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2329/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2329/comments
|
https://api.github.com/repos/psf/requests/issues/2329/events
|
https://github.com/psf/requests/issues/2329
| 48,434,475 |
MDU6SXNzdWU0ODQzNDQ3NQ==
| 2,329 |
Can't override Content-Length
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2456311?v=4",
"events_url": "https://api.github.com/users/asnelzin/events{/privacy}",
"followers_url": "https://api.github.com/users/asnelzin/followers",
"following_url": "https://api.github.com/users/asnelzin/following{/other_user}",
"gists_url": "https://api.github.com/users/asnelzin/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/asnelzin",
"id": 2456311,
"login": "asnelzin",
"node_id": "MDQ6VXNlcjI0NTYzMTE=",
"organizations_url": "https://api.github.com/users/asnelzin/orgs",
"received_events_url": "https://api.github.com/users/asnelzin/received_events",
"repos_url": "https://api.github.com/users/asnelzin/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/asnelzin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/asnelzin/subscriptions",
"type": "User",
"url": "https://api.github.com/users/asnelzin",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2014-11-11T21:03:04Z
|
2021-09-08T23:07:01Z
|
2014-11-12T18:35:39Z
|
CONTRIBUTOR
|
resolved
|
Come here from https://github.com/jakubroztocil/httpie/issues/269
Looks like this code doesn't work properly:
```
>>> import requests
>>> r = requests.post('http://httpbin.org/post', headers={'Content-Length': 'not zero'})
>>> r.request.headers
{'Content-Length': '0', 'Connection': 'keep-alive', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'User- Agent': 'python-requests/2.4.3 CPython/2.7.6 Darwin/14.0.0'}
```
However, curl doing it perfectly:
```
$ curl --verbose -X POST 'http://httpbin.org/post' --header 'Content-Length:not zero'
* Hostname was NOT found in DNS cache
* Trying 50.19.91.97...
* Connected to httpbin.org (50.19.91.97) port 80 (#0)
> POST /post HTTP/1.1
> User-Agent: curl/7.37.1
> Host: httpbin.org
> Accept: */*
> Content-Length:not zero
```
Seems like a bug there.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2329/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2329/timeline
| null |
completed
| null | null | false |
[
"Eh, I'm on the fence. I don't think requests should be able to generate any invalid input provided to it by the user. We're in the position of making user's lives easier, and sometimes that means ignoring their mistakes.\n\nHowever, this violates our general principle of leaving the user's headers where they are. That means I'm +0.5 on fixing it. The fix is easy: in `PreparedRequest.prepare_content_length`, prevent falling into the last `elif` block by adding `and self.headers.get('Content-Length') is None`.\n\n@sigmavirus24, does that sound reasonable?\n",
"I always waffle on things like this. For one thing, our behaviour should absolutely not be dictated by something like curl. On the other hand, I haven't looked into this so if post with a valid content-length header is not being overridden either, that's a bug.\n",
"I agree with your opinion about the fact that overriding the correct content-length is not a bug. But, on the other hand, I would like to have a tool that I could use to do absolutely everything. \n\nSo, I guess I have to make a fork and use it in httpie. How you think, is it a right way to do it?\n",
"Content-Length should absolutely be able to be set — if not with the standard API, then with PreparedRequests. \n",
"PreparedRequests are for the \"absolutely everything\" usecase. \n",
"With #2332 merged, this should be closeable.\n"
] |
https://api.github.com/repos/psf/requests/issues/2328
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2328/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2328/comments
|
https://api.github.com/repos/psf/requests/issues/2328/events
|
https://github.com/psf/requests/issues/2328
| 48,146,830 |
MDU6SXNzdWU0ODE0NjgzMA==
| 2,328 |
Add prepare hook
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/458976?v=4",
"events_url": "https://api.github.com/users/NullSoldier/events{/privacy}",
"followers_url": "https://api.github.com/users/NullSoldier/followers",
"following_url": "https://api.github.com/users/NullSoldier/following{/other_user}",
"gists_url": "https://api.github.com/users/NullSoldier/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/NullSoldier",
"id": 458976,
"login": "NullSoldier",
"node_id": "MDQ6VXNlcjQ1ODk3Ng==",
"organizations_url": "https://api.github.com/users/NullSoldier/orgs",
"received_events_url": "https://api.github.com/users/NullSoldier/received_events",
"repos_url": "https://api.github.com/users/NullSoldier/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/NullSoldier/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NullSoldier/subscriptions",
"type": "User",
"url": "https://api.github.com/users/NullSoldier",
"user_view_type": "public"
}
|
[
{
"color": "02e10c",
"default": false,
"description": null,
"id": 76800,
"name": "Feature Request",
"node_id": "MDU6TGFiZWw3NjgwMA==",
"url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request"
}
] |
closed
| true | null |
[] | null | 2 |
2014-11-08T00:23:10Z
|
2021-09-08T23:07:02Z
|
2014-11-08T08:54:44Z
|
NONE
|
resolved
|
It would be nice to have a pre_prepare_request hook.
I'm doing frequent requests to an API which requires a query parameter, and a custom header. I created an HTTP adapter to provide the query parameter and the header (because session won't help me with the query parameter).
I was hoping I could also use a hook to modify the url from /bar/ -> foo.com/bar/ because urls made on this adapter shouldn't require the host since you have to use something crappy like get(get_url('/bar/')) just to prepend the host everywhere in a sensible way.
If I had a pre_prepare hook I could modify the URL here.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/458976?v=4",
"events_url": "https://api.github.com/users/NullSoldier/events{/privacy}",
"followers_url": "https://api.github.com/users/NullSoldier/followers",
"following_url": "https://api.github.com/users/NullSoldier/following{/other_user}",
"gists_url": "https://api.github.com/users/NullSoldier/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/NullSoldier",
"id": 458976,
"login": "NullSoldier",
"node_id": "MDQ6VXNlcjQ1ODk3Ng==",
"organizations_url": "https://api.github.com/users/NullSoldier/orgs",
"received_events_url": "https://api.github.com/users/NullSoldier/received_events",
"repos_url": "https://api.github.com/users/NullSoldier/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/NullSoldier/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/NullSoldier/subscriptions",
"type": "User",
"url": "https://api.github.com/users/NullSoldier",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2328/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2328/timeline
| null |
completed
| null | null | false |
[
"Once upon a time requests had a hook like you describe but we removed it because almost no one was using it. You're one of the first people to request this be added back. Let me propose a different way of addressing your problem:\n\n``` python\nclass NullSoldierSession(requests.Session):\n def request(self, method, url, params=None, data=None, files=None,\n # ...\n ):\n url = get_url(url)\n return super(NullSoldierSession, self).request(method, url, params, data, files, ...)\n```\n\nThis will handle mangling the URL for you. Since every other method routes through `Session#request` by overriding that method you've pre-empted everything you need to do.\n\nI'm curious as to why the Session can't handle your query parameter for you, but this feature request is not going to be accepted with almost 100% certainty.\n",
"I just looked at the source code and realized I can use params with session to solve this problem. :|\n"
] |
https://api.github.com/repos/psf/requests/issues/2327
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2327/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2327/comments
|
https://api.github.com/repos/psf/requests/issues/2327/events
|
https://github.com/psf/requests/issues/2327
| 48,106,439 |
MDU6SXNzdWU0ODEwNjQzOQ==
| 2,327 |
Feature Request: Response MaxFilesize and MaxConnectionTimeout
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/204779?v=4",
"events_url": "https://api.github.com/users/jvanasco/events{/privacy}",
"followers_url": "https://api.github.com/users/jvanasco/followers",
"following_url": "https://api.github.com/users/jvanasco/following{/other_user}",
"gists_url": "https://api.github.com/users/jvanasco/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jvanasco",
"id": 204779,
"login": "jvanasco",
"node_id": "MDQ6VXNlcjIwNDc3OQ==",
"organizations_url": "https://api.github.com/users/jvanasco/orgs",
"received_events_url": "https://api.github.com/users/jvanasco/received_events",
"repos_url": "https://api.github.com/users/jvanasco/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jvanasco/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jvanasco/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jvanasco",
"user_view_type": "public"
}
|
[
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
}
] |
closed
| true | null |
[] | null | 5 |
2014-11-07T17:15:13Z
|
2021-09-08T23:06:09Z
|
2015-01-19T09:16:19Z
|
CONTRIBUTOR
|
resolved
|
Total Request Timeout is a request in #1928, but I think it could be implemented at the same time as a Filesize Max.
As a user of requests, there are 2 issues I often run into:
- I don't necessarily know how large the remote file is. It is 100k or 2GB? Large files can create issues based on my machine's memory and filesystem. I can't trust the data in response headers, as it's not always provided or correct.
- I can often encounter slow connections or data streams, where information is continually provided. The current read/connection timeouts handle the amount of time between data being sent -- but not the overall amount of time.
Both of these issues can be handled by developers using the current requests API, but variants of these issues often appear on StackOverflow and this issue tracker -- and I think they could be more elegantly handled within requests and (mostly) a single chunk of code.
My suggestion is the following:
- new (default=None) arguments for `max_filesize` and `max_connectiontime`
- if either are set, the response is read in chunks (r.raw.read)
- if the full response is read, the data is transferred to the content stash
- if only a portion of the response is read, an Exception is raised. details about the connection and the read response data are available on the exception instance.
I think this logic would mostly go in `requests.apapters.HTTPAdapter.send`, but I could be wrong
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2327/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2327/timeline
| null |
completed
| null | null | false |
[
"There's been some discussion about max_connectiontime in the past. It is pretty difficult to do inline; the best way to implement this would probably be to put the request in a thread and put a `threading.Timer` on the thread so it either completes or fails in a specified amount of time.\n\nMaybe this could go in requests-toolbelt? Or an example of how to implement this, for people that want it?\n",
"This is an interesting proposal, that I think suffers from being for an extremely specific use-case that is relatively uncommon. Let's take it in two parts.\n\nFirst, `max_connectiontime`. `time` in this context is underspecified. What do we mean? CPU time? On a heavily loaded CPU you may not get scheduled and so this can turn into an extremely long wall-clock duration. Wall-clock time? This is more likely, but it still allows for us to be resource starved out, and to achieve it correctly requires that we register for signals from the OS, which is an extremely unpleasant thing for a library to do. For us to pick which of these the user meant is extremely presumptuous, especially when, as @kevinburke says, it's not that hard for the user to do themselves.\n\nThe max file-size is an interesting idea, but fundamentally starts adding some weirdness to the API. Firstly, what does `requests.send(url, stream=False, max_filesize=80)` mean? It should never be possible to write a mutually contradictory line of code in this manner. I'd be _more_ open to having this be a property on a `Response` object, realistically I think I'm pretty happy with the situation as is. I'd be interested to hear from the other maintainers though.\n",
"I should also mention another key aspect of API design: the more you put in, the harder it is to work with. Good design comes from restriction and constraint, and if we added an option for every way you might possibly want to work with HTTP our API documentation will be a million pages long and incomprehensible. We should make popular things easy, and provide tools for building the less popular parts, but at some stage we need to stop adding features. This means that, at this stage, I'm a default -1 on new features unless they can be extremely well justified.\n",
"I'm also a default -1 on every new feature proposed and -2 on these proposed features. (In fact I'm confused as to why https://github.com/kennethreitz/requests/issues/1928 is even still open.)\n\nThat said, why does there need to be a maximum filesize enforced by requests. Why aren't you using streaming and detecting this yourself? I feel like this is another potential use-case for something like [tee](https://github.com/sigmavirus24/requests-toolbelt/issues/41). That said, you've given us no real justification or real world examples with which to experiment. You've described a \"content stash\" that has no basis in anything this library does and did not define it. The feature request in general is poorly specified and founded in what appear to be extreme use cases.\n",
"\"Why does there need to be a maximum filesize enforced by requests. Why aren't you using streaming and detecting this yourself?\"\n\nI recently implemented streaming and chunking. It wasn't difficult for me to implement.\n\nHowever:\n- This appears to be a question/need that often occurs on StackOverflow and in this issue tracker\n- Because there are no limits to filesize in the requests library, it is easy for a user of the library to request a large file or endless stream of data and not realize this is happening. This can lead to issues with memory and disk usage. \n- Dealing with a chunked read is a bit of an advanced topic as it deals with manipulating the `raw` file-like object. While it is something that is old and familiar for people with a few years of Python experience, it's not something that is easy for junior developers. Some sort `max_filesize` argument or a `safe_read` function could encapsulate this logic and make it accessible to all.\n\n\"You've described a \"content stash\" that has no basis in anything this library does and did not define it.\"\n- The response object has a `.content` property/method (https://github.com/kennethreitz/requests/blob/master/requests/models.py#L714-L736)\n- Said property iterates through the `.iter_content()` filelike object (https://github.com/kennethreitz/requests/blob/master/requests/models.py#L639-L683) \n- The read content is then stored in an internal stash named `._content` (https://github.com/kennethreitz/requests/blob/master/requests/models.py#L539-L540)\n\nI think I was wrong about placement, and it would better function as a method in Response as @Lukasa noted. What I'm essentially suggesting, is extending the library to have a safeguard around the call to `.iter_content`. The data would be read in chunks, and stored in the object's `._content` stash. If the entirety of the data is read, everything functions as normal. If the filesize is exceeded, an exception is raised. It would make a defensive programming concept relatively accessible.\n\nThis is in line with stdlib functions like `file.read` and `httlib.read` (and consquently urllib) that all support a form of ranged reads without having to loop over chunks. It is possible to do a ranged read on the `raw` property of the response object, but it's a looped chunk read and it appears to possibly have the potential for unintentionally changing some internal flags on the response object and create issues with future calls to `.content`.\n\nI think features/recipes like this could work in requests-toolbelt if there were an ability/hook to register consumed content into the response object so that the API doesn't change for end users. For example, if I were to read everything by chunks into a variable, I could them call \"Response._register_content(content)\" which would set `Response._content`, and `Response._content_consumed`. \n"
] |
https://api.github.com/repos/psf/requests/issues/2326
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2326/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2326/comments
|
https://api.github.com/repos/psf/requests/issues/2326/events
|
https://github.com/psf/requests/pull/2326
| 48,093,227 |
MDExOlB1bGxSZXF1ZXN0MjQwNTk2OTk=
| 2,326 |
Close sessions created in the functional API
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 6 |
2014-11-07T15:27:05Z
|
2021-09-08T09:01:15Z
|
2014-11-12T17:34:14Z
|
CONTRIBUTOR
|
resolved
|
This is related to #1882 and #1685. By calling close on the session, we
clear the PoolManager operated by the Session and close all sockets.
Fixes #1882
Partially-fixes #1685
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2326/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2326/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2326.diff",
"html_url": "https://github.com/psf/requests/pull/2326",
"merged_at": "2014-11-12T17:34:14Z",
"patch_url": "https://github.com/psf/requests/pull/2326.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2326"
}
| true |
[
"Eh, I'm +0.5. I still think that #1882 is unnecessary whining on the part of CPython, but never mind. ;)\n",
"@Lukasa it may be CPython being overly cautious but it will be a problem on PyPy. Stuff like [this](https://bugs.launchpad.net/dateutil/+bug/1376343) can cause an application to crash because there are too many file descriptors open. Others can explain why this is important to PyPy better than I can, but I'm very confident this is a worth while change.\n\nMy only qualm is worrying about users who may be accessing `r.raw._pool`. Granted they shouldn't be, but I'm worried about an influx of users who will complain that we broke X.\n",
"Yeah, that's a point. Can't really merge until 3.0.\n",
"I wouldn't think users relying on a private attribute would be given priority over a bug that could so easily affect PyPy users. If they're accessing `_pool` I'd be:\n1. surprised and confused\n2. unforgiving because they're accessing a private attribute\n\nI can't think of a reason anyone would _need_ to access a urllib3 connection pool frankly other than for the horrible debugging I've been going through and ideally no one else is really doing that.\n",
"Ship it :)\n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n\nOn Fri, Nov 7, 2014 at 11:53 AM, Ian Cordasco [email protected]\nwrote:\n\n> I wouldn't think users relying on a private attribute would be given\n> priority over a bug that could so easily affect PyPy users. If they're\n> accessing _pool I'd be:\n> 1. surprised and confused\n> 2. unforgiving because they're accessing a private attribute\n> \n> I can't think of a reason anyone would _need_ to access a urllib3\n> connection pool frankly other than for the horrible debugging I've been\n> going through and ideally no one else is really doing that.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/pull/2326#issuecomment-62202216\n> .\n",
"woo\n"
] |
https://api.github.com/repos/psf/requests/issues/2325
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2325/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2325/comments
|
https://api.github.com/repos/psf/requests/issues/2325/events
|
https://github.com/psf/requests/issues/2325
| 48,091,834 |
MDU6SXNzdWU0ODA5MTgzNA==
| 2,325 |
How to upload a single file to a specific folder in Google Drive.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5614580?v=4",
"events_url": "https://api.github.com/users/ajoys/events{/privacy}",
"followers_url": "https://api.github.com/users/ajoys/followers",
"following_url": "https://api.github.com/users/ajoys/following{/other_user}",
"gists_url": "https://api.github.com/users/ajoys/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ajoys",
"id": 5614580,
"login": "ajoys",
"node_id": "MDQ6VXNlcjU2MTQ1ODA=",
"organizations_url": "https://api.github.com/users/ajoys/orgs",
"received_events_url": "https://api.github.com/users/ajoys/received_events",
"repos_url": "https://api.github.com/users/ajoys/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ajoys/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ajoys/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ajoys",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2014-11-07T15:14:31Z
|
2021-09-08T23:07:03Z
|
2014-11-07T15:39:33Z
|
NONE
|
resolved
|
Hello,
I currently in the process of trying to upload a file to a specific Google Drive folder. I have tried creating a JSON object as indicated by the Drive api, and set it as the JSON attribute in the call, however my file still gets sent to the root directory, named as "Untitled".
Here is my code:
```
title = files_to_send[0].name.split('/')[-1]
request_body = {'title':title, "parents": [{"id": path_id}]}
status = self.call.post('https://www.googleapis.com/upload/drive/v2/files', data=files_to_send[0], json=request_body)
```
Thanks!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2325/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2325/timeline
| null |
completed
| null | null | false |
[
"Thanks for using requests!\n\nUnfortunately, this is a bug tracker, used to track bugs and feature requests. This is a usage question. You should direct your question to Stack Overflow.\n",
"Alright, sorry about that!\n",
"That's quite alright!\n"
] |
https://api.github.com/repos/psf/requests/issues/2324
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2324/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2324/comments
|
https://api.github.com/repos/psf/requests/issues/2324/events
|
https://github.com/psf/requests/issues/2324
| 48,083,896 |
MDU6SXNzdWU0ODA4Mzg5Ng==
| 2,324 |
IronPython vs Six : reraise() and _getframe()
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1540582?v=4",
"events_url": "https://api.github.com/users/tcalmant/events{/privacy}",
"followers_url": "https://api.github.com/users/tcalmant/followers",
"following_url": "https://api.github.com/users/tcalmant/following{/other_user}",
"gists_url": "https://api.github.com/users/tcalmant/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tcalmant",
"id": 1540582,
"login": "tcalmant",
"node_id": "MDQ6VXNlcjE1NDA1ODI=",
"organizations_url": "https://api.github.com/users/tcalmant/orgs",
"received_events_url": "https://api.github.com/users/tcalmant/received_events",
"repos_url": "https://api.github.com/users/tcalmant/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tcalmant/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tcalmant/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tcalmant",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-11-07T13:54:29Z
|
2021-09-08T23:07:05Z
|
2014-11-07T14:11:20Z
|
NONE
|
resolved
|
The definition of `exec_` and `reraise()` methods from the `six` module seems to be the only thing that prevent using `requests` in IronPython.
The exact module is: requests.packages.urllib3.packages.six (around line 320)
The problem is that `sys._getframe()` is not available in IronPython.
As the `exec_` method doesn't seem to be used in `requests` nor in `urllib3`, I've made a small monkey patch to correct it, but I don't know if I have to make a pull request here or in the repository of six.
The patch declares "reraise" at module level, like for Python 3, instead of doing so calling `exec_`.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2324/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2324/timeline
| null |
completed
| null | null | false |
[
"@tcalmant the `exec_` method is used in `six` though, which is used by `urllib3`. The latest version of six [still uses `sys._getframe`](https://bitbucket.org/gutworth/six/src/c17477e81e482d34bf3cda043b2eca643084e5fd/six.py?at=default#cl-647). You should raise this bug there and then **if you are successful** in getting it fixed, create a bug on [`urllib3`](/shazow/urllib3) to update the copy of six it vendors.\n"
] |
https://api.github.com/repos/psf/requests/issues/2323
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2323/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2323/comments
|
https://api.github.com/repos/psf/requests/issues/2323/events
|
https://github.com/psf/requests/pull/2323
| 48,058,734 |
MDExOlB1bGxSZXF1ZXN0MjQwMzkwNTI=
| 2,323 |
Pass strict to urllib3.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-11-07T08:22:25Z
|
2021-09-08T08:00:52Z
|
2014-11-07T14:12:24Z
|
MEMBER
|
resolved
|
This change passes `strict=True` to urllib3, and thus to httplib.
As per the documentation:
> the optional parameter strict (which defaults to a false value) causes BadStatusLine to be raised if the status line can’t be parsed as a valid HTTP/1.0 or 1.1 status line.
Merging this would mean that requests is now actively limiting itself to HTTP/1.0+, excluding HTTP/0.9 (who the hell cares?). This is _already_ requests behaviour on Python 3, it simply brings it into consistency in Python 2.
Note that, despite the fact that requests already sort of does this, this change is arguably breaking.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2323/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2323/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2323.diff",
"html_url": "https://github.com/psf/requests/pull/2323",
"merged_at": "2014-11-07T14:12:24Z",
"patch_url": "https://github.com/psf/requests/pull/2323.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2323"
}
| true |
[
"Ugh, should say that this resolves #1869.\n",
"LGTM. :+1: \n"
] |
https://api.github.com/repos/psf/requests/issues/2322
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2322/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2322/comments
|
https://api.github.com/repos/psf/requests/issues/2322/events
|
https://github.com/psf/requests/issues/2322
| 48,043,374 |
MDU6SXNzdWU0ODA0MzM3NA==
| 2,322 |
requests is not happy with a shoutcast stream
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/204779?v=4",
"events_url": "https://api.github.com/users/jvanasco/events{/privacy}",
"followers_url": "https://api.github.com/users/jvanasco/followers",
"following_url": "https://api.github.com/users/jvanasco/following{/other_user}",
"gists_url": "https://api.github.com/users/jvanasco/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jvanasco",
"id": 204779,
"login": "jvanasco",
"node_id": "MDQ6VXNlcjIwNDc3OQ==",
"organizations_url": "https://api.github.com/users/jvanasco/orgs",
"received_events_url": "https://api.github.com/users/jvanasco/received_events",
"repos_url": "https://api.github.com/users/jvanasco/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jvanasco/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jvanasco/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jvanasco",
"user_view_type": "public"
}
|
[
{
"color": "02e10c",
"default": false,
"description": null,
"id": 76800,
"name": "Feature Request",
"node_id": "MDU6TGFiZWw3NjgwMA==",
"url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request"
},
{
"color": "e11d21",
"default": false,
"description": null,
"id": 136589914,
"name": "Needs Info",
"node_id": "MDU6TGFiZWwxMzY1ODk5MTQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20Info"
}
] |
closed
| true | null |
[] | null | 6 |
2014-11-07T03:14:23Z
|
2021-09-08T23:05:56Z
|
2014-11-07T08:15:19Z
|
CONTRIBUTOR
|
resolved
|
I have a web indexer that came across this url:
```
url = 'http://voxsc1.somafm.com:8500/'
```
Requests does not like this (2.3, 2.4.3)
```
import requests
r = requests.get(url, timeout=5)
```
this causes python to just hang.
from what I can tell, this url will somehow render/redirect on a modern browser.
the first bit of data are :
ICY 200 OK
icy-notice1:<BR>This stream requires <a href="http://www.winamp.com/">Winamp</a><BR>
icy-notice2:SHOUTcast Distributed Network Audio Server/Linux v1.9.5<BR>
icy-name:Ill Street Lounge: Classic bachelor pad, playful exotica and vintage music of tomorrow. [SomaFM]
icy-genre:Exotica Lounge
icy-url:http://somafm.com
content-type:audio/mpeg
icy-pub:1
icy-br:128
`curl -I` will give an error
`curl` will show a data stream
it would be nice if requests could detect urls like this and just stop.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2322/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2322/timeline
| null |
completed
| null | null | false |
[
"> it would be nice if requests could detect urls like this and just stop.\n\nWhat exactly are we supposed to detect? Using a timeout doesn't matter because the server is continuously sending data. The connection is fine. The way to prevent this from hanging your indexer is to use `stream=True`. Passing that makes this return immediately for me.\n",
"Note that the reason requests doesn't spot this immediately is because the shape of the response is exactly the same as HTTP. That's clearly deliberate. I wonder if passing `strict` to httplib would have caught this.\n",
"Yup, if we pass `strict=True` to the pool manager when we create it, we bail on this with `BadStatusLine`.\n",
"I think this means that this bug is actually #1869, so I'm going to close this to centralise there.\n",
"Wow. Yes - I'm patching locally to catch this until the next release! Thanks!\n",
"How could I still download from a stream that returns the famous \"ICY 200 ok\" that your library doesn't understand? It throws a BadStatusLine('ICY 200 OK') and can't download at all.\nThanks.\n"
] |
https://api.github.com/repos/psf/requests/issues/2321
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2321/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2321/comments
|
https://api.github.com/repos/psf/requests/issues/2321/events
|
https://github.com/psf/requests/issues/2321
| 48,043,038 |
MDU6SXNzdWU0ODA0MzAzOA==
| 2,321 |
'verify' with custom ca bundle
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/426784?v=4",
"events_url": "https://api.github.com/users/remram44/events{/privacy}",
"followers_url": "https://api.github.com/users/remram44/followers",
"following_url": "https://api.github.com/users/remram44/following{/other_user}",
"gists_url": "https://api.github.com/users/remram44/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/remram44",
"id": 426784,
"login": "remram44",
"node_id": "MDQ6VXNlcjQyNjc4NA==",
"organizations_url": "https://api.github.com/users/remram44/orgs",
"received_events_url": "https://api.github.com/users/remram44/received_events",
"repos_url": "https://api.github.com/users/remram44/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/remram44/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/remram44/subscriptions",
"type": "User",
"url": "https://api.github.com/users/remram44",
"user_view_type": "public"
}
|
[
{
"color": "e11d21",
"default": false,
"description": null,
"id": 136589914,
"name": "Needs Info",
"node_id": "MDU6TGFiZWwxMzY1ODk5MTQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20Info"
}
] |
closed
| true | null |
[] | null | 9 |
2014-11-07T03:07:31Z
|
2021-09-08T23:07:04Z
|
2014-11-07T04:54:42Z
|
CONTRIBUTOR
|
resolved
|
It seems that currently, if 'verify' is set to the filename of a ca_bundle, a certificate that doesn't match it **will be accepted without SSLError** if it matches the global bundle. This is counter-intuitive, unclear in the documentation and prevents users from actually overriding the ca_bundle with their own.
Example:
```
openssl req -nodes -x509 -days 3650 \
-newkey rsa:2048 -keyout ca-key.pem \
-out ca-cert.pem \
-subj "/C=US/ST=New York/L=New York/O=remram.fr/CN=ca.remram.fr"
python -c "import requests; print(requests.get('https://google.com/', verify='ca-cert.pem'))"
```
My version is 2.4.3, on OS X.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2321/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2321/timeline
| null |
completed
| null | null | false |
[
"``` python\n~/sandbox/requests (master) python\nPython 2.7.8 (default, Aug 24 2014, 21:26:19)\n[GCC 4.2.1 Compatible Apple LLVM 5.1 (clang-503.0.40)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests\n>>> r = requests.get('https://google.com', verify='../urllib3/dummyserver/certs/client_bad.pem')\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"requests/api.py\", line 60, in get\n return request('get', url, **kwargs)\n File \"requests/api.py\", line 49, in request\n return session.request(method=method, url=url, **kwargs)\n File \"requests/sessions.py\", line 457, in request\n resp = self.send(prep, **send_kwargs)\n File \"requests/sessions.py\", line 569, in send\n r = adapter.send(request, **kwargs)\n File \"requests/adapters.py\", line 420, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: [Errno 1] _ssl.c:510: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed\n>>> r = requests.get('https://google.com', verify='../urllib3/dummyserver/certs/client.pem')\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"requests/api.py\", line 60, in get\n return request('get', url, **kwargs)\n File \"requests/api.py\", line 49, in request\n return session.request(method=method, url=url, **kwargs)\n File \"requests/sessions.py\", line 457, in request\n resp = self.send(prep, **send_kwargs)\n File \"requests/sessions.py\", line 569, in send\n r = adapter.send(request, **kwargs)\n File \"requests/adapters.py\", line 420, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: [Errno 1] _ssl.c:510: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed\n```\n\nI can't reproduce this. (If it isn't obvious, I have requests and urllib3 cloned next to each other and I'm using the self-signed certificates in the bundles in `urllib3/dummyserver/certs`.)\n",
"On master (v2.4.3-28-g122c92e), with the same certificate you mention (from urllib3 git):\n\n```\n$ git clone [email protected]:shazow/urllib3.git\n...\n$ git clone [email protected]:kennethreitz/requests.git\n...\n$ virtualenv venv27\nNew python executable in venv27/bin/python\nInstalling setuptools, pip...done.\n$ . venv27/bin/activate\n(venv27)$ cd requests\n(venv27)$ git describe\nv0.6.4-3028-g122c92e\n(venv27)$ git describe --tags\nv2.4.3-28-g122c92e\n(venv27)$ python\nPython 2.7.5 (default, Mar 9 2014, 22:15:05) \n[GCC 4.2.1 Compatible Apple LLVM 5.0 (clang-500.0.68)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests\n>>> requests.get('https://google.com/', verify='../urllib3/dummyserver/certs/client_bad.pem')\n<Response [200]>\n>>> ^D\n```\n\nI'm really not sure what else to tell you...\n",
"Interestingly, using Python 2.7.8 from macports (with `virtualenv -p /opt/local/bin/python`, I get the same results as you. Could this be a Python or OpenSSL bug, or a patch from Apple?\n\n```\nrequests.exceptions.SSLError: [Errno 1] _ssl.c:510: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed\n```\n",
"I'm using Python 2.7.8 from homebrew. I think it's linked against openssl 1.0.1i on my laptop. It could be a difference in the version of openssl in use and it could be a difference in the version of openssl Apple ships versus an actually up-to-date one. Either way, this isn't the fault of requests and there's no way we can \"fix\" this.\n",
"On #python:\n\n> 23:52 < _habnabit> Remram, apple has a custom-patched SSL, yes\n> 23:53 < _habnabit> Remram, apple's OpenSSL does keychain cert verification\n> 23:53 < _habnabit> Remram, https://hynek.me/articles/apple-openssl-verification-surprises/\n\nAlthough this looks very serious, you are right, there is probably nothing to be done in requests. Perhaps you could mention this somewhere in the documentation, since requests will not behave as expected on OS X.\n",
"The requests documentation is not the place for these bizarre edgecases. Blog posts are. =)\n",
"I disagree. This is a security issue, affecting one major feature of requests (certificate validation, which Python's stdlib skips) on one major platform (we're both using it!). It should definitely be apparent to users who are about to use the verify='...' feature.\n",
"The correct way to phrase this security issue is that the system OpenSSL should _never_ be used on OS X. It's wildly out of date and hopelessly insecure. That's a warning I can see us putting in the documentation, but in a more general form.\n",
"I'd be happy with this. I had no idea.\n"
] |
https://api.github.com/repos/psf/requests/issues/2320
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2320/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2320/comments
|
https://api.github.com/repos/psf/requests/issues/2320/events
|
https://github.com/psf/requests/issues/2320
| 47,730,367 |
MDU6SXNzdWU0NzczMDM2Nw==
| 2,320 |
Getting a request with a BOM mark
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1440143?v=4",
"events_url": "https://api.github.com/users/GP89/events{/privacy}",
"followers_url": "https://api.github.com/users/GP89/followers",
"following_url": "https://api.github.com/users/GP89/following{/other_user}",
"gists_url": "https://api.github.com/users/GP89/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/GP89",
"id": 1440143,
"login": "GP89",
"node_id": "MDQ6VXNlcjE0NDAxNDM=",
"organizations_url": "https://api.github.com/users/GP89/orgs",
"received_events_url": "https://api.github.com/users/GP89/received_events",
"repos_url": "https://api.github.com/users/GP89/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/GP89/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/GP89/subscriptions",
"type": "User",
"url": "https://api.github.com/users/GP89",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-11-04T16:09:20Z
|
2021-09-08T23:07:06Z
|
2014-11-04T17:58:22Z
|
NONE
|
resolved
|
I was using `requests` to get some json form a url. I noticed at the start it had a weird unicode character. The start of the response text was `u'\ufeff[{"Hos...` (the message was sent with content-type of `text/javascript; charset=utf-8`)
After looking into it a bit, it looks like it was a [BOM mark](http://en.wikipedia.org/wiki/Byte_order_mark). Looking for solutions I came across [this post](http://stackoverflow.com/a/14532226/659346) which suggests decoding the string, which would work fine using `urllib2` as it gives back the byte string. But in `requests` the response objects `text` attribute is already decoded with the BOM mark included.
I was wondering if this was something that could be addressed automatically? Or maybe it might end up biting people who expected to see that mark in the response text?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1440143?v=4",
"events_url": "https://api.github.com/users/GP89/events{/privacy}",
"followers_url": "https://api.github.com/users/GP89/followers",
"following_url": "https://api.github.com/users/GP89/following{/other_user}",
"gists_url": "https://api.github.com/users/GP89/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/GP89",
"id": 1440143,
"login": "GP89",
"node_id": "MDQ6VXNlcjE0NDAxNDM=",
"organizations_url": "https://api.github.com/users/GP89/orgs",
"received_events_url": "https://api.github.com/users/GP89/received_events",
"repos_url": "https://api.github.com/users/GP89/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/GP89/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/GP89/subscriptions",
"type": "User",
"url": "https://api.github.com/users/GP89",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2320/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2320/timeline
| null |
completed
| null | null | false |
[
"If the BOM isn't getting stripped, the problem is that we're decoding it as non-UTF-16. In such a case, the BOM is strictly unnecessary, and a leading BOM is in fact not a BOM but a unicode zero-width non-breaking space character.\n\nNow, the content-type header seems to suggest it's not a BOM, so we're doing what it says and decoding as UTF-8. In that situation, as mentioned above, the BOM is not meant to be present and so has meaning.\n\nWe shouldn't automatically strip it off because we can't unconditionally guarantee that it shouldn't be there. If users want it stripped they can easily do so themselves by unconditionally removing leading instances of `u'\\ufeff'`.\n",
"That was my gut feeling - I didn't realise that char could mean something else other than a BOM though - so I'd agree to leave it as it is :+1:\n"
] |
https://api.github.com/repos/psf/requests/issues/2319
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2319/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2319/comments
|
https://api.github.com/repos/psf/requests/issues/2319/events
|
https://github.com/psf/requests/issues/2319
| 47,622,569 |
MDU6SXNzdWU0NzYyMjU2OQ==
| 2,319 |
Expose retry settings available in urllib3
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/202341?v=4",
"events_url": "https://api.github.com/users/neuroid/events{/privacy}",
"followers_url": "https://api.github.com/users/neuroid/followers",
"following_url": "https://api.github.com/users/neuroid/following{/other_user}",
"gists_url": "https://api.github.com/users/neuroid/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/neuroid",
"id": 202341,
"login": "neuroid",
"node_id": "MDQ6VXNlcjIwMjM0MQ==",
"organizations_url": "https://api.github.com/users/neuroid/orgs",
"received_events_url": "https://api.github.com/users/neuroid/received_events",
"repos_url": "https://api.github.com/users/neuroid/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/neuroid/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/neuroid/subscriptions",
"type": "User",
"url": "https://api.github.com/users/neuroid",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-11-03T17:47:04Z
|
2021-09-08T23:07:07Z
|
2014-11-03T17:50:46Z
|
NONE
|
resolved
|
Hi,
Are there any plans to expose the [retry settings available in urllib3](https://urllib3.readthedocs.org/en/latest/helpers.html#urllib3.util.retry.Retry)? The possibility of configuring the `backoff_factor` would be quite useful.
Best,
L
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/202341?v=4",
"events_url": "https://api.github.com/users/neuroid/events{/privacy}",
"followers_url": "https://api.github.com/users/neuroid/followers",
"following_url": "https://api.github.com/users/neuroid/following{/other_user}",
"gists_url": "https://api.github.com/users/neuroid/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/neuroid",
"id": 202341,
"login": "neuroid",
"node_id": "MDQ6VXNlcjIwMjM0MQ==",
"organizations_url": "https://api.github.com/users/neuroid/orgs",
"received_events_url": "https://api.github.com/users/neuroid/received_events",
"repos_url": "https://api.github.com/users/neuroid/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/neuroid/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/neuroid/subscriptions",
"type": "User",
"url": "https://api.github.com/users/neuroid",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2319/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2319/timeline
| null |
completed
| null | null | false |
[
"Yup - see https://github.com/kennethreitz/requests/pull/2216\n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n\nOn Mon, Nov 3, 2014 at 9:47 AM, Łukasz Kawczyński [email protected]\nwrote:\n\n> Hi,\n> \n> Are there any plans to expose the retry settings available in urllib3\n> https://urllib3.readthedocs.org/en/latest/helpers.html#urllib3.util.retry.Retry?\n> The possibility of configuring the backoff_factor would be quite useful.\n> \n> Best,\n> L\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2319.\n",
"Great! Somehow I missed that pull request -_-\n"
] |
https://api.github.com/repos/psf/requests/issues/2318
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2318/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2318/comments
|
https://api.github.com/repos/psf/requests/issues/2318/events
|
https://github.com/psf/requests/issues/2318
| 47,557,987 |
MDU6SXNzdWU0NzU1Nzk4Nw==
| 2,318 |
Allow parsing str url in Python2
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/561729?v=4",
"events_url": "https://api.github.com/users/jichifly/events{/privacy}",
"followers_url": "https://api.github.com/users/jichifly/followers",
"following_url": "https://api.github.com/users/jichifly/following{/other_user}",
"gists_url": "https://api.github.com/users/jichifly/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jichifly",
"id": 561729,
"login": "jichifly",
"node_id": "MDQ6VXNlcjU2MTcyOQ==",
"organizations_url": "https://api.github.com/users/jichifly/orgs",
"received_events_url": "https://api.github.com/users/jichifly/received_events",
"repos_url": "https://api.github.com/users/jichifly/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jichifly/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jichifly/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jichifly",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2014-11-03T02:09:29Z
|
2021-09-08T23:07:06Z
|
2014-11-03T20:05:49Z
|
NONE
|
resolved
|
Is it possible to add an option to disable decoding str URL to unicode in requests.get?
I am using Python 2.7.8. After a recent update to the requests 2.4.3, I find I am not able to pass SHIFT-JIS str/bytes URL to requests.get.
I am using requests to access some Japanese websites, which require raw SHIFT-JIS encoding string as URL.
But requests.get will always decode my SHIFT-JIS str URL to UTF8 in the beginning of models.py::prepare_url, which will raise decode error. But if I pass UTF8 URL, I am not able to get correct response from those Japanese websites.
If I restore the old version code in models.py::prepare_url as follows, it will work for me like before. I am wondering is it possible to add an option to disable modifying the URL passed to requests?
```
# if isinstance(url, bytes):
# url = url.decode('utf8')
# else:
# url = unicode(url) if is_py2 else str(url)
try:
url = unicode(url)
except NameError:
# We're on Python 3.
url = str(url)
except UnicodeDecodeError:
pass
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/561729?v=4",
"events_url": "https://api.github.com/users/jichifly/events{/privacy}",
"followers_url": "https://api.github.com/users/jichifly/followers",
"following_url": "https://api.github.com/users/jichifly/following{/other_user}",
"gists_url": "https://api.github.com/users/jichifly/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jichifly",
"id": 561729,
"login": "jichifly",
"node_id": "MDQ6VXNlcjU2MTcyOQ==",
"organizations_url": "https://api.github.com/users/jichifly/orgs",
"received_events_url": "https://api.github.com/users/jichifly/received_events",
"repos_url": "https://api.github.com/users/jichifly/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jichifly/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jichifly/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jichifly",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2318/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2318/timeline
| null |
completed
| null | null | false |
[
"Thanks for this report!\n\nWhen you say the remote website requires SHIFT-JIS in their URL, which part of the URL do they require it in? SHIFT-JIS is totally unacceptable in the host part of the URL, so presumably it's either the path or the parameters.\n\nIf it's the parameters, I believe you can pass SHIFT-JIS bytestrings in the `params` argument without difficulty, though I admit that I haven't tried it. If it's the path, this is harder.\n\nThe best way to force this is to use the [prepared request](http://docs.python-requests.org/en/latest/user/advanced/#prepared-requests) workflow. This will allow you to modify the URL _after_ we've constructed it.\n",
"Note that requiring SHIFT-JIS in a URL is extremely unwise, and I'd argue that it's strictly a server bug. I'm happy to help you work around it, but I have no desire to modify the library to make it easier to do this than it already is.\n",
"Thanks for quick response. I was writing scripts to access digiket.com, which is a popular game/soft vendor site in Japan. I just noticed that it also supports percentage encoded string which can be used to bypass the SJIS encoding. I think this issue can be closed.\n\nFYI, both the following two URLs could be used to search, but the first one has to be encoded in SJIS.\n http://www.digiket.com/soft/result/_data/A=神様l/limit=20/sort=new/\n http://www.digiket.com/soft/result/_data/A=%90_%97l/limit=20/sort=new/\n"
] |
https://api.github.com/repos/psf/requests/issues/2317
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2317/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2317/comments
|
https://api.github.com/repos/psf/requests/issues/2317/events
|
https://github.com/psf/requests/pull/2317
| 47,473,505 |
MDExOlB1bGxSZXF1ZXN0MjM3MTQ4MzA=
| 2,317 |
Use to_native_string instead of builtin_str
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
] | null | 2 |
2014-11-01T02:20:16Z
|
2021-09-08T09:01:18Z
|
2014-11-01T14:04:09Z
|
CONTRIBUTOR
|
resolved
|
Fixes #2316
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2317/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2317/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2317.diff",
"html_url": "https://github.com/psf/requests/pull/2317",
"merged_at": "2014-11-01T14:04:09Z",
"patch_url": "https://github.com/psf/requests/pull/2317.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2317"
}
| true |
[
"LGTM. :cake:\n",
":sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/2316
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2316/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2316/comments
|
https://api.github.com/repos/psf/requests/issues/2316/events
|
https://github.com/psf/requests/issues/2316
| 47,449,626 |
MDU6SXNzdWU0NzQ0OTYyNg==
| 2,316 |
method = builtin_str(method) problem
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1753903?v=4",
"events_url": "https://api.github.com/users/gddk/events{/privacy}",
"followers_url": "https://api.github.com/users/gddk/followers",
"following_url": "https://api.github.com/users/gddk/following{/other_user}",
"gists_url": "https://api.github.com/users/gddk/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gddk",
"id": 1753903,
"login": "gddk",
"node_id": "MDQ6VXNlcjE3NTM5MDM=",
"organizations_url": "https://api.github.com/users/gddk/orgs",
"received_events_url": "https://api.github.com/users/gddk/received_events",
"repos_url": "https://api.github.com/users/gddk/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gddk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gddk/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gddk",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-10-31T22:13:00Z
|
2021-09-08T23:07:07Z
|
2014-11-01T14:04:09Z
|
NONE
|
resolved
|
In requests/sessions.py is a command:
method = builtin_str(method)
Converts method from
b’GET’
to
"b'GET’"
Which is the literal string, no longer a binary string. When requests tries to use the method "b'GET’”, it gets a 404 Not Found response.
I am using python3.4 and python-neutronclient (2.3.9) with requests (2.4.3). neutronclient is broken because it uses this "args = utils.safe_encode_list(args)" command which converts all the values to binary string, including method.
I'm not sure if this is a bug with neutronclient or a bug with requests, but I'm starting here. Seems if requests handled the method value being a binary string, we wouldn't have any problem.
Also, I tried in python2.6 and this bug doesn't exist there. Some difference between 2.6 and 3.4 makes this not work right.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2316/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2316/timeline
| null |
completed
| null | null | false |
[
"Ugh. This should have been caught and replaced with `to_native_str`. This is definitely a requests bug.\n"
] |
https://api.github.com/repos/psf/requests/issues/2315
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2315/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2315/comments
|
https://api.github.com/repos/psf/requests/issues/2315/events
|
https://github.com/psf/requests/issues/2315
| 47,349,561 |
MDU6SXNzdWU0NzM0OTU2MQ==
| 2,315 |
elapsed time for requests.get over 2 minutes per url
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5601149?v=4",
"events_url": "https://api.github.com/users/jkroening/events{/privacy}",
"followers_url": "https://api.github.com/users/jkroening/followers",
"following_url": "https://api.github.com/users/jkroening/following{/other_user}",
"gists_url": "https://api.github.com/users/jkroening/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jkroening",
"id": 5601149,
"login": "jkroening",
"node_id": "MDQ6VXNlcjU2MDExNDk=",
"organizations_url": "https://api.github.com/users/jkroening/orgs",
"received_events_url": "https://api.github.com/users/jkroening/received_events",
"repos_url": "https://api.github.com/users/jkroening/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jkroening/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jkroening/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jkroening",
"user_view_type": "public"
}
|
[
{
"color": "e11d21",
"default": false,
"description": null,
"id": 136589914,
"name": "Needs Info",
"node_id": "MDU6TGFiZWwxMzY1ODk5MTQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20Info"
}
] |
closed
| true | null |
[] | null | 7 |
2014-10-31T00:47:45Z
|
2021-09-08T23:06:57Z
|
2014-11-18T05:13:09Z
|
NONE
|
resolved
|
I've searched high and low, but no luck on this one.
running Mac OS X Yosemite with Python 2.7.8 and Requests 2.4.3
even a simple requests.get in command line to google.com takes 75 seconds to complete.
> > > r = requests.get('http://www.google.com')
> > > r.elapsed
> > > datetime.timedelta(0, 75, 269509)
what's the deal? any thoughts? no matter what url, they all take forever to return.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5601149?v=4",
"events_url": "https://api.github.com/users/jkroening/events{/privacy}",
"followers_url": "https://api.github.com/users/jkroening/followers",
"following_url": "https://api.github.com/users/jkroening/following{/other_user}",
"gists_url": "https://api.github.com/users/jkroening/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jkroening",
"id": 5601149,
"login": "jkroening",
"node_id": "MDQ6VXNlcjU2MDExNDk=",
"organizations_url": "https://api.github.com/users/jkroening/orgs",
"received_events_url": "https://api.github.com/users/jkroening/received_events",
"repos_url": "https://api.github.com/users/jkroening/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jkroening/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jkroening/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jkroening",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2315/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2315/timeline
| null |
completed
| null | null | false |
[
"I just realized it doesn't happen if I run the same requests through a VPN (basically spoofing my normal IP address.) Yet my normal IP address works fine on these sites in a browser window like Chrome.\n",
"Is this consistent?\n",
"Yes. Consistent as in it always happens? Yes, every time when making the get requests from my IP address, not from any other. I've contacted my service provider but they say it's not on their end.\n",
"## The only thing I can think of is that getaddrinfo is taking too long. That's the only thing that I've seen so significantly affect the length of a request\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n",
"@jkroening what DNS do you have configured? Could you try using Google's DNS (or someone else's)?\n",
"yeah, you're right to think that's the issue. I contacted my ISP about this because I don't run into the slow requests speed on a VPN. seems to be a problem on their end. thanks for checking up on this. it can be closed.\n\n## \n\nJonathan Kroening :: jonathankroening.com\n\nOn Mon, Nov 17, 2014 at 9:02 PM -0800, \"Ian Cordasco\" [email protected] wrote:\n\n@jkroening what DNS do you have configured? Could you try using Google's DNS (or someone else's)?\n\n—\nReply to this email directly or view it on GitHub.\n",
"Thanks @jkroening . Hope you have some luck with your ISP.\n"
] |
https://api.github.com/repos/psf/requests/issues/2314
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2314/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2314/comments
|
https://api.github.com/repos/psf/requests/issues/2314/events
|
https://github.com/psf/requests/issues/2314
| 47,259,454 |
MDU6SXNzdWU0NzI1OTQ1NA==
| 2,314 |
Ignoring SSL errors with Session obejcts
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3966613?v=4",
"events_url": "https://api.github.com/users/glennzw/events{/privacy}",
"followers_url": "https://api.github.com/users/glennzw/followers",
"following_url": "https://api.github.com/users/glennzw/following{/other_user}",
"gists_url": "https://api.github.com/users/glennzw/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/glennzw",
"id": 3966613,
"login": "glennzw",
"node_id": "MDQ6VXNlcjM5NjY2MTM=",
"organizations_url": "https://api.github.com/users/glennzw/orgs",
"received_events_url": "https://api.github.com/users/glennzw/received_events",
"repos_url": "https://api.github.com/users/glennzw/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/glennzw/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/glennzw/subscriptions",
"type": "User",
"url": "https://api.github.com/users/glennzw",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2014-10-30T11:35:18Z
|
2021-09-08T23:07:05Z
|
2014-11-05T01:00:14Z
|
NONE
|
resolved
|
When using a Session object the 'verify=False' option fails to ignore bad SSL:
> > > r = requests.get("https://site.com/blah", verify=False) #Works
> > >
> > > s = requests.Session()
> > > s.get('https://site.com/blah', verify=False) #Fails
> > > requests.exceptions.SSLError: [Errno 1] _ssl.c:504: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed
However, this works:
> > > s = requests.Session()
> > > s.verify = False
> > > s.get('https://site.com/blah')
Request-level parameters should override session parameters.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2314/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2314/timeline
| null |
completed
| null | null | false |
[
"## What version of requests are you using? \n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n",
"requests==2.4.1\n",
"I have reproduced it with requests==2.4.3, as expected:\n\n```\n>>> import requests\n>>> r = requests.get('https://my_wrong_cert/', verify=False)\n>>> r.status_code\n200\n>>> s = requests.Session()\n>>> s.verify = False\n>>> s.get('https://my_wrong_cert/')\n<Response [200]>\n```\n\nBut the next one has failed:\n\n```\n>>> s = requests.Session()\n>>> s.get('https://my_wrong_cert', verify=False)\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/Users/raulcd/Envs/tests/lib/python2.7/site-packages/requests/sessions.py\", line 469, in get\n return self.request('GET', url, **kwargs)\n File \"/Users/raulcd/Envs/tests/lib/python2.7/site-packages/requests/sessions.py\", line 457, in request\n resp = self.send(prep, **send_kwargs)\n File \"/Users/raulcd/Envs/tests/lib/python2.7/site-packages/requests/sessions.py\", line 569, in send\n r = adapter.send(request, **kwargs)\n File \"/Users/raulcd/Envs/tests/lib/python2.7/site-packages/requests/adapters.py\", line 420, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: [Errno 1] _ssl.c:507: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed\n```\n\nI'll start working on a patch and I will submit it.\n",
"I've been able to reproduce the behavior with two tests, the problem is not with the:\n\n```\ns.get('https://my_wrong_cert', verify=False)\n```\n\nthe problem is if you are reusing the session when a SSLError has been previously raised.\nThe test below `test_wrong_session_certificate` works as expected, but the test `test_session_certificate_fail()` fails on the second attempt as the session has already failed with the validation of the SSL certificate.\n\n```\ndef test_wrong_session_certificate():\n s = requests.Session()\n r = s.get('https://my_wrong_cert', verify=False)\n assert r.status_code==200\n\ndef test_session_certificate_fail():\n s = requests.Session()\n with pytest.raises(SSLError):\n s.get('https://my_wrong_cert')\n r = s.get('https://my_wrong_cert', verify=False) # -> This line fails due to a SSLError\n assert r.status_code==200, 'The error code is not the expected one'\n```\n",
"The issue is with urllib3:\n\nWhen calling to urllib3 with the adapter we set the:\n`>>>> conn.cert_reqs\n'CERT_NONE'`\nbut inside urllib3:\n\n```\n> /Users/raulcd/Projects/requests/requests/packages/urllib3/connectionpool.py(511)urlopen()\n 510 # Request a connection from the queue.\n--> 511 conn = self._get_conn(timeout=pool_timeout)\n 512\n\nipdb> self.cert_reqs\n'CERT_NONE'\nipdb> n\nipdb> conn.cert_reqs\n'CERT_REQUIRED'\n```\n\nSo the requests should have `CERT_NONE` but the connection retrieved from the connection pool has `CERT_REQUIRED`.\n",
"I think this issue should be closed as I opened the issue with urllib3 (where it belongs).\n",
"@raulcd I agree.\n"
] |
https://api.github.com/repos/psf/requests/issues/2313
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2313/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2313/comments
|
https://api.github.com/repos/psf/requests/issues/2313/events
|
https://github.com/psf/requests/issues/2313
| 47,176,990 |
MDU6SXNzdWU0NzE3Njk5MA==
| 2,313 |
Can't use post to upload a file with Chinese characters in its name.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3011879?v=4",
"events_url": "https://api.github.com/users/sbarba/events{/privacy}",
"followers_url": "https://api.github.com/users/sbarba/followers",
"following_url": "https://api.github.com/users/sbarba/following{/other_user}",
"gists_url": "https://api.github.com/users/sbarba/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sbarba",
"id": 3011879,
"login": "sbarba",
"node_id": "MDQ6VXNlcjMwMTE4Nzk=",
"organizations_url": "https://api.github.com/users/sbarba/orgs",
"received_events_url": "https://api.github.com/users/sbarba/received_events",
"repos_url": "https://api.github.com/users/sbarba/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sbarba/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sbarba/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sbarba",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 30 |
2014-10-29T17:19:23Z
|
2021-09-06T00:06:42Z
|
2015-05-31T12:58:25Z
|
NONE
|
resolved
|
This code:
requests.post(url, files={"file": open(u"漢字.o8d", "r")})
will return a 200, but the file is never uploaded.
I can upload that file by posting in the browser so this doesn't seem to be a server-side issue. Also, if I change the name of the file to "bob" or something ASCII it works perfectly.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2313/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2313/timeline
| null |
completed
| null | null | false |
[
"Are you sure?\n\n``` bash\n$ echo \"file file file.\\n\" >> 漢字.o8d\n$ ls\n漢字.o8d\n```\n\n``` python\n>>> import requests\n>>> r = requests.post('http://httpbin.org/post', files={'file': open(u'漢字.o8d', 'r')})\n>>> print r.content\n{\n \"args\": {}, \n \"data\": \"\", \n \"files\": {}, \n \"form\": {\n \"file\": \"file file file.\\n\"\n }, \n \"headers\": {\n \"Accept\": \"*/*\", \n \"Accept-Encoding\": \"gzip, deflate\", \n \"Connect-Time\": \"2\", \n \"Connection\": \"close\", \n \"Content-Length\": \"180\", \n \"Content-Type\": \"multipart/form-data; boundary=3491ae0e5b6d465aaebb7bd63c9c750c\", \n \"Host\": \"httpbin.org\", \n \"Total-Route-Time\": \"0\", \n \"User-Agent\": \"python-requests/2.4.0 CPython/2.7.8 Darwin/14.0.0\", \n \"Via\": \"1.1 vegur\", \n \"X-Request-Id\": \"f05915c9-279e-4187-8425-f0b06fc64ea2\"\n }, \n \"json\": null, \n \"origin\": \"77.99.146.203\", \n \"url\": \"http://httpbin.org/post\"\n}\n```\n\nSeems like httpbin doesn't have a problem. Can you confirm what version of requests you're using?\n",
"Oh hang on. Interestingly, httpbin sees it as a form field, not a file object. Hmm.\n",
"Oh, yes, I remember now.\n\nPOSTing files with unicode filenames is awkward, because you didn't say what text encoding you want us to use. There's a spec for this, which we implement, but relatively few others do it and many servers don't understand it.\n\nMy suggested workaround would be to set the filename yourself using whatever encoding you choose. Unfortunately, that doesn't work:\n\n```\nTraceback (most recent call last):\n File \"testy.py\", line 4, in <module>\n r = requests.post('http://httpbin.org/post', files={'file': (u'漢字.o8d'.encode('utf-8'), open(u'漢字.o8d', 'r'))})\n File \"/usr/local/lib/python2.7/site-packages/requests/api.py\", line 88, in post\n return request('post', url, data=data, **kwargs)\n File \"/usr/local/lib/python2.7/site-packages/requests/api.py\", line 44, in request\n return session.request(method=method, url=url, **kwargs)\n File \"/usr/local/lib/python2.7/site-packages/requests/sessions.py\", line 434, in request\n prep = self.prepare_request(req)\n File \"/usr/local/lib/python2.7/site-packages/requests/sessions.py\", line 372, in prepare_request\n hooks=merge_hooks(request.hooks, self.hooks),\n File \"/usr/local/lib/python2.7/site-packages/requests/models.py\", line 299, in prepare\n self.prepare_body(data, files)\n File \"/usr/local/lib/python2.7/site-packages/requests/models.py\", line 434, in prepare_body\n (body, content_type) = self._encode_files(files, data)\n File \"/usr/local/lib/python2.7/site-packages/requests/models.py\", line 151, in _encode_files\n rf.make_multipart(content_type=ft)\n File \"/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/fields.py\", line 173, in make_multipart\n (('name', self._name), ('filename', self._filename))\n File \"/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/fields.py\", line 133, in _render_parts\n parts.append(self._render_part(name, value))\n File \"/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/fields.py\", line 113, in _render_part\n return format_header_param(name, value)\n File \"/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/fields.py\", line 37, in format_header_param\n result.encode('ascii')\nUnicodeDecodeError: 'ascii' codec can't decode byte 0xe6 in position 10: ordinal not in range(128)\n```\n\nThe problem here seems to be [this line](https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/fields.py#L37). This unconditional call to encode will actually cause an implicit call to `str.decode` on Python 2, which breaks for non-ascii characters. @shazow, you prepared to consider that a bug?\n",
"Django now supports this and was appreciative of the bug report. The fact that httpbin doesn't parse this correctly is a flask/werkzeug bug I think.\n",
"Just discovered that 漢字 is Japanese Kanji and means \"Chinese Characters\". Enjoyed that, but the bug still stands. For now I'm able to automate testing of such filenames with Selenium, but it'd be nice to do it with requests too.\n",
"Except it's a bug in the server you're trying to upload to for not supporting a 10 year old RFC\n",
"Is there any other workaround different than changing the file name or changing the server backend?\n",
"I think someone percent-encoded the file name because whatever server they were communicating with understood that. That's behaviour that is not defined anywhere though so it depends on the server your using doing something incredibly bad and horribly wrong.\n",
"And @kampde thanks searching for prior issues and for not opening a new issue.\n",
"The aforementioned RFC is [RFC 5987](http://tools.ietf.org/html/rfc5987), right?\n",
"I don't believe so. No. That's for HTTP Headers, not for mime-headers\n",
"Looks like [RFC 2231](http://tools.ietf.org/html/rfc2231) then.\n",
"@kampde after a quick skim, that is the correct RFC. As you can see it is 18 years old.\n",
"I think in https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/fields.py#L37\n\n```\n try:\n result.encode('ascii')\n except UnicodeEncodeError:\n pass\n else:\n return result\n```\n\nModify to \"result.encode('utf8')\" will be better ,because most server can handle with utf8, but many of them do not support the style of \"email.utils.encode_rfc2231(value, 'utf-8')\"\n",
"@zhangchunlin What does 'most servers' mean? Which servers? Which versions of those servers? Why don't they implement RFC 2231?\n",
"@zhangchunlin if those servers do not implement a standard that is 18 years old, I fail to see why we should be forced to violate the standard.\n",
"@Lukasa OK, I didn't test so much, my statement maybe wrong.\nI just found that the behavior of requests wasn't same as browser(for example chrome), what I thought is that the method chrome using is workable.\n\n@sigmavirus24 I will try to make clear and submit issue to those server if needed.\n",
"It seems PHP is also affected by this, if you try to upload a file to a server running PHP, with the name 'fårikål.txt', it will throw a warning: \"PHP Warning: File Upload Mime headers garbled in Unknown on line 0\".\n\nThis is PHP 5.6.14.\n",
"@WishCow I'm not certain what result you expect to see if you're filing a PHP bug against another project. It seems frameworks in Perl, Ruby, and Python all appropriately support RFC 2231. If PHP 5.6.14 doesn't support an 18 year old standard, you should file a bug with PHP.\n",
"Just leaving a note here, in case other people encounter this issue, it took me a long time to find the cause.\n",
"@WishCow you'll probably have a better time putting together some minimal bit of PHP code and [filing a bug with PHP](https://bugs.php.net/). This comment will help others, but filing a bug to get this fixed in PHP would help a lot more people.\n",
"Actually I was about to do that, and I whipped up a quick example of the upload with curl, but _that_ seems to work. Now I'm confused, is there another RFC that describes how filenames should be handled, that curl (and PHP) might be implementing?\n\nSo this:\n\n```\ncurl -v -F får.txt=@/tmp/test.txt http://myserver.local\n```\n\nDoes produce the correct output from the handling PHP script.\n",
"Run netcat locally and send the curl request to that.\n\nCurl might be violating the RFC because support for the spec has lagged behind.\n",
"The command\n\n```\ncurl -F får='@/tmp/test.txt;filename=får.txt' localhost:14511\n```\n\nResults in the netcat output:\n\n```\nPOST / HTTP/1.1\nHost: localhost:14511\nUser-Agent: curl/7.45.0\nAccept: */*\nContent-Length: 198\nExpect: 100-continue\nContent-Type: multipart/form-data; boundary=------------------------fb94c2e958ada9f0\n\n--------------------------fb94c2e958ada9f0\nContent-Disposition: form-data; name=\"får\"; filename=\"får.txt\"\nContent-Type: text/plain\n\nhello world\n\n--------------------------fb94c2e958ada9f0--\n```\n\nSo curl indeed does not seem to use the `*=` format that the RFC is describing.\n",
"Yeah, so you can use `httpie` to produce a `cURL` like command that will probably trigger this for you.\n",
"You could also write some [PHP that uses RFC 2231](https://stackoverflow.com/questions/3856362/php-rfc-2231-how-to-encode-utf-8-string-as-content-disposition-filename).\n",
"The SO post describes how to send files with the correct encoding, but I need to receive files, for which there doesn't seem to be a way, since the $_FILES superglobal gets populated before the userland script runs.\n\nThanks for the help though, in case someone else wants to track this in PHP: https://bugs.php.net/bug.php?id=70794\n",
"@WishCow right, that's what I meant (instead of using curl use PHP).\n",
"So I ran into this issue with a PHP server running Zend 1 and the solution that I came up with was to import urllib and then encode the filename like so `files = {'file': (urllib.pathname2url(event.pathname), 'rb')}` and it solved the problem for me. Just adding this in case it might help someone else who runs into this.",
"That fix proved to introduce new problems because it changed the filenames in weird ways. I'm instead working on getting this [PR](https://github.com/urllib3/urllib3/pull/856) in urllib3 to use HTML5 encoding vs. rfc2231 by default reopened. Hopefully this will allow this problem to be fixed for requests as well. I managed to rewrite the request I was using with my patched version of urllib3 based upon the currently closed PR and it worked."
] |
https://api.github.com/repos/psf/requests/issues/2312
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2312/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2312/comments
|
https://api.github.com/repos/psf/requests/issues/2312/events
|
https://github.com/psf/requests/issues/2312
| 47,065,185 |
MDU6SXNzdWU0NzA2NTE4NQ==
| 2,312 |
SSL request hangs
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9436418?v=4",
"events_url": "https://api.github.com/users/ilya112358/events{/privacy}",
"followers_url": "https://api.github.com/users/ilya112358/followers",
"following_url": "https://api.github.com/users/ilya112358/following{/other_user}",
"gists_url": "https://api.github.com/users/ilya112358/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ilya112358",
"id": 9436418,
"login": "ilya112358",
"node_id": "MDQ6VXNlcjk0MzY0MTg=",
"organizations_url": "https://api.github.com/users/ilya112358/orgs",
"received_events_url": "https://api.github.com/users/ilya112358/received_events",
"repos_url": "https://api.github.com/users/ilya112358/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ilya112358/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ilya112358/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ilya112358",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 14 |
2014-10-28T18:44:02Z
|
2021-09-08T19:00:37Z
|
2014-10-29T16:30:12Z
|
NONE
|
resolved
|
The following curl test works fine:
```
curl -q -k --cert client-2048.crt --key client-2048.key https://identitysso.betfair.com/api/certlogin -d "username=testuser&password=testpassword" -H "X-Application: TestKey"
```
But the following python code hangs indefinitely on requests.post:
```
import requests
print 'Request test'
payload = 'username=testuser&password=testpassword'
headers = {
'X-Application': 'TestKey',
'Content-Type': 'application/x-www-form-urlencoded' }
resp = requests.post(
'https://identitysso.betfair.com/api/certlogin',
data=payload,
cert=('client-2048.crt', 'client-2048.key'),
headers=headers )
print 'Status code: ', resp.status_code
```
Last line never reached, the script hangs, no replies, no errors. Pressing Ctrl-C does not immediately abort but after several minutes prints:
```
Traceback (most recent call last):
File "F:\LMG\test.py", line 14, in <module>
headers=headers )
File "C:\Python27\lib\site-packages\requests\api.py", line 94, in post
return request('post', url, data=data, json=json, **kwargs)
File "C:\Python27\lib\site-packages\requests\api.py", line 49, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 457, in request
resp = self.send(prep, **send_kwargs)
File "C:\Python27\lib\site-packages\requests\sessions.py", line 569, in send
r = adapter.send(request, **kwargs)
File "C:\Python27\lib\site-packages\requests\adapters.py", line 362, in send
timeout=timeout
File "C:\Python27\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 516, in urlopen
body=body, headers=headers)
File "C:\Python27\lib\site-packages\requests\packages\urllib3\connectionpool.py", line 331, in _make_request
httplib_response = conn.getresponse(buffering=True)
File "C:\Python27\lib\httplib.py", line 1030, in getresponse
response.begin()
File "C:\Python27\lib\httplib.py", line 407, in begin
version, status, reason = self._read_status()
File "C:\Python27\lib\httplib.py", line 365, in _read_status
line = self.fp.readline()
File "C:\Python27\lib\socket.py", line 447, in readline
data = self._sock.recv(self._rbufsize)
File "C:\Python27\lib\ssl.py", line 241, in recv
return self.read(buflen)
File "C:\Python27\lib\ssl.py", line 160, in read
return self._sslobj.read(len)
KeyboardInterrupt
```
Python 2.7 under Windows 7. Requests ver. 2.4.3.
I don't have much experience with Internet connection programming, I don't even understand where to go from here. Any ideas why it doesn't work as intended?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2312/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2312/timeline
| null |
completed
| null | null | false |
[
"The server doesn't appear to be responding to the request. The handshake appears to have succeeded, but the server isn't sending anything back. I recommend sending _all_ the headers cURL sent.\n",
"Thanks for quick reaction, here is verbose reply from curl:\n\n```\n* Hostname was NOT found in DNS cache\n* Trying 84.20.200.150...\n* Connected to identitysso.betfair.com (84.20.200.150) port 443 (#0)\n* SSLv3, TLS handshake, Client hello (1):\n* SSLv3, TLS handshake, Server hello (2):\n* SSLv3, TLS handshake, CERT (11):\n* SSLv3, TLS handshake, Server finished (14):\n* SSLv3, TLS handshake, Client key exchange (16):\n* SSLv3, TLS change cipher, Client hello (1):\n* SSLv3, TLS handshake, Finished (20):\n* SSLv3, TLS change cipher, Client hello (1):\n* SSLv3, TLS handshake, Finished (20):\n* SSL connection using TLSv1.0 / AES128-SHA\n* Server certificate:\n* subject: C=GB; ST=London; L=London; O=The Sporting Exchange Limited; OU\n=IT Networks; CN=betfair.com\n* start date: 2014-10-08 09:06:00 GMT\n* expire date: 2016-10-08 09:05:51 GMT\n* issuer: C=US; O=HydrantID (Avalanche Cloud Corporation); CN=HydrantID S\nSL ICA G2\n* SSL certificate verify result: self signed certificate in certificate c\nhain (19), continuing anyway.\n> POST /api/certlogin HTTP/1.1\n> User-Agent: curl/7.38.0\n> Host: identitysso.betfair.com\n> Accept: */*\n> X-Application: TestKey\n> Content-Length: 39\n> Content-Type: application/x-www-form-urlencoded\n>\n* upload completely sent off: 39 out of 39 bytes\n* SSLv3, TLS handshake, Hello request (0):\n* SSLv3, TLS handshake, Client hello (1):\n* SSLv3, TLS handshake, Server hello (2):\n* SSLv3, TLS handshake, CERT (11):\n* SSLv3, TLS handshake, Request CERT (13):\n* SSLv3, TLS handshake, Server finished (14):\n* SSLv3, TLS handshake, CERT (11):\n* SSLv3, TLS handshake, Client key exchange (16):\n* SSLv3, TLS handshake, CERT verify (15):\n* SSLv3, TLS change cipher, Client hello (1):\n* SSLv3, TLS handshake, Finished (20):\n* SSLv3, TLS change cipher, Client hello (1):\n* SSLv3, TLS handshake, Finished (20):\n< HTTP/1.1 200 OK\n< Content-Type: text/plain;charset=ISO-8859-1\n< Content-Length: 46\n< Date: Tue, 28 Oct 2014 19:14:41 GMT\n< Vary: Accept-Encoding\n<\n{\"loginStatus\":\"INVALID_USERNAME_OR_PASSWORD\"}* Connection #0 to host identityss\no.betfair.com left intact\n```\n\nAnything wrong here?\n\nI tried adding the following fields one by one to the headers to no avail.\n\n```\n 'Content-Length': '39',\n 'Accept': '*/*',\n 'Host': 'identitysso.betfair.com',\n 'User-Agent': 'curl/7.38.0' \n```\n\nSeriously mind-boggling )\n",
"Hmm. What Python version are you using?\n",
"As a matter of fact, 2.7.3... Wait a minute! Could it be?.. Installed 2.7.8, works smoothly and efficiently as advertised. Mesa soo stupid! Somehow thought they wouldn't update 2.x versions since they have 3.x already.\n\nLukasa, thank you so much, man! You literally saved my day!\n",
"My pleasure, glad I could help!\n",
"I'm having exactly the same issue with Python 2.7.11 (Anaconda 2.4.0), Requests 2.9.0 on 64-bit Windows 7.\nPost request with certificate attached hangs. If I remove the certificate from requests.post call - then I receive 500 error from the server as expected.\n",
"You say \"post request with the certificate attached\": what exactly does that mean? What specifically are you doing?\n",
"I'm trying to submit a SOAP request. By \"certificate attached\" I mean \"cert\" parameter set in a requests.post call\n\n```\nheaders = {\n 'Accept-Encoding': 'gzip,deflate',\n 'Content-Type': 'application/soap+xml;charset=UTF-8',\n 'User-Agent': 'Apache-HttpClient/4.1.1 (java 1.5)',\n}\nresponse = requests.post(\n url, \n data=soap_request_data,\n cert=('cert.pem', 'key.pem'),\n verify=False,\n timeout=15,\n headers=headers,\n)\n```\n",
"@skvsn Is your key encrypted? That is, does it have a passphrase?\n",
"Yes, it is encrypted. Do I have to provide unencrypted one? Or is there a way to pass the passphrase to requests?\n",
"Currently we have no way for you to provide the passphrase, so you'd have to decrypt it. =(\n",
"@Lukasa Thanks a lot. It works perfectly with unencrypted key. Probably worth mentioning this in the docs.\n",
"Agreed. =)\n",
"By the way, decrypting the key is really simple:\n\nopenssl rsa -in encrypted_key.pem -out unencrypted_key.pem\n"
] |
https://api.github.com/repos/psf/requests/issues/2311
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2311/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2311/comments
|
https://api.github.com/repos/psf/requests/issues/2311/events
|
https://github.com/psf/requests/pull/2311
| 46,989,729 |
MDExOlB1bGxSZXF1ZXN0MjM0MzQ1Njc=
| 2,311 |
Update urllib3 to 25d150b3148
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4",
"events_url": "https://api.github.com/users/kevinburke/events{/privacy}",
"followers_url": "https://api.github.com/users/kevinburke/followers",
"following_url": "https://api.github.com/users/kevinburke/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kevinburke",
"id": 234019,
"login": "kevinburke",
"node_id": "MDQ6VXNlcjIzNDAxOQ==",
"organizations_url": "https://api.github.com/users/kevinburke/orgs",
"received_events_url": "https://api.github.com/users/kevinburke/received_events",
"repos_url": "https://api.github.com/users/kevinburke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kevinburke",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2014-10-28T04:58:54Z
|
2021-09-08T09:01:19Z
|
2014-10-28T05:01:19Z
|
CONTRIBUTOR
|
resolved
|
Summary of changes:
- fixes #2307 by raising handshake timeouts as ReadTimeout and not SSLError
- try harder to send a HTTPS request to the server if it fails halfway
- changes to error message in the Retry class (not used by requests)
- add urllib3 function to convert Url objects back into a url string (not used by requests)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4",
"events_url": "https://api.github.com/users/kevinburke/events{/privacy}",
"followers_url": "https://api.github.com/users/kevinburke/followers",
"following_url": "https://api.github.com/users/kevinburke/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kevinburke",
"id": 234019,
"login": "kevinburke",
"node_id": "MDQ6VXNlcjIzNDAxOQ==",
"organizations_url": "https://api.github.com/users/kevinburke/orgs",
"received_events_url": "https://api.github.com/users/kevinburke/received_events",
"repos_url": "https://api.github.com/users/kevinburke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kevinburke",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2311/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2311/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2311.diff",
"html_url": "https://github.com/psf/requests/pull/2311",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2311.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2311"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/2310
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2310/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2310/comments
|
https://api.github.com/repos/psf/requests/issues/2310/events
|
https://github.com/psf/requests/issues/2310
| 46,926,078 |
MDU6SXNzdWU0NjkyNjA3OA==
| 2,310 |
TypeError: initializer for ctype must be a pointer to same type, not cdata
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9384187?v=4",
"events_url": "https://api.github.com/users/ranveer0/events{/privacy}",
"followers_url": "https://api.github.com/users/ranveer0/followers",
"following_url": "https://api.github.com/users/ranveer0/following{/other_user}",
"gists_url": "https://api.github.com/users/ranveer0/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ranveer0",
"id": 9384187,
"login": "ranveer0",
"node_id": "MDQ6VXNlcjkzODQxODc=",
"organizations_url": "https://api.github.com/users/ranveer0/orgs",
"received_events_url": "https://api.github.com/users/ranveer0/received_events",
"repos_url": "https://api.github.com/users/ranveer0/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ranveer0/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ranveer0/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ranveer0",
"user_view_type": "public"
}
|
[
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
}
] |
closed
| true | null |
[] | null | 3 |
2014-10-27T16:22:38Z
|
2021-09-08T23:06:09Z
|
2015-01-19T09:16:30Z
|
NONE
|
resolved
|
I'm seeing an issue with requests that I couldn't find a solution to elsewhere.
I'm doing a post request to google with some data that contains client id and token info (are just python strings) but I occasionally get this error:
```
TypeError: initializer for ctype 'int(*)(int, X509_STORE_CTX *)' must be a pointer to same type, not cdata 'int(*)(int, X509_STORE_CTX *)'
```
It seems to work every other time and then it throws this error sometimes. I tried encoding all data/url into 'utf-8' but that didn't seem to help much since I still get the error sometimes. If I turn verify=False it works but I'd like to keep it on. Here is the full trace:
```
url = 'https://accounts.google.com/...'
data = {'refresh_token': ...,
'client_id': …,
'client_secret': ...,
'grant_type': 'refresh_token',
}
```
```
File "/path/to/file/file.py", line 124, in _my_function.py
response = requests.post(url, data=data)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 94, in post
return request('post', url, data=data, json=json, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 49, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 457, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 569, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 362, in send
timeout=timeout
File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/connectionpool.py", line 516, in urlopen
body=body, headers=headers)
File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/connectionpool.py", line 304, in _make_request
self._validate_conn(conn)
File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/connectionpool.py", line 722, in _validate_conn
conn.connect()
File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/connection.py", line 229, in connect
ssl_version=resolved_ssl_version)
File "/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/contrib/pyopenssl.py", line 255, in ssl_wrap_socket
ctx.set_verify(_openssl_verify[cert_reqs], _verify_callback)
File "/usr/local/lib/python2.7/dist-packages/OpenSSL/SSL.py", line 550, in set_verify
:param depth: An integer specifying the verify depth
TypeError: initializer for ctype 'int(*)(int, X509_STORE_CTX *)' must be a pointer to same type, not cdata 'int(*)(int, X509_STORE_CTX *)'
```
Using requests 2.4.2 and pyopenssl 0.14.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2310/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2310/timeline
| null |
completed
| null | null | false |
[
"This seems entirely to be a problem in PyOpenSSL. Have you tried upgrading to 0.15 to see if that fixes the issue?\n",
"I opened an issue with PyOpenSSL. I haven't tried updating, I can't seem to find that version I only find 0.14 version. \n",
"0.14 is the latest, 0.15 is stuck in development hell.\n"
] |
https://api.github.com/repos/psf/requests/issues/2309
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2309/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2309/comments
|
https://api.github.com/repos/psf/requests/issues/2309/events
|
https://github.com/psf/requests/pull/2309
| 46,860,414 |
MDExOlB1bGxSZXF1ZXN0MjMzNTg2Njc=
| 2,309 |
Add DeprecationWarnings to inform users of plans
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2014-10-27T00:28:03Z
|
2021-09-08T09:01:15Z
|
2014-11-12T17:36:03Z
|
CONTRIBUTOR
|
resolved
|
After a long discussion in IRC and on several issues, the developers of
requests have decided to remove specific functions from requests.utils
in version 3.0.0. To give users ample time to prepare for this, we've
added DeprecationWarnings long in advance. See also the planning of this
in issue #2266.
My only concern with this is that these DeprecationWarnings are on by default
in Python 2.6 and off everywhere else. Should we disable them for 2.6? It's a
not insignificant portion of our user base that may be affected by this.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2309/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2309/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2309.diff",
"html_url": "https://github.com/psf/requests/pull/2309",
"merged_at": "2014-11-12T17:36:03Z",
"patch_url": "https://github.com/psf/requests/pull/2309.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2309"
}
| true |
[
":+1: :cake: LGTM.\n",
"I think we should optimize for user experience. I'm personally not a fan of deprecation warnings ever, so I'd lean towards disabling them for Python 2.6's userbase — but that's because I'm choosing explicitness over implicitness. Explicitness not in the enabling/disabling, but explicitness in our intention of \"our users shouldn't have to disable this\".\n\nIf we were optimizing for consistency, we should definately let Python 2.6 choose that behavior. But... that's basically exactly why Requests exists — to establish these patterns :)\n",
"I feel like we should document all these fun decisions that make Requests great. I wonder what we'd call it.\n\nMaybe it should just be a series of blog posts. idk...\n",
"it'd be like 12factor for libraries\n",
"So I toyed with the idea of disabling warnings everywhere while adding this and adding something like `requests.enable_warnings()` but I didn't think you'd want that as a top-level API call. Thoughts?\n",
"So, @kennethreitz could you give me a bit more direction regarding what you want me to do with this? Should I default disable the warning everywhere and then provide a function for users to enable them? \n",
"Hopefully no one uses these functions anyway :)\n"
] |
https://api.github.com/repos/psf/requests/issues/2308
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2308/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2308/comments
|
https://api.github.com/repos/psf/requests/issues/2308/events
|
https://github.com/psf/requests/pull/2308
| 46,852,456 |
MDExOlB1bGxSZXF1ZXN0MjMzNTUxNDE=
| 2,308 |
Note about read timeout errors and max_retries
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4",
"events_url": "https://api.github.com/users/kevinburke/events{/privacy}",
"followers_url": "https://api.github.com/users/kevinburke/followers",
"following_url": "https://api.github.com/users/kevinburke/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kevinburke",
"id": 234019,
"login": "kevinburke",
"node_id": "MDQ6VXNlcjIzNDAxOQ==",
"organizations_url": "https://api.github.com/users/kevinburke/orgs",
"received_events_url": "https://api.github.com/users/kevinburke/received_events",
"repos_url": "https://api.github.com/users/kevinburke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kevinburke",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2014-10-26T19:46:54Z
|
2021-09-08T09:01:16Z
|
2014-11-12T17:35:24Z
|
CONTRIBUTOR
|
resolved
|
Update HISTORY to note this. The change was made in urllib3 and not picked up here.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2308/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2308/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2308.diff",
"html_url": "https://github.com/psf/requests/pull/2308",
"merged_at": "2014-11-12T17:35:24Z",
"patch_url": "https://github.com/psf/requests/pull/2308.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2308"
}
| true |
[
"As reported/commented on in #2307.\n",
"I'm hesitant to merge this as it essentially muddies history. You're modifying a changelog after the release has been cut. This amounts to changing history under a user's feet in future releases. If a user read the release notes for 2.4.0 and then reads the release notes for the next version we release, (assuming they read back down to 2.4.0) they'll be surprised they didn't see this before and rightfully so. It's unfair to the user and to anyone else looking at the HISTORY.\n\nI'd rather this be added to the docs rather than adding them here. I haven't quite thought about the best place for this in the docs though.\n",
"Alright I moved this to the `max_retries` parameter documentation in the code\n",
"And this is going to cause a merge conflict with #2216. If @Lukasa doesn't object, we can merge this now and I'll update #2216 soon there after with a merge of master.\n"
] |
https://api.github.com/repos/psf/requests/issues/2307
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2307/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2307/comments
|
https://api.github.com/repos/psf/requests/issues/2307/events
|
https://github.com/psf/requests/issues/2307
| 46,844,193 |
MDU6SXNzdWU0Njg0NDE5Mw==
| 2,307 |
SSL connections with timeout dont seem to respect retries
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3061360?v=4",
"events_url": "https://api.github.com/users/itaiin/events{/privacy}",
"followers_url": "https://api.github.com/users/itaiin/followers",
"following_url": "https://api.github.com/users/itaiin/following{/other_user}",
"gists_url": "https://api.github.com/users/itaiin/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/itaiin",
"id": 3061360,
"login": "itaiin",
"node_id": "MDQ6VXNlcjMwNjEzNjA=",
"organizations_url": "https://api.github.com/users/itaiin/orgs",
"received_events_url": "https://api.github.com/users/itaiin/received_events",
"repos_url": "https://api.github.com/users/itaiin/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/itaiin/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/itaiin/subscriptions",
"type": "User",
"url": "https://api.github.com/users/itaiin",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2014-10-26T14:25:38Z
|
2021-09-08T23:04:45Z
|
2015-05-31T12:57:54Z
|
NONE
|
resolved
|
Sending an https request with timeout, does not make retries when it gets a timeout.
Code:
``` python
import requests
s = requests.session()
s.mount('https://localhost:10000', requests.adapters.HTTPAdapter(max_retries = 3))
s.post('https://localhost:10000', timeout = 3)
```
Raises this exception (with no retries):
```
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 498, in post
return self.request('POST', url, data=data, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 456, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 559, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 382, in send
raise SSLError(e, request=request)
SSLError: _ssl.c:492: The handshake operation timed out
```
A similar request (even using the same session object) with `http` instead of `https` raises the exception `Timeout: HTTPConnectionPool(host='localhost', port=10000): Read timed out. (read timeout=1)` after 3 retries as expected.
This does not seem to be a general issue with ssl + retries - I've successfully used both
The problem was found on version 2.3.0 both on linux and windows, and also on the latest version - 2.4.3 (tried on linux only).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2307/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2307/timeline
| null |
completed
| null | null | false |
[
"Taking a look now\n",
"Previously (before v2.4.0) `max_retries` meant requests (specifically, the logic in urllib3 underlying requests) would retry the request if a server failed to return a response (a read timeout). After 2.4.0 (and urllib3 1.9), read timeouts are no longer retried.\n\nI made this change because a read timeout implies the request made it to the server, and if the code retried the request (especially with a POST request as you are trying above) this means developers could un-intentionally send duplicate data. If you are trying to POST to a server to charge a credit card, and the server is accepting data but not responding in time, an auto-retry could have really bad consequences. Another reason not to retry read timeouts is to avoid overloading a server which is (possibly) having a hard time responding to requests in the specified amount of time.\n\nThe `max_retries` parameter will still retry anything implying that the connection never made it to the server - connection errors and connection timeouts - as the 3rd code block in the gist, and the logging data, indicates.\n\nI tried this with [the following gist](https://gist.github.com/kevinburke/13e10e23838361955223), and neither protocol (http or https) retried. (For the most accurate timing info run this a few times so DNS, caches are warm etc)\n\n### version 2.3.0\n\n```\n# Timeout set to 3 seconds\n2.3.0\n2014-10-26 12:39:08,697 DEBUG Added an stderr logging handler to logger: requests.packages.urllib3\nChecking request read timeout over HTTP\n2014-10-26 12:39:08,883 INFO Starting new HTTP connection (1): httpbin.org\n2014-10-26 12:39:12,420 WARNING Retrying (3 attempts remain) after connection broken by 'ReadTimeoutError(\"HTTPConnectionPool(host='httpbin.org', port=80): Read timed out. (read timeout=3)\",)': /delay/5\n2014-10-26 12:39:12,420 INFO Starting new HTTP connection (2): httpbin.org\n2014-10-26 12:39:15,629 WARNING Retrying (2 attempts remain) after connection broken by 'ReadTimeoutError(\"HTTPConnectionPool(host='httpbin.org', port=80): Read timed out. (read timeout=3)\",)': /delay/5\n2014-10-26 12:39:15,630 INFO Starting new HTTP connection (3): httpbin.org\n2014-10-26 12:39:18,835 WARNING Retrying (1 attempts remain) after connection broken by 'ReadTimeoutError(\"HTTPConnectionPool(host='httpbin.org', port=80): Read timed out. (read timeout=3)\",)': /delay/5\n2014-10-26 12:39:18,836 INFO Starting new HTTP connection (4): httpbin.org\nrequest took 13.3297190666 seconds to error\nChecking request read timeout over HTTPS\n2014-10-26 12:39:22,043 INFO Starting new HTTPS connection (1): httpbin.org\n2014-10-26 12:39:25,622 WARNING Retrying (3 attempts remain) after connection broken by 'ReadTimeoutError(\"HTTPSConnectionPool(host='httpbin.org', port=443): Read timed out.\",)': /delay/5\n2014-10-26 12:39:25,623 INFO Starting new HTTPS connection (2): httpbin.org\n2014-10-26 12:39:29,609 WARNING Retrying (2 attempts remain) after connection broken by 'ReadTimeoutError(\"HTTPSConnectionPool(host='httpbin.org', port=443): Read timed out.\",)': /delay/5\n2014-10-26 12:39:29,610 INFO Starting new HTTPS connection (3): httpbin.org\n2014-10-26 12:39:33,173 WARNING Retrying (1 attempts remain) after connection broken by 'ReadTimeoutError(\"HTTPSConnectionPool(host='httpbin.org', port=443): Read timed out.\",)': /delay/5\n2014-10-26 12:39:33,173 INFO Starting new HTTPS connection (4): httpbin.org\nrequest took 14.7081508636 seconds to error\nChecking request connection timeout over HTTP\n2014-10-26 12:39:36,752 INFO Starting new HTTP connection (1): httpbin.org\n2014-10-26 12:39:38,754 WARNING Retrying (3 attempts remain) after connection broken by 'ConnectTimeoutError(<requests.packages.urllib3.connectionpool.HTTPConnectionPool object at 0x0000000105926720>, 'Connection to httpbin.org timed out. (connect timeout=2)')': /\n2014-10-26 12:39:38,754 INFO Starting new HTTP connection (2): httpbin.org\n2014-10-26 12:39:40,756 WARNING Retrying (2 attempts remain) after connection broken by 'ConnectTimeoutError(<requests.packages.urllib3.connectionpool.HTTPConnectionPool object at 0x0000000105926720>, 'Connection to httpbin.org timed out. (connect timeout=2)')': /\n2014-10-26 12:39:40,757 INFO Starting new HTTP connection (3): httpbin.org\n2014-10-26 12:39:42,759 WARNING Retrying (1 attempts remain) after connection broken by 'ConnectTimeoutError(<requests.packages.urllib3.connectionpool.HTTPConnectionPool object at 0x0000000105926720>, 'Connection to httpbin.org timed out. (connect timeout=2)')': /\n2014-10-26 12:39:42,759 INFO Starting new HTTP connection (4): httpbin.org\nrequest took 8.01387906075 seconds to error\n```\n\n### version 2.4.3\n\n```\n# Timeout set to 3 seconds\n2.4.3\n2014-10-26 12:33:30,025 DEBUG Added a stderr logging handler to logger: requests.packages.urllib3\nChecking request read timeout over HTTP\n2014-10-26 12:33:30,623 INFO Starting new HTTP connection (1): httpbin.org\nrequest took 3.75001692772 seconds to error\n\nChecking request read timeout over HTTPS\n2014-10-26 12:33:33,787 INFO Starting new HTTPS connection (1): httpbin.org\nrequest took 3.53643298149 seconds to error\n\nChecking request connection timeout over HTTP\n2014-10-26 12:33:37,325 INFO Starting new HTTP connection (1): httpbin.org\n2014-10-26 12:33:39,328 DEBUG Incremented Retry for (url='/'): Retry(total=2, connect=None, read=False, redirect=None)\n2014-10-26 12:33:39,328 WARNING Retrying (Retry(total=2, connect=None, read=False, redirect=None)) after connection broken by 'ConnectTimeoutError(<requests.packages.urllib3.connection.HTTPConnection object at 0x0000000103664138>, 'Connection to httpbin.org timed out. (connect timeout=2)')': /\n2014-10-26 12:33:39,328 INFO Starting new HTTP connection (2): httpbin.org\n2014-10-26 12:33:41,331 DEBUG Incremented Retry for (url='/'): Retry(total=1, connect=None, read=False, redirect=None)\n2014-10-26 12:33:41,331 WARNING Retrying (Retry(total=1, connect=None, read=False, redirect=None)) after connection broken by 'ConnectTimeoutError(<requests.packages.urllib3.connection.HTTPConnection object at 0x0000000103664170>, 'Connection to httpbin.org timed out. (connect timeout=2)')': /\n2014-10-26 12:33:41,332 INFO Starting new HTTP connection (3): httpbin.org\n2014-10-26 12:33:43,334 DEBUG Incremented Retry for (url='/'): Retry(total=0, connect=None, read=False, redirect=None)\n2014-10-26 12:33:43,335 WARNING Retrying (Retry(total=0, connect=None, read=False, redirect=None)) after connection broken by 'ConnectTimeoutError(<requests.packages.urllib3.connection.HTTPConnection object at 0x00000001036641a8>, 'Connection to httpbin.org timed out. (connect timeout=2)')': /\n2014-10-26 12:33:43,335 INFO Starting new HTTP connection (4): httpbin.org\nrequest took 8.01690006256 seconds to error\n```\n\nTo retry even when the server doesn't return a response in time, you'll want to either catch a `Timeout` in your code and re-run the request, or wait for #2216 to get merged and then pass a custom `urllib3.util.Retry(total=3, read=3)` object to the `max_retries` parameter.\n\nI'll update the HISTORY log to indicate this change.\n",
"This was extremely subtle and something none of us caught. I'm sorry for the inconvenience @itaiin.\n\nThanks for figuring this out @kevinburke. I agree with your reasoning about not retrying a POST request for those exact reasons.\n",
"Hi, thanks for looking into it!\n\nI don't necessarily agreeing with the logic - The developer should know if their requests are made to a method which is idempotent or not. In fact - this is exactly my case, where I have some that do, for which I use retries and some that dont - so I just dont.\n\nIn any case, the behavior seems to be inconsistent between http and https, which I believe to be bad anyway.\n",
"Agreed that idempotent requests are generally safe to retry. But there are a lot of failures that Requests could probably retry and chooses not to, for example 500 Server Error, 429 Too Many Requests, 503 Service Unavailable, etc. In the hierarchy of errors, a read timeout feels closer to a 500 server error, since data made it to the server in both cases.\n\n[A `Retry(read=3, total=3)` object](https://urllib3.readthedocs.org/en/latest/helpers.html#module-urllib3.util.retry) _will_ retry failed requests made with GET, PUT, DELETE, etc (once #2216 is merged, which should be soon), and you can configure it using the `method_whitelist` parameter.\n\n> In any case, the behavior seems to be inconsistent between http and https, which I believe to be bad anyway.\n\nI wasn't able to reproduce this. I tried to reproduce the code you posted using the following gist: https://gist.github.com/kevinburke/13e10e23838361955223. When running the gist, the timeout behavior was the same for HTTP and HTTPS, in both 2.3.0 and 2.4.3 (for Python 2.7.8). If you could paste more code, log data or turn on debug logging, I can help figure out why the behaviour is inconsistent.\n",
"Well, idempotent requests are safe to retry by definition :). I agree that timeout could go either way since its the one error where you are not sure if both parties agree that there was an error, still if it was my call I think I would of vote for retries but.. ok.\n\nAbout reproducing - there's no more code needed... What you need is having a server that accepts the tcp connection but does do anything else - I used netcat for that (`nc -L -p 10000`). That also allowed me to see if retries actually happen instead of deducing that from the time passed.\nSo anyway, I've ran the code using the url from your gist and indeed there was no difference for me either. I get 14 seconds for both\n\nSo perhaps the difference is that with the netcat scenerio the timeout occures while the SSL handshake is done, and not while waiting for HTTP transmissions? (BTW: Referring back to the error hierarchy - this feels more like a connection timeout to me ;))\n",
"Okay, I tried it out and I can see the difference now.\n- in v2.3.0, requests via HTTP to this TCP-only endpoint would retry 3 times. requests via HTTPS would fail with an SSLError (and not retry)\n- in v2.4.3, requests via HTTP fail once with a read timeout, and requests via HTTPS fail with an SSLError (and not retry). \n\nIt seems bad that \"The handshake operation timed out\" is raised as an SSLError.\n\nAdded https://github.com/shazow/urllib3/issues/492 to discuss this.\n",
"@kevinburke thanks for all the effort!\n"
] |
https://api.github.com/repos/psf/requests/issues/2306
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2306/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2306/comments
|
https://api.github.com/repos/psf/requests/issues/2306/events
|
https://github.com/psf/requests/pull/2306
| 46,814,257 |
MDExOlB1bGxSZXF1ZXN0MjMzNDAwMjg=
| 2,306 |
Fix failing test test_prepare_unicode_url
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/125019?v=4",
"events_url": "https://api.github.com/users/ssadler/events{/privacy}",
"followers_url": "https://api.github.com/users/ssadler/followers",
"following_url": "https://api.github.com/users/ssadler/following{/other_user}",
"gists_url": "https://api.github.com/users/ssadler/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ssadler",
"id": 125019,
"login": "ssadler",
"node_id": "MDQ6VXNlcjEyNTAxOQ==",
"organizations_url": "https://api.github.com/users/ssadler/orgs",
"received_events_url": "https://api.github.com/users/ssadler/received_events",
"repos_url": "https://api.github.com/users/ssadler/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ssadler/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ssadler/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ssadler",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
},
{
"color": "e102d8",
"default": false,
"description": null,
"id": 117745,
"name": "Planned",
"node_id": "MDU6TGFiZWwxMTc3NDU=",
"url": "https://api.github.com/repos/psf/requests/labels/Planned"
},
{
"color": "e11d21",
"default": false,
"description": null,
"id": 44501305,
"name": "Not Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTMwNQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Not%20Ready%20To%20Merge"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
] | null | 6 |
2014-10-25T15:06:09Z
|
2021-09-08T10:01:00Z
|
2014-10-27T00:32:46Z
|
CONTRIBUTOR
|
resolved
|
Test was failing in case of undefined `hooks` in prepare_hooks()
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2306/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2306/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2306.diff",
"html_url": "https://github.com/psf/requests/pull/2306",
"merged_at": "2014-10-27T00:32:46Z",
"patch_url": "https://github.com/psf/requests/pull/2306.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2306"
}
| true |
[
"So one quick note. This test is failing because the _test_ is wrong, not the `prepare_hooks` method is wrong. In the test, can you pass `hooks=[]` please? That will fix this better. Sorry for the confusion. I hadn't looked closely at the commit to fix this test.\n",
"That ok? I like to avoid using [] as a default argument since it's mutable!\n",
"@ssadler No. I wanted you to fix [the test](https://github.com/kennethreitz/requests/blob/master/test_requests.py#L1514) by passing `hooks=[]` to the `prepare` method.\n",
"Excellent @ssadler ! Thanks!\n",
":sparkles: :cake: :sparkles:\n",
"Welcome!\n"
] |
https://api.github.com/repos/psf/requests/issues/2305
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2305/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2305/comments
|
https://api.github.com/repos/psf/requests/issues/2305/events
|
https://github.com/psf/requests/pull/2305
| 46,810,077 |
MDExOlB1bGxSZXF1ZXN0MjMzMzg0MTc=
| 2,305 |
Fix 1979 digest redirects
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/125019?v=4",
"events_url": "https://api.github.com/users/ssadler/events{/privacy}",
"followers_url": "https://api.github.com/users/ssadler/followers",
"following_url": "https://api.github.com/users/ssadler/following{/other_user}",
"gists_url": "https://api.github.com/users/ssadler/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ssadler",
"id": 125019,
"login": "ssadler",
"node_id": "MDQ6VXNlcjEyNTAxOQ==",
"organizations_url": "https://api.github.com/users/ssadler/orgs",
"received_events_url": "https://api.github.com/users/ssadler/received_events",
"repos_url": "https://api.github.com/users/ssadler/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ssadler/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ssadler/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ssadler",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2014-10-25T11:57:53Z
|
2021-09-08T10:01:01Z
|
2014-10-25T14:11:26Z
|
CONTRIBUTOR
|
resolved
|
Ran into #1979. I think it's reasonable to try the digest request again every time we encounter a 3xx. Also changed the state variable `num_401_calls` to a boolean `do_401`, since it's just used to see if we should try again and decrementing it on 3xx would be misleading. I assumed it would be safe to rename it since it looks like it's supposed to be hidden (not declared in constructor). I also assume that existing max redirects handler will still work. Regression test also still due.
Oh, this also fixes a previously failing test... I haven't looked into that one very deeply though.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2305/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2305/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2305.diff",
"html_url": "https://github.com/psf/requests/pull/2305",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2305.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2305"
}
| true |
[
"## Did we not merge the other PR we had for this? \n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n",
"Hey @ssadler thanks for this, but we already had a different [PR](https://github.com/kennethreitz/requests/pull/2253) for this, which is much simpler. If you want to send a separate PR for that test failure, please do and still add yourself to the AUTHORS file. I'll merge that ASAP\n",
"Great, done! https://github.com/kennethreitz/requests/pull/2306\n"
] |
https://api.github.com/repos/psf/requests/issues/2304
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2304/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2304/comments
|
https://api.github.com/repos/psf/requests/issues/2304/events
|
https://github.com/psf/requests/issues/2304
| 46,799,962 |
MDU6SXNzdWU0Njc5OTk2Mg==
| 2,304 |
[Feature] Add browser compatible TLS hostname validator
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/772364?v=4",
"events_url": "https://api.github.com/users/Fuzion24/events{/privacy}",
"followers_url": "https://api.github.com/users/Fuzion24/followers",
"following_url": "https://api.github.com/users/Fuzion24/following{/other_user}",
"gists_url": "https://api.github.com/users/Fuzion24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Fuzion24",
"id": 772364,
"login": "Fuzion24",
"node_id": "MDQ6VXNlcjc3MjM2NA==",
"organizations_url": "https://api.github.com/users/Fuzion24/orgs",
"received_events_url": "https://api.github.com/users/Fuzion24/received_events",
"repos_url": "https://api.github.com/users/Fuzion24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Fuzion24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Fuzion24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Fuzion24",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2014-10-25T02:43:14Z
|
2021-09-08T23:07:07Z
|
2014-10-25T03:02:06Z
|
NONE
|
resolved
|
The *.domain in browsers recursively matches subdomains. Other frameworks/languages have made this behaviour work by implementing a [compatibility mode](http://developer.android.com/reference/org/apache/http/conn/ssl/BrowserCompatHostnameVerifier.html). It would be nice to add an option here to make *.domain.com recursively match.
Here's an example:
```
requests.exceptions.SSLError: hostname 'android.clients.google.com' doesn't match either of '*.google.com', '*.android.com', '*.appengine.google.com', '*.cloud.google.com', '*.google-analytics.com', '*.google.ca', '*.google.cl', '*.google.co.in', '*.google.co.jp', '*.google.co.uk', '*.google.com.ar', '*.google.com.au', '*.google.com.br', '*.google.com.co', '*.google.com.mx', '*.google.com.tr', '*.google.com.vn', '*.google.de', '*.google.es', '*.google.fr', '*.google.hu', '*.google.it', '*.google.nl', '*.google.pl', '*.google.pt', '*.googleadapis.com', '*.googleapis.cn', '*.googlecommerce.com', '*.googlevideo.com', '*.gstatic.cn', '*.gstatic.com', '*.gvt1.com', '*.gvt2.com', '*.metric.gstatic.com', '*.urchin.com', '*.url.google.com', '*.youtube-nocookie.com', '*.youtube.com', '*.youtubeeducation.com', '*.ytimg.com', 'android.com', 'g.co', 'goo.gl', 'google-analytics.com', 'google.com', 'googlecommerce.com', 'urchin.com', 'youtu.be', 'youtube.com', 'youtubeeducation.com'
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2304/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2304/timeline
| null |
completed
| null | null | false |
[
"This is in fact already possible if you do `pip install requests[security]`. This has to do with a feature called SNI. In the future, please research other (open and closed) issues before opening new ones.\n",
"@sigmavirus24 I think this is different than SNI.\n@Fuzion24 wants, that requests accepts certificates for `*.example.com` when requesting `a.b.c.example.com`. The standard defines, that `*` is _not_ recursive and only matches one level.\n\nThe problem with the example is, that with SNI we get a certificate which is valid for `*.clients.google.com` which is correct according to the standard.\nWithout SNI get one which is only valid under the requested validation rules.\n\nPersonally I am -1 on this.\n",
"## If you make a temporary environment and install requests with security and make a get request against the domain they're requesting, you get a 404 instead of a SSLError. If it isn't SNI, those extra dependencies still fix it for them.\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n",
"Yep, I wanted to say, that is a genuine feature request with a very unfortunate example.\n",
"Yeah, @t-8ch, we're in agreement here. =D\n"
] |
https://api.github.com/repos/psf/requests/issues/2303
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2303/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2303/comments
|
https://api.github.com/repos/psf/requests/issues/2303/events
|
https://github.com/psf/requests/issues/2303
| 46,788,564 |
MDU6SXNzdWU0Njc4ODU2NA==
| 2,303 |
Python 3.4: "ReferenceError: weakly-referenced object no longer exists" in __del__ of a class which refers to a Session object
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5781687?v=4",
"events_url": "https://api.github.com/users/shichao-an/events{/privacy}",
"followers_url": "https://api.github.com/users/shichao-an/followers",
"following_url": "https://api.github.com/users/shichao-an/following{/other_user}",
"gists_url": "https://api.github.com/users/shichao-an/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/shichao-an",
"id": 5781687,
"login": "shichao-an",
"node_id": "MDQ6VXNlcjU3ODE2ODc=",
"organizations_url": "https://api.github.com/users/shichao-an/orgs",
"received_events_url": "https://api.github.com/users/shichao-an/received_events",
"repos_url": "https://api.github.com/users/shichao-an/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/shichao-an/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shichao-an/subscriptions",
"type": "User",
"url": "https://api.github.com/users/shichao-an",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 10 |
2014-10-24T22:04:26Z
|
2021-09-08T08:00:43Z
|
2015-02-17T04:40:22Z
|
NONE
|
resolved
|
I got the following error when running my code in Python 3.4 (Python 2.7 and 3.3 are fine):
```
Exception ignored in: <bound method API.__del__ of <u115.api.API object at 0x109dadb00>>
Traceback (most recent call last):
File "/Users/shichao/envs/115wangpan-dl3/lib/python3.4/site-packages/u115/api.py", line 148, in __del__
if self.auto_logout and self.has_logged_in:
File "/Users/shichao/envs/115wangpan-dl3/lib/python3.4/site-packages/u115/api.py", line 195, in has_logged_in
r = self.http.get(self.passport.checkpoint_url, params=params)
File "/Users/shichao/envs/115wangpan-dl3/lib/python3.4/site-packages/u115/api.py", line 36, in get
r = self.session.get(url, params=params)
File "/Users/shichao/envs/115wangpan-dl3/lib/python3.4/site-packages/requests/sessions.py", line 469, in get
return self.request('GET', url, **kwargs)
File "/Users/shichao/envs/115wangpan-dl3/lib/python3.4/site-packages/requests/sessions.py", line 457, in request
resp = self.send(prep, **send_kwargs)
File "/Users/shichao/envs/115wangpan-dl3/lib/python3.4/site-packages/requests/sessions.py", line 563, in send
adapter = self.get_adapter(url=request.url)
File "/Users/shichao/envs/115wangpan-dl3/lib/python3.4/site-packages/requests/sessions.py", line 636, in get_adapter
for (prefix, adapter) in self.adapters.items():
File "/Users/shichao/envs/115wangpan-dl3/lib/python3.4/_collections_abc.py", line 484, in __iter__
for key in self._mapping:
File "/Users/shichao/envs/115wangpan-dl3/lib/python3.4/collections/__init__.py", line 87, in __iter__
curr = root.next
ReferenceError: weakly-referenced object no longer exists
```
The source code is here: https://github.com/shichao-an/115wangpan/blob/master/u115/api.py#L147
I think maybe it's related to the features change in 3.4 pertinent to `__del__` and weakref:
- https://docs.python.org/3/library/gc.html#gc.garbage
- https://docs.python.org/3/library/weakref.html#comparing-finalizers-with-del-methods
I wonder if there is any workaround.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2303/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2303/timeline
| null |
completed
| null | null | false |
[
"I see this error too!\n",
"So this is coming out of the standard library. The standard library implementation of an OrderedDict (which is included in Python 3.4) uses a [weakref for the root reference](https://hg.python.org/cpython/file/534b26837a13/Lib/collections/__init__.py#l74). This is a bug in Python 3.4 and should be reported on bugs.python.org.\n",
"As it seems that nobody bothered to report this I did it at http://bugs.python.org/issue23841\n",
"Anyone try Python 3.5 to see if it goes away?\n",
"@vvolkman it seems as though this is \"Not a bug\" to the upstream CPython develoeprs. Their recommendation is for people to [stop using `__del__`](https://bugs.python.org/msg240246) and instead use `weakref.finalize`. I'm going to check to ensure this doesn't actually affect requests.\n",
"What are the real consequences of this bug? Is data lost? If it is just a case of a spurious warning, I'll ignore it and get back to coding my apps. Sorry for the dumb question but the implications to app that uses requests are unclear.\n",
"It shouldn't affect you unless you try to use `__del__` and do something with a session on the object where you've defined that. The \"real consequences\" are an unhandled exception that appears to come from requests but is a result of your application using `__del__`. This will affect all versions of Python after and including 3.4 (as best I can tell).\n",
"It seems that upstream (https://bugs.python.org/issue23841) has given suggestions to how requests can fix this and has closed it as \"not a bug\" in python. I'd appreciate it if this issue could be re-opened and someone could implement their suggested fix.",
"@SwartzCr The suggestion is \"don't do anything complex in `__del__`, and the source of the `__del__` method is user code. I have no problem saying that users shouldn't make HTTP requests from `__del__`.",
"oh I see - I think I got confused about where the offending code was"
] |
https://api.github.com/repos/psf/requests/issues/2302
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2302/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2302/comments
|
https://api.github.com/repos/psf/requests/issues/2302/events
|
https://github.com/psf/requests/pull/2302
| 46,734,973 |
MDExOlB1bGxSZXF1ZXN0MjMyOTQyMTA=
| 2,302 |
Improve instructions for running the tests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/836426?v=4",
"events_url": "https://api.github.com/users/dpursehouse/events{/privacy}",
"followers_url": "https://api.github.com/users/dpursehouse/followers",
"following_url": "https://api.github.com/users/dpursehouse/following{/other_user}",
"gists_url": "https://api.github.com/users/dpursehouse/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dpursehouse",
"id": 836426,
"login": "dpursehouse",
"node_id": "MDQ6VXNlcjgzNjQyNg==",
"organizations_url": "https://api.github.com/users/dpursehouse/orgs",
"received_events_url": "https://api.github.com/users/dpursehouse/received_events",
"repos_url": "https://api.github.com/users/dpursehouse/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dpursehouse/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dpursehouse/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dpursehouse",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-10-24T12:57:49Z
|
2021-09-08T10:01:01Z
|
2014-10-24T13:43:19Z
|
CONTRIBUTOR
|
resolved
|
py.test must be run from inside the requests folder, otherwise it will
also try to run the tests belonging to the bundled packages.
Change-Id: Ic9ca135b2bfc08e1ada68218cee069ebfbd843fa
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/836426?v=4",
"events_url": "https://api.github.com/users/dpursehouse/events{/privacy}",
"followers_url": "https://api.github.com/users/dpursehouse/followers",
"following_url": "https://api.github.com/users/dpursehouse/following{/other_user}",
"gists_url": "https://api.github.com/users/dpursehouse/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dpursehouse",
"id": 836426,
"login": "dpursehouse",
"node_id": "MDQ6VXNlcjgzNjQyNg==",
"organizations_url": "https://api.github.com/users/dpursehouse/orgs",
"received_events_url": "https://api.github.com/users/dpursehouse/received_events",
"repos_url": "https://api.github.com/users/dpursehouse/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dpursehouse/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dpursehouse/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dpursehouse",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2302/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2302/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2302.diff",
"html_url": "https://github.com/psf/requests/pull/2302",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2302.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2302"
}
| true |
[
"I've had no problems running py.test from the root directory. What bundled packages are you talking about?\n",
"Sorry, it's just me misreading the output. I thought it was failing on something in the packages folder, but it turns out it was because I created my virtualenv in the root of the git.\n"
] |
https://api.github.com/repos/psf/requests/issues/2301
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2301/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2301/comments
|
https://api.github.com/repos/psf/requests/issues/2301/events
|
https://github.com/psf/requests/pull/2301
| 46,734,676 |
MDExOlB1bGxSZXF1ZXN0MjMyOTQwNDI=
| 2,301 |
Fix "TypeError: 'NoneType' object is not iterable" in prepare_hooks
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/836426?v=4",
"events_url": "https://api.github.com/users/dpursehouse/events{/privacy}",
"followers_url": "https://api.github.com/users/dpursehouse/followers",
"following_url": "https://api.github.com/users/dpursehouse/following{/other_user}",
"gists_url": "https://api.github.com/users/dpursehouse/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dpursehouse",
"id": 836426,
"login": "dpursehouse",
"node_id": "MDQ6VXNlcjgzNjQyNg==",
"organizations_url": "https://api.github.com/users/dpursehouse/orgs",
"received_events_url": "https://api.github.com/users/dpursehouse/received_events",
"repos_url": "https://api.github.com/users/dpursehouse/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dpursehouse/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dpursehouse/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dpursehouse",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-10-24T12:54:24Z
|
2021-09-08T10:01:02Z
|
2014-10-24T15:03:06Z
|
CONTRIBUTOR
|
resolved
|
This was causing one of the tests to fail.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/836426?v=4",
"events_url": "https://api.github.com/users/dpursehouse/events{/privacy}",
"followers_url": "https://api.github.com/users/dpursehouse/followers",
"following_url": "https://api.github.com/users/dpursehouse/following{/other_user}",
"gists_url": "https://api.github.com/users/dpursehouse/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dpursehouse",
"id": 836426,
"login": "dpursehouse",
"node_id": "MDQ6VXNlcjgzNjQyNg==",
"organizations_url": "https://api.github.com/users/dpursehouse/orgs",
"received_events_url": "https://api.github.com/users/dpursehouse/received_events",
"repos_url": "https://api.github.com/users/dpursehouse/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dpursehouse/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dpursehouse/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dpursehouse",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2301/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2301/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2301.diff",
"html_url": "https://github.com/psf/requests/pull/2301",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2301.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2301"
}
| true |
[
"Which test? This should never happen, hooks should never be the none object.\n",
"@Lukasa I just ran the tests and it's arising from `test_prepare_unicode_url` because it's creating a `PreparedRequest` and then calling `prepare` without passing any hooks. This is a poorly written test, not something that should be fixed here.\n"
] |
https://api.github.com/repos/psf/requests/issues/2300
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2300/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2300/comments
|
https://api.github.com/repos/psf/requests/issues/2300/events
|
https://github.com/psf/requests/issues/2300
| 46,712,337 |
MDU6SXNzdWU0NjcxMjMzNw==
| 2,300 |
404 response for a valid unicode url
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4245598?v=4",
"events_url": "https://api.github.com/users/alisufian/events{/privacy}",
"followers_url": "https://api.github.com/users/alisufian/followers",
"following_url": "https://api.github.com/users/alisufian/following{/other_user}",
"gists_url": "https://api.github.com/users/alisufian/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/alisufian",
"id": 4245598,
"login": "alisufian",
"node_id": "MDQ6VXNlcjQyNDU1OTg=",
"organizations_url": "https://api.github.com/users/alisufian/orgs",
"received_events_url": "https://api.github.com/users/alisufian/received_events",
"repos_url": "https://api.github.com/users/alisufian/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/alisufian/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alisufian/subscriptions",
"type": "User",
"url": "https://api.github.com/users/alisufian",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2014-10-24T07:50:09Z
|
2021-09-08T23:07:08Z
|
2014-10-24T10:21:36Z
|
NONE
|
resolved
|
```
In [5]: requests.get("http://www.daleelak.tv/uploads/photos/12/%D8%B3%D8%AA%D8%A7%D9%8A%D9%84.jpg")
Out[5]: <Response [404]>
```
```
curl -v "http://www.daleelak.tv/uploads/photos/12/%D8%B3%D8%AA%D8%A7%D9%8A%D9%84.jpg"
* Hostname was NOT found in DNS cache
* Trying 108.174.158.190...
* Connected to www.daleelak.tv (108.174.158.190) port 80 (#0)
> GET /uploads/photos/12/%D8%B3%D8%AA%D8%A7%D9%8A%D9%84.jpg HTTP/1.1
> User-Agent: curl/7.38.0
> Host: www.daleelak.tv
> Accept: */*
>
< HTTP/1.1 200 OK
< Date: Fri, 24 Oct 2014 07:38:58 GMT
* Server Apache is not blacklisted
< Server: Apache
< ETag: "54355-b7f9-4d6b38fce61c0"
< Accept-Ranges: bytes
< Content-Length: 47097
< Cache-Control: max-age=864000, public, must-revalidate
< Expires: Sat, 25 Oct 2014 07:38:58 GMT
< Vary: Accept-Encoding
< Content-Type: image/jpeg
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2300/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2300/timeline
| null |
completed
| null | null | false |
[
"i am also facing the same issue, urllib seems to be working fine with these sort of urls, but requests is returning 404 error\n",
"This has nothing to do with the URL and everything to do with the crappy web server. Observe:\n\n``` python\n>>> url = \"http://www.daleelak.tv/uploads/photos/12/%D8%B3%D8%AA%D8%A7%D9%8A%D9%84.jpg\"\n>>> r = requests.get(url)\n>>> r.status_code\n404\n```\n\nCompare with:\n\n``` python\n>>> url = \"http://www.daleelak.tv/uploads/photos/12/%D8%B3%D8%AA%D8%A7%D9%8A%D9%84.jpg\"\n>>> r = requests.get(url, headers={'User-Agent': 'curl/7.38.0'})\n>>> r.status_code\n200\n```\n\nLooks like for some reason we're getting blacklisted. Alternatively, for some reason curl is getting whitelisted (this seems less likely).\n\nRegardless, this isn't our bug.\n",
"Thank You!\n"
] |
https://api.github.com/repos/psf/requests/issues/2299
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2299/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2299/comments
|
https://api.github.com/repos/psf/requests/issues/2299/events
|
https://github.com/psf/requests/pull/2299
| 46,702,204 |
MDExOlB1bGxSZXF1ZXN0MjMyNzQ5NTY=
| 2,299 |
Cap the redirect_cache size to prevent memory abuse
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/375744?v=4",
"events_url": "https://api.github.com/users/mattrobenolt/events{/privacy}",
"followers_url": "https://api.github.com/users/mattrobenolt/followers",
"following_url": "https://api.github.com/users/mattrobenolt/following{/other_user}",
"gists_url": "https://api.github.com/users/mattrobenolt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mattrobenolt",
"id": 375744,
"login": "mattrobenolt",
"node_id": "MDQ6VXNlcjM3NTc0NA==",
"organizations_url": "https://api.github.com/users/mattrobenolt/orgs",
"received_events_url": "https://api.github.com/users/mattrobenolt/received_events",
"repos_url": "https://api.github.com/users/mattrobenolt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mattrobenolt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mattrobenolt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mattrobenolt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 17 |
2014-10-24T03:49:55Z
|
2021-09-08T09:01:17Z
|
2014-11-07T20:18:41Z
|
CONTRIBUTOR
|
resolved
|
It'd be easily possible to run out of memory from a long running process and session that does a lot of redirects. This is also potentially able to be abused if some information was known about how a service operated. Relatively unlikely, but it's a detail that users shouldn't have to think about. We can enforce something better.
And caching more than 10,000 redirects, in my opinion, is unreasonable. I'm open to a different value. Both more and less conservative.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2299/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2299/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2299.diff",
"html_url": "https://github.com/psf/requests/pull/2299",
"merged_at": "2014-11-07T20:18:41Z",
"patch_url": "https://github.com/psf/requests/pull/2299.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2299"
}
| true |
[
"I'm +1 on this. I don't really care what the maximum value is, but clearly there should be one.\n\nIf anything, I'd like to replace the data structure entirely with something like an LRU cache that has fixed maximum occupancy. I think urllib3 may have one, it might be worth taking a look at that.\n",
"@Lukasa I'd really like to see something like `urllib3`s LRUCache used here as well.\n",
"Also thank you so very much for this @mattrobenolt \n",
"I'll switch this up in a bit. I wasn't aware there was an LRUCache as a part of urllib3 already.\n\n@sigmavirus24 do you have an opinion on the size? Is 10k alright? That number is completely arbitrary. \n",
"If we use an actual LRU, we can probably be much more aggressive.\n",
"@mattrobenolt why not cap it a bit lower? Or is 10k already \"low\"?\n",
"For what it's worth, urllib3 has a `RecentlyUsedContainer` class in `_collections`, so you'd do:\n\n``` python\nfrom .packages.urllib3._collections import RecentlyUsedContainer\n\nredirect_cache = RecentlyUsedContainer(10000)\n```\n",
"@sigmavirus24 sorry for the delay. Updated to use the `RecentlyUsedContainer` instead and I dropped the cache size down to 1000. We could also make this configurable on the `Session`.\n",
"> We could also make this configurable on the Session\n\nI'd rather we didn't.\n\n@Lukasa this looks good to me. How do you like it?\n",
"bump. :)\n",
"LGTM. @Lukasa ?\n",
":+1: :cake:\n",
"What sort of error message do you get when you try to store the 1001st redirect? Or does the 1st one get purged?\n",
"@kevinburke it's an LRU (least recently used) algorithm. So that means that the least used objects get evicted first. Active/hot objects stay in. Similar to how memcached works. So there is no error. Objects are just evicted based on LRU.\n",
"Which is why it's good! We have just set an upper bound on how much redirect information we're going to store.\n",
"Thanks everyone. :) :cake:\n",
"Thank you @mattrobenolt !\n"
] |
https://api.github.com/repos/psf/requests/issues/2298
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2298/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2298/comments
|
https://api.github.com/repos/psf/requests/issues/2298/events
|
https://github.com/psf/requests/pull/2298
| 46,701,180 |
MDExOlB1bGxSZXF1ZXN0MjMyNzQzNjk=
| 2,298 |
Minor clean ups
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/836426?v=4",
"events_url": "https://api.github.com/users/dpursehouse/events{/privacy}",
"followers_url": "https://api.github.com/users/dpursehouse/followers",
"following_url": "https://api.github.com/users/dpursehouse/following{/other_user}",
"gists_url": "https://api.github.com/users/dpursehouse/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dpursehouse",
"id": 836426,
"login": "dpursehouse",
"node_id": "MDQ6VXNlcjgzNjQyNg==",
"organizations_url": "https://api.github.com/users/dpursehouse/orgs",
"received_events_url": "https://api.github.com/users/dpursehouse/received_events",
"repos_url": "https://api.github.com/users/dpursehouse/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dpursehouse/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dpursehouse/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dpursehouse",
"user_view_type": "public"
}
|
[
{
"color": "e11d21",
"default": false,
"description": null,
"id": 44501305,
"name": "Not Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTMwNQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Not%20Ready%20To%20Merge"
},
{
"color": "e11d21",
"default": false,
"description": null,
"id": 78002701,
"name": "Do Not Merge",
"node_id": "MDU6TGFiZWw3ODAwMjcwMQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Do%20Not%20Merge"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
] | null | 5 |
2014-10-24T03:25:42Z
|
2021-09-08T09:01:19Z
|
2014-10-27T01:07:27Z
|
CONTRIBUTOR
|
resolved
|
Fixing some minor issues found with pylint. I've done them in separate commits in case you don't want all ofg the fixes. Feel free to squash if necessary.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/836426?v=4",
"events_url": "https://api.github.com/users/dpursehouse/events{/privacy}",
"followers_url": "https://api.github.com/users/dpursehouse/followers",
"following_url": "https://api.github.com/users/dpursehouse/following{/other_user}",
"gists_url": "https://api.github.com/users/dpursehouse/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dpursehouse",
"id": 836426,
"login": "dpursehouse",
"node_id": "MDQ6VXNlcjgzNjQyNg==",
"organizations_url": "https://api.github.com/users/dpursehouse/orgs",
"received_events_url": "https://api.github.com/users/dpursehouse/received_events",
"repos_url": "https://api.github.com/users/dpursehouse/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dpursehouse/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dpursehouse/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dpursehouse",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2298/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2298/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2298.diff",
"html_url": "https://github.com/psf/requests/pull/2298",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2298.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2298"
}
| true |
[
"I'm +0 on this really. I don't mind it, but we do need to confirm that it hasn't broken anything.\n",
"Are there known problems with the test suite? I'm running it (against the head of the master branch, i.e. not including this PR) according to the instructions in todo.rst and I'm getting 27 failed.\n\nThis is on Mac OS with Python 2.7.5.\n",
"Never mind. Found the problem.\n\nCouple more PRs coming up soon...\n",
"I'm strongly -1 until the review comments are properly addressed.\n",
"closing this. after making reverts to address the review comments the remaining fixes are too trivial to warrant a PR.\n"
] |
https://api.github.com/repos/psf/requests/issues/2297
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2297/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2297/comments
|
https://api.github.com/repos/psf/requests/issues/2297/events
|
https://github.com/psf/requests/issues/2297
| 46,527,016 |
MDU6SXNzdWU0NjUyNzAxNg==
| 2,297 |
Improper escaping with '#' in url
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1779137?v=4",
"events_url": "https://api.github.com/users/rminsk/events{/privacy}",
"followers_url": "https://api.github.com/users/rminsk/followers",
"following_url": "https://api.github.com/users/rminsk/following{/other_user}",
"gists_url": "https://api.github.com/users/rminsk/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rminsk",
"id": 1779137,
"login": "rminsk",
"node_id": "MDQ6VXNlcjE3NzkxMzc=",
"organizations_url": "https://api.github.com/users/rminsk/orgs",
"received_events_url": "https://api.github.com/users/rminsk/received_events",
"repos_url": "https://api.github.com/users/rminsk/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rminsk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rminsk/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rminsk",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2014-10-22T15:58:19Z
|
2021-09-08T23:07:08Z
|
2014-10-22T20:36:29Z
|
NONE
|
resolved
|
import requests
result = requests.get("https://localhost:8080/url/with/a.#.in/it", params={'foo': 'bar'}, verify=False)
print result.request.url
https://localhost:8080/url/with/a.?foo=bar#.in/it
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2297/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2297/timeline
| null |
completed
| null | null | false |
[
"Hi there!\n\nThis is a very tricky situation to resolve. The problem is that it is extremely ambiguous as to whether a user passes us an encoded or an unencoded URL. It's simply not clear whether the # sign you gave us should be treated as a fragment delimeter or not. This is particularly important because requests doesn't have a fragment portion of its API (nor should it IMHO), so there's really no other way to specify a fragment.\n\nI recommend you escape the hash yourself, i.e.:\n\n``` python\n>>> import requests\n>>> result = requests.get(\"https://localhost:8080/url/with/a.%23.in/it\", params={'foo': 'bar'}, verify=False)\n>>> print result.request.url\nhttps://localhost:8080/url/with/a.%23.in/it?foo=bar\n```\n",
"Would it be useful to add a keywork argument specifying if the url is encode, unencoded, or current behavior of unknown? Does the fragment go after the parameters?\n",
"To answer my own question. I just looked at RFC 3986 and the fragment does go after the query.\n",
"> Would it be useful to add a keywork argument specifying if the url is encode, unencoded, or current behavior of unknown?\n\nNo. In spite of RFC 3986 there wouldn't be a good way for requests to always encode a URL. As it is now, you should be encoding this yourself as @Lukasa already pointed out. We will not be adding extra keyword arguments and we will not guess this for you.\n"
] |
https://api.github.com/repos/psf/requests/issues/2296
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2296/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2296/comments
|
https://api.github.com/repos/psf/requests/issues/2296/events
|
https://github.com/psf/requests/issues/2296
| 46,526,183 |
MDU6SXNzdWU0NjUyNjE4Mw==
| 2,296 |
Vendored version of chardet is buggy, can't detect utf-8 with bom
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/229502?v=4",
"events_url": "https://api.github.com/users/tgs/events{/privacy}",
"followers_url": "https://api.github.com/users/tgs/followers",
"following_url": "https://api.github.com/users/tgs/following{/other_user}",
"gists_url": "https://api.github.com/users/tgs/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tgs",
"id": 229502,
"login": "tgs",
"node_id": "MDQ6VXNlcjIyOTUwMg==",
"organizations_url": "https://api.github.com/users/tgs/orgs",
"received_events_url": "https://api.github.com/users/tgs/received_events",
"repos_url": "https://api.github.com/users/tgs/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tgs/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tgs/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tgs",
"user_view_type": "public"
}
|
[
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
}
] |
closed
| true | null |
[] | null | 2 |
2014-10-22T15:51:26Z
|
2021-09-08T23:06:09Z
|
2015-01-19T09:16:40Z
|
NONE
|
resolved
|
[Here's the commit](https://github.com/chardet/chardet/commit/0f9030139ceeaa4c3bf590de299fe84a5895c218) that fixed it in upstream.
Because of the bug, if I fetch a UTF-8 file that starts with a BOM, I get (e.g.):
```
<?xml version="1.0" encoding="utf-8"?>
...
```
and `response.encoding` is `ISO-8859-1` or `-2`.
I think you should be able to just pull the newest version, and it will be fixed.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2296/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2296/timeline
| null |
completed
| null | null | false |
[
"@tgs thanks. We haven't had the reason to do a new release yet. When we do, we'll pull this in.\n\nUntil we do, if you know the encoding you can simply do `response.encoding = 'utf8'`\n",
"Sounds good, thank you!\n\nFor anyone else finding this, one workaround is to set response.encoding yourself:\n\n``` python\nresponse = requests.get(...)\nresponse.encoding = 'utf-8-sig' # trims off the BOM, or you could say 'utf-8' to leave it\nresponse.text\n```\n"
] |
https://api.github.com/repos/psf/requests/issues/2295
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2295/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2295/comments
|
https://api.github.com/repos/psf/requests/issues/2295/events
|
https://github.com/psf/requests/pull/2295
| 46,463,504 |
MDExOlB1bGxSZXF1ZXN0MjMxMjk4ODc=
| 2,295 |
Adding a custom line delimiter to iter_lines()
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/955663?v=4",
"events_url": "https://api.github.com/users/2deviant/events{/privacy}",
"followers_url": "https://api.github.com/users/2deviant/followers",
"following_url": "https://api.github.com/users/2deviant/following{/other_user}",
"gists_url": "https://api.github.com/users/2deviant/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/2deviant",
"id": 955663,
"login": "2deviant",
"node_id": "MDQ6VXNlcjk1NTY2Mw==",
"organizations_url": "https://api.github.com/users/2deviant/orgs",
"received_events_url": "https://api.github.com/users/2deviant/received_events",
"repos_url": "https://api.github.com/users/2deviant/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/2deviant/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/2deviant/subscriptions",
"type": "User",
"url": "https://api.github.com/users/2deviant",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2014-10-22T01:21:26Z
|
2021-09-08T10:01:00Z
|
2014-10-25T18:33:52Z
|
CONTRIBUTOR
|
resolved
|
Initial description is in Issue #2292 however, I shall elaborate.
It is often necessary to iterate over something other than conventional newlines, where such characters are a part of the data, for example. This is a small functionality requiring a proportionally small change in the code. Wrapping `iter_content()` in a custom iterator outside of the Requests library requires significantly more code, which is why I'm proposing this change. The effect on the performance is negligible.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2295/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2295/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2295.diff",
"html_url": "https://github.com/psf/requests/pull/2295",
"merged_at": "2014-10-25T18:33:52Z",
"patch_url": "https://github.com/psf/requests/pull/2295.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2295"
}
| true |
[
"I'm surprised we haven't seen a request for this before. I guess I'm +0.5. \n",
"i wonder if we should call it `delimeter` instead\n",
"I suppose that's a proper name for it, since `newline` does mean something specific. I have no qualms with renaming the argument to `delimiter`. Do you have any other concerns?\n",
"@2deviant there's not much else to the code. I think the only concern is changing the parameter name. Thanks!\n"
] |
https://api.github.com/repos/psf/requests/issues/2294
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2294/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2294/comments
|
https://api.github.com/repos/psf/requests/issues/2294/events
|
https://github.com/psf/requests/issues/2294
| 46,405,726 |
MDU6SXNzdWU0NjQwNTcyNg==
| 2,294 |
Connection reset by peer
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5134130?v=4",
"events_url": "https://api.github.com/users/aeos/events{/privacy}",
"followers_url": "https://api.github.com/users/aeos/followers",
"following_url": "https://api.github.com/users/aeos/following{/other_user}",
"gists_url": "https://api.github.com/users/aeos/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/aeos",
"id": 5134130,
"login": "aeos",
"node_id": "MDQ6VXNlcjUxMzQxMzA=",
"organizations_url": "https://api.github.com/users/aeos/orgs",
"received_events_url": "https://api.github.com/users/aeos/received_events",
"repos_url": "https://api.github.com/users/aeos/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/aeos/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aeos/subscriptions",
"type": "User",
"url": "https://api.github.com/users/aeos",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2014-10-21T15:15:19Z
|
2021-09-08T23:07:09Z
|
2014-10-21T15:46:48Z
|
NONE
|
resolved
|
Hi,
Hopefully I am not using the API wrong, but I am stuck with how to fix an issue I am having. We are making hundreds of HTTP requests to an end point in a single script and after a while we receive a "Connection reset by peer" exception. I have put in a sleep and then retry recovery, but I get the same connection issue. If I restart the script, everything is fine.
From what I have read, there is connection pooling and I am worried that the second request is being dispatched on the same connection (which has been closed). I am wondering if maybe I am using the API wrong and should be closing the connection some how. My implementation is pretty straight forward:
```
with requests.Session() as s:
s.headers['Connection'] = 'close'
rs = s.request(method, url, **kwargs)
```
kwargs includes timeout=None. Should I be catching exceptions within the with block to ensure the connection is closed properly? Or is this the correct usage?
Thanks!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5134130?v=4",
"events_url": "https://api.github.com/users/aeos/events{/privacy}",
"followers_url": "https://api.github.com/users/aeos/followers",
"following_url": "https://api.github.com/users/aeos/following{/other_user}",
"gists_url": "https://api.github.com/users/aeos/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/aeos",
"id": 5134130,
"login": "aeos",
"node_id": "MDQ6VXNlcjUxMzQxMzA=",
"organizations_url": "https://api.github.com/users/aeos/orgs",
"received_events_url": "https://api.github.com/users/aeos/received_events",
"repos_url": "https://api.github.com/users/aeos/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/aeos/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/aeos/subscriptions",
"type": "User",
"url": "https://api.github.com/users/aeos",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2294/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2294/timeline
| null |
completed
| null | null | false |
[
"This usage is fine.\n\nAssuming this is exactly your code, I take it you re-enter the `with` block after you've caught the exception?\n",
"the with call is executed inside a function block, I catch the exception outside the function, sleep, then call the function again.\n",
"Ok, so your problem is nothing to do with connection pooling.\n\nConnections are pooled within a session. You're throwing the session away after each request, so we're throwing the connections away: you shouldn't be re-using them.\n\nThat raises the question of why the connection starts getting reset on the remote end in such a way that resetting the program fixes the problem. One option might be that you're getting rate limited. Does that sound possible?\n",
"The end point does rate limit, but that is not indicated through http codes. This may very well be an issue with the end point. Thank you for confirming that it is not a connection pooling issue! If I discover anything further in my debugging I will let you know.\n",
"No problem, glad to be of some assistance.\n"
] |
https://api.github.com/repos/psf/requests/issues/2293
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2293/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2293/comments
|
https://api.github.com/repos/psf/requests/issues/2293/events
|
https://github.com/psf/requests/issues/2293
| 46,367,258 |
MDU6SXNzdWU0NjM2NzI1OA==
| 2,293 |
Odd HTTPS issue
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/204153?v=4",
"events_url": "https://api.github.com/users/gdude2002/events{/privacy}",
"followers_url": "https://api.github.com/users/gdude2002/followers",
"following_url": "https://api.github.com/users/gdude2002/following{/other_user}",
"gists_url": "https://api.github.com/users/gdude2002/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gdude2002",
"id": 204153,
"login": "gdude2002",
"node_id": "MDQ6VXNlcjIwNDE1Mw==",
"organizations_url": "https://api.github.com/users/gdude2002/orgs",
"received_events_url": "https://api.github.com/users/gdude2002/received_events",
"repos_url": "https://api.github.com/users/gdude2002/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gdude2002/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gdude2002/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gdude2002",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2014-10-21T08:28:23Z
|
2021-09-08T23:07:09Z
|
2014-10-21T12:28:28Z
|
NONE
|
resolved
|
It appears that Requests isn't getting the correct SSL cert on my domain, for some reason.
```
➜ ~ python
Python 2.7.6 (default, Mar 22 2014, 22:59:56)
[GCC 4.8.2] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import requests
>>> r = requests.get("https://ultros.io")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 60, in get
return request('get', url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 49, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 457, in request
resp = self.send(prep, **send_kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 569, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 420, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: hostname 'ultros.io' doesn't match either of 'www.archivesmc.com', 'archivesmc.com'
>>>
```
The certificate itself is fine - if you visit [the site](https://ultros.io), then you do indeed get the correct cert.

This seems to be a fairly odd one. Server is nginx, any idea what's going on here?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2293/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2293/timeline
| null |
completed
| null | null | false |
[
"My guess is that it's SNI related. Your webserver requires SNI-enabled clients (https://www.ssllabs.com/ssltest/analyze.html?d=ultros.io).\n",
"\"This site works only in browsers with SNI support. \" - is that what you mean?\n",
"Yes. Your webserver probably serves the \"www.archivesmc.com\" certificate for clients without SNI.\n",
"I'm unsure what you're saying here. The server itself supports SNI, is what I'm getting from that message, but Requests doesn't?\n",
"Exactly. That's what I'm guessing.\n",
"Ergh, that's problematic.\n\nI'd assume it could be fixed, then, by setting ultros.io as my default nginx domain, but then Requests would fail to work on the archivesmc.com domain or the others I have on that server that use SSL.. May take some brain-crunching!\n\nIn the meantime, are there any plans to support SNI in Requests/Urllib3?\n",
"According to [this page](https://urllib3.readthedocs.org/en/latest/contrib.html), it's possible to use SNI with urllib3 - if I'm not mistaken, Requests is based on that library, right?\n",
"Confirmed fix.\n\n``` python\n>>> import urllib3.contrib.pyopenssl\n>>> urllib3.contrib.pyopenssl.inject_into_urllib3()\n>>> import requests\n>>> r = requests.get(\"https://ultros.io\")\n>>> r\n<Response [200]>\n>>>\n```\n",
"That fix is unnecessary. =)\n\nIf you simply install pyopenssl, pyasn1 and ndg-httpsclient, requests will automatically perform SNI for you. Alternatively, pip install requests with the 'security' optional dependencies: `pip install requests[security]`.\n\nThanks for your help @KoesterD!\n"
] |
https://api.github.com/repos/psf/requests/issues/2292
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2292/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2292/comments
|
https://api.github.com/repos/psf/requests/issues/2292/events
|
https://github.com/psf/requests/issues/2292
| 46,355,487 |
MDU6SXNzdWU0NjM1NTQ4Nw==
| 2,292 |
Add custom line delimiter to iter_lines()
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/955663?v=4",
"events_url": "https://api.github.com/users/2deviant/events{/privacy}",
"followers_url": "https://api.github.com/users/2deviant/followers",
"following_url": "https://api.github.com/users/2deviant/following{/other_user}",
"gists_url": "https://api.github.com/users/2deviant/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/2deviant",
"id": 955663,
"login": "2deviant",
"node_id": "MDQ6VXNlcjk1NTY2Mw==",
"organizations_url": "https://api.github.com/users/2deviant/orgs",
"received_events_url": "https://api.github.com/users/2deviant/received_events",
"repos_url": "https://api.github.com/users/2deviant/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/2deviant/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/2deviant/subscriptions",
"type": "User",
"url": "https://api.github.com/users/2deviant",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-10-21T05:14:22Z
|
2021-09-08T23:06:11Z
|
2015-01-18T20:13:16Z
|
CONTRIBUTOR
|
resolved
|
I saw a couple of requests that touched on this issue but, none that have hit it on the nose. Has adding a custom line delimiter functionality to `iter_lines()` been considered before? If so, what are the reasons for not including it? If the feature has not yet been considered, I would not mind submitting a pull request for comment.
This feature would be useful in cases when, for example, newlines are a part of data and the stream is delimited with, say `<>`.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2292/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2292/timeline
| null |
completed
| null | null | false |
[
"I don't believe it has. It's a pretty niche feature, but I'd be open to considering a PR with it.\n",
"Fixed #2295\n"
] |
https://api.github.com/repos/psf/requests/issues/2291
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2291/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2291/comments
|
https://api.github.com/repos/psf/requests/issues/2291/events
|
https://github.com/psf/requests/issues/2291
| 46,347,589 |
MDU6SXNzdWU0NjM0NzU4OQ==
| 2,291 |
Direct TLS pinning of .pem files possible? (No CA roots)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/7875239?v=4",
"events_url": "https://api.github.com/users/HulaHoopWhonix/events{/privacy}",
"followers_url": "https://api.github.com/users/HulaHoopWhonix/followers",
"following_url": "https://api.github.com/users/HulaHoopWhonix/following{/other_user}",
"gists_url": "https://api.github.com/users/HulaHoopWhonix/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/HulaHoopWhonix",
"id": 7875239,
"login": "HulaHoopWhonix",
"node_id": "MDQ6VXNlcjc4NzUyMzk=",
"organizations_url": "https://api.github.com/users/HulaHoopWhonix/orgs",
"received_events_url": "https://api.github.com/users/HulaHoopWhonix/received_events",
"repos_url": "https://api.github.com/users/HulaHoopWhonix/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/HulaHoopWhonix/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/HulaHoopWhonix/subscriptions",
"type": "User",
"url": "https://api.github.com/users/HulaHoopWhonix",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2014-10-21T02:14:05Z
|
2021-09-08T23:07:09Z
|
2014-10-21T03:20:24Z
|
NONE
|
resolved
|
How can I pin a downloaded .pem TLS cert file for a specific server directly (self-signed certificate), without using CA roots?
Is this currently possible? If yes, can you please provide an example?
I read http://docs.python-requests.org/en/latest/user/advanced/#ssl-cert-verificationbut am not sure if this applies to what I'm trying to do:
<blockquote>You can also specify a local cert to use as client side certificate, as a single file (containing the private key and the certificate) or as a tuple of both file’s path:
<code>
requests.get('https://kennethreitz.com', cert=('/path/server.crt', '/path/key'))
Response [200]
</code>
</blockquote>
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2291/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2291/timeline
| null |
completed
| null | null | false |
[
"This is a question and as such belongs on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests).\n",
"@sigmavirus24 I am asking about a pyhton-requests functionality. Can you please give me a simple yes or no answer if this is possible?\n",
"@HulaHoopWhonix We know what you're doing, we're just asking you to take the question to the appropriate location. GitHub issues are for feature requests, bug reports and code enhancements. They are not a forum for questions.\n",
"@Lukasa Ok I see what you mean. Any input for this question on stackoverflow would be appreciated. I posted it here:\n\nhttps://stackoverflow.com/questions/26479039/python-requests-direct-pem-pinning-with-self-signed-cert\n"
] |
https://api.github.com/repos/psf/requests/issues/2290
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2290/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2290/comments
|
https://api.github.com/repos/psf/requests/issues/2290/events
|
https://github.com/psf/requests/pull/2290
| 46,200,104 |
MDExOlB1bGxSZXF1ZXN0MjI5NzgyMDY=
| 2,290 |
Fix #2279. Update layout css
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1938009?v=4",
"events_url": "https://api.github.com/users/syedsuhail/events{/privacy}",
"followers_url": "https://api.github.com/users/syedsuhail/followers",
"following_url": "https://api.github.com/users/syedsuhail/following{/other_user}",
"gists_url": "https://api.github.com/users/syedsuhail/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/syedsuhail",
"id": 1938009,
"login": "syedsuhail",
"node_id": "MDQ6VXNlcjE5MzgwMDk=",
"organizations_url": "https://api.github.com/users/syedsuhail/orgs",
"received_events_url": "https://api.github.com/users/syedsuhail/received_events",
"repos_url": "https://api.github.com/users/syedsuhail/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/syedsuhail/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/syedsuhail/subscriptions",
"type": "User",
"url": "https://api.github.com/users/syedsuhail",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2014-10-19T08:39:50Z
|
2021-09-08T10:01:03Z
|
2014-10-19T09:42:28Z
|
CONTRIBUTOR
|
resolved
|
Updated gratipay widget iframe in both sidebarintro.html and sidebarlogo.html .
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2290/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2290/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2290.diff",
"html_url": "https://github.com/psf/requests/pull/2290",
"merged_at": "2014-10-19T09:42:28Z",
"patch_url": "https://github.com/psf/requests/pull/2290.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2290"
}
| true |
[
"@syedsuhail Quick question, why'd you change the width?\n",
"I didn't notice the change already made in sidebarintro.html. When I saw the gratipay widget was shorter in the requests page I tried adjusting the width. Then I understood the problem was in sidebarlogo.html. I checked the html build and didn't see any problem with the current width. So I kept it as such.\n",
"Fair enough, suits me. :cake: Thanks!\n",
"Thank you very much. :smiley: \n"
] |
https://api.github.com/repos/psf/requests/issues/2289
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2289/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2289/comments
|
https://api.github.com/repos/psf/requests/issues/2289/events
|
https://github.com/psf/requests/pull/2289
| 46,199,449 |
MDExOlB1bGxSZXF1ZXN0MjI5Nzc5MDQ=
| 2,289 |
Fix #2288 . Change urllib3 and chardet workflow
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1938009?v=4",
"events_url": "https://api.github.com/users/syedsuhail/events{/privacy}",
"followers_url": "https://api.github.com/users/syedsuhail/followers",
"following_url": "https://api.github.com/users/syedsuhail/following{/other_user}",
"gists_url": "https://api.github.com/users/syedsuhail/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/syedsuhail",
"id": 1938009,
"login": "syedsuhail",
"node_id": "MDQ6VXNlcjE5MzgwMDk=",
"organizations_url": "https://api.github.com/users/syedsuhail/orgs",
"received_events_url": "https://api.github.com/users/syedsuhail/received_events",
"repos_url": "https://api.github.com/users/syedsuhail/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/syedsuhail/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/syedsuhail/subscriptions",
"type": "User",
"url": "https://api.github.com/users/syedsuhail",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2014-10-19T07:52:54Z
|
2021-09-08T10:01:03Z
|
2014-10-19T07:56:36Z
|
CONTRIBUTOR
|
resolved
|
This modified workflow first tries to clone the urllib3 or chardet and then removes it from requests/packages if successful. If the cloning is unsuccessful then the packages are not removed.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2289/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2289/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2289.diff",
"html_url": "https://github.com/psf/requests/pull/2289",
"merged_at": "2014-10-19T07:56:36Z",
"patch_url": "https://github.com/psf/requests/pull/2289.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2289"
}
| true |
[
"Thanks for this @syedsuhail! :cake:\n",
"Happy to contribute. This is my first successful merge. :smiley: \n",
"@syedsuhail I'm glad you chose to contribute to requests!\n"
] |
https://api.github.com/repos/psf/requests/issues/2288
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2288/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2288/comments
|
https://api.github.com/repos/psf/requests/issues/2288/events
|
https://github.com/psf/requests/issues/2288
| 46,198,638 |
MDU6SXNzdWU0NjE5ODYzOA==
| 2,288 |
urllib3 and chardet makefile workflow bug
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1938009?v=4",
"events_url": "https://api.github.com/users/syedsuhail/events{/privacy}",
"followers_url": "https://api.github.com/users/syedsuhail/followers",
"following_url": "https://api.github.com/users/syedsuhail/following{/other_user}",
"gists_url": "https://api.github.com/users/syedsuhail/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/syedsuhail",
"id": 1938009,
"login": "syedsuhail",
"node_id": "MDQ6VXNlcjE5MzgwMDk=",
"organizations_url": "https://api.github.com/users/syedsuhail/orgs",
"received_events_url": "https://api.github.com/users/syedsuhail/received_events",
"repos_url": "https://api.github.com/users/syedsuhail/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/syedsuhail/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/syedsuhail/subscriptions",
"type": "User",
"url": "https://api.github.com/users/syedsuhail",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2014-10-19T06:54:05Z
|
2021-09-08T23:07:53Z
|
2014-10-19T07:56:37Z
|
CONTRIBUTOR
|
resolved
|
Currently the requests/packages/urllib3 and requests/packages/chardet are removed first before cloning from their repositories. Problem occurs when the cloning fails for some reason and requests is left without the urllib3 or chardet packages.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2288/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2288/timeline
| null |
completed
| null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/2287
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2287/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2287/comments
|
https://api.github.com/repos/psf/requests/issues/2287/events
|
https://github.com/psf/requests/issues/2287
| 46,153,670 |
MDU6SXNzdWU0NjE1MzY3MA==
| 2,287 |
Revisit concerns in #2244
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-10-17T23:13:31Z
|
2021-09-08T23:05:46Z
|
2015-04-07T14:31:18Z
|
CONTRIBUTOR
|
resolved
|
See https://github.com/kennethreitz/requests/pull/2244 for more context:
Specific concerns are:
- Should redirects handled by the cache count against the max number of redirects? (e.g., if we set a max of 30 and find 2 redirects in the cache, the max number of redirects a user could encounter after that would be 28)
- Should history be some-how generated by the redirect cache? (i.e., should we be making responses and adding them to the final response's history?)
- Should a separate error be raised for the case where there's a loop in the cache? (When documented this would help the user realize they're stuck in a cached loop and that they may want to clear it to retry using the live site. Note: this also affects the API of `Session#resolve_redirects`.)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2287/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2287/timeline
| null |
completed
| null | null | false |
[
"@Lukasa is it still worth keeping this open?\n",
"I still hate the redirect cache, but I think we should talk about it at PyCon. In particular, we should talk about whether we should yank it out and instruct people to use CacheControl instead.\n"
] |
https://api.github.com/repos/psf/requests/issues/2286
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2286/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2286/comments
|
https://api.github.com/repos/psf/requests/issues/2286/events
|
https://github.com/psf/requests/issues/2286
| 46,094,974 |
MDU6SXNzdWU0NjA5NDk3NA==
| 2,286 |
malformed multipart form-data
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1122519?v=4",
"events_url": "https://api.github.com/users/AgDude/events{/privacy}",
"followers_url": "https://api.github.com/users/AgDude/followers",
"following_url": "https://api.github.com/users/AgDude/following{/other_user}",
"gists_url": "https://api.github.com/users/AgDude/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/AgDude",
"id": 1122519,
"login": "AgDude",
"node_id": "MDQ6VXNlcjExMjI1MTk=",
"organizations_url": "https://api.github.com/users/AgDude/orgs",
"received_events_url": "https://api.github.com/users/AgDude/received_events",
"repos_url": "https://api.github.com/users/AgDude/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/AgDude/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/AgDude/subscriptions",
"type": "User",
"url": "https://api.github.com/users/AgDude",
"user_view_type": "public"
}
|
[
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
] | null | 2 |
2014-10-17T12:43:31Z
|
2021-09-08T23:06:08Z
|
2015-01-19T09:17:00Z
|
NONE
|
resolved
|
I was trying to use requests to post a form to AWS S3, and continually got an error about malformed multipart/form-data. I am not an expert on http formatting standards, but I was able to make this work with python posters instead.
Requests should support the option to format as multipart/form-data even without breaking up the file (in my case the files are small, but S3 only support enctype of multipart).
For comparison, here is the request body that was generated by requests and rejected, below is the accepted request.
```
--b12efdfeadf54d8d9041a03966c7f8d2
Content-Disposition: form-data; name="AWSAccessKeyId"
AKIAJGWMPJUFZ57CRWQQ
--b12efdfeadf54d8d9041a03966c7f8d2
Content-Disposition: form-data; name="success_action_status"
200
--b12efdfeadf54d8d9041a03966c7f8d2
Content-Disposition: form-data; name="acl"
private
--b12efdfeadf54d8d9041a03966c7f8d2
Content-Disposition: form-data; name="key"
${filename}
--b12efdfeadf54d8d9041a03966c7f8d2
Content-Disposition: form-data; name="signature"
hQ2NDqs9SHLoH5RtMQn9cfQITPc=
--b12efdfeadf54d8d9041a03966c7f8d2
Content-Disposition: form-data; name="policy"
eyJjb25kaXRpb25zIjogW1sic3RhcnRzLXdpdGgiLCAiJGtleSIsICIiXSwgeyJhY2wiOiAicHJpdmF0ZSJ9LCB7InN1Y2Nlc3NfYWN0aW9uX3N0YXR1cyI6ICIyMDAifSwgWyJzdGFydHMtd2l0aCIsICIkQ29udGVudC1UeXBlIiwgIiJdLCBbImNvbnRlbnQtbGVuZ3RoLXJhbmdlIiwgMCwgMTA0ODU3NjAwXSwgeyJidWNrZXQiOiAiZGF0YXN5bmMtdXBsb2FkcyJ9XSwgImV4cGlyYXRpb24iOiAiMjAxNC0xMC0xOFQxMjozMjowMFoifQ==
--b12efdfeadf54d8d9041a03966c7f8d2
Content-Disposition: form-data; name="Content-Type"
application/octet-stream
--b12efdfeadf54d8d9041a03966c7f8d2
Content-Disposition: form-data; name="file"
This line is new
--b12efdfeadf54d8d9041a03966c7f8d2
Content-Disposition: form-data; name="file"
and so is this one
--b12efdfeadf54d8d9041a03966c7f8d2
Content-Disposition: form-data; name="file"
005205 6802 30 13 33.8 3.8 034344 034727 14.1 444982000133517375 0 ???????? 000000
--b12efdfeadf54d8d9041a03966c7f8d2
Content-Disposition: form-data; name="file"; filename="8b8e6cbe185e4d20b7e0586f076bb353.bvs"
--b12efdfeadf54d8d9041a03966c7f8d2--
```
The request body as generated using python poster and urllib2, which was accepted by S3:
```
--a6bf1e61b29f4a3980b62345e64c120f
Content-Disposition: form-data; name="AWSAccessKeyId"
Content-Type: text/plain; charset=utf-8
AKIAJGWMPJUFZ57CRWQQ
--a6bf1e61b29f4a3980b62345e64c120f
Content-Disposition: form-data; name="success_action_status"
Content-Type: text/plain; charset=utf-8
200
--a6bf1e61b29f4a3980b62345e64c120f
Content-Disposition: form-data; name="acl"
Content-Type: text/plain; charset=utf-8
private
--a6bf1e61b29f4a3980b62345e64c120f
Content-Disposition: form-data; name="key"
Content-Type: text/plain; charset=utf-8
${filename}
--a6bf1e61b29f4a3980b62345e64c120f
Content-Disposition: form-data; name="signature"
Content-Type: text/plain; charset=utf-8
BOanSb+x9yE0rx+Y08C7OC1bTZM=
--a6bf1e61b29f4a3980b62345e64c120f
Content-Disposition: form-data; name="policy"
Content-Type: text/plain; charset=utf-8
eyJjb25kaXRpb25zIjogW1sic3RhcnRzLXdpdGgiLCAiJGtleSIsICIiXSwgeyJhY2wiOiAicHJpdmF0ZSJ9LCB7InN1Y2Nlc3NfYWN0aW9uX3N0YXR1cyI6ICIyMDAifSwgWyJzdGFydHMtd2l0aCIsICIkQ29udGVudC1UeXBlIiwgIiJdLCBbImNvbnRlbnQtbGVuZ3RoLXJhbmdlIiwgMCwgMTA0ODU3NjAwXSwgeyJidWNrZXQiOiAiZGF0YXN5bmMtdXBsb2FkcyJ9XSwgImV4cGlyYXRpb24iOiAiMjAxNC0xMC0xOFQxMjowNDowMFoifQ==
--a6bf1e61b29f4a3980b62345e64c120f
Content-Disposition: form-data; name="Content-Type"
Content-Type: text/plain; charset=utf-8
application/octet-stream
--a6bf1e61b29f4a3980b62345e64c120f
Content-Disposition: form-data; name="file"; filename="C:/somedirectory/8b8e6cbe185e4d20b7e0586f076bb353.bvs"
Content-Type: text/plain; charset=utf-8
This line is new
and so is this one
005205 6802 30 13 33.8 3.8 034344 034727 14.1 444982000133517375 0 ???????? 000000
--a6bf1e61b29f4a3980b62345e64c120f--
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2286/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2286/timeline
| null |
completed
| null | null | false |
[
"Can you tell us what you're using to generate the last part of data where you're getting different form fields? (The part that generated the \"This line is new...\" data.) Without that we have no way of helping you.\n",
"That is simply a text file that was opened like `open(filename, 'rb')`. It is opened immediately before passing it to requests. as `requests.post(url, data=dat, files={'file':file_obj}, enctype='multipart/form-data)` . This is pseudo code, I have since removed requests from that portion of the code, and have it working with another library. I am no longer in need of a solution, but I think this is bug in the way requests handles encoding.\n\nThis is the data in the text file:\n\n```\nThis line is new\nand so is this one\n005205 6802 30 13 33.8 3.8 034344 034727 14.1 444982000133517375 0 ???????? 000000\n```\n"
] |
https://api.github.com/repos/psf/requests/issues/2285
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2285/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2285/comments
|
https://api.github.com/repos/psf/requests/issues/2285/events
|
https://github.com/psf/requests/issues/2285
| 46,093,195 |
MDU6SXNzdWU0NjA5MzE5NQ==
| 2,285 |
res.json ist not a function under debian
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6000023?v=4",
"events_url": "https://api.github.com/users/tobiasschweizer/events{/privacy}",
"followers_url": "https://api.github.com/users/tobiasschweizer/followers",
"following_url": "https://api.github.com/users/tobiasschweizer/following{/other_user}",
"gists_url": "https://api.github.com/users/tobiasschweizer/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tobiasschweizer",
"id": 6000023,
"login": "tobiasschweizer",
"node_id": "MDQ6VXNlcjYwMDAwMjM=",
"organizations_url": "https://api.github.com/users/tobiasschweizer/orgs",
"received_events_url": "https://api.github.com/users/tobiasschweizer/received_events",
"repos_url": "https://api.github.com/users/tobiasschweizer/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tobiasschweizer/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tobiasschweizer/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tobiasschweizer",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-10-17T12:20:32Z
|
2021-09-08T23:07:53Z
|
2014-10-17T13:08:03Z
|
NONE
|
resolved
|
Hi
If I do a HTTP request resulting in a JSON, I can access it directly in Debian 7.6
```
res = requests.get('url')
res.json['member']
```
In fact, when I call it as a function as described in the docu, it throws an error:
```
res.json()
TypeError: 'dict' object is not callable
```
So, on my system res.json is always a parsed JSON (a dict), while on other systems this is not the case.
My Python an rquests version:
Python 3.2.3 (default, Feb 20 2013, 14:44:27)
[GCC 4.7.2] on linux2
requests==0.12.1
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2285/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2285/timeline
| null |
completed
| null | null | false |
[
"Thanks for reporting this!\n\nThis is not a requests bug. The problem lies with your machine. The current requests version is 2.4.3, as can be seen [on PyPI](https://pypi.python.org/pypi/requests). The version you have installed, 0.12.1, was released more than two years ago. Understandably, the documentation for 2.4.3 does not apply to two year old versions.\n\nYou should update requests. If that doesn't work then Debian are packaging an extremely old version, and you should complain to them, and then install requests from pip.\n",
"thanks, works fine with the current version\n\n```\n pip install --upgrade requests\n```\n"
] |
https://api.github.com/repos/psf/requests/issues/2284
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2284/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2284/comments
|
https://api.github.com/repos/psf/requests/issues/2284/events
|
https://github.com/psf/requests/issues/2284
| 45,864,040 |
MDU6SXNzdWU0NTg2NDA0MA==
| 2,284 |
bugs on HTTP-request nested json.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4",
"events_url": "https://api.github.com/users/ghost/events{/privacy}",
"followers_url": "https://api.github.com/users/ghost/followers",
"following_url": "https://api.github.com/users/ghost/following{/other_user}",
"gists_url": "https://api.github.com/users/ghost/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ghost",
"id": 10137,
"login": "ghost",
"node_id": "MDQ6VXNlcjEwMTM3",
"organizations_url": "https://api.github.com/users/ghost/orgs",
"received_events_url": "https://api.github.com/users/ghost/received_events",
"repos_url": "https://api.github.com/users/ghost/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ghost/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ghost",
"user_view_type": "public"
}
|
[
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
}
] |
closed
| true | null |
[] | null | 7 |
2014-10-15T13:12:41Z
|
2021-09-08T23:06:04Z
|
2015-01-19T09:18:15Z
|
NONE
|
resolved
|
I think, there's a bug on serializing a nested json.
for example, the following payload
{'username': 'bambang', 'device_info': {'os_version': '9.3.1', 'model': 'samsung', 'os_name': 'android'}}
would be serialized as:
"username=bambang&device_info=os_version&device_info=model&device_info=os_name"
however, sending the same json via ajax-javascript, would be serialized as:
username=bambang&device_info%5Bos_version%5D=9.3.1&device_info%5Bmodel%5D=samsung&device_info%5Bos_name%5D=android
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2284/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2284/timeline
| null |
completed
| null | null | false |
[
"I think you're a little confused about what JSON is. You cannot 'serialize' JSON. JSON is a serialization format, it's something you serialize _to_. What you are serializing there is a dictionary.\n\nThe second element of your confusion is that you aren't even serializing _to_ JSON. You're serializing to a format known as `urlencoded` or `form-encoded`. This form of data serialization does not have any inherent specified form for nesting, all that it has is convention.\n\nThere are a couple of problems here.\n\nFirstly, we're not handling the nesting well. That _might_ be a bug, but as I stated earlier, form-encoding doesn't really _do_ nested in any consistent way.\n\nSecondly, if we passed it straight through to the standard library you still wouldn't get the result you wanted:\n\n``` python\n>>> from urllib import urlencode\n>>> urlencode({'username': 'bambang', 'device_info': {'os_version': '9.3.1', 'model': 'samsung', 'os_name': 'android'}})\n'username=bambang&device_info=%7B%27os_version%27%3A+%279.3.1%27%2C+%27model%27%3A+%27samsung%27%2C+%27os_name%27%3A+%27android%27%7D'\n```\n\nWhat this fundamentally boils down to is that it's not clearly an area that requests can help you with. One approach that might be 'right' is someone else's definition of 'wrong'. What you should do is specify the form you want yourself. For example, you should transform the above dictionary to:\n\n``` python\n{'username': 'bambang', 'device_info[os_version]': '9.3.1', 'device_info[model]': 'samsung', 'device_info[os_name]': 'android'}\n```\n\nThat would get you the function you want.\n\n@sigmavirus24, @kennethreitz, do you think we should be trying to handle the nesting ourselves? I'd be interested to see if there's actually a consistent convention here, even though form-data was never intended to do this.\n",
"form-data was never intended to work this way, but sadly it does. This is not something requests should handle because there's no _defined_ or _standard_ way to do it. This can be handled by the toolbelt, and I'll be happy to handle the controversy there. That said, I can't count how many times this has come up on this project. That combined with the order of magnitude more times it has come up on StackOverflow is incredible. It is _not_ hard for users to search for this and find exactly what they're looking for in 15 minutes or (much much) less (if they don't confuse terminology).\n",
"@Lukasa, @sigmavirus24. The 'json serialization' ramadokayano is describing is exactly what jQuery $.ajax methods produce. It's been a while since I used prototype.js but IIRC they did this too. angularjs does nested form-data encoding with []. I'm not sure what ember or meteor do.\n\nThere aren't native browser methods for this other than mozilla's https://developer.mozilla.org/en-US/Add-ons/SDK/High-Level_APIs/querystring (which is unstable and not available in the latest firefox) so I've no idea what that does.\n\nPHP has been doing the nested object serialization and de serialization in exactly the way the same way as jQuery since like forever.\n\n```\n<?php\n\n$something = json_decode(\n \"{'username': 'bambang', 'device_info': {'os_version': '9.3.1', 'model': 'samsung', 'os_name': 'android'}}\"\n , true\n);\n\n// username=bambang&device_info%5Bos_version%5D=9.3.1&device_info%5Bmodel%5D=samsung&device_info%5Bos_name%5D=android\necho http_build_query($something);\n```\n\nThe node eco system isn't 100% consistent but all main modules and libs do the encoding with [].\n\n```\nvar params = {'username': 'bambang', 'device_info': {'os_version': '9.3.1', 'model': 'samsung', 'os_name': 'android'}};\n\n// working though the node modules with this functionality in order of decending popularity\n// see, https://nodejsmodules.org/tags/querystring\n\n// QS - 55.8%\nvar Qs = require('qs');\n// username=bambang&device_info%5Bos_version%5D=9.3.1&device_info%5Bmodel%5D=samsung&device_info%5Bos_name%5D=android\nQs.stringify(params);\n\n// form2json - 6.8%\n// Is getting very old at this point and isn't under active development. It does object decoding only.\n// Uses object properties separated by '.'\n\n// querystring - 5.8%\nquerystring = require('querystring');\n// username=bambang&device_info=\n// which I'm led to believe is intentional - https://github.com/joyent/node/issues/1665\nquerystring.stringify(params)\n\n// perry - 1.3%\nvar Perry = require('perry');\n// username=bambang&device_info[os_version]=9.3.1&device_info[model]=samsung&device_info[os_name]=android\nPerry.stringify(params);\n\n// connect-queryparser -- 0.1 but was used in a lot of frameworks. Now depreciated.\n// only did object decoding. When passed the output of QS, Perry or jQuery is would return params\n```\n\nI'm not sure why nodejs' native querystring lacks support for any composite objects but all the main frameworks (restify, express, connect-middleware, ...) do nested objects with []. I suppose you trade the not being able to put unescaped literal `[` in your object property names (who does this anyway?) without escaping vs allowing nested objects with keys go through a querystring-ing and unquerystring-ing progress unscathed which seems like a excellent trade off to make to me.\n\nFor good or bad, jQuery has now utterly dominated in the browser. I think it's fair to say with all the major players doing nested object serialisation in browser with [] it's basically a defacto standard at this point.\n",
"> I'm not sure why nodejs' native querystring lacks support for any composite objects\n\nI know why! It's not the _actual_ standard so there's no reason to support it, just like the Python standard library doesn't support it for the _exact same reason_.\n\n> I think it's fair to say with all the major players doing nested object serialisation in browser with [] it's basically a defacto standard at this point.\n\nI think you meant it's the defacto standard for how jQuery and JS frameworks work in the browser. I can't find an example of a browser that would construct something like this without form input ids being pre-formatted to this structure.\n\nIf you think Python as a whole should support this, I suggest you either open a new issue at bugs.python.org (preferably after searching for already open/closed bugs that match the request). Until Python supports it, the toolbelt will support it but I'm pretty confident this doesn't belong in requests-core.\n",
"> It's not the actual standard\n\nI agree. There isn't a official standard to follow.\n\n> there no reason to support it\n\nOf course python isn't doing anything contrary to a official standard because, well, there isn't one but I don't think you can go so far as to say there's _no_ reason to support it. People writing PHP, ruby, node, .NET and stuff in browsers have seemingly all come to see this way of encoding as a defacto standard. Which ever way you cut it python is clearly the odd-one-out in this not-small group and it doesn't look like this is going to change. This seems like a bad place for python to be. You said yourself \"I can't count how many times this has come up on this project\"\n\n> I can't find an example of a browser that would construct something like this\n\nThat's right. To do something like this in a browsers you need javascript and the native javascript api doesn't cover this _at all_ so there's no standard to follow other than the frameworks which are now near universal. I imagine the OP is comparing python's url query string builder to one the output of pretty much any of these.\n\nI think there are good reasons to not do any of this is requests e.g. this belongs in python core, it breaks backward compatibility, ... but at this point I think there-is-no-standard argument doesn't hold up.\n\n> preferably after searching for already open/closed bugs that match the request\n\nTouché\n",
"While I understand both sides of this debate... I do recall Urllib handles this fine as nested Json structure per contract binding was common place in prior jobs. \n",
"@DavidHwu \n1. We're not talking about JSON. This isn't JSON. I'll repeat this until I'm blue in the face.\n2. If urllib behaved as @squareproton suggests already, requests would behave the same way since we're almost certainly using the same libraries.\n"
] |
https://api.github.com/repos/psf/requests/issues/2283
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2283/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2283/comments
|
https://api.github.com/repos/psf/requests/issues/2283/events
|
https://github.com/psf/requests/issues/2283
| 45,809,552 |
MDU6SXNzdWU0NTgwOTU1Mg==
| 2,283 |
vulnerability in SSL v3
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4",
"events_url": "https://api.github.com/users/kevinburke/events{/privacy}",
"followers_url": "https://api.github.com/users/kevinburke/followers",
"following_url": "https://api.github.com/users/kevinburke/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kevinburke",
"id": 234019,
"login": "kevinburke",
"node_id": "MDQ6VXNlcjIzNDAxOQ==",
"organizations_url": "https://api.github.com/users/kevinburke/orgs",
"received_events_url": "https://api.github.com/users/kevinburke/received_events",
"repos_url": "https://api.github.com/users/kevinburke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kevinburke",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 12 |
2014-10-14T22:43:31Z
|
2021-09-08T23:07:54Z
|
2014-10-15T17:37:01Z
|
CONTRIBUTOR
|
resolved
|
Described here: http://googleonlinesecurity.blogspot.com/2014/10/this-poodle-bites-exploiting-ssl-30.html and here: http://chat.stackexchange.com/transcript/message/18153650#18153650
I am not sure what implications this has for requests (potentially none) but it is a good idea to look at it.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4",
"events_url": "https://api.github.com/users/kevinburke/events{/privacy}",
"followers_url": "https://api.github.com/users/kevinburke/followers",
"following_url": "https://api.github.com/users/kevinburke/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kevinburke",
"id": 234019,
"login": "kevinburke",
"node_id": "MDQ6VXNlcjIzNDAxOQ==",
"organizations_url": "https://api.github.com/users/kevinburke/orgs",
"received_events_url": "https://api.github.com/users/kevinburke/received_events",
"repos_url": "https://api.github.com/users/kevinburke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kevinburke",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2283/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2283/timeline
| null |
completed
| null | null | false |
[
"## I'm 100% in favor of forcing double checking that we don't use this by default and disabling it if we do\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n",
"Checking now.\n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n\nOn Tue, Oct 14, 2014 at 4:04 PM, Ian Cordasco [email protected]\nwrote:\n\n> I'm 100% in favor of forcing double checking that we don't use this by\n> \n> ## default and disabling it if we do\n> \n> Sent from my Android device with K-9 Mail. Please excuse my brevity.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2283#issuecomment-59131916\n> .\n",
"I'm not sure that not using it by default is enough - if an attacker can interrupt the TLS v1 protocol and force the connection to downgrade to ssl v3, it's enough to read the plaintext.\n\nI am not sure how to test this but I am reading through the code to try and understand it. Would appreciate some help from @t-8ch @alex @reaperhulk @dreid .. :)\n",
"This is such a PITA for requests, we'll break a lot of people's stuff when we do this.\n\nWe should disable by default (as in OP_NO_SSLV3), but we need to have a good way to turn that limitation off.\n",
"The attacker also needs the means to control the data the client sends. Which is quite a big 'if' for users of requests, I think.\n",
"@t-8ch I haven't read up on POODLE yet: are cookies enough?\n",
"Also, really what we actually want is people to use versions of OpenSSL new enough to support `TLS_FALLBACK_SCSV`.\n",
"Note also that this approach is _more_ extreme than what Chrome have done. From [Adam Langley's blog post](https://www.imperialviolet.org/2014/10/14/poodle.html):\n\n> The changes that I've just landed in Chrome only disable fallback to SSLv3 – a server that correctly negotiates SSLv3 can still use it.\n\nNow, they do plan to remove SSLv3 entirely:\n\n> A little further down the line, perhaps in about three months, we hope to disable SSLv3 completely.\n\nBut that's Chrome, and they support a fundamentally different use-case from us. We need to be extremely careful about what we do here and how we expose it, because this will break a lot of people's code and we _will_ get complaints.\n",
"Personally I'd rather be secure than backwards compatible with a broken protocol by default \n",
"Sure, but that breaks people's stuff. We have to accept that, and behave appropriately. We also need a fairly easy (e.g. transport adapter) way of turning it back on.\n",
"Comment away at https://github.com/shazow/urllib3/issues/487\n",
"For the record `TLS_FALLBACK_SCSV` has zero use for requests or Python except as a server. TLS has secure negotiation of the highest protocol version. However browsers, trying to be more compatible than secure, work around this secure negotiation and implement an _insecure_ negotiation. `TLS_FALLBACK_SCSV` is an extension to TLS that attempts to make their version of negotiation secure against downgrade attacks. \n"
] |
https://api.github.com/repos/psf/requests/issues/2282
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2282/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2282/comments
|
https://api.github.com/repos/psf/requests/issues/2282/events
|
https://github.com/psf/requests/pull/2282
| 45,760,041 |
MDExOlB1bGxSZXF1ZXN0MjI3MTc0NjY=
| 2,282 |
Update sidebarintro.html
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/8458546?v=4",
"events_url": "https://api.github.com/users/morganthrapp/events{/privacy}",
"followers_url": "https://api.github.com/users/morganthrapp/followers",
"following_url": "https://api.github.com/users/morganthrapp/following{/other_user}",
"gists_url": "https://api.github.com/users/morganthrapp/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/morganthrapp",
"id": 8458546,
"login": "morganthrapp",
"node_id": "MDQ6VXNlcjg0NTg1NDY=",
"organizations_url": "https://api.github.com/users/morganthrapp/orgs",
"received_events_url": "https://api.github.com/users/morganthrapp/received_events",
"repos_url": "https://api.github.com/users/morganthrapp/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/morganthrapp/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/morganthrapp/subscriptions",
"type": "User",
"url": "https://api.github.com/users/morganthrapp",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2014-10-14T14:56:11Z
|
2021-09-08T10:01:04Z
|
2014-10-14T15:22:48Z
|
CONTRIBUTOR
|
resolved
|
Fix issue #2279. "Gittip" changed to "Gratipay". Tip button extended from 48px to 60px.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2282/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2282/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2282.diff",
"html_url": "https://github.com/psf/requests/pull/2282",
"merged_at": "2014-10-14T15:22:48Z",
"patch_url": "https://github.com/psf/requests/pull/2282.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2282"
}
| true |
[
"@kennethreitz you okay with this?\n",
"<3\n",
"Yay, my first merge. :) \n",
"Thanks @morganthrapp! :cake: \n",
"Shouldn't the URLs be updated to https://gratipay.com/kennethreitz/ ?\n\nRedirect from old domain works for now for the link, not sure about the iframe.\n",
"It probably should.\nOn Oct 15, 2014 7:47 AM, \"Arjen\" [email protected] wrote:\n\n> Shouldn't the URL be updated to https://gratipay.com/kennethreitz/\n> (redirect from old domain works for now) ?\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/pull/2282#issuecomment-59193588\n> .\n",
"Gittip redirects to Gratipay so it isn't urgent\n"
] |
https://api.github.com/repos/psf/requests/issues/2281
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2281/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2281/comments
|
https://api.github.com/repos/psf/requests/issues/2281/events
|
https://github.com/psf/requests/issues/2281
| 45,672,985 |
MDU6SXNzdWU0NTY3Mjk4NQ==
| 2,281 |
Transfer-Encoding: chunked; API to get trailers?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/8687679?v=4",
"events_url": "https://api.github.com/users/davidabarrett/events{/privacy}",
"followers_url": "https://api.github.com/users/davidabarrett/followers",
"following_url": "https://api.github.com/users/davidabarrett/following{/other_user}",
"gists_url": "https://api.github.com/users/davidabarrett/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/davidabarrett",
"id": 8687679,
"login": "davidabarrett",
"node_id": "MDQ6VXNlcjg2ODc2Nzk=",
"organizations_url": "https://api.github.com/users/davidabarrett/orgs",
"received_events_url": "https://api.github.com/users/davidabarrett/received_events",
"repos_url": "https://api.github.com/users/davidabarrett/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/davidabarrett/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/davidabarrett/subscriptions",
"type": "User",
"url": "https://api.github.com/users/davidabarrett",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2014-10-13T18:41:56Z
|
2021-09-08T23:07:54Z
|
2014-10-13T18:43:57Z
|
NONE
|
resolved
|
Is there a way to process trailers using Requests?
http://tools.ietf.org/html/rfc7230#section-4.1.2
Adding a trailer, (e.g. Content-MD5) is
especially useful to detect incomplete
streamed reponses from an apache server,
otherwise its difficult (impossible?) to write
a robust client for large server responses.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2281/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2281/timeline
| null |
completed
| null | null | false |
[
"I'm afraid `httplib`, upon which requests is built, provides no API for obtaining trailers. This means we can't provide one either.\n\nWe are thinking of rewriting `httplib`, which would enable us to provide such support. In the meantime, I recommend you request the feature from the CPython developers.\n\nSorry we can't be of more help!\n",
"Is there an existing thread somewhere on the status of\nchunked trailers support for httplib in CPython and where\nto post the request for it?\n",
"I did a quick search but found nothing. Trailers are a relatively uncommon feature, and I could find no request for the feature absolutely anywhere.\n\nI recommend opening it on bugs.python.org, or possibly sending a mail to python-list.\n"
] |
https://api.github.com/repos/psf/requests/issues/2280
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2280/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2280/comments
|
https://api.github.com/repos/psf/requests/issues/2280/events
|
https://github.com/psf/requests/issues/2280
| 45,666,766 |
MDU6SXNzdWU0NTY2Njc2Ng==
| 2,280 |
Requests is not properly escaping cookies
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/84711?v=4",
"events_url": "https://api.github.com/users/erickt/events{/privacy}",
"followers_url": "https://api.github.com/users/erickt/followers",
"following_url": "https://api.github.com/users/erickt/following{/other_user}",
"gists_url": "https://api.github.com/users/erickt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/erickt",
"id": 84711,
"login": "erickt",
"node_id": "MDQ6VXNlcjg0NzEx",
"organizations_url": "https://api.github.com/users/erickt/orgs",
"received_events_url": "https://api.github.com/users/erickt/received_events",
"repos_url": "https://api.github.com/users/erickt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/erickt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/erickt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/erickt",
"user_view_type": "public"
}
|
[
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
}
] |
closed
| true | null |
[] | null | 2 |
2014-10-13T17:31:02Z
|
2021-09-08T23:06:08Z
|
2015-01-19T09:18:41Z
|
NONE
|
resolved
|
Requests does not appear to be escaping cookies appropriately, which triggers this bug:
``` python
>>> import requests
>>> r = requests.get('http://httpbin.org/cookies',
... cookies={
... 'escaped': '"a=b;b=c"',
... 'unescaped': 'x=y;y=z',
... })
>>> print r.text
{
"cookies": {
"escaped": "a=b;b=c",
"unescaped": "x=y",
"y": "z"
}
}
```
According to the RFC 2109 spec, cookies with reserved characters like `;` should be escaped with quotes. Here are the relevant definitions from that spec:
```
From RFC 2109:
av-pairs = av-pair *(";" av-pair)
av-pair = attr ["=" value] ; optional value
attr = token
value = word
word = token | quoted-string
cookie = "Cookie:" cookie-version
1*((";" | ",") cookie-value)
cookie-value = NAME "=" VALUE [";" path] [";" domain]
cookie-version = "$Version" "=" value
NAME = attr
VALUE = value
path = "$Path" "=" value
domain = "$Domain" "=" value
set-cookie = "Set-Cookie:" cookies
cookies = 1#cookie
cookie = NAME "=" VALUE *(";" cookie-av)
NAME = attr
VALUE = value
cookie-av = "Comment" "=" value
| "Domain" "=" value
| "Max-Age" "=" value
| "Path" "=" value
| "Secure"
| "Version" "=" 1*DIGIT
From RFC 2068:
token = 1*<any CHAR except CTLs or tspecials>
tspecials = "(" | ")" | "<" | ">" | "@"
| "," | ";" | ":" | "\" | <">
| "/" | "[" | "]" | "?" | "="
| "{" | "}" | SP | HT
quoted-string = ( <"> *(qdtext) <"> )
qdtext = <any TEXT except <">>
```
If I read requests source correctly, I believe this may actually be a bug in the upstream `cookiejar` code, which doesn't appear to escape cookies when it is added to the cookie jar. However, the `cookies` jar appears to work fine:
``` python
>>> import Cookie
>>> c=Cookie.SimpleCookie()
>>> c['escaped'] = '"a=b;b=c"'
>>> c['unescaped'] = 'a=b;b=c'
>>> print c.output()
Set-Cookie: escaped="\"a=b\073b=c\""
Set-Cookie: unescaped="a=b\073b=c"
```
You can find the method that does the quoting [here](https://hg.python.org/cpython/file/c0224ff67cdd/Lib/http/cookies.py#l225).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2280/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2280/timeline
| null |
completed
| null | null | false |
[
"If you tracked it upstream to the standard library, why did you open a bug here? Did you also open a bug [upstream](https://bugs.python.org)?\n",
"I wasn't completely positive if it was an upstream bug, so I wanted to\ncheck in with you first. cookielib doesn't really to be friendly to using\nit to prepare cookies, so I could see them aguing that requests should be\nusing Cookie/http.cookies instead.\n\nThe main reason why I filed this here though is that unfortunately my\napplication needs to be compatible with any Python 2.6 or above. So even if\nit's fixed upstream I can't count on it being fixed for my customers. It is\nmuch easier for me to update a requests dependency.\n\nOn Monday, October 13, 2014, Ian Cordasco [email protected] wrote:\n\n> If you tracked it upstream to the standard library, why did you open a bug\n> here? Did you also open a bug upstream?\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2280#issuecomment-58930226\n> .\n"
] |
https://api.github.com/repos/psf/requests/issues/2279
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2279/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2279/comments
|
https://api.github.com/repos/psf/requests/issues/2279/events
|
https://github.com/psf/requests/issues/2279
| 45,663,906 |
MDU6SXNzdWU0NTY2MzkwNg==
| 2,279 |
Gittip renamed to Gratipay, layout needs updating
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/46775?v=4",
"events_url": "https://api.github.com/users/mjpieters/events{/privacy}",
"followers_url": "https://api.github.com/users/mjpieters/followers",
"following_url": "https://api.github.com/users/mjpieters/following{/other_user}",
"gists_url": "https://api.github.com/users/mjpieters/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mjpieters",
"id": 46775,
"login": "mjpieters",
"node_id": "MDQ6VXNlcjQ2Nzc1",
"organizations_url": "https://api.github.com/users/mjpieters/orgs",
"received_events_url": "https://api.github.com/users/mjpieters/received_events",
"repos_url": "https://api.github.com/users/mjpieters/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mjpieters/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mjpieters/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mjpieters",
"user_view_type": "public"
}
|
[
{
"color": "0b02e1",
"default": false,
"description": null,
"id": 191274,
"name": "Contributor Friendly",
"node_id": "MDU6TGFiZWwxOTEyNzQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly"
},
{
"color": "fad8c7",
"default": false,
"description": null,
"id": 136616769,
"name": "Documentation",
"node_id": "MDU6TGFiZWwxMzY2MTY3Njk=",
"url": "https://api.github.com/repos/psf/requests/labels/Documentation"
}
] |
closed
| true | null |
[] | null | 0 |
2014-10-13T17:00:37Z
|
2021-09-08T23:07:53Z
|
2014-10-19T09:42:28Z
|
CONTRIBUTOR
|
resolved
|
Gittip [was renamed to Gratipay](https://github.com/gratipay/inside.gratipay.com/issues/73), and now the sidebar button is cut off:
<kbd>

</kbd>
Looks like some CSS needs adjusting.
For extra bonus points: adjust the link text in the sidebar too.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2279/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2279/timeline
| null |
completed
| null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/2278
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2278/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2278/comments
|
https://api.github.com/repos/psf/requests/issues/2278/events
|
https://github.com/psf/requests/issues/2278
| 45,647,239 |
MDU6SXNzdWU0NTY0NzIzOQ==
| 2,278 |
apparent_encoding() hangs with high CPU utilization
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/349269?v=4",
"events_url": "https://api.github.com/users/cazzerson/events{/privacy}",
"followers_url": "https://api.github.com/users/cazzerson/followers",
"following_url": "https://api.github.com/users/cazzerson/following{/other_user}",
"gists_url": "https://api.github.com/users/cazzerson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/cazzerson",
"id": 349269,
"login": "cazzerson",
"node_id": "MDQ6VXNlcjM0OTI2OQ==",
"organizations_url": "https://api.github.com/users/cazzerson/orgs",
"received_events_url": "https://api.github.com/users/cazzerson/received_events",
"repos_url": "https://api.github.com/users/cazzerson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/cazzerson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cazzerson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/cazzerson",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2014-10-13T14:21:06Z
|
2021-09-08T23:05:55Z
|
2014-10-13T14:26:07Z
|
NONE
|
resolved
|
While processing a large set of requests, `apparent_encoding()` works well for quite a while but eventually hangs. All I have at this point is a traceback:
```
File "/opt/wsgi/.../ENV/lib/python2.7/site-packages/requests/models.py", line 637, in apparent_encoding
return chardet.detect(self.content)['encoding']
File "/opt/wsgi/.../ENV/lib/python2.7/site-packages/requests/packages/chardet/__init__.py", line 30, in detect
u.feed(aBuf)
File "/opt/wsgi/.../ENV/lib/python2.7/site-packages/requests/packages/chardet/universaldetector.py", line 128, in feed
if prober.feed(aBuf) == constants.eFoundIt:
File "/opt/wsgi/.../ENV/lib/python2.7/site-packages/requests/packages/chardet/charsetgroupprober.py", line 64, in feed
st = prober.feed(aBuf)
File "/opt/wsgi/.../ENV/lib/python2.7/site-packages/requests/packages/chardet/sjisprober.py", line 74, in feed
self._mContextAnalyzer.feed(aBuf[i + 1 - charLen:i + 3
```
I appreciate anything you can do to help.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2278/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2278/timeline
| null |
completed
| null | null | false |
[
"This is a duplicate of https://github.com/chardet/chardet/issues/29 which no one (who is remotely interested in the bug) has had the time to investigate or fix yet. This isn't a bug in requests.\n",
"This is also creating problems for me.\nAny work around?\n",
"@dav009 two options:\n1. Use cChardet instead of `apparent_encoding`\n2. Set a custom encoding if you know it to prevent requests from needing to use `apparent_encoding`\n",
"@dav009 one other alternative, help chardet fix the bug on their side.\n"
] |
https://api.github.com/repos/psf/requests/issues/2277
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2277/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2277/comments
|
https://api.github.com/repos/psf/requests/issues/2277/events
|
https://github.com/psf/requests/issues/2277
| 45,600,135 |
MDU6SXNzdWU0NTYwMDEzNQ==
| 2,277 |
make .raw more file like
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/23225?v=4",
"events_url": "https://api.github.com/users/CarlFK/events{/privacy}",
"followers_url": "https://api.github.com/users/CarlFK/followers",
"following_url": "https://api.github.com/users/CarlFK/following{/other_user}",
"gists_url": "https://api.github.com/users/CarlFK/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/CarlFK",
"id": 23225,
"login": "CarlFK",
"node_id": "MDQ6VXNlcjIzMjI1",
"organizations_url": "https://api.github.com/users/CarlFK/orgs",
"received_events_url": "https://api.github.com/users/CarlFK/received_events",
"repos_url": "https://api.github.com/users/CarlFK/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/CarlFK/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/CarlFK/subscriptions",
"type": "User",
"url": "https://api.github.com/users/CarlFK",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2014-10-13T01:24:42Z
|
2021-09-08T23:07:54Z
|
2014-10-13T02:35:23Z
|
NONE
|
resolved
|
This is kinda half baked, but maybe there is a pattern that fits what I am looking for:
> > > r=requests.get(url)
> > > r.raw.name # AttributeError: 'HTTPResponse' object has no attribute 'name'
just like:
> > > fd = open('foo.txt')
> > > fd.name
> > > 'foo.txt'
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2277/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2277/timeline
| null |
completed
| null | null | false |
[
"@CarlFK what would that even return?\n",
"Beyond that, this belongs on urllib3. The `raw` attribute is a urllib3 HTTPResponse. I also see no reason to provide this. It may have a few file-like methods, but there's no reason for it to be more file-like.\n",
"The easy answer is url.\nmaybe parse it to look for something file namy like.\n\nor look in the headers for something - like my server side code:\nresponse['Content-Disposition'] = 'inline; filename=playlist.m3u'\n\n> > > r = requests.get(url)\n> > > r.headers\n> > > CaseInsensitiveDict({'content-disposition': 'inline; filename=playlist.m3u',\n\nBut I see your point about urllib3, so yeah, never mind.\n",
"You realize you could easily write your own wrapper class for this right?\n\n``` python\nclass FileLikeHTTPResponse(object):\n def __init__(self, raw):\n self.raw = raw\n self.name = self.find_name()\n # etc.\n def read(self, amount=None):\n return self.raw.read(amount)\n # etc\n```\n"
] |
https://api.github.com/repos/psf/requests/issues/2276
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2276/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2276/comments
|
https://api.github.com/repos/psf/requests/issues/2276/events
|
https://github.com/psf/requests/pull/2276
| 45,598,273 |
MDExOlB1bGxSZXF1ZXN0MjI2MjIzMjE=
| 2,276 |
Attempt at #1956 DO NOT MERGE
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9095841?v=4",
"events_url": "https://api.github.com/users/mikecool1000/events{/privacy}",
"followers_url": "https://api.github.com/users/mikecool1000/followers",
"following_url": "https://api.github.com/users/mikecool1000/following{/other_user}",
"gists_url": "https://api.github.com/users/mikecool1000/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mikecool1000",
"id": 9095841,
"login": "mikecool1000",
"node_id": "MDQ6VXNlcjkwOTU4NDE=",
"organizations_url": "https://api.github.com/users/mikecool1000/orgs",
"received_events_url": "https://api.github.com/users/mikecool1000/received_events",
"repos_url": "https://api.github.com/users/mikecool1000/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mikecool1000/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mikecool1000/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mikecool1000",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-10-13T00:19:29Z
|
2021-09-08T10:01:04Z
|
2014-10-13T00:20:15Z
|
CONTRIBUTOR
|
resolved
|
My attempt at fixing this issue, not sure its great
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9095841?v=4",
"events_url": "https://api.github.com/users/mikecool1000/events{/privacy}",
"followers_url": "https://api.github.com/users/mikecool1000/followers",
"following_url": "https://api.github.com/users/mikecool1000/following{/other_user}",
"gists_url": "https://api.github.com/users/mikecool1000/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mikecool1000",
"id": 9095841,
"login": "mikecool1000",
"node_id": "MDQ6VXNlcjkwOTU4NDE=",
"organizations_url": "https://api.github.com/users/mikecool1000/orgs",
"received_events_url": "https://api.github.com/users/mikecool1000/received_events",
"repos_url": "https://api.github.com/users/mikecool1000/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mikecool1000/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mikecool1000/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mikecool1000",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2276/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2276/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2276.diff",
"html_url": "https://github.com/psf/requests/pull/2276",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2276.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2276"
}
| true |
[
"@mikecool1000 why did you close this?\n",
"I just didn't want try to submit work I am unsure of\n\nSent from my iPhone\n\n> On Oct 12, 2014, at 7:16 PM, Ian Cordasco [email protected] wrote:\n> \n> @mikecool1000 why did you close this?\n> \n> —\n> Reply to this email directly or view it on GitHub.\n"
] |
https://api.github.com/repos/psf/requests/issues/2275
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2275/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2275/comments
|
https://api.github.com/repos/psf/requests/issues/2275/events
|
https://github.com/psf/requests/issues/2275
| 45,586,595 |
MDU6SXNzdWU0NTU4NjU5NQ==
| 2,275 |
Content-Length header not checked by requests if not enough data is sent
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5478285?v=4",
"events_url": "https://api.github.com/users/squareproton/events{/privacy}",
"followers_url": "https://api.github.com/users/squareproton/followers",
"following_url": "https://api.github.com/users/squareproton/following{/other_user}",
"gists_url": "https://api.github.com/users/squareproton/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/squareproton",
"id": 5478285,
"login": "squareproton",
"node_id": "MDQ6VXNlcjU0NzgyODU=",
"organizations_url": "https://api.github.com/users/squareproton/orgs",
"received_events_url": "https://api.github.com/users/squareproton/received_events",
"repos_url": "https://api.github.com/users/squareproton/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/squareproton/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/squareproton/subscriptions",
"type": "User",
"url": "https://api.github.com/users/squareproton",
"user_view_type": "public"
}
|
[
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
}
] |
closed
| true | null |
[] | null | 4 |
2014-10-12T17:11:19Z
|
2021-09-08T23:06:08Z
|
2015-01-19T09:19:03Z
|
NONE
|
resolved
|
This is a feature request.
In our application we've noticed ~ 1 in 2500 GET requests is truncated early. The HTTP responses are typically 20K-2MB. This is certainly caused by network issues or dodgy HTTP servers and isn't a requests problem. By default requests 2.3.0 doesn't give any indication if the response body has a length smaller than the 'Content-Length' header indicated. The response object is presented as if nothing is amiss.
Requests is already doing some checks on Content-Length. When a response body is larger than a Content-Length header describes requests silently truncates the body. I've (yet) to find a way of detecting this case.
I would completely understand if the requests developers thought this type of check was outside the scope of the library - after all it is pretty easy to do this check in user code - but it seemed kind of anomalous given the super succinct high level api.
In a similar vein a Content-MD5 header check would be useful in our application but this is probably a lot more niche.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2275/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2275/timeline
| null |
completed
| null | null | false |
[
"Hi @squareproton \n\nLet me start off by saying that we've had this requested in the past (see: #1938) so you would do well to read that discussion to understand why we will not ever add content-length checking to requests. (This is also vaguely related to #2255 and https://github.com/sigmavirus24/requests-toolbelt/issues/41.)\n\nIf you read the thread that resulted from the last time this was suggested, you'll see that we do not add arbitrary or niche features to requests. If there's a high likelihood of users needed it (something greater than 90% of our current userbase) then we'll add it. You might understand how this makes requests far more maintainable, especially given we don't do much in the way of header checks.\n\nWhat I am going to address here is the following:\n\n> Requests is already doing some checks on Content-Length. When a response body is larger than a Content-Length header describes requests silently truncates the body.\n\nI'm fairly certain you're misunderstanding what you're receiving. Of course, if you can provide an example that reliably reproduces that, this would be most appreciated. I have never personally seen this behaviour, and I'd be surprised to hear that we're truncating large bodies. If that's happening anywhere, it might be httplib but I doubt it happens there either.\n",
"For that matter, it was my belief that short bodies threw an IncompleteRead error.\n",
"@Lukasa I was under the same impression frankly\n",
"Thanks for your responses. #1938 has convinced me, a Content-Length check doesn't belong in requests, and the check I'm currently doing in the application is broken.\n\nI've got test cases for the missing IncompleteRead error and the silent truncation of the request bodies.\n\n**IncompleteRead error**\n\nnodejs server 0.10.31 on Ubuntu server 12.04 LTS\n\n```\nvar http = require('http');\n\nhttp.createServer(function (req, res) {\n var buffer = new Buffer(10);\n buffer.fill('A');\n res.writeHead(200, {'Content-Length': buffer.length*2});\n res.write(buffer);\n // agressively trash the socket, calling res.end() doesn't work if node\n // believes there are still outstanding bytes to be sent\n res.socket.destroy();\n}).listen(4321);\n```\n\nMaking a request with curl\n\n```\n$ curl -v http://localhost:4321\n\n* About to connect() to localhost port 4321 (#0)\n* Trying 127.0.0.1... connected\n> GET / HTTP/1.1\n> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3\n> Host: localhost:4321\n> Accept: */*\n> \n< HTTP/1.1 200 OK\n< Content-Length: 20\n< Date: Mon, 13 Oct 2014 15:01:30 GMT\n< Connection: keep-alive\n< \n* transfer closed with 10 bytes remaining to read\n* Closing connection #0\ncurl: (18) transfer closed with 10 bytes remaining to read\nAAAAAAAAAA\n```\n\nIn python doing \n\n```\nimport requests\nresponse = requests.get(\"http://192.168.2.19:4321\")\nprint(response.content, len(response.content))\n```\n\ngenerates\n\n```\nb'AAAAAAAAAA' 10\n```\n\nSame behaviour on windows and linux. The versions of the software installed is as follows.\n\nPython 3.4.0 (v3.4.0:04f714765c13, Mar 16 2014, 19:24:06) [MSC v.1600 32 bit (Intel)] on win32\nRequests v2.3.0\n\nPython 3.4.1 (default, May 25 2014, 22:33:14) [GCC 4.6.3] on linux\nRequests v2.4.1\n\nI'm pretty new to python (though not programming in general) so it's possible I'm missing something in a python config somewhere which causes the error to be suppressed. I just not very familiar with the python environment. I didn't personally setup or configure the python install on the Windows machine (this was scripted by another developer) but as far as I am aware it doesn't do anything exotic. I did do the Ubuntu server though and that's totally vanilla.\n\n**Silent Truncation of Excess Content**\n\nChanging the node server to output a Content-Length header which is too small, i.e. `buffer.length/2`\n\n```\n* About to connect() to localhost port 4321 (#0)\n* Trying 127.0.0.1... connected\n> GET / HTTP/1.1\n> User-Agent: curl/7.22.0 (x86_64-pc-linux-gnu) libcurl/7.22.0 OpenSSL/1.0.1 zlib/1.2.3.4 libidn/1.23 librtmp/2.3\n> Host: localhost:4321\n> Accept: */*\n> \n< HTTP/1.1 200 OK\n< Content-Length: 5\n< Date: Mon, 13 Oct 2014 15:28:56 GMT\n< Connection: keep-alive\n< \n* Excess found in a non pipelined read: excess = 5, size = 5, maxdownload = 5, bytecount = 0\n* Connection #0 to host localhost left intact\n* Closing connection #0\nAAAAA\n```\n\nSometimes curl doesn't doesn't display the error message \\* Excess found in a non pipelines read ... This happens about 20% of the time. I don't know why it's not 100% reproducible. I've checked with wireshark and I'm pretty sure node is always sending the excess data.\n\nThe same python script always produces the same output on both Windows and Ubuntu.\n\n```\nb'AAAAA' 5\n```\n\nI've also tested some of the different ways of getting the data with stream=True and never been able to read the excess content.\n\nHope this helps.\n"
] |
https://api.github.com/repos/psf/requests/issues/2274
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2274/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2274/comments
|
https://api.github.com/repos/psf/requests/issues/2274/events
|
https://github.com/psf/requests/issues/2274
| 45,439,236 |
MDU6SXNzdWU0NTQzOTIzNg==
| 2,274 |
update python-requests.org to SSL
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4",
"events_url": "https://api.github.com/users/kevinburke/events{/privacy}",
"followers_url": "https://api.github.com/users/kevinburke/followers",
"following_url": "https://api.github.com/users/kevinburke/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kevinburke",
"id": 234019,
"login": "kevinburke",
"node_id": "MDQ6VXNlcjIzNDAxOQ==",
"organizations_url": "https://api.github.com/users/kevinburke/orgs",
"received_events_url": "https://api.github.com/users/kevinburke/received_events",
"repos_url": "https://api.github.com/users/kevinburke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kevinburke",
"user_view_type": "public"
}
|
[
{
"color": "02e10c",
"default": false,
"description": null,
"id": 76800,
"name": "Feature Request",
"node_id": "MDU6TGFiZWw3NjgwMA==",
"url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
] | null | 3 |
2014-10-10T02:10:40Z
|
2021-09-06T00:06:42Z
|
2014-10-11T04:25:37Z
|
CONTRIBUTOR
|
resolved
|
it's now free if you use Cloudflare for DNS
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2274/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2274/timeline
| null |
completed
| null | null | false |
[
"Technically we have SSL, but it's only for ReadTheDocs.org so there's a Certificate Name mismatch. We could proxy a certificate from Heroku https://docs.readthedocs.org/en/latest/alternate_domains.html#cname-ssl \n\n```\n21:17.03 ericholscher he can proxy back to RTD from heroku as well\n21:17.06 ericholscher if ya'll want\n21:17.12 ericholscher for the ssl term\n```\n",
"I don't think this really matters. \n",
"@kennethreitz: it matters because of https://github.com/requests/requests/issues/4885"
] |
https://api.github.com/repos/psf/requests/issues/2273
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2273/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2273/comments
|
https://api.github.com/repos/psf/requests/issues/2273/events
|
https://github.com/psf/requests/pull/2273
| 45,439,100 |
MDExOlB1bGxSZXF1ZXN0MjI1MzYyOTU=
| 2,273 |
Add Release History to the sidebar
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
] | null | 3 |
2014-10-10T02:07:26Z
|
2021-09-08T10:01:05Z
|
2014-10-12T14:55:34Z
|
CONTRIBUTOR
|
resolved
|
Closes #2269
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2273/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2273/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2273.diff",
"html_url": "https://github.com/psf/requests/pull/2273",
"merged_at": "2014-10-12T14:55:34Z",
"patch_url": "https://github.com/psf/requests/pull/2273.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2273"
}
| true |
[
"Since @kennethreitz is very passionate about the docs, I want to make sure he's okay with this before we merge it @alex \n",
"Much needed :)\n",
":cake: \n"
] |
https://api.github.com/repos/psf/requests/issues/2272
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2272/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2272/comments
|
https://api.github.com/repos/psf/requests/issues/2272/events
|
https://github.com/psf/requests/issues/2272
| 45,362,630 |
MDU6SXNzdWU0NTM2MjYzMA==
| 2,272 |
chunked encoding bug when release connection
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/32542?v=4",
"events_url": "https://api.github.com/users/ilovenwd/events{/privacy}",
"followers_url": "https://api.github.com/users/ilovenwd/followers",
"following_url": "https://api.github.com/users/ilovenwd/following{/other_user}",
"gists_url": "https://api.github.com/users/ilovenwd/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ilovenwd",
"id": 32542,
"login": "ilovenwd",
"node_id": "MDQ6VXNlcjMyNTQy",
"organizations_url": "https://api.github.com/users/ilovenwd/orgs",
"received_events_url": "https://api.github.com/users/ilovenwd/received_events",
"repos_url": "https://api.github.com/users/ilovenwd/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ilovenwd/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ilovenwd/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ilovenwd",
"user_view_type": "public"
}
|
[
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
}
] |
closed
| true | null |
[] | null | 11 |
2014-10-09T12:56:06Z
|
2021-09-08T23:06:07Z
|
2015-01-19T09:19:32Z
|
NONE
|
resolved
|
https://github.com/kennethreitz/requests/blob/master/requests/adapters.py#L404
I encounter a bug that will be fixed when remove this line.
Since the preload_content=False in HTTPResponse.from_httplib, we cannot _put_conn back to pool (where it's closed)
``` python
low_conn = conn._get_conn(timeout=timeout)
try:
low_conn.putrequest(request.method,
url,
skip_accept_encoding=True)
.....
r = low_conn.getresponse()
resp = HTTPResponse.from_httplib(
r,
pool=conn,
connection=low_conn,
preload_content=False,
decode_content=False
)
except:
# If we hit any problems here, clean up the connection.
# Then, reraise so that we can handle the actual exception.
low_conn.close()
raise
else:
# All is well, return the connection to the pool.
conn._put_conn(low_conn)
```
the code trigger this bug (just illustrate):
```
monkey.patch_all()
def data_stream(n):
for i in range(n):
x = str(i)
gevent.sleep(0)
yield x
gevent.spawn(requests.post('xxx', data=data_stream(100)))
gevent.spawn(requests.post('xxx', data=data_stream(100)))
gevent.joinall()
```
the first post gets normal response, while the second get empty response (conn closed before read content)
but result is ok if we don't use gevent.spawn
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2272/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2272/timeline
| null |
completed
| null | null | false |
[
"Why aren't you using grequests?\n",
"I look grequests code and find that the code is indeed what I do now use gevent.\nAlso, I need more control with the concurrency, the example code is just a simple demo.\nI think this is a bug of requests.\n",
"Can you provide something that actually allows us to reproduce this. We aren't going to just remove code because you asked us to. I modified your example a bit so that I could test it, and it in fact produces no problems for me.\n\n``` python\nimport requests\nimport gevent\nimport gevent.monkey\n\ngevent.monkey.patch_all()\n\n\ndef data_stream(n):\n for i in range(n):\n x = str(i)\n gevent.sleep(0)\n yield x\n\nurl = 'https://httpbin.org/post'\n\ngreenlets = [gevent.spawn(requests.post, url, data=data_stream(100)),\n gevent.spawn(requests.post, url, data=data_stream(100))]\nr = gevent.joinall(greenlets)\n```\n",
"I don't means remove the code is the correct way, just say where the bug is.\n\nI change your code a little and produce the bug.\nNote the connection pool size=2\n\n``` python\nimport sys\nimport requests\nimport gevent\nimport gevent.monkey\n\ngevent.monkey.patch_all()\n\n\ndef data_stream(n, mark='.'):\n for i in range(n):\n x = str(i)*100\n #gevent.sleep(0)\n if i%(n/10)==0:\n sys.stdout.write(mark)\n sys.stdout.flush()\n yield x\n\nsleep = 0\n\nsess = requests.Session()\nadapter = requests.adapters.HTTPAdapter(pool_connections=2, pool_maxsize=2)\nsess.mount('http://', adapter)\n\ndef g(n, mark='.'):\n global sleep\n sleep += 0.2\n #gevent.sleep(sleep)\n\n url = 'http://httpbin.org/post'\n r = sess.post(url, data=data_stream(n, mark))\n #gevent.sleep(4)\n print '\\n>>>>>>', mark, len(r.content)\n\nn = 100\ngreenlets = [\n gevent.spawn(g, n, '.'),\n gevent.spawn(g, n, 'x'),\n gevent.spawn(g, n, 'o'),\n #gevent.spawn(g, n, '-'),\n #gevent.spawn(g, n, '='),\n #gevent.spawn(g, n, '*'),\n ]\nr = gevent.joinall(greenlets)\nprint 'done'\n```\n\noutput:\n\n```\nxxxxxxxxoooooooo........xx\n>>>>>> x 466\n..\n>>>>>> . 0\noo\n>>>>>> o 0\ndone\n```\n\nthe 2nd and 3rd response is lost\n",
"I haven't run the code yet, but this can (probably) be solved in a few ways:\n1. Use grequests because it looks to do The Right Thing\n2. Figure out a way to make gevent monkey patch the `queue` module with it's [own](http://www.gevent.org/gevent.queue.html) (although I have yet to test that this will work)\n3. (And this will be the easiest) Use a separate session for each \"thread\". The problem is that the `LifoQueue` from the standard library is **threadsafe** but that does not imply gevent-safe. This will ensure separate queues for your program.\n\nWe _could_ add something like\n\n``````\n```python\nLOCK = threading.Lock()\n# ...\n with LOCK:\n conn._put_conn(low_conn)\n```\n``````\n\nBut we probably won't. The reason this might work is because when using `gevent.monkey.patch_all()` you're patching the threading library and so gevent will ensure that this will not hang. I don't have the time to test all of this, but connection pooling is almost certainly the problem here. @shazow do you have any thoughts on this matter?\n",
"So actually, the catch here is import/monkey patch order.\n\nI'm willing to be the following will fix this:\n\n``` python\nimport gevent\nimport gevent.monkey\n\ngevent.monkey.patch_all()\n\nimport sys\nimport requests\n```\n\n@ilovenwd if you don't `patch_all` before import requests, it probably won't replace any usage of `RLock` used in the `LifoQueue` which is imported from `threading`.\n",
"I change the code, but still the same issue.\n\n``` python\nimport gevent\nimport gevent.monkey\ngevent.monkey.patch_all()\nimport requests\n```\n\nthe output is exactly the same.\n\nI trace the code line by line, the real issue is that:\nbecause preload_content=False, so the low_conn is put back to pool BEFORE content is consumed,\nso other greenlet/thread may re-use this low_conn (and reset it maybe).\nafter that, the original client of the low_conn read nothing.\n",
"you can compare the chunked code with non-chunked code https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/connectionpool.py#L566\nby default, it didn't release the conn (by default), so maybe we should not release low_conn in chunked case, but I'm not sure, since I'm not familiar with requests code base.\n",
"Right so the solution is to not use the same session. This is the same recommendation we make in the case of threading. We never want to preload the content into memory immediately. If we do it unconditionally that will have some very real memory problems for users.\n",
"Let me know if I can still be of help, sounds like you figured it out.\n",
"Maybe I should use the diff session per greenlet solution. Thanks for the explanation.\nHowever, I wish requests doc could have sth regards this case with gevents.\n"
] |
https://api.github.com/repos/psf/requests/issues/2271
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2271/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2271/comments
|
https://api.github.com/repos/psf/requests/issues/2271/events
|
https://github.com/psf/requests/pull/2271
| 45,320,725 |
MDExOlB1bGxSZXF1ZXN0MjI0NjgyNzg=
| 2,271 |
Fixed #2250
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9095841?v=4",
"events_url": "https://api.github.com/users/mikecool1000/events{/privacy}",
"followers_url": "https://api.github.com/users/mikecool1000/followers",
"following_url": "https://api.github.com/users/mikecool1000/following{/other_user}",
"gists_url": "https://api.github.com/users/mikecool1000/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mikecool1000",
"id": 9095841,
"login": "mikecool1000",
"node_id": "MDQ6VXNlcjkwOTU4NDE=",
"organizations_url": "https://api.github.com/users/mikecool1000/orgs",
"received_events_url": "https://api.github.com/users/mikecool1000/received_events",
"repos_url": "https://api.github.com/users/mikecool1000/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mikecool1000/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mikecool1000/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mikecool1000",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2014-10-09T02:28:13Z
|
2021-09-08T10:01:06Z
|
2014-10-10T18:30:18Z
|
CONTRIBUTOR
|
resolved
|
Fixed #2250 with #2271
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2271/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2271/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2271.diff",
"html_url": "https://github.com/psf/requests/pull/2271",
"merged_at": "2014-10-10T18:30:18Z",
"patch_url": "https://github.com/psf/requests/pull/2271.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2271"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/2270
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2270/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2270/comments
|
https://api.github.com/repos/psf/requests/issues/2270/events
|
https://github.com/psf/requests/issues/2270
| 45,300,413 |
MDU6SXNzdWU0NTMwMDQxMw==
| 2,270 |
form url encoding data
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/555399?v=4",
"events_url": "https://api.github.com/users/3goats/events{/privacy}",
"followers_url": "https://api.github.com/users/3goats/followers",
"following_url": "https://api.github.com/users/3goats/following{/other_user}",
"gists_url": "https://api.github.com/users/3goats/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/3goats",
"id": 555399,
"login": "3goats",
"node_id": "MDQ6VXNlcjU1NTM5OQ==",
"organizations_url": "https://api.github.com/users/3goats/orgs",
"received_events_url": "https://api.github.com/users/3goats/received_events",
"repos_url": "https://api.github.com/users/3goats/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/3goats/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/3goats/subscriptions",
"type": "User",
"url": "https://api.github.com/users/3goats",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-10-08T21:37:04Z
|
2021-09-08T23:07:55Z
|
2014-10-08T22:03:25Z
|
NONE
|
resolved
|
Hi I have the following data structure which needs to be sent to a web-service as form-urlencoded:
```
data = {'Properties': [{'key': 'KeyLength', 'value': '512'}], 'Category': 'keysets', 'Offset': '0', 'Limit': '100'}
I'm posting it like this:
```
```
req = requests.post('http://server1/ws1/api/data/filters/', data=data)
```
This is how it should look received after its been encoded and decoded:
```
Properties[0][key]=KeyLength&Properties[0][value]=768&Category=keysets&Offset=0&Limit=100
```
However it seems end up like this:
``
Category=keysets&Limit=100&Properties=[{'value': '512', 'key': 'KeyLength'}]&Offset=0
```
Here's how it looks in the chrome developer tools which works fine from the browser.

Can somebody please advise what I'm doing wrong.
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2270/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2270/timeline
| null |
completed
| null | null | false |
[
"You already asked this question on [StackOverflow](https://stackoverflow.com/questions/26266664/requests-form-urlencoded-data) which is the appropriate forum. This is a common problem and since I don't have the time to answer your question there now (and didn't look since you posted it 40 minutes ago), I'm sure someone else will link you to the relevant answer. Please be more patient with StackOverflow. Our community there is vibrant and will usually answer your question in less than a day.\n",
"OK - Thanks. I re-wrote the question on StackOverflow to be more descriptive and because I've sort of worked out what going wrong. However, still can't get it to work. I also tried 'werkzeug.MultiDict' but couldn't get that to work either. \n"
] |
https://api.github.com/repos/psf/requests/issues/2269
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2269/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2269/comments
|
https://api.github.com/repos/psf/requests/issues/2269/events
|
https://github.com/psf/requests/issues/2269
| 45,062,943 |
MDU6SXNzdWU0NTA2Mjk0Mw==
| 2,269 |
Link to the changelog in the sidebar of the documentation
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/772?v=4",
"events_url": "https://api.github.com/users/alex/events{/privacy}",
"followers_url": "https://api.github.com/users/alex/followers",
"following_url": "https://api.github.com/users/alex/following{/other_user}",
"gists_url": "https://api.github.com/users/alex/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/alex",
"id": 772,
"login": "alex",
"node_id": "MDQ6VXNlcjc3Mg==",
"organizations_url": "https://api.github.com/users/alex/orgs",
"received_events_url": "https://api.github.com/users/alex/received_events",
"repos_url": "https://api.github.com/users/alex/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/alex/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alex/subscriptions",
"type": "User",
"url": "https://api.github.com/users/alex",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2014-10-07T03:13:48Z
|
2021-09-04T00:06:28Z
|
2014-10-12T14:55:34Z
|
MEMBER
|
resolved
|
Right now it's unfortunately a bit hard to find, "Community updates" is not necessarily where I would usually look for the changelog / release notes. Right now the way I find it is to search "CVE", because I know that word is on the page.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2269/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2269/timeline
| null |
completed
| null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/2268
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2268/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2268/comments
|
https://api.github.com/repos/psf/requests/issues/2268/events
|
https://github.com/psf/requests/pull/2268
| 44,933,787 |
MDExOlB1bGxSZXF1ZXN0MjIyNjA4MzE=
| 2,268 |
Re-order params for backwards compat
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-10-05T23:56:27Z
|
2021-09-08T10:01:08Z
|
2014-10-06T09:40:03Z
|
CONTRIBUTOR
|
resolved
|
Address @kevinburke's concerns
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2268/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2268/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2268.diff",
"html_url": "https://github.com/psf/requests/pull/2268",
"merged_at": "2014-10-06T09:40:03Z",
"patch_url": "https://github.com/psf/requests/pull/2268.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2268"
}
| true |
[
"LGTM\n"
] |
https://api.github.com/repos/psf/requests/issues/2267
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2267/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2267/comments
|
https://api.github.com/repos/psf/requests/issues/2267/events
|
https://github.com/psf/requests/pull/2267
| 44,933,052 |
MDExOlB1bGxSZXF1ZXN0MjIyNjA1NTQ=
| 2,267 |
allow unicode URLs on Python 2
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/151929?v=4",
"events_url": "https://api.github.com/users/minrk/events{/privacy}",
"followers_url": "https://api.github.com/users/minrk/followers",
"following_url": "https://api.github.com/users/minrk/following{/other_user}",
"gists_url": "https://api.github.com/users/minrk/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/minrk",
"id": 151929,
"login": "minrk",
"node_id": "MDQ6VXNlcjE1MTkyOQ==",
"organizations_url": "https://api.github.com/users/minrk/orgs",
"received_events_url": "https://api.github.com/users/minrk/received_events",
"repos_url": "https://api.github.com/users/minrk/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/minrk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/minrk/subscriptions",
"type": "User",
"url": "https://api.github.com/users/minrk",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-10-05T23:30:52Z
|
2021-09-08T10:01:07Z
|
2014-10-06T09:39:54Z
|
CONTRIBUTOR
|
resolved
|
on Python 2 `u'é'.decode('utf8')` fails with UnicodeEncodeError, but only AttributeError is caught.
This only calls decode on known bytes objects.
includes a test that succeeds on 2.4.1 and fails on 2.4.2
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2267/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2267/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2267.diff",
"html_url": "https://github.com/psf/requests/pull/2267",
"merged_at": "2014-10-06T09:39:54Z",
"patch_url": "https://github.com/psf/requests/pull/2267.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2267"
}
| true |
[
"cc @buttscicles \n",
"FWIW this works for me.\n"
] |
https://api.github.com/repos/psf/requests/issues/2266
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2266/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2266/comments
|
https://api.github.com/repos/psf/requests/issues/2266/events
|
https://github.com/psf/requests/issues/2266
| 44,922,005 |
MDU6SXNzdWU0NDkyMjAwNQ==
| 2,266 |
Move utility functions from requests.utils to requests-toolbelt
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"color": "e102d8",
"default": false,
"description": null,
"id": 117745,
"name": "Planned",
"node_id": "MDU6TGFiZWwxMTc3NDU=",
"url": "https://api.github.com/repos/psf/requests/labels/Planned"
},
{
"color": "fbca04",
"default": false,
"description": null,
"id": 44501249,
"name": "Needs BDFL Input",
"node_id": "MDU6TGFiZWw0NDUwMTI0OQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input"
},
{
"color": "eb6420",
"default": false,
"description": null,
"id": 44501256,
"name": "Breaking API Change",
"node_id": "MDU6TGFiZWw0NDUwMTI1Ng==",
"url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2013-11-17T11:29:34Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": null,
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/20",
"id": 487518,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels",
"node_id": "MDk6TWlsZXN0b25lNDg3NTE4",
"number": 20,
"open_issues": 12,
"state": "open",
"title": "3.0.0",
"updated_at": "2024-05-19T18:43:00Z",
"url": "https://api.github.com/repos/psf/requests/milestones/20"
}
| 6 |
2014-10-05T17:08:58Z
|
2021-09-08T07:00:40Z
|
2017-07-30T14:06:03Z
|
CONTRIBUTOR
|
resolved
|
# Reasoning
requests is an _HTTP_ library, not an HTML library. Some of these utility functions are far more focused on HTML and a Response's body than on HTTP. With that said, they don't belong in requests proper. Instead they belong in a third-party library and currently the best place for that is requests-toolbelt.
# Path to removal
Step 1:
- [x] Add `DeprecationWarning`s to all of these functions in the very next possible release of requests (either 2.4.3 or 2.5.0) (See PR https://github.com/kennethreitz/requests/pull/2309)
Step 2:
Prior to releasing 3.0, remove the following functions and migrate them to [requests-toolbelt](/sigmavirus24/requests-toolbelt). Allow requests-toolbelt to release a version first and then release 3.0.
- [x] `get_encodings_from_content`
- [x] `get_unicode_from_response`
- [ ] more functions?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2266/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2266/timeline
| null |
completed
| null | null | false |
[
"+1000000\n",
"Why `get_unicode_from_response`?\n",
"We already provide the unicode in the response as `r.text`.\n",
"Back in the day, we had a function that would convert any Request object into a curl command string.\n\n...lol\n",
"Yeah, people have recreated that time and again. That's not ending up in the toolbelt frankly. On the other hand, if someone wrote a function to convert it to an HTTPie command string... I might consider that ;)\n",
"I think we can close this."
] |
https://api.github.com/repos/psf/requests/issues/2265
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2265/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2265/comments
|
https://api.github.com/repos/psf/requests/issues/2265/events
|
https://github.com/psf/requests/pull/2265
| 44,903,523 |
MDExOlB1bGxSZXF1ZXN0MjIyNTEwMjI=
| 2,265 |
Remove timeout from __attrs__
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-10-05T04:04:57Z
|
2021-09-08T10:01:08Z
|
2014-10-05T08:05:29Z
|
CONTRIBUTOR
|
resolved
|
We do not allow the user to set the timeout value on the Session any longer so
this is extraneous
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2265/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2265/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2265.diff",
"html_url": "https://github.com/psf/requests/pull/2265",
"merged_at": "2014-10-05T08:05:29Z",
"patch_url": "https://github.com/psf/requests/pull/2265.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2265"
}
| true |
[
"LGTM. :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/2264
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2264/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2264/comments
|
https://api.github.com/repos/psf/requests/issues/2264/events
|
https://github.com/psf/requests/issues/2264
| 44,885,750 |
MDU6SXNzdWU0NDg4NTc1MA==
| 2,264 |
ImportError: cannot import name urlencode
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1220668?v=4",
"events_url": "https://api.github.com/users/evanemolo/events{/privacy}",
"followers_url": "https://api.github.com/users/evanemolo/followers",
"following_url": "https://api.github.com/users/evanemolo/following{/other_user}",
"gists_url": "https://api.github.com/users/evanemolo/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/evanemolo",
"id": 1220668,
"login": "evanemolo",
"node_id": "MDQ6VXNlcjEyMjA2Njg=",
"organizations_url": "https://api.github.com/users/evanemolo/orgs",
"received_events_url": "https://api.github.com/users/evanemolo/received_events",
"repos_url": "https://api.github.com/users/evanemolo/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/evanemolo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/evanemolo/subscriptions",
"type": "User",
"url": "https://api.github.com/users/evanemolo",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2014-10-04T15:22:44Z
|
2021-09-08T23:07:57Z
|
2014-10-05T01:27:58Z
|
NONE
|
resolved
|
I am running Python 2.7.8. I get the following error from `import requests`:
```
Traceback (most recent call last):
File "request_get.py", line 1, in <module>
import requests
File "/usr/local/lib/python2.7/site-packages/requests/__init__.py", line 58, in <module>
from . import utils
File "/usr/local/lib/python2.7/site-packages/requests/utils.py", line 25, in <module>
from .compat import parse_http_list as _parse_list_header
File "/usr/local/lib/python2.7/site-packages/requests/compat.py", line 7, in <module>
from .packages import chardet
File "/usr/local/lib/python2.7/site-packages/requests/packages/__init__.py", line 3, in <module>
from . import urllib3
File "/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/__init__.py", line 10, in <module>
from .connectionpool import (
File "/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 37, in <module>
from .request import RequestMethods
File "/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/request.py", line 4, in <module>
from urllib import urlencode
ImportError: cannot import name urlencode
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1220668?v=4",
"events_url": "https://api.github.com/users/evanemolo/events{/privacy}",
"followers_url": "https://api.github.com/users/evanemolo/followers",
"following_url": "https://api.github.com/users/evanemolo/following{/other_user}",
"gists_url": "https://api.github.com/users/evanemolo/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/evanemolo",
"id": 1220668,
"login": "evanemolo",
"node_id": "MDQ6VXNlcjEyMjA2Njg=",
"organizations_url": "https://api.github.com/users/evanemolo/orgs",
"received_events_url": "https://api.github.com/users/evanemolo/received_events",
"repos_url": "https://api.github.com/users/evanemolo/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/evanemolo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/evanemolo/subscriptions",
"type": "User",
"url": "https://api.github.com/users/evanemolo",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2264/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2264/timeline
| null |
completed
| null | null | false |
[
"``` pycon\n$ python\nPython 2.7.8 (default, Aug 24 2014, 21:26:19)\n[GCC 4.2.1 Compatible Apple LLVM 5.1 (clang-503.0.40)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests\n>>> from urllib import urlencode\n>>>\n```\n\nCan you give us more information?\n- How did you install requests?\n- What system are you running on?\n- What happens when you do `python -c 'from urllib import urlencode; print(\"Imported\")'`?\n- Do you have a file in the same directory as `request_get.py` called `urllib.py`?\n",
"@sigmavirus24 Thanks for giving me pointers for troubleshooting. The file I was originally using was haphazardly called 'urllib.py'. Once I ran Requests from a different directory it worked fine.\n",
"@evanemolo it was a lucky guess. :)\n",
"The error message for this is really bad. I wish they'd make it more clear\nthat you're shadowing a module name.\n\nOn Saturday, October 4, 2014, Ian Cordasco [email protected] wrote:\n\n> @evanemolo https://github.com/evanemolo it was a lucky guess. :)\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2264#issuecomment-57923709\n> .\n\n## \n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n",
"@kevinburke it's doubtful that this will be updated in 2.7 and in 3.x it's a non-issue. I tried adding `from __future__ import absolute_import` and that doesn't fix it either. There's not much we can do here.\n"
] |
https://api.github.com/repos/psf/requests/issues/2263
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2263/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2263/comments
|
https://api.github.com/repos/psf/requests/issues/2263/events
|
https://github.com/psf/requests/pull/2263
| 44,834,329 |
MDExOlB1bGxSZXF1ZXN0MjIyMjIwODA=
| 2,263 |
Fixed #2205
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3482660?v=4",
"events_url": "https://api.github.com/users/daftshady/events{/privacy}",
"followers_url": "https://api.github.com/users/daftshady/followers",
"following_url": "https://api.github.com/users/daftshady/following{/other_user}",
"gists_url": "https://api.github.com/users/daftshady/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/daftshady",
"id": 3482660,
"login": "daftshady",
"node_id": "MDQ6VXNlcjM0ODI2NjA=",
"organizations_url": "https://api.github.com/users/daftshady/orgs",
"received_events_url": "https://api.github.com/users/daftshady/received_events",
"repos_url": "https://api.github.com/users/daftshady/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/daftshady/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/daftshady/subscriptions",
"type": "User",
"url": "https://api.github.com/users/daftshady",
"user_view_type": "public"
}
|
[
{
"color": "e11d21",
"default": false,
"description": null,
"id": 78002701,
"name": "Do Not Merge",
"node_id": "MDU6TGFiZWw3ODAwMjcwMQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Do%20Not%20Merge"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
] | null | 6 |
2014-10-03T18:48:43Z
|
2021-09-08T09:01:16Z
|
2014-11-12T17:36:20Z
|
CONTRIBUTOR
|
resolved
|
Hello.
I made some fixes to `get_unicode_from_response` fixing #2205.
It now tries every encodings from meta tags by calling `get_encodings_from_content`.
Because encoding charset is case-insensitive by RFC 2616, encoding charset is appended to `tried_encodings` in lower case.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2263/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2263/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2263.diff",
"html_url": "https://github.com/psf/requests/pull/2263",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2263.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2263"
}
| true |
[
"This looks great @daftshady, thanks! :cake: I'll let @sigmavirus24 review, but I'm :+1:.\n",
"I have one concern. Once that's addressed I'll be :+1:.\n",
"@sigmavirus24 I agree with you. It's fairly reasonable to use a set here. I added another commit.\n",
"Looks great now! Thanks @daftshady!\n",
"This functionality was actually intentionally removed from Requests, and the documentation was accidentally not updated appropriately. \n\nLet's fully remove that function in v3.0.0.\n",
"#2266\n"
] |
https://api.github.com/repos/psf/requests/issues/2262
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2262/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2262/comments
|
https://api.github.com/repos/psf/requests/issues/2262/events
|
https://github.com/psf/requests/issues/2262
| 44,750,686 |
MDU6SXNzdWU0NDc1MDY4Ng==
| 2,262 |
Proper encoding for application/json Content-Type
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/827573?v=4",
"events_url": "https://api.github.com/users/noseworthy/events{/privacy}",
"followers_url": "https://api.github.com/users/noseworthy/followers",
"following_url": "https://api.github.com/users/noseworthy/following{/other_user}",
"gists_url": "https://api.github.com/users/noseworthy/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/noseworthy",
"id": 827573,
"login": "noseworthy",
"node_id": "MDQ6VXNlcjgyNzU3Mw==",
"organizations_url": "https://api.github.com/users/noseworthy/orgs",
"received_events_url": "https://api.github.com/users/noseworthy/received_events",
"repos_url": "https://api.github.com/users/noseworthy/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/noseworthy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/noseworthy/subscriptions",
"type": "User",
"url": "https://api.github.com/users/noseworthy",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2014-10-02T23:19:13Z
|
2021-09-08T23:07:58Z
|
2014-10-03T14:10:38Z
|
NONE
|
resolved
|
Hello,
I've looked through open and closed issues but haven't seen this raised before. Sorry if I'm bringing up something that's already been addressed. That said, here we go:
I'm running a flask-restful app and returning JSON responses to requests. Flask-restful helpfully encodes the response body as UTF-8 and sets the Content-Type to "application/json". When i make a request to my server using requests, requests can't properly detect the encoding and it looks like this is because I haven't set the charset on the Content-Type header. However, according to [RFC 4627](http://www.ietf.org/rfc/rfc4627.txt) the encoding for application/json responses is either UTF-8, UTF-16, or UTF-32. The default is UTF-8, but you can detect which one of the three encodings is being used by looking at the first four bytes as mentioned in [RFC 4627](http://www.ietf.org/rfc/rfc4627.txt).
When requests receives a response with a Content-Type header of application/json shouldn't it try to detect the encoding as UTF-8, UTF-16, or UTF-32 and set the encoding on the response object accordingly? This would help greatly with using `response.text`. Right now I'm setting the encoding on the response object manually.
Just to note: `response.json()` works perfectly so I realize this is a bit of a corner case since users will probably just deserialize the response anyway.
Thanks for your time, and all your awesome work. I absolutely love requests.
Mike
Possibly Related Issues: #1665 #1467
Reference to @mitsuhiko's opinion on the matter: https://github.com/mitsuhiko/flask/issues/454#issuecomment-4578200
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/827573?v=4",
"events_url": "https://api.github.com/users/noseworthy/events{/privacy}",
"followers_url": "https://api.github.com/users/noseworthy/followers",
"following_url": "https://api.github.com/users/noseworthy/following{/other_user}",
"gists_url": "https://api.github.com/users/noseworthy/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/noseworthy",
"id": 827573,
"login": "noseworthy",
"node_id": "MDQ6VXNlcjgyNzU3Mw==",
"organizations_url": "https://api.github.com/users/noseworthy/orgs",
"received_events_url": "https://api.github.com/users/noseworthy/received_events",
"repos_url": "https://api.github.com/users/noseworthy/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/noseworthy/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/noseworthy/subscriptions",
"type": "User",
"url": "https://api.github.com/users/noseworthy",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2262/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2262/timeline
| null |
completed
| null | null | false |
[
"> requests can't properly detect the encoding\n\nWhat does `response.apparent_encoding` return? This sounds l like a bug in chardet.\n\n> The default is UTF-8, but you can detect which one of the three encodings is being used by looking at the first four bytes\n\nrequests _could_ do this, but won't because this is delegated to chardet.\n\n> When requests receives a response with a Content-Type header of application/json shouldn't it try to detect the encoding as UTF-8, UTF-16, or UTF-32 and set the encoding on the response object accordingly?\n\nWe are an _HTTP_ library, not JSON RPC, nor are we a library that specifically conforms to RFC 4627. It's easy to see how having `r.json()` could make that less obvious _but_ it's a convenience method, like `r.encoding` and `r.apparent_encoding` are there as conveniences.\n\nIf you look at RFCs 7230-7235, you'll note that we should only be using what's provided by the server in the headers for `r.encoding`. The fact that 4627 doesn't allow for the server to set an explicit charset is a flaw in the design in my opinion. Users of JSON transmitted over HTTP shouldn't have to inspect the first 4 bytes if there are (and have been) other mechanisms for determining the encoding.\n\nDoes this explain it well enough?\n",
"Thanks for your fast response.\n\n`response.apparent_encoding` returns`'ISO-8859-2'`\n\n``` python\nIn [1]: import requests\n\nIn [3]: resp = requests.get(my_api_endpoint)\n\nIn [4]: resp.headers\nOut[4]: CaseInsensitiveDict({'content-length': '213', 'server': 'Werkzeug/0.9.4 Python/2.7.8', 'date': 'Fri, 03 Oct 2014 05:01:51 GMT', 'x-server-time': '0.003', 'content-type': 'application/json'})\n\nIn [5]: resp.encoding\n\nIn [6]: resp.apparent_encoding\nOut[6]: 'ISO-8859-2'\n```\n\nchardet doesn't have a problem with the utf-16 or utf-32 encoding however (probably due to the byte order marks):\n\n``` python\nIn [7]: from chardet import detect\n\nIn [8]: detect(resp.content.decode('utf-8').encode('utf-16'))\nOut[8]: {'confidence': 1.0, 'encoding': 'UTF-16LE'}\n\nIn [9]: detect(resp.content.decode('utf-8').encode('utf-32'))\nOut[9]: {'confidence': 1.0, 'encoding': 'UTF-32LE'}\n```\n\nI just noticed as well that RFC 4627 states that the encoding for application/json will be UTF-8 and if UTF-16 or UTF-32 are being used the Content-Transfer-Encoding header must be set to binary.\n\n> Encoding considerations: 8bit if UTF-8; binary if UTF-16 or UTF-32\n> \n> JSON may be represented using UTF-8, UTF-16, or UTF-32. When JSON is written in UTF-8, JSON is 8bit compatible. When JSON is written in UTF-16 or UTF-32, the binary content-transfer-encoding must be used.\n\nMaybe requests could check for Content-Type == application/json and set `response.encoding = 'utf-8'` iff the Content-Transfer-Encoding header isn't set to binary? `response.encoding` could be left as None in the case of the binary Content-Transfer-Encoding and chardet should be easily able to differentiate between UTF-16 and UTF-32?\n\n> ... nor are we a library that specifically conforms to RFC 4627\n\nI understand. Your documentation says:\n\n> Requests is intended to be compliant with all relevant specifications and RFCs where that compliance will not cause difficulties for users\n\nSo I guess RFC 4627 is deemed irrelevant? The documentation also says this regarding `response.text`:\n\n> The encoding of the response content is determined based solely on HTTP headers, following RFC 2616 to the letter. If you can take advantage of non-HTTP knowledge to make a better guess at the encoding, you should set r.encoding appropriately before accessing this property.\n\nSo with all that said, I completely understand your position. It does feel a little gross to have requests inspect response content, and chardet should be handling this. And yeah, it's a major bummer that RFC 4627 doesn't allow for charset declarations. It would certainly make things a lot easier. Clients shouldn't have to try to interpret encodings. A better spec might say that UTF-8 is to be assumed unless UTF-16 or UTF-32 is specified in a charset parameter.\n\nIt'd be nice to have the convenience property `response.encoding` handle the most common case of UTF-8, but I can understand your apprehension. Thanks again for reviewing this.\n",
"> Just to note: response.json() works perfectly so I realize this is a bit of a corner case since users will probably just deserialize the response anyway.\n\nThis is the relevant bit of your message. We have conformed to the relevant RFCs where you provide us out-of-band information on what you are expecting. Basically, when you call `response.json()` you are saying to us \"yes, this is definitely JSON, please apply JSON-specific logic here\". When you _don't_ call that, we aren't going to apply JSON-specific logic, we're going to restrict ourselves to HTTP.\n",
"This should be fixed in chardet now (I just remembered why this seemed familiar, it's because of https://github.com/chardet/chardet/issues/30). That said, this really is a chardet issue, as we've already said and as @Lukasa said, we do the right thing when you tell us you want JSON from the body. Other than that, we have to assume it falls under the domain of HTTP and that explicitly requires a `charset` parameter in `Content-Type`. We have no intention of every doing anything more than using that or trying to guess the encoding with chardet. We already do not introspect the body automatically (although we provide that as a convenience function) for meta tags or anything else that would hint the character encoding and we provide you with the ability to tell us what encoding to use.\n\nThere's enough degrees of freedom to do what you want to do. Further, if you're only ever using a session to talk to your API, you could (if you desired) do something like:\n\n``` python\ndef force_utf8(response, *args):\n response.encoding = 'utf8'\n return None\n\ns.hooks['response'].append(force_utf8)\nr = s.get('http://10.0.0.1:5000/my/endpoint')\nassert r.encoding == 'utf8'\n```\n\nIf you're not using a session, you can do:\n\n``` python\nrequests.get(my_api_endpoint, hooks={'response': [force_utf8]})\n```\n",
"Good to know it's fixed in chardet, thanks @sigmavirus24. Yeah, I have ways of detecting the encoding in my client code. I'll close this now.\n\nThanks for your time, guys.\n",
"@Opethil Our pleasure, thanks for taking the time to talk this through with us. =) Users like you are a pleasure to work with, keep it up!\n"
] |
https://api.github.com/repos/psf/requests/issues/2261
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2261/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2261/comments
|
https://api.github.com/repos/psf/requests/issues/2261/events
|
https://github.com/psf/requests/issues/2261
| 44,729,047 |
MDU6SXNzdWU0NDcyOTA0Nw==
| 2,261 |
There is no straightforward way to enable debug or show the raw http request + response
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/94108?v=4",
"events_url": "https://api.github.com/users/dequis/events{/privacy}",
"followers_url": "https://api.github.com/users/dequis/followers",
"following_url": "https://api.github.com/users/dequis/following{/other_user}",
"gists_url": "https://api.github.com/users/dequis/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dequis",
"id": 94108,
"login": "dequis",
"node_id": "MDQ6VXNlcjk0MTA4",
"organizations_url": "https://api.github.com/users/dequis/orgs",
"received_events_url": "https://api.github.com/users/dequis/received_events",
"repos_url": "https://api.github.com/users/dequis/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dequis/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dequis/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dequis",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2014-10-02T19:30:54Z
|
2021-09-08T23:07:59Z
|
2014-10-02T19:40:26Z
|
NONE
|
resolved
|
There's this stack overflow question: http://stackoverflow.com/q/10588644/2195033
The question is about something that should be simple: Print the whole http request.
Because often it's much easier to debug something by seeing it exactly as it was sent over the wire (Figuratively. Obviously I'd prefer the unencrypted one in the case of https)
I've visited this page way too many times. Unsurprisingly, it's always the same two answers:
- "Copypaste these 20 lines into your project to enable debugging"
- "Inspect these 20 attributes one by one, or assemble the whole thing yourself"
In the comments of the second one, someone asks 'Which of these gives me "the entire request, headers included"?'. No replies to that other than the assumption that reassembling the request manually is okay. The first one kinda does the job, but it feels like the times when i copypasted activestate recipes to be able to do multipart file uploads.
Does it really have to be this way? Why not provide built-in helpers for these things?
```
requests.enable_basic_debugging_globally()
```
Or:
```
print(req.raw_request_string_as_it_is_going_to_get_sent_over_the_wire)
print(resp.raw_response_string_with_headers_and_status_line)
print(resp.raw_response_string_with_headers_and_status_line_and_also_the_request_before_it)
```
Ok, I'm not great at api design. You get the idea.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2261/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2261/timeline
| null |
completed
| null | null | false |
[
"Thanks for this!\n\nThis is a not entirely uncommon request. Unfortunately, I sincerely doubt it's going to happen. I've tackled a variant of this request before [here](https://github.com/kennethreitz/requests/pull/1802#issuecomment-30644444). The most relevant part is this:\n\n> This gets even worse if you consider extending this to serialize HTTP requests (which is almost certainly the next step here, being infinitely more useful in debugging weird failures), since Requests cannot possibly reconstruct the HTTP request: we just don't have all the information.\n\nWhat's important to note is that requests is a long, _long_ way from what actually gets written in the wire. Requests builds a set of information which it passes to `urllib3`, which mutates it and passes it to `httplib`, which mutates it and passes it to the socket itself. To debug appropriately we'd need to be able to intercept that final socket write. We'd also need to be able to correlate socket writes with their appropriate request, which is hard to do because `httplib` has a non-trivial state machine in place. This represents reaching far, far into `httplib` and very far down the stack.\n\nThe problem is, all of that work is still less useful than cracking out tcpdump or Wireshark and using that to perform a packet capture. In addition to showing you the actual bytes sent and received from the perspective of the NIC driver, it can show you packet flows. This can help debug some extremely awkward problems that occasionally occur in network programming.\n\nTo sum up, I don't think there's no value in this, but I also don't think it's the right way to debug this problem. If and when we rewrite `httplib`, we can consider whether the new library can do this logging (I believe it can). Until that stage, the work required is substantial and there's an easier and better way to do it, so I doubt we'll do it.\n",
"Here I was writing a nice response and @Lukasa said it faster and better... again. ;)\n\nThanks for the suggestion @dequis and taking the time to file the bug. That said, you said one thing that @Lukasa didn't address and which bothered me. \n\nWe should take the following conversation off this bug because it's entirely tangential. _Why are you copying recipes from ActiveState to do `multipart/form-data` file uploads?_ Requests provides that for you and the requests-toolbelt ensures you don't need to think about it in the case where your file is too big to be loaded into memory all at once. Feel free to email me and reply to this. \n",
"> Why are you copying recipes from ActiveState to do multipart/form-data file uploads?\n\nHahah, sorry if that was unclear, I don't do that anymore. Not with requests. I was referring to the past, when urllib2 was the best thing I had.\n\nRegarding the rest... welp.\n",
"Easy, just bundle wireshark as part of requests\n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n\nOn Thu, Oct 2, 2014 at 12:45 PM, dx [email protected] wrote:\n\n> Why are you copying recipes from ActiveState to do multipart/form-data\n> file uploads?\n> \n> Hahah, sorry if that was unclear, I don't do that anymore. Not with\n> requests. I was referring to the past, when urllib2 was the best thing I\n> had.\n> \n> Regarding the rest... welp.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2261#issuecomment-57694165\n> .\n",
"May want to take a look at https://gist.github.com/kevinburke/7990326 which\ndoes a lot of this and pretty prints it etc\n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n\nOn Thu, Oct 2, 2014 at 1:04 PM, Kevin Burke [email protected] wrote:\n\n> Easy, just bundle wireshark as part of requests\n> \n> ## \n> \n> Kevin Burke\n> phone: 925.271.7005 | twentymilliseconds.com\n> \n> On Thu, Oct 2, 2014 at 12:45 PM, dx [email protected] wrote:\n> \n> > Why are you copying recipes from ActiveState to do multipart/form-data\n> > file uploads?\n> > \n> > Hahah, sorry if that was unclear, I don't do that anymore. Not with\n> > requests. I was referring to the past, when urllib2 was the best thing I\n> > had.\n> > \n> > Regarding the rest... welp.\n> > \n> > —\n> > Reply to this email directly or view it on GitHub\n> > https://github.com/kennethreitz/requests/issues/2261#issuecomment-57694165\n> > .\n"
] |
https://api.github.com/repos/psf/requests/issues/2260
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2260/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2260/comments
|
https://api.github.com/repos/psf/requests/issues/2260/events
|
https://github.com/psf/requests/issues/2260
| 44,663,731 |
MDU6SXNzdWU0NDY2MzczMQ==
| 2,260 |
Ignore proxy or always use 80 port
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1228001?v=4",
"events_url": "https://api.github.com/users/obelyakov/events{/privacy}",
"followers_url": "https://api.github.com/users/obelyakov/followers",
"following_url": "https://api.github.com/users/obelyakov/following{/other_user}",
"gists_url": "https://api.github.com/users/obelyakov/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/obelyakov",
"id": 1228001,
"login": "obelyakov",
"node_id": "MDQ6VXNlcjEyMjgwMDE=",
"organizations_url": "https://api.github.com/users/obelyakov/orgs",
"received_events_url": "https://api.github.com/users/obelyakov/received_events",
"repos_url": "https://api.github.com/users/obelyakov/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/obelyakov/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/obelyakov/subscriptions",
"type": "User",
"url": "https://api.github.com/users/obelyakov",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2014-10-02T08:53:21Z
|
2021-09-08T23:07:58Z
|
2014-10-02T12:49:05Z
|
NONE
|
resolved
|
Hello!
I use lib and get requests with proxy on port 8080.
Example:
r=requests.get('http://www.rambler.ru/', proxies={'http:':'http://login:[email protected]:8080', 'https:':'https://login:[email protected]:8080'}, timeout=5)
Outgoing connections to port 80 on the server closed. And this request I freezes on this server. But running on the server where the 80 ports are open.
Also I see the blocked connection firewall on port 80.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2260/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2260/timeline
| null |
completed
| null | null | false |
[
"The proxies parameter should be:\n\n``` python\nproxies = { 'http': 'http://login:[email protected]:8080', 'https': 'https://login:[email protected]:8080' }\n```\n\nYou need to move the colons out of the first and third fields.\n",
"@Hasimir is correct. Please consult the documentation before raising bugs. =)\n",
"Sorry)\n"
] |
https://api.github.com/repos/psf/requests/issues/2259
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2259/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2259/comments
|
https://api.github.com/repos/psf/requests/issues/2259/events
|
https://github.com/psf/requests/issues/2259
| 44,534,684 |
MDU6SXNzdWU0NDUzNDY4NA==
| 2,259 |
HTTP Authentication Any
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1127463?v=4",
"events_url": "https://api.github.com/users/tarzanjw/events{/privacy}",
"followers_url": "https://api.github.com/users/tarzanjw/followers",
"following_url": "https://api.github.com/users/tarzanjw/following{/other_user}",
"gists_url": "https://api.github.com/users/tarzanjw/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tarzanjw",
"id": 1127463,
"login": "tarzanjw",
"node_id": "MDQ6VXNlcjExMjc0NjM=",
"organizations_url": "https://api.github.com/users/tarzanjw/orgs",
"received_events_url": "https://api.github.com/users/tarzanjw/received_events",
"repos_url": "https://api.github.com/users/tarzanjw/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tarzanjw/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tarzanjw/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tarzanjw",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2014-10-01T07:15:19Z
|
2021-09-08T23:08:00Z
|
2014-10-01T07:51:46Z
|
NONE
|
resolved
|
Should `requests` supports the "ANY" authentication method for http authentication?
PHP curl supports this method: http://php.net/manual/en/function.curl-setopt.php (please search for CURLOPT_HTTPAUTH).
This method will support http authentication for both Basic and Digest. It will send the header with Basic first, and try for Digest if the server requests for Digest.
Because requests is for human, normally the client should not care about Basic or Digest.
What do you think?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2259/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2259/timeline
| null |
completed
| null | null | false |
[
"It's not clear to me that it's worthwhile adding this to requests proper. However, the [toolbelt project has exactly what you're looking for](http://toolbelt.readthedocs.org/en/latest/user.html#guessauth).\n",
"Thank you, this is exactly what I am looking for.\n\nP/S: In toolbelt's documentation, I think there is a typo:\n\n> This requires an additional request in case of **basic auth**, as usually basic auth is sent preemptively.\n\nThis should be _Digest auth_.\n\nThanks,\n",
"Nope, that's not a typo, it's just a little unclear. It means _when compared to requests default basic auth_, it requires an additional request. This is because normally requests doesn't wait for a 401 for basic auth, it just emits the header straight away. GuessAuth doesn't do that.\n",
"Oh, that way will make 2 requests for both of Basic and Digest auth?\n\nIf it's true, I think I should create an issue for GuessAuth.\n\nDo you know is there a special reason that GuessAuth has to do that?\n",
"@tarzanjw The short answer is because it doesn't know which header to send until it gets the 401 challenge. It's not _necessary_ that it do this, it could send Basic by default.\n",
"It _could_ but it won't (wearing my hat as toolbelt maintainer). Also, this isn't the place to discuss this.\n",
"Thanks for you reply,\n\nIf I use GuessAuth, it will sent the WWW-Authorization header at 2nd request (the one after 401 response), right?\n\nIf the server support Basic authentication, and the GuessAuth send the WWW-Authorization header at the first request, it will take us just 1 request only.\n\nDo you think that 1 is better than 2 ?\n",
"It _might_ save us one round trip, but it might also cause the server to return a 403 Forbidden. That would be bad. \n",
"It's okay, thanks for you reply.\n"
] |
https://api.github.com/repos/psf/requests/issues/2258
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2258/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2258/comments
|
https://api.github.com/repos/psf/requests/issues/2258/events
|
https://github.com/psf/requests/pull/2258
| 44,464,628 |
MDExOlB1bGxSZXF1ZXN0MjIwMTQ4MTY=
| 2,258 |
Add json parameter
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
] | null | 6 |
2014-09-30T15:59:29Z
|
2021-09-07T00:06:40Z
|
2014-10-05T16:46:09Z
|
CONTRIBUTOR
|
resolved
|
Closes #2025
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2258/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2258/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2258.diff",
"html_url": "https://github.com/psf/requests/pull/2258",
"merged_at": "2014-10-05T16:46:09Z",
"patch_url": "https://github.com/psf/requests/pull/2258.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2258"
}
| true |
[
"@Lukasa @kevinburke care to take another look at this?\n",
"LGTM\n",
"omg I'm so excited \n\n:heart: :sparkles: :cake: :rose: :cake: :sparkles: :heart:\n",
"@willingc thank you so much for your hard work on this. :cake: (Those emoji from @kennethreitz are for you ;))\n",
"Thank you @sigmavirus24 for your support and mentoring :guitar: :sunrise: :evergreen_tree: , @lukasa for the review :cat: :cat2: :smile_cat: , and @kennethreitz for the community atmosphere :camera: :v: :guitar:\n",
"@willingc thank YOU, and you're welcome :)\n"
] |
https://api.github.com/repos/psf/requests/issues/2257
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2257/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2257/comments
|
https://api.github.com/repos/psf/requests/issues/2257/events
|
https://github.com/psf/requests/pull/2257
| 44,463,621 |
MDExOlB1bGxSZXF1ZXN0MjIwMTQyMTY=
| 2,257 |
Add json upload
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1894766?v=4",
"events_url": "https://api.github.com/users/akhomchenko/events{/privacy}",
"followers_url": "https://api.github.com/users/akhomchenko/followers",
"following_url": "https://api.github.com/users/akhomchenko/following{/other_user}",
"gists_url": "https://api.github.com/users/akhomchenko/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/akhomchenko",
"id": 1894766,
"login": "akhomchenko",
"node_id": "MDQ6VXNlcjE4OTQ3NjY=",
"organizations_url": "https://api.github.com/users/akhomchenko/orgs",
"received_events_url": "https://api.github.com/users/akhomchenko/received_events",
"repos_url": "https://api.github.com/users/akhomchenko/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/akhomchenko/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/akhomchenko/subscriptions",
"type": "User",
"url": "https://api.github.com/users/akhomchenko",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2014-09-30T15:52:07Z
|
2021-09-08T10:01:11Z
|
2014-09-30T16:08:07Z
|
CONTRIBUTOR
|
resolved
|
Closes https://github.com/kennethreitz/requests/issues/2025
TODO:
- [ ] check `TODO`s
- [ ] more tests?
- [ ] `application/json` and files upload
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1894766?v=4",
"events_url": "https://api.github.com/users/akhomchenko/events{/privacy}",
"followers_url": "https://api.github.com/users/akhomchenko/followers",
"following_url": "https://api.github.com/users/akhomchenko/following{/other_user}",
"gists_url": "https://api.github.com/users/akhomchenko/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/akhomchenko",
"id": 1894766,
"login": "akhomchenko",
"node_id": "MDQ6VXNlcjE4OTQ3NjY=",
"organizations_url": "https://api.github.com/users/akhomchenko/orgs",
"received_events_url": "https://api.github.com/users/akhomchenko/received_events",
"repos_url": "https://api.github.com/users/akhomchenko/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/akhomchenko/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/akhomchenko/subscriptions",
"type": "User",
"url": "https://api.github.com/users/akhomchenko",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2257/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2257/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2257.diff",
"html_url": "https://github.com/psf/requests/pull/2257",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2257.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2257"
}
| true |
[
"@gagoman someone had been working on this for a while and stalled. I've opened a PR on their behalf. Did you read the entirety of the issue?\n",
"Yes, I had read an issue and there were no progress since June, so I've decided to implement it.\n",
"@sigmavirus24 reopen if it is still required.\n",
"@gagoman the discussion said the work would be slow but that two people were owning the work. The proper etiquette would have been to ask if they were still working in it. Thanks for your contribution, but we'll be continuing forward with their work.\n",
"@sigmavirus24 got it.\n"
] |
https://api.github.com/repos/psf/requests/issues/2256
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2256/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2256/comments
|
https://api.github.com/repos/psf/requests/issues/2256/events
|
https://github.com/psf/requests/issues/2256
| 44,423,818 |
MDU6SXNzdWU0NDQyMzgxOA==
| 2,256 |
connectionpool:Setting: read timeout to None & "'Task got bad yield: <Response [200]>'"
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3073209?v=4",
"events_url": "https://api.github.com/users/marcoippolito/events{/privacy}",
"followers_url": "https://api.github.com/users/marcoippolito/followers",
"following_url": "https://api.github.com/users/marcoippolito/following{/other_user}",
"gists_url": "https://api.github.com/users/marcoippolito/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/marcoippolito",
"id": 3073209,
"login": "marcoippolito",
"node_id": "MDQ6VXNlcjMwNzMyMDk=",
"organizations_url": "https://api.github.com/users/marcoippolito/orgs",
"received_events_url": "https://api.github.com/users/marcoippolito/received_events",
"repos_url": "https://api.github.com/users/marcoippolito/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/marcoippolito/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/marcoippolito/subscriptions",
"type": "User",
"url": "https://api.github.com/users/marcoippolito",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-09-30T10:45:18Z
|
2021-09-08T23:08:00Z
|
2014-09-30T15:50:30Z
|
NONE
|
resolved
|
Hi,
I'm porting the already functioning (in python3.4) code
(https://github.com/KeepSafe/aiohttp/blob/master/examples/crawl.py ) into python2.76 code,
through trollius, urllib and requests.
But....the output I get is:
time python crawl_requests.py http://www.ilsole24ore.com/english-version/front-page.shtml
DEBUG:trollius:Using selector: EpollSelector
('url to do = ', 'http://www.ilsole24ore.com/english-version/front-page.shtml')
('processing:', 'http://www.ilsole24ore.com/english-version/front-page.shtml')
INFO:urllib3.connectionpool:Starting new HTTP connection (1): www.ilsole24ore.com
DEBUG:urllib3.connectionpool:Setting read timeout to None
DEBUG:urllib3.connectionpool:"GET /english-version/front-page.shtml HTTP/1.1" 200 13699
('...', 'http://www.ilsole24ore.com/english-version/front-page.shtml', 'has error', "'Task got bad yield: <Response [200]>'")
('done:', 1, '; ok:', 0)
Here is my code:
http://ipaste.org/fLj
Could pleas give me some hints in solving this issue?
Lookin forward to your helpfull hints and suggestions.
Kind regards.
Marco
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2256/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2256/timeline
| null |
completed
| null | null | false |
[
"This is not a requests bug. =) We do not support asyncio.\n\nThe fact that you attempted this suggests you don't have a full understanding of how asyncio functions, which is understandable because asyncio is a nightmare of complexity. The short statement is that asyncio uses an event loop, combined with futures and coroutines (in the form of generators) to perform explicit concurrency.\n\nNote that it performs _explicit_ concurrency. You cannot drop a random library into an `asyncio` project and expect it to function correctly. The project needs to have been written from the ground up to understand `asyncio`, or at least to have been wrapped in `asyncio` goodness. `aiohttp` has been written in exactly that manner: requests has not. This means that requests never yields for other coroutines to run, and so they never will. This totally eliminates the advantage of `asyncio`.\n\nThere is no plan currently to rewrite requests to use `asyncio`: support is simply not there yet.\n\nIf you would like to use something that looks like requests' API, you have one obvious option.:[async-requests](https://github.com/inglesp/async-requests), which has been written for `asyncio`. I don't know, however, whether it is Python 3.4 only or not.\n\nUltimately, this is not a requests bug. =)\n",
"Hi Cory,\nthank you very much for your kind explanation and help.\n\nActually I'm struggling in finding the solution of this problem.\nKind regards.\nMarco\n\n2014-09-30 17:50 GMT+02:00 Cory Benfield [email protected]:\n\n> This is not a requests bug. =) We do not support asyncio.\n> \n> The fact that you attempted this suggests you don't have a full\n> understanding of how asyncio functions, which is understandable because\n> asyncio is a nightmare of complexity. The short statement is that asyncio\n> uses an event loop, combined with futures and coroutines (in the form of\n> generators) to perform explicit concurrency.\n> \n> Note that it performs _explicit_ concurrency. You cannot drop a random\n> library into an asyncio project and expect it to function correctly. The\n> project needs to have been written from the ground up to understand\n> asyncio, or at least to have been wrapped in asyncio goodness. aiohttp\n> has been written in exactly that manner: requests has not. This means that\n> requests never yields for other coroutines to run, and so they never will.\n> This totally eliminates the advantage of asyncio.\n> \n> There is no plan currently to rewrite requests to use asyncio: support is\n> simply not there yet.\n> \n> If you would like to use something that looks like requests' API, you have\n> one obvious option.:async-requests\n> https://github.com/inglesp/async-requests, which has been written for\n> asyncio. I don't know, however, whether it is Python 3.4 only or not.\n> \n> Ultimately, this is not a requests bug. =)\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2256#issuecomment-57335553\n> .\n"
] |
https://api.github.com/repos/psf/requests/issues/2255
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2255/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2255/comments
|
https://api.github.com/repos/psf/requests/issues/2255/events
|
https://github.com/psf/requests/issues/2255
| 44,247,749 |
MDU6SXNzdWU0NDI0Nzc0OQ==
| 2,255 |
Session requests should respect verify=False
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/585003?v=4",
"events_url": "https://api.github.com/users/isaulv/events{/privacy}",
"followers_url": "https://api.github.com/users/isaulv/followers",
"following_url": "https://api.github.com/users/isaulv/following{/other_user}",
"gists_url": "https://api.github.com/users/isaulv/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/isaulv",
"id": 585003,
"login": "isaulv",
"node_id": "MDQ6VXNlcjU4NTAwMw==",
"organizations_url": "https://api.github.com/users/isaulv/orgs",
"received_events_url": "https://api.github.com/users/isaulv/received_events",
"repos_url": "https://api.github.com/users/isaulv/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/isaulv/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/isaulv/subscriptions",
"type": "User",
"url": "https://api.github.com/users/isaulv",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
}
] |
closed
| true | null |
[] | null | 20 |
2014-09-29T01:06:48Z
|
2021-09-08T22:00:52Z
|
2015-08-31T09:45:01Z
|
NONE
|
resolved
|
Using a website that has a self-signed cert, the following use of verify=False fails:
import requests
s = requests.Session()
r = s.get('https://selfsignedsite')
_traceback_
r = s.get('https://selfsidnedsite', verify=False)
_traceback is the same about SSL3 not finding a valid cert_
requests.get('https://selfsignedsite', verify=False)
_traceback as above_
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2255/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2255/timeline
| null |
completed
| null | null | false |
[
"fails in what way?\n\nOn Sunday, September 28, 2014, Isaul Vargas [email protected]\nwrote:\n\n> Using a website that has a self-signed cert, the following use of\n> verify=False fails:\n> import requests\n> s = requests.Session()\n> r = s.get('https://selfsignedsite')\n> \n> r = s.get('https://selfsidnedsite', verify=False)\n> \n> requests.get('https://selfsignedsite', verify=False)\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2255.\n\n## \n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n",
"This traceback:\n\n<pre>Traceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/usr/local/lib/python2.7/site-packages/requests/sessions.py\", line 463, in get\n return self.request('GET', url, **kwargs)\n File \"/usr/local/lib/python2.7/site-packages/requests/sessions.py\", line 451, in request\n resp = self.send(prep, **send_kwargs)\n File \"/usr/local/lib/python2.7/site-packages/requests/sessions.py\", line 557, in send\n r = adapter.send(request, **kwargs)\n File \"/usr/local/lib/python2.7/site-packages/requests/adapters.py\", line 420, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: [Errno 1] _ssl.c:510: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed</pre>\n",
"What version of requests? \n",
"The latest release version 2.4.1\n",
"I can't reproduce this at all. Can you share the site you're trying to make a request against?\n",
"also does chrome or Firefox load the site?\n\nOn Sunday, September 28, 2014, Ian Cordasco [email protected]\nwrote:\n\n> I can't reproduce this at all. Can you share the site you're trying to\n> make a request against?\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2255#issuecomment-57108647\n> .\n\n## \n\n## \n\nKevin Burke\nphone: 925.271.7005 | twentymilliseconds.com\n",
"Yes firefox and Chrome load the site fine.\n",
"The site I was connecting to was <redacted>\n",
"Using the regular functional API:\n\n``` pycon\n>>> import requests\n>>> r = requests.get('https://my.qaa.sailthru-qa.com')\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/usr/local/lib/python2.7/site-packages/requests/api.py\", line 59, in get\n return request('get', url, **kwargs)\n File \"/usr/local/lib/python2.7/site-packages/requests/api.py\", line 48, in request\n return session.request(method=method, url=url, **kwargs)\n File \"/usr/local/lib/python2.7/site-packages/requests/sessions.py\", line 451, in request\n resp = self.send(prep, **send_kwargs)\n File \"/usr/local/lib/python2.7/site-packages/requests/sessions.py\", line 557, in send\n r = adapter.send(request, **kwargs)\n File \"/usr/local/lib/python2.7/site-packages/requests/adapters.py\", line 420, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: [Errno 1] _ssl.c:510: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed\n>>> r = requests.get('https://my.qaa.sailthru-qa.com', verify=False)\n/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py:730: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.org/en/latest/security.html (This warning will only appear once by default.)\n InsecureRequestWarning)\n>>> r.status_code\n200\n```\n\nUsing a Session:\n\n``` pycon\n>>> import requests\n>>> s = requests.session()\n>>> r2 = s.get('https://my.qaa.sailthru-qa.com', verify=False)\n/usr/local/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py:730: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.org/en/latest/security.html (This warning will only appear once by default.)\n InsecureRequestWarning)\n>>> r2.status_code\n200\n```\n",
"In otherwords, we do respect `verify=False`. You may be confused by the warning we emit when you disable certificate verification.\n",
"I am not confused by the warning. Just follow the sequence of steps I outlined. Although it is not something I would do in writing the code that I want, I can see instances where if someone writes code to automate connections, and they want to drop back on not using verification, that this would bite them.\n",
"Ah, I see what you mean. So this even had bizarre behaviour on 2.3.0:\n\n``` pycon\n>>> import requests\n>>> s = requests.session()\n>>> s.get('https://my.qaa.sailthru-qa.com')\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/Users/icordasc/virtualenv/tmp-a188018336f3351d/lib/python2.7/site-packages/requests/sessions.py\", line 468, in get\n return self.request('GET', url, **kwargs)\n File \"/Users/icordasc/virtualenv/tmp-a188018336f3351d/lib/python2.7/site-packages/requests/sessions.py\", line 456, in request\n resp = self.send(prep, **send_kwargs)\n File \"/Users/icordasc/virtualenv/tmp-a188018336f3351d/lib/python2.7/site-packages/requests/sessions.py\", line 559, in send\n r = adapter.send(request, **kwargs)\n File \"/Users/icordasc/virtualenv/tmp-a188018336f3351d/lib/python2.7/site-packages/requests/adapters.py\", line 382, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: [Errno 1] _ssl.c:510: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed\n>>> s.get('https://my.qaa.sailthru-qa.com', verify=False)\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/Users/icordasc/virtualenv/tmp-a188018336f3351d/lib/python2.7/site-packages/requests/sessions.py\", line 468, in get\n return self.request('GET', url, **kwargs)\n File \"/Users/icordasc/virtualenv/tmp-a188018336f3351d/lib/python2.7/site-packages/requests/sessions.py\", line 456, in request\n resp = self.send(prep, **send_kwargs)\n File \"/Users/icordasc/virtualenv/tmp-a188018336f3351d/lib/python2.7/site-packages/requests/sessions.py\", line 559, in send\n r = adapter.send(request, **kwargs)\n File \"/Users/icordasc/virtualenv/tmp-a188018336f3351d/lib/python2.7/site-packages/requests/adapters.py\", line 375, in send\n raise ConnectionError(e, request=request)\nrequests.exceptions.ConnectionError: HTTPSConnectionPool(host='my.qaa.sailthru-qa.com', port=443): Max retries exceeded with url: / (Caused by <class 'httplib.CannotSendRequest'>: )\n>>> s.get('https://my.qaa.sailthru-qa.com', verify=False)\n<Response [200]>\n```\n\nThis looks like it might be related to connection pooling. If on 2.4.1 I do this:\n\n``` pycon\n>>> s.close()\n>>> s.get('https://my.qaa.sailthru-qa.com', verify=False)\n<Response [200]>\n```\n\nAfter making the request with `verify=True` you can see that it works just fine (iirc because it clears the pool manager).\n",
"Yeah, this is almost certainly connection pooling. urllib3 puts the connection back when SSL errors are raised, but I really don't know if we can safely do that. @shazow, @t-8ch, what are your thoughts about not re-using connections that raise SSL errors?\n",
"This should already happen here: https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/connectionpool.py#L542\n",
"[Not according to my reading](https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/connectionpool.py#L566-L570).\n",
"Yeah, you are right, but this is the place where it should be done :-)\n",
"My question was really whether you think it _should_ be done. =)\n",
"Either that, or the connectionpool should include the ssl parameters in the lookup process, so it does not reuse the wrong connection like in this issue.\n",
"Hmm, the configuration of sockets in a ConnectionPool should be homogenous. If they're not, then I'd consider this a bug. (That is to say, the configuration of a socket should probably not be affected by request-time flags but rather pool-construction-time flags.)\n",
"This appears to be resolved in 2.7.0.\n"
] |
https://api.github.com/repos/psf/requests/issues/2254
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2254/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2254/comments
|
https://api.github.com/repos/psf/requests/issues/2254/events
|
https://github.com/psf/requests/issues/2254
| 44,219,057 |
MDU6SXNzdWU0NDIxOTA1Nw==
| 2,254 |
SSLError (hostname mismatch) on forge.ocamlcore.org
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/14172?v=4",
"events_url": "https://api.github.com/users/timbertson/events{/privacy}",
"followers_url": "https://api.github.com/users/timbertson/followers",
"following_url": "https://api.github.com/users/timbertson/following{/other_user}",
"gists_url": "https://api.github.com/users/timbertson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/timbertson",
"id": 14172,
"login": "timbertson",
"node_id": "MDQ6VXNlcjE0MTcy",
"organizations_url": "https://api.github.com/users/timbertson/orgs",
"received_events_url": "https://api.github.com/users/timbertson/received_events",
"repos_url": "https://api.github.com/users/timbertson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/timbertson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/timbertson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/timbertson",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-09-28T06:51:45Z
|
2021-09-08T23:07:58Z
|
2014-10-04T05:38:07Z
|
NONE
|
resolved
|
Using the latest version (2.4.1):
```
>>> requests.get('https://forge.ocamlcore.org/', stream=True)
```
Throws:
```
SSLError: hostname 'forge.ocamlcore.org' doesn't match either of 'lists.ocamlcore.org', 'ocamlcore.org'
```
The same URL shows no SSL issues in firefox or chrome, and curl also thinks it's fine:
```
$ curl -Iv 'https://forge.ocamlcore.org/'
( ... )
* Connected to forge.ocamlcore.org (87.98.154.45) port 443 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
* CAfile: /etc/pki/tls/certs/ca-bundle.crt
CApath: none
* SSL connection using TLS_DHE_RSA_WITH_AES_128_CBC_SHA
* Server certificate:
* subject: E=<redacted>@gmail.com,CN=forge.ocamlcore.org,C=CH,OID.2.5.4.13=ngEw1eS9i0H4lis9
* start date: Oct 01 20:21:04 2013 GMT
* expire date: Oct 02 00:48:22 2014 GMT
* common name: forge.ocamlcore.org
* issuer: CN=StartCom Class 1 Primary Intermediate Server CA,OU=Secure Digital Certificate Signing,O=StartCom Ltd.,C=IL
> HEAD / HTTP/1.1
> User-Agent: curl/7.32.0
> Host: forge.ocamlcore.org
> Accept: */*
>
< HTTP/1.1 200 OK
HTTP/1.1 200 OK
( ... )
```
I'm afraid I don't know enough about SSL certs to debug this any further, but I'm assuming that requests is missing something if firefox, chrome & curl all accept the cert.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/14172?v=4",
"events_url": "https://api.github.com/users/timbertson/events{/privacy}",
"followers_url": "https://api.github.com/users/timbertson/followers",
"following_url": "https://api.github.com/users/timbertson/following{/other_user}",
"gists_url": "https://api.github.com/users/timbertson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/timbertson",
"id": 14172,
"login": "timbertson",
"node_id": "MDQ6VXNlcjE0MTcy",
"organizations_url": "https://api.github.com/users/timbertson/orgs",
"received_events_url": "https://api.github.com/users/timbertson/received_events",
"repos_url": "https://api.github.com/users/timbertson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/timbertson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/timbertson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/timbertson",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2254/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2254/timeline
| null |
completed
| null | null | false |
[
"This is almost certainly an SNI issue. I don't have access to a shell right now to confirm it, but I suspect you're using Python 2.\n\nIn that case you need to install three new packages. You can either do that by reinstalling requests with the `security` extension (`pip install requests[security]`) or by installing them explicitly: `pip install PyOpenSSL pyasn1 ndg-httpsclient`.\n",
"Thanks for the tips - I had no idea about the security extension. I'm afraid it took me too long getting the native extensions installed, and in that time the certificate has expired anyway! But I'll take your word for this, and reopen if I need to.\n"
] |
https://api.github.com/repos/psf/requests/issues/2253
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2253/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2253/comments
|
https://api.github.com/repos/psf/requests/issues/2253/events
|
https://github.com/psf/requests/pull/2253
| 44,184,573 |
MDExOlB1bGxSZXF1ZXN0MjE4OTIzNzA=
| 2,253 |
A fix for #1979: repeat HTTP digest authentication after redirect.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1481195?v=4",
"events_url": "https://api.github.com/users/yossigo/events{/privacy}",
"followers_url": "https://api.github.com/users/yossigo/followers",
"following_url": "https://api.github.com/users/yossigo/following{/other_user}",
"gists_url": "https://api.github.com/users/yossigo/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/yossigo",
"id": 1481195,
"login": "yossigo",
"node_id": "MDQ6VXNlcjE0ODExOTU=",
"organizations_url": "https://api.github.com/users/yossigo/orgs",
"received_events_url": "https://api.github.com/users/yossigo/received_events",
"repos_url": "https://api.github.com/users/yossigo/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/yossigo/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/yossigo/subscriptions",
"type": "User",
"url": "https://api.github.com/users/yossigo",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true | null |
[] | null | 5 |
2014-09-27T18:19:36Z
|
2021-09-08T09:01:18Z
|
2014-11-01T14:04:29Z
|
CONTRIBUTOR
|
resolved
|
This solves the following scenario:
(1) 401 (Digest authentication begins)
(2) 302 (Authenticated, response redirects to another endpoint)
(3) 401 (New endpoint also requires authentication)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2253/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2253/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2253.diff",
"html_url": "https://github.com/psf/requests/pull/2253",
"merged_at": "2014-11-01T14:04:29Z",
"patch_url": "https://github.com/psf/requests/pull/2253.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2253"
}
| true |
[
"Hey @yossigo thanks for the pull request!\n\nWhile you're here, could you fix something that I think is a bug? On [this line](https://github.com/yossigo/requests/commit/c28da22e9c42e22b303bb07da434ce65e10c0cb2#diff-9f3a95293a5d26032b1c588167760362R193) could you change it to\n\n``` python\nsetattr(self, 'num_401_calls', num_401_calls + 1)\n```\n\nWe still want to check to prevent users from being caught in an endless challenge state with a server.\n\nRight now, I'm trying to decide if I think `handle_302` should check the response code before resetting the attribute because it will be called regardless of whether a 302 was received or not.\n\nAlso, @yossigo what do you think about setting `num_401_calls` to 1 in `handle_302` instead of deleting the attribute? It would be much simpler. Also, there's no need to return `r` from `handle_302`.\n",
"It seems like the new 302 handler will actually zero out the count every request. That might be deliberate, but we should probably make sure it only gets called on redirects.\n",
"I've cleaned up a bit the original commit (which was too hasty). Now avoiding possible looping around 401s and will handle all redirect responses.\n",
"I think this is ready to merge. Sorry I didn't see your update @yossigo \n\nThanks for this!\n",
":sparkles: :cake: :sparkles:\n"
] |
https://api.github.com/repos/psf/requests/issues/2252
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2252/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2252/comments
|
https://api.github.com/repos/psf/requests/issues/2252/events
|
https://github.com/psf/requests/pull/2252
| 44,182,917 |
MDExOlB1bGxSZXF1ZXN0MjE4OTE5NzA=
| 2,252 |
to use ssl connections ssl varialble must be differ than None.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3129353?v=4",
"events_url": "https://api.github.com/users/kostyll/events{/privacy}",
"followers_url": "https://api.github.com/users/kostyll/followers",
"following_url": "https://api.github.com/users/kostyll/following{/other_user}",
"gists_url": "https://api.github.com/users/kostyll/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kostyll",
"id": 3129353,
"login": "kostyll",
"node_id": "MDQ6VXNlcjMxMjkzNTM=",
"organizations_url": "https://api.github.com/users/kostyll/orgs",
"received_events_url": "https://api.github.com/users/kostyll/received_events",
"repos_url": "https://api.github.com/users/kostyll/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kostyll/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kostyll/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kostyll",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-09-27T17:42:35Z
|
2021-09-08T10:01:12Z
|
2014-09-27T19:20:15Z
|
NONE
|
resolved
|
it's my solution of #2251
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2252/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2252/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2252.diff",
"html_url": "https://github.com/psf/requests/pull/2252",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/2252.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2252"
}
| true |
[
"Hi @kostyll thanks for the change! Unfortunately this is a change that belongs upstream in [urllib3](/shazow/urllib3).\n"
] |
https://api.github.com/repos/psf/requests/issues/2251
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2251/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2251/comments
|
https://api.github.com/repos/psf/requests/issues/2251/events
|
https://github.com/psf/requests/issues/2251
| 44,182,878 |
MDU6SXNzdWU0NDE4Mjg3OA==
| 2,251 |
to use ssl connections ssl varialble must be differ than None.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3129353?v=4",
"events_url": "https://api.github.com/users/kostyll/events{/privacy}",
"followers_url": "https://api.github.com/users/kostyll/followers",
"following_url": "https://api.github.com/users/kostyll/following{/other_user}",
"gists_url": "https://api.github.com/users/kostyll/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kostyll",
"id": 3129353,
"login": "kostyll",
"node_id": "MDQ6VXNlcjMxMjkzNTM=",
"organizations_url": "https://api.github.com/users/kostyll/orgs",
"received_events_url": "https://api.github.com/users/kostyll/received_events",
"repos_url": "https://api.github.com/users/kostyll/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kostyll/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kostyll/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kostyll",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2014-09-27T17:41:46Z
|
2021-09-08T23:08:01Z
|
2014-09-29T01:46:25Z
|
NONE
|
resolved
|
Hello, I've get the next problem
response = requests.post(self.url("translate"), data=data)
File "/home/andrew/.config/sublime-text-3/Packages/RusVariablesTranslator/requests/requests/api.py", line 92, in post
return request('post', url, data=data, *_kwargs)
File "/home/andrew/.config/sublime-text-3/Packages/RusVariablesTranslator/requests/requests/api.py", line 48, in request
return session.request(method=method, url=url, *_kwargs)
File "/home/andrew/.config/sublime-text-3/Packages/RusVariablesTranslator/requests/requests/sessions.py", line 451, in request
resp = self.send(prep, *_send_kwargs)
File "/home/andrew/.config/sublime-text-3/Packages/RusVariablesTranslator/requests/requests/sessions.py", line 562, in send
r = adapter.send(request, *_kwargs)
File "/home/andrew/.config/sublime-text-3/Packages/RusVariablesTranslator/requests/requests/adapters.py", line 362, in send
timeout=timeout
File "/home/andrew/.config/sublime-text-3/Packages/RusVariablesTranslator/requests/requests/packages/urllib3/connectionpool.py", line 516, in urlopen
body=body, headers=headers)
File "/home/andrew/.config/sublime-text-3/Packages/RusVariablesTranslator/requests/requests/packages/urllib3/connectionpool.py", line 304, in _make_request
self._validate_conn(conn)
File "/home/andrew/.config/sublime-text-3/Packages/RusVariablesTranslator/requests/requests/packages/urllib3/connectionpool.py", line 722, in _validate_conn
conn.connect()
File "/home/andrew/.config/sublime-text-3/Packages/RusVariablesTranslator/requests/requests/packages/urllib3/connection.py", line 169, in connect
self.sock = ssl.wrap_socket(conn, self.key_file, self.cert_file)
AttributeError: 'NoneType' object has no attribute 'wrap_socket'
....
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2251/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2251/timeline
| null |
completed
| null | null | false |
[
"Hey @kostyll can you tell us what version of Python you're using?\n",
"py3.3.3 - interpreter in sublime text 3. but I don't know how to import ssl(_ssl) module .... in sublime text 3 under linux python interpreter is compliled without this module ...\n",
"This is not really a bug, it's just what happens when you don't have an SSL module in your Python distribution. It looks like ST3 compiles without it, which isn't particularly surprising.\n",
"Thanks. I've already read about it. But how can I get this module (_ssl) supported in my ST3 ? I've read http://sublimetext.userecho.com/topic/50801-bundle-python-ssl-module/#comment_165820 but I've not understood the mechanism of including this module manualy yet.\n",
"This is not a bug in requests or urllib3. If you need help with this @kostyll you should see if there are any ST3 users who can help you with this. Better yet, I would advise not using the vendored copy of Python with ST3.\n",
"Thank you guys!) But I went another way - I've fount repo https://bitbucket.org/klorenz/sublimessl which contains implementation of binding of linking ssl library. The example of usage is you can find in my repo : https://github.com/kostyll/my-sublime-text-3-ya-translate-plugin which contains simple ST3 plugin. Good luck !)\n"
] |
https://api.github.com/repos/psf/requests/issues/2250
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2250/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2250/comments
|
https://api.github.com/repos/psf/requests/issues/2250/events
|
https://github.com/psf/requests/issues/2250
| 43,960,734 |
MDU6SXNzdWU0Mzk2MDczNA==
| 2,250 |
Link header parsing breaks if attribute value has commas
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1693859?v=4",
"events_url": "https://api.github.com/users/hariharshankar/events{/privacy}",
"followers_url": "https://api.github.com/users/hariharshankar/followers",
"following_url": "https://api.github.com/users/hariharshankar/following{/other_user}",
"gists_url": "https://api.github.com/users/hariharshankar/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/hariharshankar",
"id": 1693859,
"login": "hariharshankar",
"node_id": "MDQ6VXNlcjE2OTM4NTk=",
"organizations_url": "https://api.github.com/users/hariharshankar/orgs",
"received_events_url": "https://api.github.com/users/hariharshankar/received_events",
"repos_url": "https://api.github.com/users/hariharshankar/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/hariharshankar/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hariharshankar/subscriptions",
"type": "User",
"url": "https://api.github.com/users/hariharshankar",
"user_view_type": "public"
}
|
[
{
"color": "e10c02",
"default": false,
"description": null,
"id": 117744,
"name": "Bug",
"node_id": "MDU6TGFiZWwxMTc3NDQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Bug"
},
{
"color": "0b02e1",
"default": false,
"description": null,
"id": 191274,
"name": "Contributor Friendly",
"node_id": "MDU6TGFiZWwxOTEyNzQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly"
}
] |
closed
| true | null |
[] | null | 5 |
2014-09-25T20:01:41Z
|
2021-09-08T23:07:55Z
|
2014-10-10T18:30:18Z
|
NONE
|
resolved
|
The function parse_header_links does not parse link headers correctly if the attribute value for a link has commas or semi-colons in them.
https://github.com/kennethreitz/requests/blob/master/requests/utils.py#L562
The Memento protocol (RFC 7089) allows HTTP datetime in the link attributes and the link header parser cannot parse this reliably.
For example:
``` python
>>> r = requests.get('http://mementoweb.org/timegate/http://www.google.com', allow_redirects=False)
>>> r.headers['link']
"""<http://www.google.com>;rel="original" ,
<http://mementoweb.org/timemap/link/1/http://www.google.com>;rel="timemap"; type="application/link-format",
<http://mementoweb.org/timegate/http://www.google.com>;rel="timegate",
<http://web.archive.org/web/20140911065756/http://www.google.com/>;rel="memento"; datetime="Thu, 11 Sep 2014 06:57:56 GMT",
<http://web.archive.org/web/20131004112325/http://www.google.com/>;rel="memento first"; datetime="Sat, 04 Oct 1997 22:20:27 GMT",
<http://web.archive.org/web/20121011204401/https://www.google.com/>;rel="memento last"; datetime="Sat, 11 Oct 2014 10:05:51 GMT""""
>>> r.links
{'04 Oct 1997 22:20:27 GMT': {'url': '04 Oct 1997 22:20:27 GMT'},
'11 Oct 2014 10:05:51 GMT': {'url': '11 Oct 2014 10:05:51 GMT'},
'11 Sep 2014 06:57:56 GMT': {'url': '11 Sep 2014 06:57:56 GMT'},
'memento': {'datetime': 'Thu',
'rel': 'memento',
'url': 'http://web.archive.org/web/20140911065756/http://www.google.com/'},
'memento first': {'datetime': 'Sat',
'rel': 'memento first',
'url': 'http://web.archive.org/web/20131004112325/http://www.google.com/'},
'memento last': {'datetime': 'Sat',
'rel': 'memento last',
'url': 'http://web.archive.org/web/20121011204401/https://www.google.com/'},
'original': {'rel': 'original', 'url': 'http://www.google.com'},
'timegate': {'rel': 'timegate',
'url': 'http://mementoweb.org/timegate/http://www.google.com'},
'timemap': {'rel': 'timemap',
'type': 'application/link-format',
'url': 'http://mementoweb.org/timemap/link/1/http://www.google.com'}}
```
There are third party link header parsers in the URLs below, but it would be very convenient if the requests' parse_header_links function could parse correctly.
https://bitbucket.org/azaroth42/linkheaderparser/src/c2321bf3349b94a12a37ed8c41d4e4785006ada7/parse_link.py?at=default
https://gist.github.com/mnot/210535
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2250/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2250/timeline
| null |
completed
| null | null | false |
[
"Yeah, we're aware of this and have had an open issue in the past for it. It's not high-priority work for us, so I'm tagging it as 'Contributor Friendly' and leaving it open for others.\n\nIf someone finds this, it was last looked at in part of #1980.\n",
"Actually your bug report is inaccurate and a bit misleading. You'll note that we accurately (and correctly) parse Link headers that conform to [RFC 5988](http://tools.ietf.org/html/rfc5988) (and only have one link per relation). You can verify this with your own example, we parse:\n\n```\n<http://mementoweb.org/timegate/http://www.google.com>;rel=\"timegate\",\n```\n\ninto \n\n``` python\n 'timegate': {'rel': 'timegate',\n 'url': 'http://mementoweb.org/timegate/http://www.google.com'},\n```\n\nand \n\n```\n<http://mementoweb.org/timemap/link/1/http://www.google.com>;rel=\"timemap\"; type=\"application/link-format\",\n```\n\ninto \n\n``` python\n 'timemap': {'rel': 'timemap',\n 'type': 'application/link-format',\n 'url': 'http://mementoweb.org/timemap/link/1/http://www.google.com'}\n```\n\nMemento (RFC 7089) is an extension of 5988 (or so it would seem) and I'm not entirely convinced that we're required to support it. This also doesn't seem related to #1980 since this isn't in relation to having multiple links for a single relation.\n\nThat said, I agree our parsing could be better here (e.g., we shouldn't create entries in `links` for `dates`). We should parse out _what we are supposed to [1]_ and only that.\n\n1: \"supposed to\" meaning what is supported in 5988 (on a quick scan I don't see support for datetimes but I think we're already sort of parsing them and I can think of a fix for this)\n",
"My bug report indeed focused on problems in parsing attribute values that contain commas and semicolons. Having looked at https://github.com/kennethreitz/requests/pull/1980 , I now see that there are several reasons why Link headers that are used in the Memento protocol will not be parsed correctly by the Requests Link header parser, as they often contain:\n\n[a] Multiple links with the same relation type\n[b] Multiple relation types for a link\n[c] Links with extension attributes\n\nA Memento Link header that exemplifies all three is shown in the Memento RFC at http://www.mementoweb.org/guide/rfc/#Pattern1.2\n\n[a], [b], and [c] are fully compliant with RFC 5988 (Web Linking). Mark Nottingham, author of RFC 5988 was actually a reviewer of the Memento RFC on behalf of the IETF. While [a] and [b] are\nstraightforward uses of RFC 5988, admittedly, [c] builds on the extension mechanism provided by the BNF of section \"5. The Link Header Field\" of RFC 5988. In addition, in Memento, these extension\nattributes are used to provide HTTP datetime values, which unfortunately contain a comma. But since Memento is all about resources and time, and since it is an extension of HTTP, there was no\nway to avoid this in the protocol specification. Note, BTW, that the use of commas and semicolons in attributes is legitimate in RFC 5988 as long as the attribute values are quoted.\n\nIt seems beneficial to the community to make the Requests Link header parser compliant with RFC 5988. Once that is done, Link headers as used in the Memento protocol will also parse correctly. I am willing to provide help to achieve that goal.\n",
"@hariharshankar if you can solve the sole issue here without also addressing #1980 that will be fine. If you can fix the parsing so extensions of 5988 to handle case [c], then that's great. As you can tell from #1980 cases [a] and [b] are not cases we're willing to tackle as a core part of the library.\n",
"Simple fix:\nchange the line:\n`for val in value.split(\",\"):`\nto:\n`for val in re.split(\",\\ *<\",value):`\n\nI have tested this on the given cases with success\n"
] |
https://api.github.com/repos/psf/requests/issues/2249
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2249/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2249/comments
|
https://api.github.com/repos/psf/requests/issues/2249/events
|
https://github.com/psf/requests/pull/2249
| 43,959,582 |
MDExOlB1bGxSZXF1ZXN0MjE4MDc5MDE=
| 2,249 |
fix #2247
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/717901?v=4",
"events_url": "https://api.github.com/users/t-8ch/events{/privacy}",
"followers_url": "https://api.github.com/users/t-8ch/followers",
"following_url": "https://api.github.com/users/t-8ch/following{/other_user}",
"gists_url": "https://api.github.com/users/t-8ch/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/t-8ch",
"id": 717901,
"login": "t-8ch",
"node_id": "MDQ6VXNlcjcxNzkwMQ==",
"organizations_url": "https://api.github.com/users/t-8ch/orgs",
"received_events_url": "https://api.github.com/users/t-8ch/received_events",
"repos_url": "https://api.github.com/users/t-8ch/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/t-8ch/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/t-8ch/subscriptions",
"type": "User",
"url": "https://api.github.com/users/t-8ch",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2014-09-25T19:50:35Z
|
2021-09-08T10:01:08Z
|
2014-10-05T17:19:14Z
|
CONTRIBUTOR
|
resolved
|
We have to pass urllib3 the url without the authentication information,
else it will be parsed by httplib as a netloc and included in the request line
and Host header
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2249/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2249/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2249.diff",
"html_url": "https://github.com/psf/requests/pull/2249",
"merged_at": "2014-10-05T17:19:14Z",
"patch_url": "https://github.com/psf/requests/pull/2249.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2249"
}
| true |
[
"This _looks_ right, but it's hard for me to know because URLs are a crazy mess. I'll let @sigmavirus24 review this, he knows URLs better than I do.\n",
"So if we used a better (read: less forgiving) URL parser library we wouldn't need to split on `@` (like we do [here](https://github.com/kennethreitz/requests/pull/2249/files#diff-5956087d5835a57d9ef6fff974f6fd9bR687)). In principle this works fine. In reality, we should get something like `rfc3986` into acceptable shape for @shazow or just make the `Url` object in `urllib3` more RFC compliant (whichever works better) and use that for parsing the URL and reconstructing it. In short, you have 5 major components of the URL:\n\n```\n{scheme}://{authority}{/path}{?query}{#fragment}\n```\n\nAnd `authority` which we're dealing with right now has 3 sub-components so a URL would look like:\n\n```\n{scheme}://{userinfo@}{hostname}{:port}{/path}{?query}{#fragment}\n```\n\n`rfc3986` would split this up and allow you to replace `userinfo` with `None` and then reconstruct the URL. For a quick fix, this is great. I'd rather not have so much URL/URI parsing logic in requests though, this is an HTTP library not an HTTP + URL + ... library.\n",
"+∞. I think it would be best to have URL objects that are immutable and an API like URL.replace(userinfo=None, fragment=None)\n",
"@t-8ch that's roughly `rfc3986`'s API, except that I don't think I call it `replace` but maybe I do.... I'll double check later.\n",
"@sigmavirus24 Please open the specific issue with urllib3's url parser, no need to be backhanded. :)\n",
"Sorry @shazow, it wasn't meant to be back handed. I'll pull together the list of things the object is missing and make an issue with it tonight.\n",
"Thanks. :)\n",
"What's the status of this? \n",
"@kennethreitz :+1: \n"
] |
https://api.github.com/repos/psf/requests/issues/2248
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2248/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2248/comments
|
https://api.github.com/repos/psf/requests/issues/2248/events
|
https://github.com/psf/requests/pull/2248
| 43,949,232 |
MDExOlB1bGxSZXF1ZXN0MjE4MDE0NzM=
| 2,248 |
docs: Clarify how to pass a custom set of CAs
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2631925?v=4",
"events_url": "https://api.github.com/users/jktjkt/events{/privacy}",
"followers_url": "https://api.github.com/users/jktjkt/followers",
"following_url": "https://api.github.com/users/jktjkt/following{/other_user}",
"gists_url": "https://api.github.com/users/jktjkt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jktjkt",
"id": 2631925,
"login": "jktjkt",
"node_id": "MDQ6VXNlcjI2MzE5MjU=",
"organizations_url": "https://api.github.com/users/jktjkt/orgs",
"received_events_url": "https://api.github.com/users/jktjkt/received_events",
"repos_url": "https://api.github.com/users/jktjkt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jktjkt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jktjkt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jktjkt",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-09-25T18:12:30Z
|
2021-09-08T10:01:13Z
|
2014-09-25T18:28:05Z
|
CONTRIBUTOR
|
resolved
|
This new wording makes it hopefully easier to find how to override the
system-provided list of trusted CAs. I failed to find this through some googling
and had to resort to asking on IRC previously.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2248/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2248/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2248.diff",
"html_url": "https://github.com/psf/requests/pull/2248",
"merged_at": "2014-09-25T18:28:05Z",
"patch_url": "https://github.com/psf/requests/pull/2248.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2248"
}
| true |
[
"Thanks! :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/2247
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2247/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2247/comments
|
https://api.github.com/repos/psf/requests/issues/2247/events
|
https://github.com/psf/requests/issues/2247
| 43,923,889 |
MDU6SXNzdWU0MzkyMzg4OQ==
| 2,247 |
requests sends HTTP auth details in Host header
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/56778?v=4",
"events_url": "https://api.github.com/users/cool-RR/events{/privacy}",
"followers_url": "https://api.github.com/users/cool-RR/followers",
"following_url": "https://api.github.com/users/cool-RR/following{/other_user}",
"gists_url": "https://api.github.com/users/cool-RR/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/cool-RR",
"id": 56778,
"login": "cool-RR",
"node_id": "MDQ6VXNlcjU2Nzc4",
"organizations_url": "https://api.github.com/users/cool-RR/orgs",
"received_events_url": "https://api.github.com/users/cool-RR/received_events",
"repos_url": "https://api.github.com/users/cool-RR/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/cool-RR/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cool-RR/subscriptions",
"type": "User",
"url": "https://api.github.com/users/cool-RR",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 22 |
2014-09-25T14:15:55Z
|
2021-09-08T23:07:56Z
|
2014-10-05T17:19:15Z
|
NONE
|
resolved
|
I noticed that the Host header of my http requests generated by `requests` have the HTTP auth credentials, like this:
```
Host: admin:[email protected]
```
(This is when I specified them in the URL.)
Is this standard? It looks weird to me.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2247/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2247/timeline
| null |
completed
| null | null | false |
[
"Wow, that's wrong! I don't have my laptop right now so I can't investigate, but can anyone repro with the latest version of requests?\n",
"``` pycon\n>>> import requests\n>>> r = requests.get('https://username:[email protected]/get')\n>>> r.json()['headers']\n{'Accept': '*/*', 'Accept-Encoding': 'gzip, deflate', 'User-Agent': 'python-requests/2.4.1 CPython/3.3.2 Darwin/13.3.0', 'Authorization': 'Basic dXNlcm5hbWU6Zm9v', 'X-Request-Id': 'fb06edf3-14c8-42ad-b3c5-fefa801ddd9a', 'Connection': 'close', 'Host': 'httpbin.org'}\n>>> r.request.headers\n{'Authorization': 'Basic dXNlcm5hbWU6Zm9v', 'Accept': '*/*', 'Accept-Encoding': 'gzip, deflate', 'Connection': 'keep-alive', 'User-Agent': 'python-requests/2.4.1 CPython/3.3.2 Darwin/13.3.0'}\n```\n\nNote that this doesn't seem to be reproducable. @cool-RR can you share the code and requests version that caused this?\n",
"Ian, can you please check using something like Fiddler the true Host of the\nrequest?\n\nOn Thu, Sep 25, 2014 at 6:44 PM, Ian Cordasco [email protected]\nwrote:\n\n> > > > import requests>>> r = requests.get('https://username:[email protected]/get')>>> r.json()['headers']{'Accept': '_/_', 'Accept-Encoding': 'gzip, deflate', 'User-Agent': 'python-requests/2.4.1 CPython/3.3.2 Darwin/13.3.0', 'Authorization': 'Basic dXNlcm5hbWU6Zm9v', 'X-Request-Id': 'fb06edf3-14c8-42ad-b3c5-fefa801ddd9a', 'Connection': 'close', 'Host': 'httpbin.org'}>>> r.request.headers{'Authorization': 'Basic dXNlcm5hbWU6Zm9v', 'Accept': '_/_', 'Accept-Encoding': 'gzip, deflate', 'Connection': 'keep-alive', 'User-Agent': 'python-requests/2.4.1 CPython/3.3.2 Darwin/13.3.0'}\n> \n> Note that this doesn't seem to be reproducable. @cool-RR\n> https://github.com/cool-RR can you share the code and requests version\n> that caused this?\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2247#issuecomment-56838438\n> .\n",
"@cool-RR I'll do that when you can give me the version and the code. The true host received by httpbin is what is printed on the 4th line. Also, it would look as though we're stripping the auth out and creating an Authorization header as we should be, before we even pass the URL on to urllib3.\n",
"Python 2.7.6 (default, Nov 10 2013, 19:24:24) [MSC v.1500 64 bit (AMD64)]\non win32\nType \"copyright\", \"credits\" or \"license()\" for more information.\nDreamPie 1.2.1\n\n> > > import requests\n> > > requests.**version**\n> > > 0: '2.4.1'\n> > > requests.get('http://yo:[email protected]')\n> > > 1: <Response [400]>\n\nRequest in Fiddler raw view:\n\nGET http://yo:[email protected]/ HTTP/1.1\nHost: yo:[email protected]\nConnection: keep-alive\nAccept: _/_\nAccept-Encoding: gzip, deflate\nAuthorization: Basic eW86cHc=\nUser-Agent: python-requests/2.4.1 CPython/2.7.6 Windows/7\n\nAlso Fiddler shows another warning about the port not being specified. Also\nshould the credentials show in the first line too?\n\nOn Thu, Sep 25, 2014 at 6:47 PM, Ian Cordasco [email protected]\nwrote:\n\n> @cool-RR https://github.com/cool-RR I'll do that when you can give me\n> the version and the code. The true host received by httpbin is what is\n> printed on the 4th line. Also, it would look as though we're stripping the\n> auth out and creating an Authorization header as we should be, before we\n> even pass the URL on to urllib3.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2247#issuecomment-56838948\n> .\n",
"This bug is quite clearly proxy specific. =) Our generating of Host headers and Request URLs is clearly wrong there. \n",
"Sorry, I'm confused. Why is this clearly proxy-specific?\n\nOn Thu, Sep 25, 2014 at 7:15 PM, Cory Benfield [email protected]\nwrote:\n\n> This bug is quite clearly proxy specific. =) Our generating of Host\n> headers and Request URLs is clearly wrong there.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2247#issuecomment-56843117\n> .\n",
"The full request URL is only sent when sent to a proxy. I presume you have your `$HTTP_PROXY` environment variable set to Fiddler. We've detected that and used our proxy-specific handling. This is why @sigmavirus24 couldn't reproduce your behaviour. \n",
"I'm seeing this with mitmproxy (and using explicity proxy settings). If I use netcat then I see this:\n\n\n\nSo I can confirm this is a proxy-specific issue.\n",
"I suggest comparing the output from the proxy-specific request to the\nnormal one to find any other diversions (like maybe port number missing or\nanything else)\n\nOn Thu, Sep 25, 2014 at 7:27 PM, Ian Cordasco [email protected]\nwrote:\n\n> I'm seeing this with mitmproxy (and using explicity proxy settings). If I\n> use netcat then I see this:\n> \n> [image: screen shot 2014-09-25 at 11 26 45 am]\n> https://cloud.githubusercontent.com/assets/240830/4408147/d623579a-44d0-11e4-9147-0a0a7b700612.png\n> \n> So I can confirm this is a proxy-specific issue.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2247#issuecomment-56844956\n> .\n",
"I blame https://github.com/shazow/urllib3/blob/f28732e73e9af6656d32584221ccc8b4bdb83920/urllib3/poolmanager.py#L242\n(Investigating, there was a bugreport about this in the past)\n",
"That explains why it's in the Host header, but not why it's in the request URL.\n\nBoth are likely to be my fault though, that's code I wrote.\n",
"Na, I was wrong, I mixed up the auth info and the port\n",
"https://hg.python.org/cpython/file/bfdb995e8d7d/Lib/http/client.py#l1061\n`urlsplit` from the stdlib includes the auth stuff in the netloc.\nWe should probably remove the auth stuff from the url before passing the URL down to httplib.\n",
"We have to remove the auth info here:\nhttps://github.com/kennethreitz/requests/blob/a718a81d273503bd2ffae8e6cb036a8516eb426a/requests/adapters.py#L273\n",
"Did we decide if/where the bug was in urllib3? I'd like to have it fixed in urllib3 also.\n",
"I assumed, urllib3 expected users to not include auth stuff in URLs but set them in the headers\n",
"Mmm, I guess that's one interpretation. Not sure I'd expect that, at least with `proxy_from_url`.\n\nBut either way, I'm still confused about how auth was getting into `Url.netloc`? Was it all because of non-`://` schemas?\n",
"It was in the URL which was passed as is down to httplib which used it in the request line and Host header\n",
"To clarify: It was never in Url.netloc, this was a mistake on my part. Urllib3 never touches the URL in the codepath used by requests\n",
"Ah k, thanks.\n",
"Probably redundant, but here's an example where I faced the problem: http://stackoverflow.com/questions/26109264/pip-proxy-authentication-and-not-supported-proxy-scheme\n"
] |
https://api.github.com/repos/psf/requests/issues/2246
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2246/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2246/comments
|
https://api.github.com/repos/psf/requests/issues/2246/events
|
https://github.com/psf/requests/issues/2246
| 43,598,962 |
MDU6SXNzdWU0MzU5ODk2Mg==
| 2,246 |
speed limit and time out
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1589540?v=4",
"events_url": "https://api.github.com/users/ibroomcorn/events{/privacy}",
"followers_url": "https://api.github.com/users/ibroomcorn/followers",
"following_url": "https://api.github.com/users/ibroomcorn/following{/other_user}",
"gists_url": "https://api.github.com/users/ibroomcorn/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ibroomcorn",
"id": 1589540,
"login": "ibroomcorn",
"node_id": "MDQ6VXNlcjE1ODk1NDA=",
"organizations_url": "https://api.github.com/users/ibroomcorn/orgs",
"received_events_url": "https://api.github.com/users/ibroomcorn/received_events",
"repos_url": "https://api.github.com/users/ibroomcorn/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ibroomcorn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ibroomcorn/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ibroomcorn",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-09-23T07:32:49Z
|
2021-09-08T23:08:03Z
|
2014-09-23T09:48:35Z
|
NONE
|
resolved
|
requests doesn't have speed limit?
curl like this: --limit-rate
wget like this: --limit-rate
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2246/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2246/timeline
| null |
completed
| null | null | false |
[
"Correct.\n\nRequests simply doesn't have this low-level of control over the HTTP stack except in one specific situation: when sending a chunked upload, which we handle ourselves. In all other cases `httplib` is responsible for sending our data, and _it_ is the one that lacks a speed limit.\n\nWithout a patch to `httplib` we cannot add such a feature. Even with a patch, that is a sufficiently low-level feature that we probably wouldn't add it anyway.\n\nI'm sorry we can't be more helpful. =(\n"
] |
https://api.github.com/repos/psf/requests/issues/2245
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2245/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2245/comments
|
https://api.github.com/repos/psf/requests/issues/2245/events
|
https://github.com/psf/requests/pull/2245
| 43,582,412 |
MDExOlB1bGxSZXF1ZXN0MjE2MjE3Nzg=
| 2,245 |
Correct redirection introduction
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-09-23T02:08:53Z
|
2021-09-08T10:01:14Z
|
2014-09-23T06:25:26Z
|
CONTRIBUTOR
|
resolved
|
The history attribute contains responses, not requests.
Thanks to Yang Yang for reporting this to me via email
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2245/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2245/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2245.diff",
"html_url": "https://github.com/psf/requests/pull/2245",
"merged_at": "2014-09-23T06:25:26Z",
"patch_url": "https://github.com/psf/requests/pull/2245.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2245"
}
| true |
[
":+1: :cake:\n"
] |
https://api.github.com/repos/psf/requests/issues/2244
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2244/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2244/comments
|
https://api.github.com/repos/psf/requests/issues/2244/events
|
https://github.com/psf/requests/pull/2244
| 43,582,135 |
MDExOlB1bGxSZXF1ZXN0MjE2MjE2NzM=
| 2,244 |
Avoid getting stuck in a loop
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 16 |
2014-09-23T02:04:04Z
|
2021-09-08T03:01:07Z
|
2014-09-25T02:57:38Z
|
CONTRIBUTOR
|
resolved
|
This prevents a case where we make a request to URL A, which 301s to B which
would then 301 back to A. Alternatively, for less simple schemes, this will
also prevent us from getting stuck in a loop, e.g., it will prevent the
following from causing an endless loop:
```
A -> B -> C -> D -> E -> F --
^ \
| /
---<------------<----------<-
```
Fixes #2231.
I tested this by cloning httpbin and hard coding a permanent redirect loop from `/relative-redirect/1` to `/relative-redirect/2` so that we could trigger this. This pull request fixes it.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2244/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2244/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2244.diff",
"html_url": "https://github.com/psf/requests/pull/2244",
"merged_at": "2014-09-25T02:57:38Z",
"patch_url": "https://github.com/psf/requests/pull/2244.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2244"
}
| true |
[
"This no longer gets us stuck in a loop by causing us to actually hit the URLs that we're being forcibly redirected to, starting with the _end_ of the redirect chain.\n\nIs this what we want? Or do we want to throw an exception, e.g. `RedirectLoopError`?\n",
"## I don't think we should be adding a new exception for this. This will preserve the previous behavior of having hotting a Max Redirects exception thrown. It's no worse than the behavior we had before the redirect cache was introduced.\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n",
"But yes, I suspect we could subclass `RedirectLoopError` from `TooManyRedirects` and raise that so as to save the calls to the network that would otherwise cause us to exceed the max number of retries. I'm not opposed to that. I'd like to hear other opinions though from @fcosantos and @RuudBurger (and anyone else that has one).\n\n@kennethreitz thoughts on saving people from hitting the network more than necessary when we can detect an endless redirect loop?\n",
"I should also have mentioned that this saves us from loops like this too:\n\n```\n A -> B -> C -> D -> E -> F --\n ^ \\\n | /\n ------------<----------<-\n\n```\n\nOr any other case that essentially results in an endless loop\n",
"The `RedirectLoopError` seems like a good idea.\n\nIn my case, the url works (and redirects properly) in the browser. So maybe there is another issue on how the cache works.\nI can provide you with a test url, if you have an email address I can send it to. As it is a url containing an API key.\n",
"@RuudBurger both @Lukasa and I have our emails available on our GitHub profiles. If you'd rather use PGP, you can find my PGP key (and associated email address) by searching https://pgp.mit.edu/ for my real name.\n",
"I see absolutely no reason why permanent redirects in the redirect cache should not count against the max redirects limit. They are still redirects, we're just not hitting the wire to do them. That behaviour will also fix our infinite redirects problem.\n",
"In my opinion an exception needs to be raised and `RedirectLoopError` is always going to be a `TooManyRedirects` at the end, if you want to subclass it is fine but maybe unnecessary.\n\nI like @sigmavirus24 set() solution, maybe you want to pack it a bit more:\n\n```\nchecked_urls = set()\nwhile request.url in redirect_cache:\n checked_urls.add(request.url)\n request.url = self.redirect_cache.get(request.url)\n if request.url in checked_urls:\n raise *whatever*\n```\n",
"> RedirectLoopError is always going to be a TooManyRedirects at the end\n\n@fcosantos I don't quite understand what you mean. Could you clarify this for me?\n",
"I think the key point is that there's no need for a `RedirectLoopError`, we can just have `TooManyRedirects`. I still think permanent redirects should count against the redirect limit though, with a loop being a special case where we immediately know that we'll hit `TooManyRedirects`.\n",
"@Lukasa I can think of a couple ways to make it influence the max_redirects count.\n1. Move the logic into `Session#resolve_redirects`. This unfortunately would mean actually using the network for the first (possibly permanent) redirect.\n2. Keep count along with the set of how many times we go through that loop and add a new optional parameter to `Session#resolve_redirects` along the lines of `redirects_already_followed=0`. We can then initialize `i` in `Session#resolve_redirects` with that and that will affect the max number of redirects possible (including using the cache).\n\nOption 2 seems most practical, but I just loathe adding more arguments (that could confuse a user) to a public API like this.\n\nAlso, I think there's value in using a subclass of `TooManyRedirects`. I can see an instance where this might cause confusion because of a case like @RuudBurger has. In a browser, it might very well work just fine, but because of oddities in the usage of requests the loop is caused by the redirect cache. I think providing users a way to disambiguate where the error is actually coming from is very useful (and a better experience).\n",
"I wonder if my concern about the redirect cache leading to reproducible behaviour is just my specific problem. I just realised that the other thing this redirect cache changes is the behaviour of `Response.history`, which is now not guaranteed to be the same for each request (the redirect cache doesn't populate it).\n",
"@Lukasa good point. I'm not sure we can actually reconstruct the history accurately unless we also cache responses in the redirect cache. In other words, we'd have a cache something like:\n\n``` python\n{'http://example.com/': ('http://www.example.com', <Response [301]>)}\n```\n\nAnd a big problem with that would be timestamps in headers and such (e.g., cookies). All of this which makes me wonder exactly how good an idea it is to keep the redirect cache around.\n",
"I don't know that we should necessarily throw out the redirect cache, but we should at the very least document the hell out of how it is going to behave.\n",
"+1\n",
"Uh... this wasn't exactly ready to merge. (Not that it breaks anything, but we were discussing alternative solutions.)\n"
] |
https://api.github.com/repos/psf/requests/issues/2243
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2243/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2243/comments
|
https://api.github.com/repos/psf/requests/issues/2243/events
|
https://github.com/psf/requests/issues/2243
| 43,552,857 |
MDU6SXNzdWU0MzU1Mjg1Nw==
| 2,243 |
Source code clarification sought
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/70234?v=4",
"events_url": "https://api.github.com/users/philip-goh/events{/privacy}",
"followers_url": "https://api.github.com/users/philip-goh/followers",
"following_url": "https://api.github.com/users/philip-goh/following{/other_user}",
"gists_url": "https://api.github.com/users/philip-goh/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/philip-goh",
"id": 70234,
"login": "philip-goh",
"node_id": "MDQ6VXNlcjcwMjM0",
"organizations_url": "https://api.github.com/users/philip-goh/orgs",
"received_events_url": "https://api.github.com/users/philip-goh/received_events",
"repos_url": "https://api.github.com/users/philip-goh/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/philip-goh/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/philip-goh/subscriptions",
"type": "User",
"url": "https://api.github.com/users/philip-goh",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-09-22T21:05:16Z
|
2021-09-08T23:08:04Z
|
2014-09-22T21:12:35Z
|
NONE
|
resolved
|
I'd like some clarification as to whether the following code in [requests/api.py](https://github.com/kennethreitz/requests/blob/master/requests/api.py#L47-L48) is correct.
```
def request(method, url, **kwargs):
"""Constructs and sends a :class:`Request <Request>`.
Returns :class:`Response <Response>` object.
:param method: method for the new :class:`Request` object.
:param url: URL for the new :class:`Request` object.
:param params: (optional) Dictionary or bytes to be sent in the query string for the :class:`Request`.
:param data: (optional) Dictionary, bytes, or file-like object to send in the body of the :class:`Request`.
:param headers: (optional) Dictionary of HTTP Headers to send with the :class:`Request`.
:param cookies: (optional) Dict or CookieJar object to send with the :class:`Request`.
:param files: (optional) Dictionary of ``'name': file-like-objects`` (or ``{'name': ('filename', fileobj)}``) for multipart encoding upload.
:param auth: (optional) Auth tuple to enable Basic/Digest/Custom HTTP Auth.
:param timeout: (optional) How long to wait for the server to send data
before giving up, as a float, or a (`connect timeout, read timeout
<user/advanced.html#timeouts>`_) tuple.
:type timeout: float or tuple
:param allow_redirects: (optional) Boolean. Set to True if POST/PUT/DELETE redirect following is allowed.
:type allow_redirects: bool
:param proxies: (optional) Dictionary mapping protocol to the URL of the proxy.
:param verify: (optional) if ``True``, the SSL cert will be verified. A CA_BUNDLE path can also be provided.
:param stream: (optional) if ``False``, the response content will be immediately downloaded.
:param cert: (optional) if String, path to ssl client cert file (.pem). If Tuple, ('cert', 'key') pair.
Usage::
>>> import requests
>>> req = requests.request('GET', 'http://httpbin.org/get')
<Response [200]>
"""
session = sessions.Session()
return session.request(method=method, url=url, **kwargs)
```
It creates a temporary session but does not call `session.close()` when the session goes out of scope. Is this correct and intended? Will this leak resources as the adapters associated with the session are not closed? Could you clarify if this is correct?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2243/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2243/timeline
| null |
completed
| null | null | false |
[
"This is correct.\n\n`session.close()` is not mandatory. The only resource that needs to be worried about are sockets, and sockets that are leaked are closed by the Python garbage collector. There is no need to explicitly clear a `session`.\n"
] |
https://api.github.com/repos/psf/requests/issues/2242
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2242/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2242/comments
|
https://api.github.com/repos/psf/requests/issues/2242/events
|
https://github.com/psf/requests/issues/2242
| 43,510,896 |
MDU6SXNzdWU0MzUxMDg5Ng==
| 2,242 |
HTTPS Session Resume!!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/218767?v=4",
"events_url": "https://api.github.com/users/xnulinu/events{/privacy}",
"followers_url": "https://api.github.com/users/xnulinu/followers",
"following_url": "https://api.github.com/users/xnulinu/following{/other_user}",
"gists_url": "https://api.github.com/users/xnulinu/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/xnulinu",
"id": 218767,
"login": "xnulinu",
"node_id": "MDQ6VXNlcjIxODc2Nw==",
"organizations_url": "https://api.github.com/users/xnulinu/orgs",
"received_events_url": "https://api.github.com/users/xnulinu/received_events",
"repos_url": "https://api.github.com/users/xnulinu/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/xnulinu/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xnulinu/subscriptions",
"type": "User",
"url": "https://api.github.com/users/xnulinu",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2014-09-22T16:18:45Z
|
2021-09-08T23:05:00Z
|
2014-09-22T16:46:36Z
|
NONE
|
resolved
|
This question might not be on the right project. It seems to me that the request library during HTTPS communication does not do SSL Resume. It would be nice to have SSL resume feature for optimization. May be I should ask this question to urllib3 forum?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2242/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2242/timeline
| null |
completed
| null | null | false |
[
"I'd love this feature, it'd be awesome. Unfortunately, we're not in a place to do it at the moment.\n\nThe standard library has an [open feature request](http://bugs.python.org/issue8106) to add support for session resumption. No-one has yet done the work here, so the standard library won't support it. PyOpenSSL also doesn't have support, so we can't do it outside the standard library either.\n\nIf support ends up in either of those projects we can absolutely optionally use it, and I'd love to have it, but we need to get it supported there first I'm afraid.\n",
"The new release version (0.14) of pyOpenSSL has opened number of callbacks from the subset of OpenSSL library. The particular API called 'set_session' can be use in pyOpenSSL.py (/requests/packages/urllib3/contrib/pyopenssl.py) to enable SSL session reuse. There are two tasks we need to do to make it happen I think\n1. Store the initiated established session for particular host:port so that we can reuse the same session for the subsequent request.\n2. Use global SSL context or Context associated for particular host:port. I don't know which way to go\n\nLet me know if there are any plans in near future to add SSL resume support.\n",
"If you're interested in this feature, I recommend opening a feature request on urllib3, which is where the work needs to be done. =)\n",
"Thanks, will do.\n",
"@xnulinu would you mind linking the feature request here?\n",
"The feature request link I have opened on urllib3 library is : https://github.com/shazow/urllib3/issues/590\n"
] |
https://api.github.com/repos/psf/requests/issues/2241
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2241/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2241/comments
|
https://api.github.com/repos/psf/requests/issues/2241/events
|
https://github.com/psf/requests/pull/2241
| 43,509,530 |
MDExOlB1bGxSZXF1ZXN0MjE1OTAxNTg=
| 2,241 |
raise RuntimeError when a single streamed request calls *iter methods th...
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1518884?v=4",
"events_url": "https://api.github.com/users/tijko/events{/privacy}",
"followers_url": "https://api.github.com/users/tijko/followers",
"following_url": "https://api.github.com/users/tijko/following{/other_user}",
"gists_url": "https://api.github.com/users/tijko/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/tijko",
"id": 1518884,
"login": "tijko",
"node_id": "MDQ6VXNlcjE1MTg4ODQ=",
"organizations_url": "https://api.github.com/users/tijko/orgs",
"received_events_url": "https://api.github.com/users/tijko/received_events",
"repos_url": "https://api.github.com/users/tijko/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/tijko/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/tijko/subscriptions",
"type": "User",
"url": "https://api.github.com/users/tijko",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2014-09-22T16:10:24Z
|
2021-09-08T10:01:09Z
|
2014-10-05T16:48:40Z
|
CONTRIBUTOR
|
resolved
|
In response to issue #2240, add a check to see if the content was "consumed"
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2241/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2241/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2241.diff",
"html_url": "https://github.com/psf/requests/pull/2241",
"merged_at": "2014-10-05T16:48:40Z",
"patch_url": "https://github.com/psf/requests/pull/2241.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2241"
}
| true |
[
"Let's have a subclass of `RequestException`. My proposed name is `BodyConsumedError`: @sigmavirus24, do you have a better one?\n",
"EOFError?\n",
"I think `BodyConsumedError` is preferable to `EOFError` and `StreamConsumedError` might be even more obvious in this case.\n",
"Thoughts on making whichever error name we choose multiply inherit from `RequestException` and `TypeError`? People who are doing this may already be catching `TypeError`, if we magically change this on them, it's a breaking change\n",
"Yeah, I can accept that. And `StreamConsumedError` is the best of the bunch so far.\n",
"@tijko so could you update this PR to add the described exception to `requests/exceptions.py` and then use it here where you're raising the `RuntimeError`? Thanks in advance\n",
"@sigmavirus24 looks like @tijko updated this as requested\n",
"@kennethreitz looks good to me\n"
] |
https://api.github.com/repos/psf/requests/issues/2240
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2240/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2240/comments
|
https://api.github.com/repos/psf/requests/issues/2240/events
|
https://github.com/psf/requests/issues/2240
| 43,416,436 |
MDU6SXNzdWU0MzQxNjQzNg==
| 2,240 |
Multiple calls to iter* fail with unhelpful error.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-09-22T06:41:32Z
|
2021-09-08T23:08:05Z
|
2014-09-22T16:36:53Z
|
MEMBER
|
resolved
|
If you call the `iter*` methods more than once for a single streamed request, it fails with an extremely unhelpful error:
``` python
>>> r = requests.get('http://www.google.com/', stream=True)
>>> [x for x in r.iter_content(1024)]
>>> [x for x in r.iter_content(1024)]
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib/python2.7/site-packages/requests/utils.py", line 341, in iter_slices
while pos < len(string):
TypeError: object of type 'bool' has no len()
```
I think it's fine that you can't call the iter\* methods twice on a streamed response, but we can easily catch this situation so we should.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2240/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2240/timeline
| null |
completed
| null | null | false |
[
"I'm making a pull request with a `RuntimeError` being raised. What kind of exception would you ideally like to throw here? \n",
"I'm going to close this to centralise on #2241.\n"
] |
https://api.github.com/repos/psf/requests/issues/2239
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2239/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2239/comments
|
https://api.github.com/repos/psf/requests/issues/2239/events
|
https://github.com/psf/requests/issues/2239
| 43,320,398 |
MDU6SXNzdWU0MzMyMDM5OA==
| 2,239 |
ssl and multiple subdomains
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/128454?v=4",
"events_url": "https://api.github.com/users/seanjensengrey/events{/privacy}",
"followers_url": "https://api.github.com/users/seanjensengrey/followers",
"following_url": "https://api.github.com/users/seanjensengrey/following{/other_user}",
"gists_url": "https://api.github.com/users/seanjensengrey/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/seanjensengrey",
"id": 128454,
"login": "seanjensengrey",
"node_id": "MDQ6VXNlcjEyODQ1NA==",
"organizations_url": "https://api.github.com/users/seanjensengrey/orgs",
"received_events_url": "https://api.github.com/users/seanjensengrey/received_events",
"repos_url": "https://api.github.com/users/seanjensengrey/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/seanjensengrey/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/seanjensengrey/subscriptions",
"type": "User",
"url": "https://api.github.com/users/seanjensengrey",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2014-09-20T21:38:55Z
|
2021-09-08T23:07:10Z
|
2014-09-20T22:16:29Z
|
NONE
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2239/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2239/timeline
| null |
completed
| null | null | false |
[
"When I attempt to retrieve a url like, https://ia700306.us.archive.org/11/items/MITclassical_mech/\n\n```\nrequests.get(\"https://ia700306.us.archive.org/11/items/MITclassical_mech/\")\n```\n\nI am getting an exception\n\n```\nlib/python2.7/site-packages/requests/adapters.pyc in send(self, request, stream, timeout, verify, cert, proxies)\n 418 except (_SSLError, _HTTPError) as e:\n 419 if isinstance(e, _SSLError):\n--> 420 raise SSLError(e, request=request)\n 421 elif isinstance(e, ReadTimeoutError):\n 422 raise ReadTimeout(e, request=request)\n\nSSLError: hostname 'ia700306.us.archive.org' doesn't match either of '*.archive.org', 'archive.org'\n```\n\nBut Chrome shows the domain as authenticated with SSL. Is this a bug or by design? \n",
"It's neither. =)\n\nWe are correctly validating the certificate that was passed to us, but `archive.org` passed the wrong certificate. This is because it uses an extension to TLS called SNI (Server Name Indication) to work out which TLS certificate to present. Python 2.7's standard library doesn't have support for this extension, so out of the box requests can't use it.\n\nIf you're on the most recent version of requests (2.4.1) you can re-install requests with `pip install requests[security]` to get these features. If not, you need to install the following extra libraries:\n- `pyOpenSSL`\n- `ndg-httpsclient`\n- `pyasn1`\n",
"Excellent. That worked\n\nCould this SNI issue be detected and have the above instructions added as part of the exception?\n",
"The best we could do is catch the verification error and say, on Python 2, \"this may be a result of lacking SNI\". We can't know for sure because it looks like a standard certificate validation error.\n",
"Could a friendly bootup message alert the user that they are running on Python 2 without SNI enabled and that many sites will have certificate errors unless they install the following packages ... ?\n\nIPython did something similar for platforms that had a misconfigured readline/libedit.\n",
"## Unlikely. Urllib3 added on-by-default warnings that significantly angered our users. We won't be reproducing that mistake at all\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n",
"@sigmavirus24 is right. We are still arguing about whether or not that was ok.\n"
] |
|
https://api.github.com/repos/psf/requests/issues/2238
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2238/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2238/comments
|
https://api.github.com/repos/psf/requests/issues/2238/events
|
https://github.com/psf/requests/pull/2238
| 43,319,264 |
MDExOlB1bGxSZXF1ZXN0MjE1NDE0NjE=
| 2,238 |
Support bytestring URLs on Python 3.x
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1097349?v=4",
"events_url": "https://api.github.com/users/joealcorn/events{/privacy}",
"followers_url": "https://api.github.com/users/joealcorn/followers",
"following_url": "https://api.github.com/users/joealcorn/following{/other_user}",
"gists_url": "https://api.github.com/users/joealcorn/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/joealcorn",
"id": 1097349,
"login": "joealcorn",
"node_id": "MDQ6VXNlcjEwOTczNDk=",
"organizations_url": "https://api.github.com/users/joealcorn/orgs",
"received_events_url": "https://api.github.com/users/joealcorn/received_events",
"repos_url": "https://api.github.com/users/joealcorn/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/joealcorn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joealcorn/subscriptions",
"type": "User",
"url": "https://api.github.com/users/joealcorn",
"user_view_type": "public"
}
|
[
{
"color": "009800",
"default": false,
"description": null,
"id": 44501218,
"name": "Ready To Merge",
"node_id": "MDU6TGFiZWw0NDUwMTIxOA==",
"url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge"
},
{
"color": "207de5",
"default": false,
"description": null,
"id": 60620163,
"name": "Minion Seal of Approval",
"node_id": "MDU6TGFiZWw2MDYyMDE2Mw==",
"url": "https://api.github.com/repos/psf/requests/labels/Minion%20Seal%20of%20Approval"
}
] |
closed
| true |
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
[
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
] | null | 6 |
2014-09-20T20:47:05Z
|
2021-09-08T10:01:10Z
|
2014-10-02T16:20:05Z
|
CONTRIBUTOR
|
resolved
|
Hi there folks.
Currently `prepare_url` will call `unicode` or `str` on the url arg depending on the python version. This works fine for most cases, but the one case it trips up on is bytestrings on python 3.x as the string representation of these is `"b'http://httpbin.org'"`. Eventually this will surface as an `InvalidSchema` exception.
I find this to be completely unexpected, and I'd imagine it's not something that's been done intentionally.
Technically this a breaking change. The possibility of passing non-strings to `prepare_url` is undocumented and untested, but regardless it may be better to go about fixing this in a different way, that's your call.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2238/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2238/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2238.diff",
"html_url": "https://github.com/psf/requests/pull/2238",
"merged_at": "2014-10-02T16:20:05Z",
"patch_url": "https://github.com/psf/requests/pull/2238.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2238"
}
| true |
[
"Thanks for this!\n\nI think we like the ability to pass non-strings to `prepare_url`. It allows people to use custom url-building classes without running into trouble. It's worth noting that you change also breaks Python 2 behaviour: previously a Python 2 `str` would get lifted to a Python 2 `unicode`, which it now doesn't do.\n\nI think we can get around this by simply special-casing string types. The logic is really:\n\n```\nif type is unicode, leave unchanged\nelse, if type is bytes, decode bytestring to unicode\nelse, if type is anything else, call the 'to unicode' method\n```\n\nI think that's really the logic we want here. @sigmavirus24, thoughts?\n",
"I agree with @Lukasa that this is the behaviour we want. We absolutely want `unicode` urls because on Python 3, we can handle IRIs. We would need to adopt something that implements RFC 3987 to support it on Python 2, but once we did, we would be able to support them there too. For example, http://☃.net should be supported. (Your browser, and Python 3 should properly \"encode\" that to http://xn--n3h.net/.)\n",
"Ah good point, forgot about that use case.\nHave updated the branch, I believe it's doing what it should be now, how's that?\n",
":heart: @buttscicles \n\n@Lukasa this looks okay to me. Thoughts?\n",
":cake: Make it so.\n",
"Let's not document this :)\n"
] |
https://api.github.com/repos/psf/requests/issues/2237
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2237/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2237/comments
|
https://api.github.com/repos/psf/requests/issues/2237/events
|
https://github.com/psf/requests/pull/2237
| 43,318,390 |
MDExOlB1bGxSZXF1ZXN0MjE1NDEwNjM=
| 2,237 |
Remove invoke from requirements.txt, docs
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1097349?v=4",
"events_url": "https://api.github.com/users/joealcorn/events{/privacy}",
"followers_url": "https://api.github.com/users/joealcorn/followers",
"following_url": "https://api.github.com/users/joealcorn/following{/other_user}",
"gists_url": "https://api.github.com/users/joealcorn/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/joealcorn",
"id": 1097349,
"login": "joealcorn",
"node_id": "MDQ6VXNlcjEwOTczNDk=",
"organizations_url": "https://api.github.com/users/joealcorn/orgs",
"received_events_url": "https://api.github.com/users/joealcorn/received_events",
"repos_url": "https://api.github.com/users/joealcorn/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/joealcorn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/joealcorn/subscriptions",
"type": "User",
"url": "https://api.github.com/users/joealcorn",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2014-09-20T20:02:07Z
|
2021-09-08T10:01:15Z
|
2014-09-20T22:12:38Z
|
CONTRIBUTOR
|
resolved
|
Invoke seems to be an old, unneeded dependency
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2237/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2237/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/2237.diff",
"html_url": "https://github.com/psf/requests/pull/2237",
"merged_at": "2014-09-20T22:12:38Z",
"patch_url": "https://github.com/psf/requests/pull/2237.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/2237"
}
| true |
[
"You appear to be correct. Thanks for this! :cake: \n",
":cake: Thanks @buttscicles \n"
] |
https://api.github.com/repos/psf/requests/issues/2236
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2236/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2236/comments
|
https://api.github.com/repos/psf/requests/issues/2236/events
|
https://github.com/psf/requests/issues/2236
| 43,233,966 |
MDU6SXNzdWU0MzIzMzk2Ng==
| 2,236 |
Would it make sense to merge responses into requests ?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/25111?v=4",
"events_url": "https://api.github.com/users/cournape/events{/privacy}",
"followers_url": "https://api.github.com/users/cournape/followers",
"following_url": "https://api.github.com/users/cournape/following{/other_user}",
"gists_url": "https://api.github.com/users/cournape/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/cournape",
"id": 25111,
"login": "cournape",
"node_id": "MDQ6VXNlcjI1MTEx",
"organizations_url": "https://api.github.com/users/cournape/orgs",
"received_events_url": "https://api.github.com/users/cournape/received_events",
"repos_url": "https://api.github.com/users/cournape/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/cournape/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cournape/subscriptions",
"type": "User",
"url": "https://api.github.com/users/cournape",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2014-09-19T12:43:28Z
|
2021-09-08T23:08:05Z
|
2014-09-19T12:52:30Z
|
NONE
|
resolved
|
Hi,
requests is great, but testing code using it is a bit difficult without scaffolding. The [responses project](https://github.com/dropbox/responses) is fairly neat, but since it needs to access internals of requests, wouldn't it make sense to include the feature in requests ?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2236/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2236/timeline
| null |
completed
| null | null | false |
[
"## No. There are plenty of libraries that access our internals. Most (if not all) do not belong in requests core.\n\nSent from my Android device with K-9 Mail. Please excuse my brevity.\n",
"As @sigmavirus24 says, lots and lots of projects hook into requests and we very definitely don't want to include them in requests. =)\n",
"If you want to discuss this further, email me\n"
] |
https://api.github.com/repos/psf/requests/issues/2235
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2235/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2235/comments
|
https://api.github.com/repos/psf/requests/issues/2235/events
|
https://github.com/psf/requests/issues/2235
| 43,205,423 |
MDU6SXNzdWU0MzIwNTQyMw==
| 2,235 |
chunk size error for unicode content
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/32542?v=4",
"events_url": "https://api.github.com/users/ilovenwd/events{/privacy}",
"followers_url": "https://api.github.com/users/ilovenwd/followers",
"following_url": "https://api.github.com/users/ilovenwd/following{/other_user}",
"gists_url": "https://api.github.com/users/ilovenwd/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ilovenwd",
"id": 32542,
"login": "ilovenwd",
"node_id": "MDQ6VXNlcjMyNTQy",
"organizations_url": "https://api.github.com/users/ilovenwd/orgs",
"received_events_url": "https://api.github.com/users/ilovenwd/received_events",
"repos_url": "https://api.github.com/users/ilovenwd/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ilovenwd/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ilovenwd/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ilovenwd",
"user_view_type": "public"
}
|
[
{
"color": "e102d8",
"default": false,
"description": null,
"id": 117745,
"name": "Planned",
"node_id": "MDU6TGFiZWwxMTc3NDU=",
"url": "https://api.github.com/repos/psf/requests/labels/Planned"
},
{
"color": "eb6420",
"default": false,
"description": null,
"id": 44501256,
"name": "Breaking API Change",
"node_id": "MDU6TGFiZWw0NDUwMTI1Ng==",
"url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change"
}
] |
open
| false | null |
[] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2013-11-17T11:29:34Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": null,
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/20",
"id": 487518,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels",
"node_id": "MDk6TWlsZXN0b25lNDg3NTE4",
"number": 20,
"open_issues": 12,
"state": "open",
"title": "3.0.0",
"updated_at": "2024-05-19T18:43:00Z",
"url": "https://api.github.com/repos/psf/requests/milestones/20"
}
| 6 |
2014-09-19T05:27:53Z
|
2014-10-10T16:46:04Z
| null |
NONE
| null |
I found this code in requests/adapters.py (latest version installed by pip):
https://github.com/kennethreitz/requests/blob/master/requests/adapters.py#L383
``` python
for i in request.body:
low_conn.send(hex(len(i))[2:].encode('utf-8'))
```
if `i` is a unicode, the low_conn send utf8 encoding byte string, but the chunk size is wrong.
I think it should change to:
``` python
if isinstance(i, unicode):
i = i.encode('utf8')
low_conn.send(hex(len(i))[2:].encode('utf-8'))
```
| null |
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2235/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2235/timeline
| null | null | null | null | false |
[
"Thanks for raising this!\n\nI don't think we should do that, however. If you've passed us a unicode string we should not be guessing at what text encoding you want to use in the body. I think I'd be happier not accepting unicode at all in this case, rather than guessing that 'UTF-8' is what is meant.\n\nThis is a bit of a thorny issue though, because that reduces our compatibility: we've implicitly allowed it in the past. Maybe force a decode to ASCII instead? (A choice which is almost certain to work.)\n",
"@sigmavirus24, can I get your thoughts here?\n",
"So while my instinct is to insist the user give us everything as a bytes object (and I don't think it's entirely unreasonable), we actively encourage users to do:\n\n``` python\nrequests.post(url,\n data=json.dumps({'my': 'json', 'data': 'here'}),\n headers={'Content-Type': 'application/json'})\n```\n\nIf we don't handle this in requests, at least for some deprecation period, we will be forcing users to do:\n\n``` python\nrequests.post(url,\n data=json.dumps({'my': 'json', 'data': 'here'}).encode('utf-8'),\n headers={'Content-Type': 'application/json'})\n```\n\nI'm sure the number of people passing JSON to `data` is not insignificant. I guess I'm in favor of using a Warning and transitioning to forcing this. This use case I outlined will also become obsolete soon because requests will be handling `json.dumps` for users. Which reminds me...\n",
"UTF8 is the most reasonable default.\nBesides, python3 string defaults to unicode, many data read from db/http is auto convert to unicode(default utf8).\nso, why not accept utf8 as default unicode encoding?\nThe python standard library ALREADY AUTO convert unicode to utf8 when write to socket.\n(that why the chunk size is wrong, but the chunk body is ok)\n\n@sigmavirus24 this bug only appears when using generator as data (chunked encoding)\npost data=unicode works because \n\n> The python standard library ALREADY AUTO convert unicode to utf8 when write to socket.\n",
"> The python standard library ALREADY AUTO convert unicode to utf8 when write to socket. (that why the chunk size is wrong, not the chunk body)\n\nNot in Python 3 it doesn't:\n\n``` python\nPython 3.4.1 (default, Aug 25 2014, 11:56:02) \n[GCC 4.2.1 Compatible Apple LLVM 5.1 (clang-503.0.40)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import socket\n>>> s = socket.create_connection(('mkcert.org', 80))\n>>> s.write(\"unicode string\")\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\nTypeError: 'str' does not support the buffer interface\n```\n\nIn fact, it doesn't even work in Python 2 on my machine:\n\n``` python\nPython 2.7.8 (default, Aug 25 2014, 11:53:26) \n[GCC 4.2.1 Compatible Apple LLVM 5.1 (clang-503.0.40)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import socket\n>>> s = socket.create_connection(('mkcert.org', 80))\n>>> s.send(u\"unicode string with ÜBİTAK\")\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\nUnicodeEncodeError: 'ascii' codec can't encode character u'\\xdc' in position 20: ordinal not in range(128)\n```\n\nThe answer to 'why not accept utf8 as default encoding' is because that mistake is exactly what causes this problem in the first place. There is no 'default encoding', there's only right and wrong. We cannot and should not guess in this regard. It makes no sense to send unicode bytes on a socket.\n\nSometimes, we can guess. JSON has a set of well-defined text encodings, so we can pick one of those. But you could be sending text in _any_ encoding, and we have no way to guess. Getting weird server errors is worse than us blowing up and saying \"you have to give us binary data!\"\n",
"> Getting weird server errors is worse than us blowing up and saying \"you have to give us binary data!\"\n\nYeah I'm surprised we haven't had more bug reports about this frankly. Like I said, I think we should follow a deprecation pattern for this behaviour for 2.5 and 2.6, then make it default in 2.7 (or 3.0).\n- I think we should issue a `DeprecationWarning` when we receive `data` whose type is not `bytes`. We should then immediately try to encode the data for the user.\n- For the case that @ilovenwd is encountering (using a generator) we should issue _1_ deprecation warning after the first chunk and then encode the data for the user.\n- In the case of the user passing a file(-like) object to `data`, we should check the mode to ensure it was opened with `'b'` or is an instance/subclass of `BytesIO`. This case is tougher because some portions of it may be handled by the generator case (i.e., some users don't define `__len__` on `BytesIO` subclasses and so they're treated as generators.).\n\nOnce we have a `json` parameter, we can confidently handle that ourselves, for the user.\n"
] |
https://api.github.com/repos/psf/requests/issues/2234
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2234/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2234/comments
|
https://api.github.com/repos/psf/requests/issues/2234/events
|
https://github.com/psf/requests/issues/2234
| 43,193,961 |
MDU6SXNzdWU0MzE5Mzk2MQ==
| 2,234 |
when setting the Accept-Encoding header to None, it is still sent as 'identity'
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/643587?v=4",
"events_url": "https://api.github.com/users/vdanen/events{/privacy}",
"followers_url": "https://api.github.com/users/vdanen/followers",
"following_url": "https://api.github.com/users/vdanen/following{/other_user}",
"gists_url": "https://api.github.com/users/vdanen/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/vdanen",
"id": 643587,
"login": "vdanen",
"node_id": "MDQ6VXNlcjY0MzU4Nw==",
"organizations_url": "https://api.github.com/users/vdanen/orgs",
"received_events_url": "https://api.github.com/users/vdanen/received_events",
"repos_url": "https://api.github.com/users/vdanen/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/vdanen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vdanen/subscriptions",
"type": "User",
"url": "https://api.github.com/users/vdanen",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2014-09-19T00:40:01Z
|
2017-10-24T17:19:40Z
|
2014-09-19T13:26:24Z
|
NONE
| null |
I am passing these headers to request:
headers = {'Content-Type': 'application/something+xml',
'Accept': 'application/something+xml',
'Host': host,
'Expect': '100-continue',
'Connection': None,
'Accept-Encoding': None}
and then using:
r = requests.put(my_url, data=strata_xml, headers=headers, auth=(user, pwd), verify=False)
the REST service on the other end does not like this header at all. Using nc, I see it sending:
PUT /myurl HTTP/1.1
Accept-Encoding: identity
Content-Length: 1200
Accept: application/something+xml
User-Agent: python-requests/2.4.1 CPython/2.7.8 Darwin/13.3.0
Host: localhost
Expect: 100-continue
Content-Type: application/something+xml
I believe that https://hg.python.org/cpython/file/d047928ae3f6/Lib/http/client.py#l1086 may offer some clue there as to why this header is not being deleted when I set it to None, but I don't know how to work-around that.
Thanks.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/643587?v=4",
"events_url": "https://api.github.com/users/vdanen/events{/privacy}",
"followers_url": "https://api.github.com/users/vdanen/followers",
"following_url": "https://api.github.com/users/vdanen/following{/other_user}",
"gists_url": "https://api.github.com/users/vdanen/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/vdanen",
"id": 643587,
"login": "vdanen",
"node_id": "MDQ6VXNlcjY0MzU4Nw==",
"organizations_url": "https://api.github.com/users/vdanen/orgs",
"received_events_url": "https://api.github.com/users/vdanen/received_events",
"repos_url": "https://api.github.com/users/vdanen/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/vdanen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vdanen/subscriptions",
"type": "User",
"url": "https://api.github.com/users/vdanen",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2234/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2234/timeline
| null |
completed
| null | null | false |
[
"Essentially http.client adds \"Accept-Encoding: identity\" if there is no \"Accept-Encoding\" header present in the prepared request and this skip_accept_encoding option is not set (to clarify further)\n",
"@vdanen we remove `None` before passing the headers on to `urllib3`. `urllib3` does not do the same processing that we do on headers before passing them on to `httplib`/`http.client` (as it shouldn't). There's no way to pass this on to urllib3 in such a way that it can also tell `httplib` that. Is there any `Accept-Encoding` value that this server won't balk at?\n",
"Not that I have found, no. It doesn't like that header at all it seems. I've also tried to use a generator to force it to be chunked (which is the only way this header doesn't get sent it seems), but the server really didn't like that.\n\nI've asked if there is a way they can fix their end to accept/discard/ignore that header without dying, but I probably won't hear back on that for a bit, and I may not be the only one to run into this problem which is I reported it.\n",
"That's really fairly terrible. `Accept-Encoding: identity` is _always_ valid, the RFCs say so. It should be utterly harmless to send it along.\n\nOtherwise, removing this requires us to replace httplib. That's a substantial bit of work. =(\n",
"Yes, I see that. And I certainly wouldn't expect to have to gut this just to make it work for my use-case which seems to not be that standard. Thanks for the info! If it's not possible to work-around this in any reasonable way, then there probably isn't a point in keeping this open.\n",
"😢 ",
"Need the same thing for Connection:keep-alive - don't send it. :-(",
"This is an issue from 2014 that is not related to the current comments. To avoid reviving a long dead issue, I'm locking the conversation.\r\n\r\n@forgetso because your comment has actionable information, I'll inform you that `Connection: keep-alive` is the default for HTTP/1.1. If you don't want keep-alive behaviour, you can specify `Connection: close` instead."
] |
https://api.github.com/repos/psf/requests/issues/2233
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2233/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2233/comments
|
https://api.github.com/repos/psf/requests/issues/2233/events
|
https://github.com/psf/requests/issues/2233
| 43,191,940 |
MDU6SXNzdWU0MzE5MTk0MA==
| 2,233 |
Refactoring exception hierarchy
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/234019?v=4",
"events_url": "https://api.github.com/users/kevinburke/events{/privacy}",
"followers_url": "https://api.github.com/users/kevinburke/followers",
"following_url": "https://api.github.com/users/kevinburke/following{/other_user}",
"gists_url": "https://api.github.com/users/kevinburke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kevinburke",
"id": 234019,
"login": "kevinburke",
"node_id": "MDQ6VXNlcjIzNDAxOQ==",
"organizations_url": "https://api.github.com/users/kevinburke/orgs",
"received_events_url": "https://api.github.com/users/kevinburke/received_events",
"repos_url": "https://api.github.com/users/kevinburke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kevinburke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kevinburke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kevinburke",
"user_view_type": "public"
}
|
[
{
"color": "fbca04",
"default": false,
"description": null,
"id": 44501249,
"name": "Needs BDFL Input",
"node_id": "MDU6TGFiZWw0NDUwMTI0OQ==",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20BDFL%20Input"
},
{
"color": "eb6420",
"default": false,
"description": null,
"id": 44501256,
"name": "Breaking API Change",
"node_id": "MDU6TGFiZWw0NDUwMTI1Ng==",
"url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change"
}
] |
open
| false | null |
[] |
{
"closed_at": null,
"closed_issues": 29,
"created_at": "2013-11-17T11:29:34Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
},
"description": null,
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/20",
"id": 487518,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels",
"node_id": "MDk6TWlsZXN0b25lNDg3NTE4",
"number": 20,
"open_issues": 12,
"state": "open",
"title": "3.0.0",
"updated_at": "2024-05-19T18:43:00Z",
"url": "https://api.github.com/repos/psf/requests/milestones/20"
}
| 5 |
2014-09-19T00:00:11Z
|
2016-01-26T10:17:01Z
| null |
CONTRIBUTOR
| null |
This was mentioned in #2230 as a project for requests 3.0, but it probably should get its own thread.
I wrote a lot of words about what I'd like exception hierarchy to look like, here:
https://gist.github.com/kevinburke/b98e053a4bf9835c67bb
| null |
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2233/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2233/timeline
| null | null | null | null | false |
[
"I am in principle a strong +1 on this work. It's most definitely a breaking change, however.\n",
"I'm starting to collect issues and features for a potential 3.0 release.\n",
"Since it seems like a 3.0 release is drawing closer I'm just raising awareness to this issue which seems rather interesting. \n",
"Thanks @jonathan-s. Those of us interested in this were already aware of requests 3.0 drawing close.\n",
"Ah, sorry I missed that the 3.0.0 milestone had already been added that's keeping track of the issues for 3.0.0. \n"
] |
https://api.github.com/repos/psf/requests/issues/2232
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2232/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2232/comments
|
https://api.github.com/repos/psf/requests/issues/2232/events
|
https://github.com/psf/requests/issues/2232
| 43,125,798 |
MDU6SXNzdWU0MzEyNTc5OA==
| 2,232 |
Content returned still encoded
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/8092009?v=4",
"events_url": "https://api.github.com/users/mryan82/events{/privacy}",
"followers_url": "https://api.github.com/users/mryan82/followers",
"following_url": "https://api.github.com/users/mryan82/following{/other_user}",
"gists_url": "https://api.github.com/users/mryan82/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mryan82",
"id": 8092009,
"login": "mryan82",
"node_id": "MDQ6VXNlcjgwOTIwMDk=",
"organizations_url": "https://api.github.com/users/mryan82/orgs",
"received_events_url": "https://api.github.com/users/mryan82/received_events",
"repos_url": "https://api.github.com/users/mryan82/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mryan82/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mryan82/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mryan82",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2014-09-18T12:14:53Z
|
2021-09-08T23:08:06Z
|
2014-09-18T12:26:59Z
|
NONE
|
resolved
|
When I try the following, the content returned in the response is still encoded. Is this an issue with the requests module?
result = requests.get('http://aceabio.com')
result.content:
'\xff\xfe<\x00!\x00D\x00O\x00C\x00T\x00Y\x00P\x00E\x00 \x00h\x00t\x00m\x00l\x00 \xx......./\x00h\x00t\x00m\x00l\x00>\x00'
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4",
"events_url": "https://api.github.com/users/Lukasa/events{/privacy}",
"followers_url": "https://api.github.com/users/Lukasa/followers",
"following_url": "https://api.github.com/users/Lukasa/following{/other_user}",
"gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lukasa",
"id": 1382556,
"login": "Lukasa",
"node_id": "MDQ6VXNlcjEzODI1NTY=",
"organizations_url": "https://api.github.com/users/Lukasa/orgs",
"received_events_url": "https://api.github.com/users/Lukasa/received_events",
"repos_url": "https://api.github.com/users/Lukasa/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lukasa",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2232/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2232/timeline
| null |
completed
| null | null | false |
[
"`result.content` is always a bytestring, it does not decode the text from whatever text encoding it uses.\n\nThe server has not set a text encoding in the headers, so you have three options:\n1. Set the encoding yourself. I can see that it's UTF-16, so use that:\n \n ``` python\n result.encoding = 'utf-16'\n result.text\n ```\n2. Decode it yourself\n \n ``` python\n result.content.decode('utf-16')\n ```\n3. Use `requests.utils.get_encodings_from_content`. This also won't work, for two reasons. Firstly, `get_encodings_from_content` assumes an 8-bit encoding is being used for the meta tag, which is untrue. Second, even if it was true, the upstream server has _lied_ about the text encoding it's using:\n \n ```\n <meta http-equiv=\"Content-Type\" content=\"text/html; charset=utf-8\" />\n ```\n\nThis is not a bug in requests, this is the upstream server not giving us enough information to work with.\n"
] |
https://api.github.com/repos/psf/requests/issues/2231
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/2231/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/2231/comments
|
https://api.github.com/repos/psf/requests/issues/2231/events
|
https://github.com/psf/requests/issues/2231
| 43,116,545 |
MDU6SXNzdWU0MzExNjU0NQ==
| 2,231 |
Endless loop in sessions.py
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/8166442?v=4",
"events_url": "https://api.github.com/users/fcosantos/events{/privacy}",
"followers_url": "https://api.github.com/users/fcosantos/followers",
"following_url": "https://api.github.com/users/fcosantos/following{/other_user}",
"gists_url": "https://api.github.com/users/fcosantos/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/fcosantos",
"id": 8166442,
"login": "fcosantos",
"node_id": "MDQ6VXNlcjgxNjY0NDI=",
"organizations_url": "https://api.github.com/users/fcosantos/orgs",
"received_events_url": "https://api.github.com/users/fcosantos/received_events",
"repos_url": "https://api.github.com/users/fcosantos/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/fcosantos/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fcosantos/subscriptions",
"type": "User",
"url": "https://api.github.com/users/fcosantos",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 6 |
2014-09-18T10:08:51Z
|
2021-09-08T23:08:02Z
|
2014-09-25T02:57:38Z
|
NONE
|
resolved
|
> (gdb) py-list
> 544 # Guard against that specific failure case.
> 545 if not isinstance(request, PreparedRequest):
> 546 raise ValueError('You can only send PreparedRequests.')
> 547
> 548 while request.url in self.redirect_cache:
> -549 request.url = self.redirect_cache.get(request.url)
> 550
> 551 # Set up variables needed for resolve_redirects and dispatching of hooks
> 552 allow_redirects = kwargs.pop('allow_redirects', True)
> 553 stream = kwargs.get('stream')
> 554 timeout = kwargs.get('timeout')
redirect_cache dict could have complementary key/values that forces and enless loop.
> (gdb) py-down
> # 2 Frame 0x7f645c00acd0, for file /test/libs/requests/sessions.py, line 549, in send (self=<Session(cookies=<RequestsCookieJar(_now=1410944130, _policy=<DefaultCookiePolicy(strict_rfc2965_unverifiable=True, strict_ns_domain=0, _allowed_domains=None, rfc2109_as_netscape=None, rfc2965=False, strict_domain=False, _now=1410944130, strict_ns_set_path=False, strict_ns_unverifiable=False, strict_ns_set_initial_dollar=False, hide_cookie2=False, _blocked_domains=(), netscape=True) at remote 0x7f651070c4d0>, _cookies={}, _cookies_lock=<_RLock(_Verbose__verbose=False, _RLock__owner=None, _RLock__block=<thread.lock at remote 0x7f6530772dd0>, _RLock__count=0) at remote 0x7f65105a5450>) at remote 0x7f65105a5e10>, stream=False, hooks={'response': []}, redirect_cache={'http://www.hent aiwe blog.com/': 'http://hent aiwe blog.com/', 'http://hent aiwe blog.com/': 'http://www.hent aiwe blog.com/'}, auth=None, trust_env=True, headers=<CaseInsensitiveDict(_store={'accept-encoding': ('Accept-Encoding', 'gzip, deflate'), 'accep...(truncated)
>
> ```
> request.url = self.redirect_cache.get(request.url)
> ```
Current request chain with wget:
> $ wget www.hent aiwe blog.com
> --2014-09-18 11:52:20-- http://www.hent aiwe blog.com/
> Resolving www.hent aiwe blog.com (www.hent aiwe blog.com)... 109.72.81.172
> Connecting to www.hent aiwe blog.com (www.hent aiwe blog.com)|109.72.81.172|:80... connected.
> HTTP request sent, awaiting response... 301 Moved Permanently
> Location: http://hent aiwe blog.com/ [following]
> --2014-09-18 11:52:20-- http://hent aiwe blog.com/
> Resolving hent aiwe blog.com (hent aiwe blog.com)... 109.72.81.172
> Reusing existing connection to www.hent aiwe blog.com:80.
> HTTP request sent, awaiting response... 301 Moved Permanently
> Location: http://www.hent aiwe blog.com/ [following]
> --2014-09-18 11:52:20-- http://www.hent aiwe blog.com/
> Reusing existing connection to www.hent aiwe blog.com:80.
> HTTP request sent, awaiting response... 301 Moved Permanently
> Location: http://hent aiwe blog.com/ [following]
> ...
Suggested resolution:
> ```
> redirect_count = 0
> while request.url in self.redirect_cache:
> redirect_count += 1
> if redirect_count > self.max_redirects:
> raise TooManyRedirects
> request.url = self.redirect_cache.get(request.url)
> ```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4",
"events_url": "https://api.github.com/users/kennethreitz/events{/privacy}",
"followers_url": "https://api.github.com/users/kennethreitz/followers",
"following_url": "https://api.github.com/users/kennethreitz/following{/other_user}",
"gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kennethreitz",
"id": 119893,
"login": "kennethreitz",
"node_id": "MDQ6VXNlcjExOTg5Mw==",
"organizations_url": "https://api.github.com/users/kennethreitz/orgs",
"received_events_url": "https://api.github.com/users/kennethreitz/received_events",
"repos_url": "https://api.github.com/users/kennethreitz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kennethreitz",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/2231/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/2231/timeline
| null |
completed
| null | null | false |
[
"Please search for issues before raising new ones. This was raised in #2207 and fixed in #2210: the fix is already in `master`.\n",
"I apologise, we didn't quite fix this. We actually have a separate bug, in that we can have a redirect loop of size greater than 1. \n\nIdeally we ought to be able to spot redirect loops, regardless of whether they use the redirect cache. \n",
"So this is a pretty serious problem. We could do:\n\n``` python\nchecked_urls = set()\nwhile request.url in self.redirect_cache:\n checked_urls.add(request.url)\n new_url = self.redirect_cache.get(request.url)\n if new_url in checked_urls:\n break\n request.url = new_url\n```\n\nThe break will mean that we use the existing request URL which should, ideally, trigger a `TooManyRedirects` exception, ideally this will prevent us from having to duplicate logic for that exception here. **I have not tested this yet though.**\n\nThe idea is that if we followed a redirect in the cache to something else, it will find that in `checked_urls`. This way if the redirect cache reads more like\n\n```\nA -> B -> C -> D -> E -> F -\n^ \\\n| /\n-----------------------------\n```\n\nWe will catch it as soon as `F` redirects to `A`. We could (subclassing `TooManyRedirects`) raise a `RedirectLoopFound` exception too from here as soon as we catch this instead of forcing the user to wait for 30 redirects to happen (or whatever I max currently is).\n",
"I'm having the same issue, just wanted to chip in. For a faulty url that loops to itself, the max_redirects counter never goes beyond 1.\nThe self.redirect_cache looks like this:\n`{'http://www.hostname.com/': 'http://hostname.com/', 'http://hostname.com/': 'http://www.hostname.com/'}`\nThe www redirects to the non-www, and visa versa.\nOnly thing that fixes it now is setting the max_redirects to 1.\n",
"The patch mentioned above works for me btw. https://github.com/RuudBurger/CouchPotatoServer/commit/3338b72d1f6845f6a44e6407d62d4ba27deee266\n",
"PR #2244 has a fix for this.\n"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.