url
stringlengths 50
53
| repository_url
stringclasses 1
value | labels_url
stringlengths 64
67
| comments_url
stringlengths 59
62
| events_url
stringlengths 57
60
| html_url
stringlengths 38
43
| id
int64 597k
2.65B
| node_id
stringlengths 18
32
| number
int64 1
6.83k
| title
stringlengths 1
296
| user
dict | labels
listlengths 0
5
| state
stringclasses 2
values | locked
bool 2
classes | assignee
dict | assignees
listlengths 0
4
| milestone
dict | comments
int64 0
211
| created_at
stringlengths 20
20
| updated_at
stringlengths 20
20
| closed_at
stringlengths 20
20
⌀ | author_association
stringclasses 3
values | active_lock_reason
stringclasses 4
values | body
stringlengths 0
65.6k
⌀ | closed_by
dict | reactions
dict | timeline_url
stringlengths 59
62
| performed_via_github_app
null | state_reason
stringclasses 3
values | draft
bool 2
classes | pull_request
dict | is_pull_request
bool 2
classes | issue_comments
listlengths 0
30
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/psf/requests/issues/5798
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5798/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5798/comments
|
https://api.github.com/repos/psf/requests/issues/5798/events
|
https://github.com/psf/requests/issues/5798
| 865,744,201 |
MDU6SXNzdWU4NjU3NDQyMDE=
| 5,798 |
`resuests.post` working in JupyterLab but failing in python script
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/11691571?v=4",
"events_url": "https://api.github.com/users/msank00/events{/privacy}",
"followers_url": "https://api.github.com/users/msank00/followers",
"following_url": "https://api.github.com/users/msank00/following{/other_user}",
"gists_url": "https://api.github.com/users/msank00/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/msank00",
"id": 11691571,
"login": "msank00",
"node_id": "MDQ6VXNlcjExNjkxNTcx",
"organizations_url": "https://api.github.com/users/msank00/orgs",
"received_events_url": "https://api.github.com/users/msank00/received_events",
"repos_url": "https://api.github.com/users/msank00/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/msank00/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/msank00/subscriptions",
"type": "User",
"url": "https://api.github.com/users/msank00",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-04-23T05:02:00Z
|
2021-08-27T00:08:25Z
|
2021-04-23T11:04:45Z
|
NONE
|
resolved
|
# Summary.
Facing an unusual behavior. `requests.post` working in JupyterLab notebook for the below code snippet. But same is failing if running in script. Saw similar python `requests` issue in github, but people mistakenly posted it in Javascript `request` package github issue page [here](https://github.com/request/request/issues/3147)
## Expected Result
In JupyterLab getting response from the API endpoint and receiving the below response.
```py
{'input_id': 'b36e1cb8-a3e6-11eb-ae01-00505688397d',
'prediction': 12389.1817}
```
The above response is the actual behavior. Next copied the code and ran in a script but failing.
- Both JulyterLab and script running under same conda environement.
## Actual Result
What happened instead.
```py
urllib3.exceptions.ProtocolError: ("Connection broken: TimeoutError(110, 'Connection timed out')", TimeoutError(110, 'Connection timed out'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "webapp/debug_request.py", line 37, in <module>
response = rq.post(
File "/home/s1bfwd/miniconda3/envs/respai/lib/python3.8/site-packages/requests/api.py", line 119, in post
return request('post', url, data=data, json=json, **kwargs)
File "/home/s1bfwd/miniconda3/envs/respai/lib/python3.8/site-packages/requests/api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "/home/s1bfwd/miniconda3/envs/respai/lib/python3.8/site-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "/home/s1bfwd/miniconda3/envs/respai/lib/python3.8/site-packages/requests/sessions.py", line 697, in send
r.content
File "/home/s1bfwd/miniconda3/envs/respai/lib/python3.8/site-packages/requests/models.py", line 831, in content
self._content = b''.join(self.iter_content(CONTENT_CHUNK_SIZE)) or b''
File "/home/s1bfwd/miniconda3/envs/respai/lib/python3.8/site-packages/requests/models.py", line 756, in generate
raise ChunkedEncodingError(e)
requests.exceptions.ChunkedEncodingError: ("Connection broken: TimeoutError(110, 'Connection timed out')", TimeoutError(110, 'Connection timed out'))
```
## Reproduction Steps
```python
import requests as rq
import json
payload = {
"test_input": [
{
"age": 61,
"sex": 0,
"bmi": 28.2,
"children": 0,
"smoker": 0,
"region": 3
}
],
"background_data": "/path/to/data",
"feature_columns": [
"age",
"sex",
"bmi",
"children",
"smoker",
"region"
],
"model_id": "model_id"
}
headers = {
"Content-Type": "application/json",
"Accept": "application/json",
}
url = "URL_to_HIT"
response = rq.post(
url,
data=json.dumps(payload),
verify=False,
headers=headers
)
json.loads(response.content)
```
## System Information
$ python -m requests.help
```
{
"chardet": {
"version": "4.0.0"
},
"cryptography": {
"version": "3.3.1"
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.8.5"
},
"platform": {
"release": "4.15.0-142-generic",
"system": "Linux"
},
"pyOpenSSL": {
"openssl_version": "1010109f",
"version": "20.0.1"
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "101010bf"
},
"urllib3": {
"version": "1.26.2"
},
"using_pyopenssl": true
}
```
This command is only available on Requests v2.16.4 and greater. Otherwise,
please provide some basic information about your system (Python version,
operating system, &c).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5798/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5798/timeline
| null |
completed
| null | null | false |
[
"Hi there! Thanks for opening this issue. Unfortunately, it seems this is a request for help instead of a report of a defect in the project. Please use [StackOverflow](https://stackoverflow.com) for general usage questions instead and only report defects here."
] |
https://api.github.com/repos/psf/requests/issues/5797
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5797/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5797/comments
|
https://api.github.com/repos/psf/requests/issues/5797/events
|
https://github.com/psf/requests/pull/5797
| 864,683,496 |
MDExOlB1bGxSZXF1ZXN0NjIwODc1NzM2
| 5,797 |
Switch LGPL'd chardet for MIT licensed charset_normalizer
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/34150?v=4",
"events_url": "https://api.github.com/users/ashb/events{/privacy}",
"followers_url": "https://api.github.com/users/ashb/followers",
"following_url": "https://api.github.com/users/ashb/following{/other_user}",
"gists_url": "https://api.github.com/users/ashb/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ashb",
"id": 34150,
"login": "ashb",
"node_id": "MDQ6VXNlcjM0MTUw",
"organizations_url": "https://api.github.com/users/ashb/orgs",
"received_events_url": "https://api.github.com/users/ashb/received_events",
"repos_url": "https://api.github.com/users/ashb/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ashb/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ashb/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ashb",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 105 |
2021-04-22T08:25:50Z
|
2021-10-26T15:05:48Z
|
2021-07-06T23:55:03Z
|
CONTRIBUTOR
|
resolved
|
At least for Python 3 -- charset_normalizer doesn't support Python2, so for that chardet is still used -- this means the "have chardet" path is also still tested.
Although using the (non-vendored) chardet library is fine for requests itself, but using a LGPL dependency the story is a lot less clear for downstream projects, particularly ones that might like to bundle requests (and thus chardet) in to a single binary -- think something similar to what docker-compose is doing. By including an LGPL'd module it is no longer clear if the resulting artefact must also be LGPL'd.
By changing out this dependency for one under MIT we remove all license ambiguity.
As an "escape hatch" I have made the code so that it will use chardet first if it is installed, but we no longer depend upon it directly, although there is a new extra added, `requests[lgpl]`. This should minimize the impact to users, and give them an escape hatch if charset_normalizer turns out to be not as good. (In my non-exhaustive tests it detects the same encoding as chartdet in every case I threw at it)
I've read https://github.com/psf/requests/pull/4115, https://github.com/psf/requests/issues/3389, and https://github.com/chardet/chardet/issues/36#issuecomment-768281452 so I'm aware of the history, but I hope that the approach in this PR will allow this to be merged, as right now, the Apache Software Foundation doesn't allow projects to depend upon LGPL'd code (this is something I'm trying to get changed, but it is a _very_ slow process)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 19,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 1,
"laugh": 0,
"rocket": 0,
"total_count": 20,
"url": "https://api.github.com/repos/psf/requests/issues/5797/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5797/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5797.diff",
"html_url": "https://github.com/psf/requests/pull/5797",
"merged_at": "2021-07-06T23:55:02Z",
"patch_url": "https://github.com/psf/requests/pull/5797.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5797"
}
| true |
[
"That looks like an easy fix to a long-standing problem and licencing problem. It would be really great if this change gets merged and released.\n\nFor the maintainers of requests - I know You have limited time for testing but If you think there is any need for extra testing, I am happy to help to get it approved. ",
"Dear maintainers of requests library. \r\n\r\nWould it be possible to hear from you what you think about this idea? It's super-important for us - basically for the whole Apache Software Foundation but for Apache Airflow and Apache Liminal projects particularly. It would be great if we know whether this or similar change will be possible to accept by you or not and whether we should try some alternatives.\r\n\r\nImplementing non-LGPL bound dependency by `requests` would be by-far easiest way to proceed for us, but In case it's not - we will have to start thinking about alternatives before we release next version, because this problem effectively blocks us from releasing our software. \r\n\r\nThis is just a kind request for feedbck and comments, rather than asking to merge it immediately, we just need to know what our options are.\r\n",
"The library is under a rather restrictive (for the maintainers' own sanity) feature-freeze at the moment and unfortunately changing a dependency which can have a ripple-effect on the usage of the library is very unlikely to be accepted - regardless of licensing",
"> The library is under a rather restrictive (for the maintainers' own sanity) feature-freeze at the moment and unfortunately changing a dependency which can have a ripple-effect on the usage of the library is very unlikely to be accepted - regardless of licensing\r\n\r\nIs there anything we can help with to make this happen ? Maybe we can somehow step-in and help you with testing/changes and step-up gradually to become maintainers eventually? I think we could probably try to bring it to the attention of more experienced devs that could help with that?\r\n\r\nOne alternative I can think of is to maintain fork of requests in the Apache-owned organization (and any of the library that uses that uses requests that we use) but obviously this is not something that we'd love - might be good for short-term, tactical approach but not really great for long-term maintenance/strategic solution. \r\n\r\nIs there any timeline for the feature freeze you mentioned ?",
"> The library is under a rather restrictive (for the maintainers' own sanity) feature-freeze at the moment and unfortunately changing a dependency which can have a ripple-effect on the usage of the library is very unlikely to be accepted - regardless of licensing\r\n\r\nCan we also know the reason for the feature-freeze, we like Ash & Jarek said would be happy to help in any testing and maintaining if that is one of the main concerns. This change should help most of the downstream projects which are used at lot of organization who have strict licensing policies.",
"> The library is under a rather restrictive (for the maintainers' own sanity) feature-freeze at the moment and unfortunately changing a dependency which can have a ripple-effect on the usage of the library is very unlikely to be accepted - regardless of licensing\r\n\r\n@sigmavirus24 I can appreciate that. Is this a feature freeze \"for ever\", or is there some planned time frame?",
"Hi everyone,\r\n\r\nLike @sigmavirus24 said, this is highly unlikely to happen. The absence of any kind of answer is a good index.\r\n\r\nI worked in several companies that have a license check mechanism, sometimes used right sometimes not. In those companies, I proposed a really dirty way to fix this by using their internal proxy switching chardet to charset-normalizer via a hacky package. Not a good way to go, but worked.\r\n\r\nThe only concern I would have regarding this PR is the deployment scale, although this alternative package is stable and has been used by many it is no match to requests usage. But we have to ponder this with the actual usage of the 'apparent_encoding' property.\r\n\r\nThis concern can be eliminated with many solutions at the maintainers' disposal like using the 'pre-release' mark from Pypi, using a separate branch automatically synced with the master one, etc...\r\nWe are available if you (maintainers) needed to verify if any regression shows up.\r\n\r\nIf this project has any future major release I would vouch in favor of removing charset detection from it as it is not 'HTTP Client' related. Like httpx did sometime this year.\r\n\r\nOtherwise, technically, correct me if I am wrong, I feel confident to say that charset-normalizer is way more reliable than chardet. And can be proven easily. (ex. i. poor charset coverage, ii. sometimes return encoding that cannot decode given bytes)\r\n\r\nMaintainer @nateprewitt did already take a risk when bumping chardet from 3.x to 4.0 earlier this year. (In a matter of ~80 hours after the tag (4.0 chardet) was published to Github) So one could argue that such a change is not impossible.\r\n\r\nIn case this PR would fail, I like the idea of @potiuk having a fork in the Apache org with this patch and synced with upstream at all times.\r\n\r\nRegards,",
"@sigmavirus24 - what do you think? Any chance we can avoid forking? I think we will need to solve it before the next release of Airflow which is likely going to happen next week, so we do not have a lot of time.",
"@potiuk This is unlikely to be accepted and released within a week.",
"Within a week is tight, yeah 🙂 but even just a \"yes, we are working to accept this\" is probably enough for us to not have to fork it in the short term (which we'd clearly like to avoid.)",
"Hey @sigmavirus24 . FYI: We have started to discuss this in LEGAL JIRA of the Apache Software Foundation. Seems that in the ASF we have now ~50 projects that are using requests library and the discussion came to the conclusion that basically we have to migrate out of the requests library because of the `chardet` dependency.\r\n\r\nHere is the discussion: https://issues.apache.org/jira/browse/LEGAL-572\r\n \r\nWe are discussing about the solution. Forking requests and asking others to use the fork is one thing.\r\n\r\nOne more thing that came out in the discussion is to propose you to donate `requests` to the ASF (and go through incubation process). I believe if you have problems with feature-freeze/maintenance effort needed, becoming part of an established organisation might be a path you might be willing to take. \r\n\r\nWhat do you think? Would you be willing to have a second thought about getting rid of the LGPL dependency and merging the PR ? I think we are really close to start a bigger effort of not only converting 50 ASF projects but also asking a number of 3rd-party libraries to switch to our non-LGPL depending fork that we are planning to make.\r\n\r\nWe really want to play it nicely, please don't treat it as a hostile move, but I think we have no other choice here. ",
"> Is this a feature freeze \"for ever\", or is there some planned time frame?\r\n\r\nThis started with the release of 2.0. The idea was to slow down development to a reasonable pace. Lots of features were being thrown over the wall at us and the library's reach was beginning to sprawl way too much. It's been a feature freeze for quite a few years.\r\n\r\n> Maybe we can somehow step-in and help you with testing/changes and step-up gradually to become maintainers eventually? I think we could probably try to bring it to the attention of more experienced devs that could help with that?\r\n\r\nTo be clear, you're suggesting becoming maintainers to stop a feature freeze created to keep the surface area minimal, to keep churn down to provide a stable library, and not because we're inexperienced as developers or maintainers. I'm sure of course you weren't trying to call us inexperienced because we have a default policy of \"No\" that doesn't belong in Requests.\r\n\r\nThat said, this appears to have been sent as a PR in sudden urgency that feels manufactured as chardet has been a dependency of this library almost as long as the library has existed. It was also created with the message of \"We can do testing if needed\" as in you did no testing expecting us to just merge it because the ASF needed it, which defeats the purpose of designing for stability.\r\n\r\n> One more thing that came out in the discussion is to propose you to donate `requests` to the ASF (and go through incubation process). I believe if you have problems with feature-freeze/maintenance effort needed, becoming part of an established organisation might be a path you might be willing to take.\r\n\r\nSo, `requests` isn't mine to donate by any stretch of the imagination.\r\n\r\nThe PSF is fairly established but doesn't force things.\r\n\r\n> Would you be willing to have a second thought about getting rid of the LGPL dependency and merging the PR ?\r\n\r\nAs I said, I don't trust the PR. Y'all have given me 0 confidence that this actually doesn't break things for users. So that's a hard pass for me on merging this.\r\n\r\nNow that Pip is off of Python 2, however, I think a Requests 3 that's Python 3 only is well within sight and just dropping the character detection altogether is the _right_ choice. I don't think it works well either way with any dependency in particular.\r\n\r\nAs for when Requests 3 might ship, who knows. I don't particularly have a lot of time for that and neither does Nate. Also it probably won't be requests 3. But that's a separate thing altogether",
"Hi @sigmavirus24 \r\n\r\nI am truly amazed by the situation. Lots of things learned in the past month.\r\n\r\nYou do even realize that the frustration lived by the community is born out of those kinds of weird situations?\r\nHow much would it cost to answer that in the beginning? How much PR would be closed already if there were fewer non-said things? This debate could have been avoided in the first place, It appears that maintainers already made their decision the second that PR showed up.\r\n\r\n- You said that maintainers' time is valuable without considering the time of your fellows.\r\n- You said that 'requests' is SO critical that no sudden changes are tolerated YET chardet 4.0 was used in less time than you answering a hard NO to this. How did maintainers verify that the new release was safe? But denying idna latest major.\r\n- When answering a PR about the PSF CoC and clearly dismissing others, how would you expect reasonable people to react?\r\n- And that is a small portion of what there is to say, unfortunately. Lots of contradictions.\r\n\r\nOpenSource can only grow and evolve by confronting each other's opinions, being as truthful as possible, only together that we will succeed.\r\n\r\nI am truly convinced that honesty is gold. Even if brutal and cold as long as said cordially.\r\nNo one said that current maintainers are inexperienced, contributions speak for themselves. Really I was so much inspired by what requests brought that I started composing things on my own. So when others reach to propose help in any form, it should not be considered hostile.\r\nToday I feel a little bit disappointed in requests/maintainers.\r\n\r\n> As for when Requests 3 might ship, who knows. I don't particularly have a lot of time for that and neither does Nate. Also it probably won't be requests 3. But that's a separate thing altogether\r\n\r\nYes, maybe requests3 is meant to be created by others. Who knows.\r\n\r\n> As I said, I don't trust the PR. Y'all have given me 0 confidence that this actually doesn't break things for users. So that's a hard pass for me on merging this.\r\n> Now that Pip is off of Python 2, however, I think a Requests 3 that's Python 3 only is well within sight and just dropping the character detection altogether is the right choice. I don't think it works well either way with any dependency in particular.\r\n\r\nThat is AN opinion, feel free to check out actual facts.\r\n\r\nFinally, thank you for giving us a response even if negative. Others are still hoping.\r\n\r\nHopefully, there is a better future for requests and for us all.\r\nRegards,",
"> How much would it cost to answer that in the beginning? How much PR would be closed already if there were fewer non-said things? This debate could have been avoided in the first place, It appears that maintainers already made their decision the second that PR showed up.\r\n\r\nContrary to the story you seem to have constructed about my opinion (and only mine, I don't speak for Nate), I was keeping an open mind that someone would say \"We did these tests to verify this library is easily swapped for chardet, we think this is safe as an alternative\". The whole conversation hasn't been about backwards compatibility but instead about a license which has been present since before Requests 1.0.\r\n\r\n> You said that 'requests' is SO critical that no sudden changes are tolerated YET chardet 4.0 was used in less time than you answering a hard NO to this. How did maintainers verify that the new release was safe? But denying idna latest major.\r\n\r\nchardet has had consistently top-notch quality releases. It's much like we upgrade urllib3 pretty quickly. I feel like I'm missing something in this conversation though since you seem very hung up on this. 4.0, to the best of my knowledge (which is likely incomplete) didn't cause any issues. Dropping Python 2 will get us onto the latest idna but once again, no one has done any testing to indicate that they've found it to have backwards compatibility. Like this PR, folks just send it and expect it to get merged or expect us to do that testing ourselves. If we were to write that kind of testing into our CI, we'd get low quality bug reports from linux distro maintainers about those tests talking to the open internet or even worse, failing as they package incompatible versions together. Just smashing merge doesn't save us any time and only irritates users who are broken by those changes.\r\n\r\n> That is AN opinion, feel free to check out actual facts.\r\n\r\nI'm genuinely confused by this. Are you arguing that the automatic character detection works well? I have years of issues, stackoverflow questions, emails, and blog posts indicating that it's terrible for a great number of people. Maybe not 100% of users, but without any kind of telemetry I can only look at the data available to me and make the determination that users are liable to be less confused by Requests' behaviour if something like chardet wasn't used. That's also orthogonal to the only concern the ASF seems to have which is the license.\r\n\r\n",
"Text is not a good form of exchange, with too much incomprehension.\r\n\r\n> Contrary to the story you seem to have constructed about my opinion (and only mine, I don't speak for Nate), I was keeping an open mind.\r\n\r\nI don't doubt your good intentions. What I am saying today is that something could have been handled in a better way. Many things you oppose to this PR are stated as things that could have been saying earlier. Not admitting that is not the way to go I think.\r\n\r\n> chardet has had consistently top-notch quality releases.\r\n\r\nComparing chardet to urllib3 is a bit of a stretch. Many open-source programs are well released. Ok for urllib3 but not chardet, I have studied the code and it does not answer the \"top quality\" safe content.\r\nurllib3 release-process is in fact a well-oiled machine.\r\n\r\n> didn't cause any issues.\r\n\r\nFuture tense. Took the risk and waited. Could have been a disaster. As you said, I hung on that. The release calendar is abrupt. This is reasonable to raise the question.\r\n\r\n> Like this PR, folks just send it and expect it to get merged or expect us to do that testing ourselves.\r\n\r\nI think one of the main issues is your feeling regarding those PRs. A contribution could be meet halfway, if you wish nothing to be done to your end, that okay. Be a mentor, guide them. Let them do the actual work, share your knowledge, expand your horizon with them. If you wish to be a solo reviewer/merger, that okay for everyone.\r\n\r\n> Just smashing merge doesn't save us any time and only irritates users who are broken by those changes.\r\n\r\nEveryone knows that and agrees with it. You saw in people's message only that. But let me assure you that it wasn't just about that.\r\n\r\n> Are you arguing that the automatic character detection works well?\r\n\r\nNo, never said that. Some solution brings more stability than other for sure. Chardet is far behind cchardet/uchardet, should be completely blind to not see it. I am as well placed as you to answer things on this matter.\r\n\r\n> but without any kind of telemetry I can only look at the data available to me\r\n\r\nYou really need to make some more research. There is more data than \"what you have\". You would know it if you looked at what was made. \r\n\r\nOtherwise, I insist that if you opened up earlier. No one would have insisted for weeks...\r\n\r\n> but instead about a license which has been present since before Requests 1.0.\r\n\r\nYes, this is true. But alternatives were not there until recently and would have been ridiculous to propose it sooner. Needed more time to gain maturity across usage. cchardet excluded due to binding/build constraint.\r\n\r\nRegards,",
"@Ousret I think you've interpreted things and added context to my actions that simply isn't there and made assumptions that have led you to believe I've wasted time or disrespected others' time.\r\n\r\nYou mention the CoC PR. I did all of that while on my phone picking up groceries or walking my dog because it was easy and didn't need nuanced communication. This issue, however, I have wanted to provide clear communication about and doing that from my phone is nearly impossible. I had time today at a computer, but genuinely most of my time at my computer these days is not spent on Open Source or even this website. What little time my personal computer is on it is in search of something I or my family needs or some other task that needs to be done and then I'm away from it rather immediately.\r\n\r\nYou don't get to see that. This 'social' platform doesn't give you that context. Some things I can handle quickly without needing to communicate clearly, others not so much. Early on, I misread that folks were planning to do more testing. Only later on a re-read after the umpteenth direct ping about this did I realize I'd misread it and that was an afterthought on the author's part. Any contribution I've made to another project I've ensured was tested and I've tried my best to test. I've even reached out to the author's in advance of sending the PR to see if they have advice or if my tests would be sufficient.\r\n\r\n> Chardet is far behind cchardet/uchardet, should be completely blind to not see it.\r\n\r\n\"Far behind\" and the three, from the last several times I tested it returned wildly different results. Swapping out dependencies that produce different results is not backwards compatible. But apparently I'm blind because I have constraints that those projects haven't bothered to research or consider or even talk to me about before declaring themselves the clearly superior project in all cases, especially this one.",
"@sigmavirus24 I'm sorry for all the raised tempers that this PR has envoked, this was _far_ from my intent.\r\n\r\nMy understanding is then that the requests library is in critical fixes only mode, and that you don't view this as a critical fix, so I'll close this PR as it's clear you won't accept it because you don't want to break code for existing users, which is something I can entirely appreciate!\r\n\r\nIf there is _any_ chance you would accept this PR I will do the work to make it happen, but I don't think it'll happen. Please tell me if I'm mistaken.\r\n\r\n",
"@ashb I'm sorry for the miscommunications and how little time I have to test this.\n\nOne thing I can think of is that if the ASF can run a test against some of the most popular websites and determine for the ones that don't declare an encoding does this dependency and chardet agree?\n\nThere are open lists of these websites, sadly I expect most of them to actually declare their encoding but it's worth a shot as a sample. This is one of the tests I've run in the past but neither have sponsored hardware to use nor time to orchestrate",
"@sigmavirus24 No problem, I understand the pressures of being an opensource maintainer!\r\n\r\nPerhaps another test we can do is to capture some sites in various encodings and _strip out_ the encoding headers/meta tags?",
"Yes. The main thing we need is reasonable confidence this is mostly backwards compatible (I don't expect full backwards compatibility because I know chardet's algorithm is far from perfect and I know the alternatives are also far from perfect). I do except something greater than 75% compatibility roughly. The last time I tested things the compatibility was roughly 60% even accounting for the fact that chardet reports some encodings differently (because the models are a little more dated). Sadly when Rackspace killed their F/OSS credit program I lost that data because I hadn't backed it up elsewhere",
"> Yes. The main thing we need is reasonable confidence this is mostly backwards compatible (I don't expect full backwards compatibility because I know chardet's algorithm is far from perfect and I know the alternatives are also far from perfect). I do except something greater than 75% compatibility roughly. The last time I tested things the compatibility was roughly 60% even accounting for the fact that chardet reports some encodings differently (because the models are a little more dated). Sadly when Rackspace killed their F/OSS credit program I lost that data because I hadn't backed it up elsewhere\r\n\r\nThis is great that we can discuss now how we can help with testing :). I would be very happy to help. Especially that better part of yesterday and today I implemented and tested replacing requests with httpx for Apache Airflow (we need it in order to release it). \r\n\r\nActually one other thing that the ASF projects can do - if we have alpha/beta release of `requests` with optional chardet dependency - we could ask all the projects that are using requests to test it it works for them with chardet removed. ",
"I ran a quick test using both the Alexa top 50 domains and a list of 500 domains I grabbed from [moz.com](https://moz.com/top500). The results were fairly positive, though it would be worth actually grabbing the full Alexa list for a few countries (US, JP, RU etc.) to repeat the test. \r\n\r\nQuick and dirty script is here: https://gist.github.com/da1910/79c168294a8dfe2957a8cbc61daa1710\r\n\r\nThe reported encoding, chardet's and charset-normalizer's determinations are included in the csv files, and the table below shows the results where the two packages differ.\r\n\r\n### Alexa Top 50 (2 timed out, 12 reported no encoding, 3 had a different result):\r\n| URL | Chardet | Charset_Normalizer |\r\n| --- | ------- | ------------------ |\r\n| Wikipedia.org | Windows-1254 | utf_8 |\r\n| Twitch.tv | Windows-1252 | utf_8 |\r\n| Amazon.co.jp | SHIFT_JIS | cp932 |\r\n\r\n[all_results_50.txt](https://github.com/psf/requests/files/6459691/all_results_50.txt)\r\n\r\n### Moz's top 500 domains (17 timed out, 92 reported no encoding, 11 had different results):\r\n| URL | Chardet | Charset_Normalizer |\r\n| --- | ------- | ------------------ |\r\n| brandbucket.com | ISO-8859-1 | utf_8 |\r\n| cdc.gov | utf-8 | utf_8 |\r\n| nasa.gov | ISO-8859-1 | utf_8 |\r\n| youronlinechoices.com | Windows-1252 | iso8859_10 |\r\n| amazon.co.jp | SHIFT_JIS | cp932 |\r\n| twitch.tv | Windows-1252 | utf_8 |\r\n| m.wikipedia.org | Windows-1254 | utf_8 |\r\n| wikipedia.org | Windows-1254 | utf_8 |\r\n| wn.com | Windows-1254 | utf_8 |\r\n| gutenberg.org | Windows-1252 | utf_8 |\r\n| photos1.blogger.com | Windows-1252 | None |\r\n\r\nNote: photos1.blogger.com returned a gif for me, so it's most definitely not windows-1252...\r\n\r\n[all_results_500.txt](https://github.com/psf/requests/files/6459815/all_results_500.txt)\r\n\r\n",
"I will try to run more tests tomorrow with more sites.\r\n\r\nJust for @sigmavirus24 and others who read that - so that you was aware why it is important for us and what is the problem we are trying to address. Maybe you simply are not aware what is the extent of the problem.\r\n\r\nFor Airflow - we are just about to release 2.1, and I just finished the PR (took me more than a day) where I removed requests as core dependency (basically we replaced `requests` with `httpx`, we vendored-in `connexion` (we have API build with `connexion`) and replaced requests with `httpx` for it) and changed HTTP provider (and it's derived classes) to use `httpx`. Here is the PR: https://github.com/apache/airflow/pull/15781 . This PR still needs to be reviewed, corrected after review thoroughly tested and merged. This is a monster change: 86 files, +7,035 −684 lines of code.\r\n\r\n\r\n\r\nBut as PMC of Airflow we have no choice now - we are obliged to do it to release new version (now that we are aware of the problem with `chardet` it would be conscious violating the policy of ASF if we keep `chardet` as mandatory requirement). \r\n\r\nLuckily we could do it rather \"quickly\" because we've already split airflow into core and optional parts and this OK with the policy of ASF to have optional dependency on LGPL. But we cannot have mandatory one.\r\n\r\nBut this does not even touch all the optional dependencies and transitive ones. We have 43 3rd-party packages that use requests (https://issues.apache.org/jira/browse/LEGAL-572?focusedCommentId=17341767&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-17341767) where requests is used (GCP/Azure/Docker/Kubernetes - they are optional but it's hard to imagine anyone using Airflow withou tthose).\r\n\r\nUnfotunately some other projects of ASF are using Airflow as dependency (for example [Apache Liminal](https://incubator.apache.org/clutch/liminal.html) and they require docker, Kubernetes optional parts of Airflfow. So they are in much worse situation - because for them pretty much a chain of dependencies would have to be updated. So for now Liminal is pretty much blocked.\r\n\r\nThose ~50 projects of ASF could either do the same as I just did, or test requests with `charset_normalizer` when it is ready for testing in alpha/beta. ",
"Hi,\r\n\r\nFirst of all, I would like to thanks @sigmavirus24 \r\n\r\nThen @ashb \r\nI created two tests playground using docker-compose with python 2.7 and 3.8 using your branch. Sources available at https://github.com/Ousret/requests-x-charset_normalizer\r\n\r\nThe results are also fairly positives. The remaining only problem is that you missed filtering out the warning about \"trying to detect from..\" as did @da1910 \r\n\r\nHere are what I got from running `playground27` and `playground38`. (Diff only)\r\n\r\nI took into account that `ISO-8859-7` == `cp1253`, as it is the same code page under a different name.\r\n\r\nWe have a firm **78 %** backward compatibility exact results. Based on +400 files. And these numbers increase if we tolerate different encoding that produces equal Unicode output. Even more, if we tolerate minor differences.\r\n\r\n```csv\r\nfile;chardet;charset_normalizer\r\n/raw/utf-8-sig/bom-utf-8.srt;UTF-8-SIG;utf_8\r\n/raw/windows-1250-slovak/_ude_1.txt;Windows-1254;latin_1\r\n/raw/TIS-620/opentle.org.xml;TIS-620;iso8859_11\r\n/raw/iso-8859-2-hungarian/hirtv.hu.xml;ISO-8859-1;cp1250\r\n/raw/Johab/iyagi-readme.txt;None;johab\r\n/raw/windows-1250-czech/_ude_2.txt;Windows-1252;cp850\r\n/raw/iso-8859-2-hungarian/shamalt.uw.hu.mv.xml;ISO-8859-1;cp1250\r\n/raw/windows-1250-hungarian/bbc.co.uk.hu.forum.xml;ISO-8859-1;cp1250\r\n/raw/windows-1250-slovak/_ude_3.txt;Windows-1254;latin_1\r\n/raw/iso-8859-2-hungarian/shamalt.uw.hu.mk.xml;ISO-8859-1;cp1250\r\n/raw/utf-8/sample.3.ar.srt;UTF-8-SIG;utf_8\r\n/raw/TIS-620/pharmacy.kku.ac.th.centerlab.xml;TIS-620;iso8859_11\r\n/raw/IBM866/blog.mlmaster.com.xml;IBM866;cp1125\r\n/raw/iso-8859-2-hungarian/auto-apro.hu.xml;ISO-8859-1;cp1250\r\n/raw/GB2312/jjgod.3322.org.xml;GB2312;gb18030\r\n/raw/windows-1254-turkish/_chromium_windows-1254_with_no_encoding_specified.html;ISO-8859-1;hp_roman8\r\n/raw/GB2312/cnblog.org.xml;GB2312;gb18030\r\n/raw/GB2312/cindychen.com.xml;GB2312;gb18030\r\n/raw/windows-1252/_mozilla_bug421271_text.html;ISO-8859-1;cp437\r\n/raw/windows-1256-arabic/sample.2.ar.srt;MacCyrillic;cp1256\r\n/raw/GB2312/chen56.blogcn.com.xml;GB2312;gb18030\r\n/raw/TIS-620/_mozilla_bug488426_text.html;TIS-620;iso8859_11\r\n/raw/iso-8859-2-czech/_ude_1.txt;ISO-8859-1;iso8859_15\r\n/raw/iso-8859-2-hungarian/shamalt.uw.hu.mr.xml;ISO-8859-1;cp1250\r\n/raw/TIS-620/pharmacy.kku.ac.th.analyse1.xml;TIS-620;iso8859_11\r\n/raw/IBM866/newsru.com.xml;IBM866;cp1125\r\n/raw/GB2312/pda.blogsome.com.xml;GB2312;gb18030\r\n/raw/GB2312/luciferwang.blogcn.com.xml;GB2312;gb18030\r\n/raw/IBM866/music.peeps.ru.xml;IBM866;cp1125\r\n/raw/GB2312/softsea.net.xml;GB2312;gb18030\r\n/raw/windows-1252/_ude_2.txt;Windows-1252;cp437\r\n/raw/windows-1250-slovene/_ude_1.txt;Windows-1252;cp850\r\n/raw/GB2312/_chromium_gb18030_with_no_encoding_specified.html.xml;GB2312;euc_jis_2004\r\n/raw/iso-8859-2-hungarian/cigartower.hu.xml;ISO-8859-1;cp1250\r\n/raw/windows-1256-arabic/sample.1.ar.srt;MacCyrillic;cp1256\r\n/raw/iso-8859-2-hungarian/honositomuhely.hu.xml;ISO-8859-1;cp1250\r\n/raw/GB2312/_mozilla_bug171813_text.html;GB2312;big5hkscs\r\n/raw/IBM866/aug32.hole.ru.xml;IBM866;cp1125\r\n/raw/IBM866/aif.ru.health.xml;IBM866;cp1125\r\n/raw/iso-8859-2-hungarian/shamalt.uw.hu.xml;ISO-8859-1;cp1250\r\n/raw/IBM866/kapranoff.ru.xml;IBM866;cp1125\r\n/raw/windows-1256-arabic/_chromium_windows-1256_with_no_encoding_specified.html;MacCyrillic;cp1256\r\n/raw/windows-1250-slovak/_ude_2.txt;Windows-1254;hp_roman8\r\n/raw/windows-1250-hungarian/_ude_2.txt;Windows-1252;iso8859_10\r\n/raw/iso-8859-2-hungarian/escience.hu.xml;ISO-8859-1;cp1250\r\n/raw/Johab/hlpro-readme.txt;None;johab\r\n/raw/utf-8/_ude_3.txt;utf-8;None\r\n/raw/GB2312/godthink.blogsome.com.xml;GB2312;gb18030\r\n/raw/IBM866/intertat.ru.xml;IBM866;cp1125\r\n/raw/SHIFT_JIS/_ude_1.txt;SHIFT_JIS;shift_jis_2004\r\n/raw/utf-8-sig/_ude_4.txt;UTF-8-SIG;utf_8\r\n/raw/iso-8859-2-hungarian/ugyanmar.blogspot.com.xml;ISO-8859-1;cp1250\r\n/raw/iso-8859-6-arabic/_chromium_ISO-8859-6_with_no_encoding_specified.html;MacCyrillic;iso8859_6\r\n/raw/windows-1252/_ude_1.txt;Windows-1252;cp850\r\n/raw/IBM866/money.rin.ru.xml;IBM866;cp1125\r\n/raw/windows-1250-hungarian/objektivhir.hu.xml;ISO-8859-1;cp1250\r\n/raw/windows-1252/github_bug_9.txt;Windows-1252;cp437\r\n/raw/windows-1250-hungarian/bbc.co.uk.hu.pressreview.xml;Windows-1252;cp1250\r\n/raw/IBM866/forum.template-toolkit.ru.4.xml;IBM866;cp1125\r\n/raw/iso-8859-9-turkish/divxplanet.com.xml;ISO-8859-1;cp1254\r\n/raw/SHIFT_JIS/_ude_4.txt;SHIFT_JIS;shift_jis_2004\r\n/raw/IBM866/janulalife.blogspot.com.xml;IBM866;cp1125\r\n/raw/EUC-KR/_chromium_windows-949_with_no_encoding_specified.html;EUC-KR;gb2312\r\n/raw/iso-8859-7-greek/disabled.gr.xml;windows-1253;iso8859_7\r\n/raw/iso-8859-2-polish/_ude_1.txt;ISO-8859-1;hp_roman8\r\n/raw/iso-8859-2-hungarian/saraspatak.hu.xml;ISO-8859-1;cp1250\r\n/raw/windows-1250-hungarian/bbc.co.uk.hu.xml;Windows-1252;cp1250\r\n/raw/IBM866/forum.template-toolkit.ru.8.xml;IBM866;cp1125\r\n/raw/GB2312/coverer.com.xml;GB2312;gb18030\r\n/raw/windows-1250-romanian/_ude_1.txt;Windows-1252;iso8859_15\r\n/raw/IBM866/_ude_1.txt;IBM866;cp1125\r\n/raw/iso-8859-1/_ude_1.txt;ISO-8859-1;hp_roman8\r\n/raw/TIS-620/trickspot.boxchart.com.xml;TIS-620;iso8859_11\r\n/raw/IBM866/forum.template-toolkit.ru.6.xml;IBM866;cp1125\r\n/raw/windows-1254-turkish/_ude_1.txt;Windows-1252;iso8859_15\r\n/raw/windows-1250-polish/_ude_1.txt;Windows-1252;hp_roman8\r\n/raw/IBM866/susu.ac.ru.xml;IBM866;cp1125\r\n/raw/GB2312/w3cn.org.xml;GB2312;gb18030\r\n/raw/EUC-TW/_ude_euc-tw1.txt;EUC-TW;gb18030\r\n/raw/IBM866/greek.ru.xml;IBM866;cp1125\r\n/raw/IBM866/forum.template-toolkit.ru.9.xml;IBM866;cp1125\r\n/raw/GB2312/cappuccinos.3322.org.xml;GB2312;gb18030\r\n/raw/windows-1250-czech/_ude_1.txt;Windows-1254;cp850\r\n/raw/windows-1250-hungarian/_ude_1.txt;Windows-1252;mac_latin2\r\n/raw/windows-1250-croatian/_ude_1.txt;Windows-1252;cp850\r\n/raw/IBM866/forum.template-toolkit.ru.1.xml;IBM866;cp1125\r\n/raw/Johab/mdir-doc.txt;None;johab\r\n/raw/TIS-620/pharmacy.kku.ac.th.healthinfo-ne.xml;TIS-620;iso8859_11\r\n/raw/GB2312/eighthday.blogspot.com.xml;GB2312;gb18030\r\n/raw/ascii/_mozilla_bug638318_text.html;ascii;None\r\n/raw/windows-1250-hungarian/bbc.co.uk.hu.learningenglish.xml;ISO-8859-1;cp1250\r\n/raw/GB2312/14.blog.westca.com.xml;GB2312;gb18030\r\n/raw/utf-8-sig/sample-english.bom.txt;UTF-8-SIG;utf_8\r\n/raw/iso-8859-1/_ude_6.txt;ISO-8859-1;cp1250\r\n/raw/EUC-JP/_mozilla_bug431054_text.html;EUC-JP;cp1252\r\n/raw/windows-1250-hungarian/torokorszag.blogspot.com.xml;ISO-8859-1;cp1250\r\n/raw/windows-1256-arabic/sample.4.ar.srt;MacCyrillic;cp1256\r\n```\r\n\r\nThere is the question of `UTF-8-SIG` and `UTF-8`. CharsetNormalizer return 'UTF-8'.\r\n\r\nYou may find my JSON outputs : \r\n- Python 2.7 (Chardet): https://pastebin.com/JirQLXi8\r\n- Python 3.8 (Charset-Normalizer): https://pastebin.com/BufNHD8B\r\n\r\nTo retrieve your own outputs: Run `check_compat.py` after generating required JSONs in the ./results directory.\r\n\r\nRegards,",
"I have patched the legacy function `detect` to comply more. Starting from v1.3.7 it will return 'utf-8-sig' when the SIG is present in the payload. More explanations/details in the PR description.\r\n\r\n```\r\nplayground38_1 | /raw/utf-8-sig/sample-english.bom.txt;utf_8_sig\r\nplayground38_1 | /raw/utf-8-sig/bom-utf-8.srt;utf_8_sig\r\nplayground38_1 | /raw/utf-8-sig/_ude_4.txt;utf_8_sig\r\n```\r\n\r\nThe compatibility reaches 79 % now. ",
"I run some first tests on 33138 URLs which are unique URLS among top 1000 from 80 countries in the world (took them from https://dataforseo.com/top-1000-websites) \r\n\r\nI modified slightly the solution of @da1910 to add parallelism. I think I have to take a closer look as I had to handle the exception (more below). The full run takes ~ 1.5h on my PC I will run it tomorrow again with some fixes but maybe someone will find it useful:\r\n\r\nThe 33138 urls are here: https://github.com/potiuk/test-charset-normalizer/blob/main/URLS.csv\r\n\r\nI got 163 exceptions during running the test and I had no time to look at the cause so I simply logged the count for now to see the first result and how long it takes to run. You can see the results in \"res\" folder here:\r\nhttps://github.com/potiuk/test-charset-normalizer/tree/main/res\r\n\r\nFor now I refrain from interpretation before I take a closer look - but feel free.\r\n\r\nThe exceptions look always the same. I will take a look tomorrow more closely (got first COVID vaccination today YAY!) but maybe someone can shed some light on that one:\r\n\r\n```\r\nlocal:34/0/100%/0.0s Trying url 'occ.pt'\r\nTrying url 'occhionotizie.it'\r\nTrying url 'oceaniabiz.com'\r\nTraceback (most recent call last):\r\n File \"./test_encoding.py\", line 53, in <module>\r\n normalizer_encoding = charset_normalizer.detect(response.content)\r\n File \"/home/jarek/.pyenv/versions/test_encoding/lib/python3.8/site-packages/charset_normalizer/legacy.py\", line 24, in detect\r\n 'language': r.language if r is not None and r.language != 'Unknown' else '',\r\n File \"/home/jarek/.pyenv/versions/test_encoding/lib/python3.8/site-packages/cached_property.py\", line 36, in __get__\r\n value = obj.__dict__[self.func.__name__] = self.func(obj)\r\n File \"/home/jarek/.pyenv/versions/test_encoding/lib/python3.8/site-packages/charset_normalizer/normalizer.py\", line 141, in language\r\n return 'English' if len(self.alphabets) == 1 and self.alphabets[0] == 'Basic Latin' else 'Unknown'\r\n File \"/home/jarek/.pyenv/versions/test_encoding/lib/python3.8/site-packages/cached_property.py\", line 36, in __get__\r\n value = obj.__dict__[self.func.__name__] = self.func(obj)\r\n File \"/home/jarek/.pyenv/versions/test_encoding/lib/python3.8/site-packages/charset_normalizer/normalizer.py\", line 86, in alphabets\r\n return list(self.ranges.keys())\r\nAttributeError: 'list' object has no attribute 'keys'\r\nlocal:33/1/100%/17.0s Trying url 'ul.pt'\r\n\tFailed https, trying http\r\nTrying url 'ula.ve'\r\n\tFailed https, trying http\r\nTrying url 'ulac.lt'\r\n\tFailed https, trying http\r\nTrying url 'ulapland.fi'\r\n\tFailed https, trying http\r\nTrying url 'ulasantempat.com'\r\nTraceback (most recent call last):\r\n File \"./test_encoding.py\", line 53, in <module>\r\n normalizer_encoding = charset_normalizer.detect(response.content)\r\n File \"/home/jarek/.pyenv/versions/test_encoding/lib/python3.8/site-packages/charset_normalizer/legacy.py\", line 24, in detect\r\n 'language': r.language if r is not None and r.language != 'Unknown' else '',\r\n File \"/home/jarek/.pyenv/versions/test_encoding/lib/python3.8/site-packages/cached_property.py\", line 36, in __get__\r\n value = obj.__dict__[self.func.__name__] = self.func(obj)\r\n File \"/home/jarek/.pyenv/versions/test_encoding/lib/python3.8/site-packages/charset_normalizer/normalizer.py\", line 141, in language\r\n return 'English' if len(self.alphabets) == 1 and self.alphabets[0] == 'Basic Latin' else 'Unknown'\r\n File \"/home/jarek/.pyenv/versions/test_encoding/lib/python3.8/site-packages/cached_property.py\", line 36, in __get__\r\n value = obj.__dict__[self.func.__name__] = self.func(obj)\r\n File \"/home/jarek/.pyenv/versions/test_encoding/lib/python3.8/site-packages/charset_normalizer/normalizer.py\", line 86, in alphabets\r\n return list(self.ranges.keys())\r\nAttributeError: 'list' object has no attribute 'keys'\r\n\r\n```\r\n\r\n",
"@sigmavirus24 It would be good to get your feedback on the tests conducted whenever you get time. Looks like from @Ousret 's & @potiuk 's tests it is more than 75% compatibility. However, I want to make sure that it aligns with the tests you had in mind or something else needs to be tested too.\r\n\r\n",
"Nice testing scenario @potiuk \r\nIndeed, the exception happens because I think your client may end up getting empty responses from some hosts.\r\nYou may try to run these again.",
"> Indeed, the exception happens because I think your client may end up getting empty responses from some hosts.\r\n> You may try to run these again.\r\n\r\nFantastic news! Will do tomorrow!. Getting to sleep now :)\r\n",
"Hey @Ousret ! Thanks for such quick fixing. I am rerunning it all with more verbose output and improved results and found another exception: \r\n\r\n```\r\nxai\tTraceback (most recent call last):\r\nxai\t File \"./test_encoding.py\", line 73, in <module>\r\nxai\t normalizer_result = charset_normalizer.detect(response.content)\r\nxai\t File \"/home/jarek/.pyenv/versions/test_encoding/lib/python3.8/site-packages/charset_normalizer/legacy.py\", line 20, in detect\r\nxai\t r = CnM.from_bytes(byte_str).best().first()\r\nxai\t File \"/home/jarek/.pyenv/versions/test_encoding/lib/python3.8/site-packages/charset_normalizer/normalizer.py\", line 517, in from_bytes\r\nxai\t fingerprint_tests = [el.fingerprint == cnm.fingerprint for el in matches]\r\nxai\t File \"/home/jarek/.pyenv/versions/test_encoding/lib/python3.8/site-packages/charset_normalizer/normalizer.py\", line 517, in <listcomp>\r\nxai\t fingerprint_tests = [el.fingerprint == cnm.fingerprint for el in matches]\r\nxai\t File \"/home/jarek/.pyenv/versions/test_encoding/lib/python3.8/site-packages/cached_property.py\", line 36, in __get__\r\nxai\t value = obj.__dict__[self.func.__name__] = self.func(obj)\r\nxai\t File \"/home/jarek/.pyenv/versions/test_encoding/lib/python3.8/site-packages/charset_normalizer/normalizer.py\", line 259, in fingerprint\r\nxai\t return sha256(self.output()).hexdigest()\r\nxai\t File \"/home/jarek/.pyenv/versions/test_encoding/lib/python3.8/site-packages/charset_normalizer/normalizer.py\", line 267, in output\r\nxai\t return str(self).encode(encoding)\r\nxai\tUnicodeEncodeError: 'utf-8' codec can't encode character '\\udb82' in position 3057: surrogates not allowed\r\n```\r\n"
] |
https://api.github.com/repos/psf/requests/issues/5796
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5796/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5796/comments
|
https://api.github.com/repos/psf/requests/issues/5796/events
|
https://github.com/psf/requests/issues/5796
| 861,968,459 |
MDU6SXNzdWU4NjE5Njg0NTk=
| 5,796 |
Request.params causes invalid response with OpenID authorize call with response_type=code
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/22151742?v=4",
"events_url": "https://api.github.com/users/caffeinatedMike/events{/privacy}",
"followers_url": "https://api.github.com/users/caffeinatedMike/followers",
"following_url": "https://api.github.com/users/caffeinatedMike/following{/other_user}",
"gists_url": "https://api.github.com/users/caffeinatedMike/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/caffeinatedMike",
"id": 22151742,
"login": "caffeinatedMike",
"node_id": "MDQ6VXNlcjIyMTUxNzQy",
"organizations_url": "https://api.github.com/users/caffeinatedMike/orgs",
"received_events_url": "https://api.github.com/users/caffeinatedMike/received_events",
"repos_url": "https://api.github.com/users/caffeinatedMike/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/caffeinatedMike/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/caffeinatedMike/subscriptions",
"type": "User",
"url": "https://api.github.com/users/caffeinatedMike",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2021-04-19T21:46:34Z
|
2021-11-26T03:00:37Z
|
2021-08-28T02:43:52Z
|
NONE
|
resolved
|
When attempting to call an OpenID authorize endpoint with a response_type of "code" the `params` property seems to be causing the calls to fail.
## Expected Result
The `params` should properly supply the OpenID-related fields as url parameters, so the authorization can proceed
## Actual Result
The `params` cause the call to fail in an error on the webserver, stopping the authorization process
## Reproduction Steps
```python
from urllib.parse import urlencode, quote
from base64 import urlsafe_b64encode
import hashlib
import random
import string
import os
import re
sess = requests.Session()
code_verifier = re.sub(r'[^a-zA-Z0-9]+', '', urlsafe_b64encode(os.urandom(40)).decode())
code_challenge = urlsafe_b64encode(
hashlib.sha256(code_verifier.encode()).digest()
).decode().replace('=', '') # hash code_verifier, then base64 urlencode and remove padding
url_params = dict(
client_id="connect-supplier-web",
redirect_uri="https://site.com/callback",
response_type="code",
scope="openid profile vendor-pricing-mgmt-api power-bi-report-api customer-api site-api",
state=''.join(random.choice(string.ascii_lowercase + string.digits) for _ in range(32)),
code_challege=code_challenge,
code_challenge_method="S256",
response_mode="query"
)
###
# FIRST ATTEMPT
###
resp = sess.get("https://connect-identity-server.site.com/connect/authorize", params=url_params) # does not work
print(resp.url) # resp.url is now https://connect-identity-server.site.com/home/error?errorId=CfD......X95
###
# SECOND ATTEMPT - Thinking maybe it was because of the "scope" field being quote_plused instead of quoted
###
resp = sess.get("https://connect-identity-server.site.com/connect/authorize", params=urlencode(url_params, quote_via=quote)) # does not work
print(resp.url) # resp.url is still https://connect-identity-server.site.com/home/error?errorId=CfD......X95
###
# THIRD ATTEMPT - Forgoing requests' params and providing the full, already-constructed url
###
resp = sess.get(
"https://connect-identity-server.site.com/connect/authorize?"
"client_id=connect-supplier-web&"
"redirect_uri=https%3A%2F%2Fsite.com%2Fcallback&"
"response_type=code&"
"scope=openid%20profile%20vendor-pricing-mgmt-api%20power-bi-report-api%20customer-api%20site-api&"
f"state={''.join(random.choice(string.ascii_lowercase + string.digits) for _ in range(32))}&"
f"code_challenge={code_challenge}&"
"code_challenge_method=S256&"
"response_mode=query"
)
print(resp.url) # resp.url is now correct - https://connect-identity-server.site.com/Account/Login?ReturnUrl=%2Fconnect%2Fauthorize%2Fcallback%3Fclient_id%3Dconnect-supplier-...%3DS256%26response_mode%3Dquery
```
## System Information
```
{
"chardet": {
"version": "3.0.4"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.7.8"
},
"platform": {
"release": "10",
"system": "Windows"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.24.0"
},
"system_ssl": {
"version": "1010107f"
},
"urllib3": {
"version": "1.25.9"
},
"using_pyopenssl": false
}
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5796/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5796/timeline
| null |
completed
| null | null | false |
[
"What do the _request_ url's look like? And what do they look like initially? I.e., what is in `[req.url for req in resp.history]` and does it match what you build yourself? What's different there? Also we're using the standard library to encode the query parameters. Does your server require a specific ordering for some reason? What have you done to debug this? \r\n\r\nThis doesn't smell like a bug in requests",
"@sigmavirus24 Looking at the history of the `params` request, the url is the same. See below:\r\n```\r\n# Produced using Request.params = urlencode(url_params, quote_via=quote)\r\nhttps://connect-identity-server.site.com/connect/authorize?\r\n\tclient_id=connect-supplier-web&\r\n\tredirect_uri=https%3A%2F%2Fconnectsupplier.site.com%2Fcallback&\r\n\tresponse_type=code&\r\n\tscope=openid%20profile%20vendor-pricing-mgmt-api%20power-bi-report-api%20customer-api%20site-api&\r\n\tstate=baic4eph7ax2ul79y6piimvwoyv3i8tv&\r\n\tcode_challege=NtyvKJEji0dC_ozSALJOMX7DqiMlLbKDLv38WSh-GV0&\r\n\tcode_challenge_method=S256&\r\n\tresponse_mode=query\r\n\t\r\n\r\n# Produced via one large url string with f-strings\r\n\"https://connect-identity-server.site.com/connect/authorize?\"\r\n\t\"client_id=connect-supplier-web&\"\r\n\t\"redirect_uri=https%3A%2F%2Fconnectsupplier.site.com%2Fcallback&\"\r\n\t\"response_type=code&\"\r\n\t\"scope=openid%20profile%20vendor-pricing-mgmt-api%20power-bi-report-api%20customer-api%20site-api&\"\r\n\tf\"state={generate_random_string(32).lower()}&\" => state=baic4eph7ax2ul79y6piimvwoyv3i8tv&\r\n\tf\"code_challenge={code_challenge}&\" => code_challege=NtyvKJEji0dC_ozSALJOMX7DqiMlLbKDLv38WSh-GV0&\r\n\t\"code_challenge_method=S256&\"\r\n\t\"response_mode=query\"\r\n```\r\nThe only difference I see is how the `params` version causes a 301 redirect to an error page. \r\n\r\nHere are some screenshots of the various properties between the two versions\r\n# Response Headers\r\n### Params Version\r\n\r\n### String Version\r\n\r\n# History Headers\r\n### Params Version\r\n\r\n### String Version\r\n\r\n# History Request\r\n### Params Version\r\n\r\n### String Version\r\n\r\n\r\nIf you need to see any other info, I'm happy to provide it.",
"Screenshots are inaccessible for people with screen readers and are not searchable. Please copy and paste plain text",
"Tell me what you specifically want to see from the requests and I'll happily do so. But, thus far you've been vague in what to provide.",
"There isn't enough info in here to work out what's going on. The URL being generated by the f-string is clearly different than what params is generating given that it works. My assumption is something is happening with urlencoding and the params dict. The spaces in `scope` may be using `+` instead of `%20` for example.\r\n\r\nI'd recommend writing your script to output the _exact_ computed f-string url and then checking response.history for the original request. You can access that from the original response in the history and checking `response.request.url`. That is the easiest way to determine a difference.\r\n\r\nIf you find the class of encoding differences, we can discuss things further but it's unlikely we can change this behavior because we're relying on the standard library `urlencode`. The best bet in that case is to encode the query string yourself for this case using the [PreparedRequest](https://docs.python-requests.org/en/master/user/advanced/#prepared-requests) flow. \r\n\r\nI'm going to resolve this for now since it hasn't had traction in several months. Feel free to reopen if you have more targeted information on a defect not discussed above."
] |
https://api.github.com/repos/psf/requests/issues/5795
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5795/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5795/comments
|
https://api.github.com/repos/psf/requests/issues/5795/events
|
https://github.com/psf/requests/pull/5795
| 860,150,203 |
MDExOlB1bGxSZXF1ZXN0NjE3MTA1NzIx
| 5,795 |
Replace getattr(...) is not None with hasattr(...)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/59511469?v=4",
"events_url": "https://api.github.com/users/takos22/events{/privacy}",
"followers_url": "https://api.github.com/users/takos22/followers",
"following_url": "https://api.github.com/users/takos22/following{/other_user}",
"gists_url": "https://api.github.com/users/takos22/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/takos22",
"id": 59511469,
"login": "takos22",
"node_id": "MDQ6VXNlcjU5NTExNDY5",
"organizations_url": "https://api.github.com/users/takos22/orgs",
"received_events_url": "https://api.github.com/users/takos22/received_events",
"repos_url": "https://api.github.com/users/takos22/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/takos22/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/takos22/subscriptions",
"type": "User",
"url": "https://api.github.com/users/takos22",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2021-04-16T20:35:39Z
|
2021-08-27T00:08:43Z
|
2021-04-16T20:38:19Z
|
NONE
|
resolved
|
Replace `getattr(body, 'tell', None) is not None` with `hasattr(body, 'tell')`
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5795/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5795/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5795.diff",
"html_url": "https://github.com/psf/requests/pull/5795",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5795.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5795"
}
| true |
[
"Hi @takos22,\r\n\r\nThis syntax is intentional. You can read the reasoning [here](https://hynek.me/articles/hasattr/). ",
"Ok my bad, didn't know about that issue with ``hasattr``. Thanks for the info and for the fast response."
] |
https://api.github.com/repos/psf/requests/issues/5794
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5794/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5794/comments
|
https://api.github.com/repos/psf/requests/issues/5794/events
|
https://github.com/psf/requests/issues/5794
| 859,870,644 |
MDU6SXNzdWU4NTk4NzA2NDQ=
| 5,794 |
response.json() raises inconsistent exception type
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/675558?v=4",
"events_url": "https://api.github.com/users/rowanseymour/events{/privacy}",
"followers_url": "https://api.github.com/users/rowanseymour/followers",
"following_url": "https://api.github.com/users/rowanseymour/following{/other_user}",
"gists_url": "https://api.github.com/users/rowanseymour/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rowanseymour",
"id": 675558,
"login": "rowanseymour",
"node_id": "MDQ6VXNlcjY3NTU1OA==",
"organizations_url": "https://api.github.com/users/rowanseymour/orgs",
"received_events_url": "https://api.github.com/users/rowanseymour/received_events",
"repos_url": "https://api.github.com/users/rowanseymour/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rowanseymour/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rowanseymour/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rowanseymour",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 10 |
2021-04-16T14:32:24Z
|
2021-10-24T16:00:30Z
|
2021-07-26T15:56:44Z
|
NONE
|
resolved
|
There are some comments about this on https://github.com/psf/requests/issues/4842 but I think it warrants its own issue.
If `simplejson` is present in the environment, the library uses it so `.json()` returns `simplejson.errors.JSONDecodeError` rather than `json.decoder.JSONDecodeError`
If I'm writing a library that uses `requests` I don't know what other libraries will be present in an environment. I expect that a method returns a consistent exception type that I can handle. As it stands, anyone writing a library (e.g. https://github.com/conda-forge/conda-smithy/pull/1369, https://github.com/Vonage/vonage-python-sdk/issues/197) has to check themselves if `simplejson` is present to know what exception type will be thrown.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 2,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 2,
"url": "https://api.github.com/repos/psf/requests/issues/5794/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5794/timeline
| null |
completed
| null | null | false |
[
"Maybe the simplest solution which wouldn't break existing code is to provide a `requests.JSONDecodeError` which aliases whatever exception type will be thrown, and calling code should always use that.",
"Any update :)?",
"I don't see a easy way to do this without breaking existing code for other developers because in the requests/models.py (around line 881). It looks like the code intentionally raises different exceptions.\r\n\r\n`def json(self, **kwargs):\r\n r\"\"\"Returns the json-encoded content of a response, if any.\r\n :param \\*\\*kwargs: Optional arguments that ``json.loads`` takes.\r\n :raises simplejson.JSONDecodeError: If the response body does not\r\n contain valid json and simplejson is installed.\r\n :raises json.JSONDecodeError: If the response body does not contain\r\n valid json and simplejson is not installed on Python 3.\r\n :raises ValueError: If the response body does not contain valid\r\n json and simplejson is not installed on Python 2. \r\n \"\"\"` \r\n \r\n Maybe it is easier if you catch these errors: simplejson.JSONDecodeError, json.JSONDecodeError, ValueError",
"The more I think about it... \r\nSolution 1:\r\n add a new method so that it returns only 1 error\r\nSolution 2:\r\n change the json method so that it raises only 1 type or Error. I could work on this, but I am not sure which solution is better. ",
"I actually think the long term solution would be to stop using simplejson but maybe until then give callers a way to control which exception they're going to get, e.g. `response.json(std_json=True)`",
"It is probably easier to add a flag (or argument) to the json method that catches all the errors and raises only 1 error like requests.JSONDecodeError. \r\n\r\nIf simplejson is intended to be phased out then adding logic such that if std_json is True then raise only json.JSONDecodeError if any error has occured. If std_json is False then old behaviour happens???\r\n",
"Hi. I've been working on ways to fix this issue, and so far, I've found two ways.\r\n\r\nThe most ideal way seems to be replacing usage of `simplejson` with `json`. I implemented that in one of my commits, and created a `requests.JSONDecodeError` to handle errors in the models (I did not test it at all, though). Of course, anyone relying on catching `simplejson`'s `JSONDecodeError` would have issues if they updated their requests library. But if we wanted to consistently use `json`, and remove all instances of `simplejson`, then perhaps that's not such a bad thing.\r\n\r\n> Maybe the simplest solution which wouldn't break existing code is to provide a `requests.JSONDecodeError` which aliases whatever exception type will be thrown, and calling code should always use that.\r\n\r\nI agree with this as a second option, although, in the long run, it may eventually be good to use option 1. If we did this in `exceptions.py`...\r\n\r\n```python\r\nclass JSONDecodeError(json.JSONDecodeError, simplejson.JSONDecodeError):\r\n \"\"\"Couldn't decode the text into json\"\"\"\r\n```\r\n\r\n...it could get thrown if `json.dumps` results in an exception. That way, almost all instances of `simplejson` can be removed, except for the `simplejson.JSONDecodeError` inherited from in the alias, which users may be catching themselves.",
"@rowanseymour @BarryThrill @wwyl1234 I'm updating you on this since I made branches in a fork for the two best fixes for this issue.\r\n\r\nFix option 1 (not ideal for now) - replace `simplejson` with `json`, and throw a `requests.JSONDecodeError` when there is an issue parsing text into json: https://github.com/steveberdy/requests/commit/14bc512c36e331cf9042edbabe55d026ec94ec7c\r\n\r\nFix option 2 (more ideal for now) - replace `simpejson` functionality with `json` functionality, but make a `requests.JSONDecodeError` that inherits from `json.JSONDecodeError` and `simplejson.JSONDecodeError` in order for it to still be caught by users who may still be relying on catching the old error types: https://github.com/steveberdy/requests/commit/5ff9fb7af2aaeb8dc2a49fc0f80f356b602d0215\r\n\r\n@nateprewitt @sigmavirus24 @sethmlarson I was told to contact you for maintenance requests. Could I have permission to submit a pull request for one of these branches?\r\n\r\nP.S. I ran tests and they passed. Thanks",
"There is more than 2 options and there is only one truly ideal option - one that preserves backwards compatibility.\r\n\r\nThe standard library raises `ValueError` on Python 2, and it's own exception on Python 3. `simplejson` has its own exception class. As a result, we could create our own exception that inherits from all of the above as necessary. Something like:\r\n\r\n\r\n```py\r\ntry:\r\n from json import JSONDecodeError as stdlib_JSONDecodeError\r\nexcept ImportError:\r\n stdlib_JSONDecodeError = ValueError\r\n\r\ntry:\r\n from simplejson import JSONDecodeError as sj_JSONDecodeError\r\nexcept ImportError:\r\n sj_JSONDecodeError = Exception\r\n\r\nclass JSONDecodeError(stdlib_JSONDecodeError, sj_JSONDecodeError):\r\n pass\r\n```\r\n\r\nThis avoids the unnecessary removal of simplejson support for those that still expect it. This also allows folks catching one of:\r\n\r\n```\r\nsimplejson.JSONDecodeError\r\njson.JSONDecodeError\r\nValueError\r\n```\r\n\r\nTo still catch this exception too without having to change code",
"@sigmavirus24 I can make that edit in my second branch. I didn't do any `try`/`except` for importing the `JSONDecodeError` from `simplejson` or `json`, so I'll do that and then I'll submit a PR."
] |
https://api.github.com/repos/psf/requests/issues/5793
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5793/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5793/comments
|
https://api.github.com/repos/psf/requests/issues/5793/events
|
https://github.com/psf/requests/issues/5793
| 856,498,008 |
MDU6SXNzdWU4NTY0OTgwMDg=
| 5,793 |
当响应头中有空格时,response.headers 会被截断
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/26384334?v=4",
"events_url": "https://api.github.com/users/Emptytao/events{/privacy}",
"followers_url": "https://api.github.com/users/Emptytao/followers",
"following_url": "https://api.github.com/users/Emptytao/following{/other_user}",
"gists_url": "https://api.github.com/users/Emptytao/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Emptytao",
"id": 26384334,
"login": "Emptytao",
"node_id": "MDQ6VXNlcjI2Mzg0MzM0",
"organizations_url": "https://api.github.com/users/Emptytao/orgs",
"received_events_url": "https://api.github.com/users/Emptytao/received_events",
"repos_url": "https://api.github.com/users/Emptytao/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Emptytao/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Emptytao/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Emptytao",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2021-04-13T02:02:35Z
|
2021-08-27T00:08:26Z
|
2021-04-14T00:27:25Z
|
NONE
|
resolved
|
Summary.
## 这是完整的响应数据包

## 这是response.headers获取的数据包

|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5793/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5793/timeline
| null |
completed
| null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/5792
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5792/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5792/comments
|
https://api.github.com/repos/psf/requests/issues/5792/events
|
https://github.com/psf/requests/pull/5792
| 855,382,217 |
MDExOlB1bGxSZXF1ZXN0NjEzMTI5MTYz
| 5,792 |
Short-circuit raise_for_status code for statuses that will not raise …
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/383875?v=4",
"events_url": "https://api.github.com/users/ptmcg/events{/privacy}",
"followers_url": "https://api.github.com/users/ptmcg/followers",
"following_url": "https://api.github.com/users/ptmcg/following{/other_user}",
"gists_url": "https://api.github.com/users/ptmcg/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ptmcg",
"id": 383875,
"login": "ptmcg",
"node_id": "MDQ6VXNlcjM4Mzg3NQ==",
"organizations_url": "https://api.github.com/users/ptmcg/orgs",
"received_events_url": "https://api.github.com/users/ptmcg/received_events",
"repos_url": "https://api.github.com/users/ptmcg/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ptmcg/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ptmcg/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ptmcg",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2021-04-11T19:00:47Z
|
2021-12-01T16:06:02Z
|
2021-09-02T15:49:30Z
|
NONE
|
resolved
|
…an exception
This short-circuit code will skip the steps for decoding response.reason if the status code is not an exception-raising status code.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5792/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5792/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5792.diff",
"html_url": "https://github.com/psf/requests/pull/5792",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5792.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5792"
}
| true |
[
"Decoding the `reason` certainly isn't that expensive in reality. Why add this complexity here? What does this gain anybody?",
"Early exit from an exception-raising routine, when it can be quickly determined that no exception will be raised, would be considered by some to be less complexity, not more. At run-time it saves several additional if-tests for all successful statuses, and for those reading the code, it clearly shows that only status codes >= 400 are really of any interest when raising an exception.\r\n\r\n(I could understand if there were some possible restructuring of HTTP status codes in the future, but this status code organization is pretty cast in stone now.)",
"Hi @ptmcg, thanks for the initiative here! Unfortunately, I think I'm in agreement with @sigmavirus24. This is an optimization that adds extra code clutter without providing a tangible user benefit.\r\n\r\nI did some quick benchmarking and performance difference is measured in fractions of a _microsecond_. There's never going to be a situation where we accept latency involved with sending a request across a network but are concerned about sub-microsecond precision in post processing."
] |
https://api.github.com/repos/psf/requests/issues/5791
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5791/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5791/comments
|
https://api.github.com/repos/psf/requests/issues/5791/events
|
https://github.com/psf/requests/pull/5791
| 854,957,457 |
MDExOlB1bGxSZXF1ZXN0NjEyODE1Nzcy
| 5,791 |
Grammar fix in description
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/20052700?v=4",
"events_url": "https://api.github.com/users/cocobennett/events{/privacy}",
"followers_url": "https://api.github.com/users/cocobennett/followers",
"following_url": "https://api.github.com/users/cocobennett/following{/other_user}",
"gists_url": "https://api.github.com/users/cocobennett/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/cocobennett",
"id": 20052700,
"login": "cocobennett",
"node_id": "MDQ6VXNlcjIwMDUyNzAw",
"organizations_url": "https://api.github.com/users/cocobennett/orgs",
"received_events_url": "https://api.github.com/users/cocobennett/received_events",
"repos_url": "https://api.github.com/users/cocobennett/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/cocobennett/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cocobennett/subscriptions",
"type": "User",
"url": "https://api.github.com/users/cocobennett",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2021-04-10T02:01:13Z
|
2021-08-27T00:08:41Z
|
2021-04-11T16:25:49Z
|
CONTRIBUTOR
|
resolved
|
Fixes https://github.com/psf/requests/issues/5785
Could not find the second error as described `This is also a problem on the pypi page under the project description.`
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5791/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5791/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5791.diff",
"html_url": "https://github.com/psf/requests/pull/5791",
"merged_at": "2021-04-11T16:25:49Z",
"patch_url": "https://github.com/psf/requests/pull/5791.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5791"
}
| true |
[
"Thank you!",
"@cocobennett,\r\nTo answer this-\r\n> Could not find the second error as described This is also a problem on the pypi page under the project description.\r\n\r\nThis would essentially be fixed by your patch as the `long_description` picked up by PyPI page to populate the description section is set to `readme` in `setup.py` as seen in this line: https://github.com/psf/requests/blob/c45a4dfe6bfc6017d4ea7e9f051d6cc30972b310/setup.py#L70"
] |
https://api.github.com/repos/psf/requests/issues/5790
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5790/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5790/comments
|
https://api.github.com/repos/psf/requests/issues/5790/events
|
https://github.com/psf/requests/issues/5790
| 854,852,352 |
MDU6SXNzdWU4NTQ4NTIzNTI=
| 5,790 |
use new format model
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/60766355?v=4",
"events_url": "https://api.github.com/users/EmadDeve20/events{/privacy}",
"followers_url": "https://api.github.com/users/EmadDeve20/followers",
"following_url": "https://api.github.com/users/EmadDeve20/following{/other_user}",
"gists_url": "https://api.github.com/users/EmadDeve20/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/EmadDeve20",
"id": 60766355,
"login": "EmadDeve20",
"node_id": "MDQ6VXNlcjYwNzY2MzU1",
"organizations_url": "https://api.github.com/users/EmadDeve20/orgs",
"received_events_url": "https://api.github.com/users/EmadDeve20/received_events",
"repos_url": "https://api.github.com/users/EmadDeve20/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/EmadDeve20/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/EmadDeve20/subscriptions",
"type": "User",
"url": "https://api.github.com/users/EmadDeve20",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-04-09T21:02:59Z
|
2021-08-27T00:08:27Z
|
2021-04-10T00:13:52Z
|
NONE
|
resolved
|
Hi.
I think if we using a format like:
```print(f"{name}")```
is better than :
```print("{}".format(name))```
No?
if I right can I fix it?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5790/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5790/timeline
| null |
completed
| null | null | false |
[
"This library still supports Python 2.7 and so the examples should work across all versions"
] |
https://api.github.com/repos/psf/requests/issues/5789
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5789/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5789/comments
|
https://api.github.com/repos/psf/requests/issues/5789/events
|
https://github.com/psf/requests/issues/5789
| 853,358,939 |
MDU6SXNzdWU4NTMzNTg5Mzk=
| 5,789 |
User guide for how to provide timeouts
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/182122?v=4",
"events_url": "https://api.github.com/users/torotil/events{/privacy}",
"followers_url": "https://api.github.com/users/torotil/followers",
"following_url": "https://api.github.com/users/torotil/following{/other_user}",
"gists_url": "https://api.github.com/users/torotil/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/torotil",
"id": 182122,
"login": "torotil",
"node_id": "MDQ6VXNlcjE4MjEyMg==",
"organizations_url": "https://api.github.com/users/torotil/orgs",
"received_events_url": "https://api.github.com/users/torotil/received_events",
"repos_url": "https://api.github.com/users/torotil/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/torotil/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/torotil/subscriptions",
"type": "User",
"url": "https://api.github.com/users/torotil",
"user_view_type": "public"
}
|
[] |
closed
| false | null |
[] | null | 11 |
2021-04-08T11:14:40Z
|
2024-02-15T14:24:58Z
|
2024-02-15T14:24:39Z
|
NONE
| null |
[As](https://github.com/psf/requests/issues/5637) [many](https://github.com/psf/requests/issues/3054) [others](https://github.com/psf/requests/issues/2856) [before](https://github.com/psf/requests/issues/2508) [me](https://github.com/psf/requests/issues/2011) I have the need to set a default timeout for all requests (blocking forever just isn’t a very good default in most cases I guess).
I appreciate that this has been covered exhaustingly in this issue queue, that there [might not be a recommendation for all use-cases](https://github.com/psf/requests/issues/2011#issuecomment-64444354), and there are [different](https://github.com/psf/requests/issues/2011#issuecomment-64440818) [ways](https://github.com/psf/requests/issues/3054#issuecomment-197400891) [to](https://github.com/psf/requests/issues/2011#issuecomment-490050252) [do](https://github.com/psf/requests/issues/2011#issuecomment-637156626) this.
That’s exactly why I’m proposing to add documentation at the [appropriate](https://2.python-requests.org/en/master/user/advanced/#timeouts) [places](https://2.python-requests.org/en/master/user/advanced/#session-objects) for this in order to improve the experience for downstream developers and to avoid further mayhem in the issue queue.
So what are recommendable ways to have a default timeout and which disadvantages / advantages do they have?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/182122?v=4",
"events_url": "https://api.github.com/users/torotil/events{/privacy}",
"followers_url": "https://api.github.com/users/torotil/followers",
"following_url": "https://api.github.com/users/torotil/following{/other_user}",
"gists_url": "https://api.github.com/users/torotil/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/torotil",
"id": 182122,
"login": "torotil",
"node_id": "MDQ6VXNlcjE4MjEyMg==",
"organizations_url": "https://api.github.com/users/torotil/orgs",
"received_events_url": "https://api.github.com/users/torotil/received_events",
"repos_url": "https://api.github.com/users/torotil/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/torotil/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/torotil/subscriptions",
"type": "User",
"url": "https://api.github.com/users/torotil",
"user_view_type": "public"
}
|
{
"+1": 4,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 4,
"url": "https://api.github.com/repos/psf/requests/issues/5789/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5789/timeline
| null |
completed
| null | null | false |
[
"All of your issues describe providing one timeout for a session rather than how to provide timeouts for requests.\r\n\r\nIt sounds like you thus want to add an example of a recipe to the documentation that shows people how to create their own session object that accepts a timeout and passes that on to every request the session makes? Is that accurate? If not, can you please explain (perhaps with an example) what you're thinking needs to be added exactly?",
"> All of your issues describe providing one timeout for a session rather than how to provide timeouts for requests.\r\n\r\nIn more general terms it’s mostly about pre-configuring request parameters (eg. authentication, headers, …) so that not every part of the code initiating a request needs to care about all these details. For most of these parameters sessions can be used for exactly that, which is perhaps the reason why people tend to think about putting the timeout there as well. I actually don’t mind what the concept of encapsulating these parameters is called. What matters is that it accomplishes the encapsulation and is easy to use.\r\n\r\n> what you're thinking needs to be added exactly?\r\n\r\nI’d suggest to:\r\n\r\n1. Add a note to the [session documentation](https://2.python-requests.org/en/master/user/advanced/#session-objects) and maybe even the [API-docs](https://2.python-requests.org/en/master/api/#sessionapi) that although it works for most parameters it doesn’t work for `timeout` not by obmission but on purpose (maybe referencing an appropriate issue where this has been discussed).\r\n2. Include 1 or 2 recipies on how to achieve a session-like encapsulation for timeouts so that the Python community doesn’t end up with 1001 solutions to more or less the same problem. I probably would put those in the [timeout section](https://2.python-requests.org/en/master/user/advanced/#timeouts) and link them from the above notes.\r\n\r\nSo the open questions for me are:\r\n\r\n- Does adding this to the docs sound sensible?\r\n- Which recipies are worth putting there?",
"> Include 1 or 2 recipies on how to achieve a session-like encapsulation for timeouts so that the Python community doesn’t end up with 1001 solutions to more or less the same problem. I probably would put those in the [timeout section](https://2.python-requests.org/en/master/user/advanced/#timeouts) and link them from the above notes.\r\n\r\nAbsolutely not.\r\n\r\nIgnoring the mounds of hate mail I've received for a decision I didn't make, but frankly agree with, this would then make it seem like \"the library authors encourage this but won't support it\" which will cause a flood of new issues and pull requests for something that we won't be adding.\r\n\r\nIf there are other improvements around session attributes you can point out that you think are missing, I'm happy to review a PR adding those",
"> Ignoring the mounds of hate mail I've received for a decision I didn't make\r\n\r\nI’m really sorry to hear you are getting hate mail for *any* decision made in an open source project. I definitely like to avoid that. I’m also not suggesting to revise that decision. I’m trying to find ways to implement it.\r\n\r\nDo you also disagree with 1.? (making this decision visible in the docs)\r\n\r\n----\r\n\r\n> this would then make it seem like \"the library authors encourage this but won't support it\"\r\n\r\nI think the “this“ is important in this sentence: Do you think there is absolutely no sensible way to achieve an encapsulation of the timeout that fits the decision & the design of the library? … even for specific use-cases? — For me it seems that at least [this comment](https://github.com/psf/requests/issues/2011#issuecomment-64440818) includes a way to do this that doesn’t bind the timeout to the session at all.\r\n\r\n----\r\n\r\nI imagine most developers who end up bringing this to the issue queue go through a process roughly like this:\r\n\r\n- I need to add a timeout to all of my requests (otherwise they might block forever). Is there a way to configure this at one place [instead of needing to pass it with each request]? → Consulting the docs.\r\n- The docs say the Session can be used to pre-configure most of the request params. → Let’s try if this works for the timeout as well.\r\n- This doesn’t work. It seems like an oversight. → Let’s post an issue.\r\n- Ok, there is a closed issue about this that refers to duplicates. → Let’s see if those contain something useful.\r\n- Wow, there is many possible ways, but the maintainers seem not to be convinced by any of them. → What should I do now?\r\n\r\nI think adding something to the docs would catch them before they are too invested in the idea that the Session is the right place to add this. Pointing them to a recipy that solves their problem would avoid them coming to the issue queue altogether — even if it’s a “We don’t recommend this, but some users have found this useful.”",
"Hi there,\r\n\r\nRemember this lib , is just a user-friendly or elegant HTTP Requests implementation , based on urllib3 , so...it depends on urllib and it depends on socket lib....if you are loking for a more deep implementation, I would suggest you to use adapters and look for tcp session params on python..you can set session and connection timeouts, while urllib release a easy way to set then.\r\n\r\nCheers",
"It seems reasonable that this should be emphasized more in the docs since it won't be supported directly. [Quickstart > Timeouts](https://docs.python-requests.org/en/master/user/quickstart/#timeouts) is the second to last section and contains this gotcha:\r\n\r\n> Nearly all production code should use this parameter in nearly all requests. Failure to do so can cause your program to hang indefinitely\r\n\r\nThe default behavior blocks forever under relatively common and transient network conditions, and almost every example in the documentation outside of the Timeout section itself shows this usage.\r\n\r\nIf a default timeout does not belong to the `Session` object philosophically but instead to transport, couldn't it be added as an example under [Transport Adapters](https://docs.python-requests.org/en/latest/user/advanced/#transport-adapters) where there is already an example of other [non-default behavior](https://docs.python-requests.org/en/latest/user/advanced/#example-specific-ssl-version)?\r\n\r\nOr perhaps the other Quickstart examples should always be passing the `timeout` parameter explicitly so the importance of its use is emphasized.",
"You say `timeout` doesn't belong to the `Session` object conceptually, OK. But then why not add a `timeout` parameter to `HTTPAdapter`? It would be much better than saying \"just add this 10 line boilerplate to every project where you need to use session-level timeouts\". From the amount of duplicate issues about this it should be clear that this is not a niche use case but is quite common. If there is any reason that wouldn't make sense please enlighten me, because I'm struggling to understand.\r\n\r\nIt also seems completely self-contradictory for the documentation to suggest that \"nearly every request should have a timeout\" while the library doesn't provide a convenient way to specify a default timeout rather than repeating it on every request, and neither does the documentation provide a recipe for subclassing `HTTPAdapter` to add this functionality. Please explain the logic behind this, because I would think for the overwhelming majority of use cases a session-level timeout with maybe a couple of manual overrides would work fine and make things much easier. (Whether it's on the session itself or the adapter is just an implementation detail and not my point here.)\r\n\r\nIf adding it to HTTPAdapter itself isn't a workable solution for whatever reason, maybe it could go in `requests-toolbelt` or a separate PyPI package (though the former seems unmaintained, and the latter would encourage Node.js-style dependency creep if there's a package *just* for Requests session timeouts).",
"> Remember this lib , is just a user-friendly or elegant HTTP Requests implementation …\r\n\r\nExactly. And that’s why it’s worth looking for a user-friendly way to support a very common use-case — I daresay code that needs a timeout is more common than code that doesn’t. So perhaps we are talking about nothing less than *the most common* use case for this library and how to make it less error prone.",
"Potentially instead of a changing the timeout property / attribute itself, would it make sense to offer the ability to warn when no timeout is set, similar to how more recent python versions can warn about using `open` without an `encoding` value?",
"Hi, let me revive the discussion. I stumbled upon the \"default timeout\" issue recently. I think that a very simple solution, like global default timeout that is applied to all the requests when no more specific timeout is provided, would address many operational pains. I would imagine that such a timeout could be set with some `requests.set_default_timeout()` or with `pyproject.toml` , to really apply to all the requests that use (awesome) `requests` library. The real-world return from such an investment in the code would be huge.\r\n\r\nI'd happily provide a PR for such a feature, I'd like to know though which direction would be the best one.",
"This is one of the reasons why I would use [httpx](https://www.python-httpx.org) which has an [easy to use solution for timeouts](https://www.python-httpx.org/advanced/#setting-and-disabling-timeouts) for new projects. For my current projects I’d hope to migrate at some point.\r\n\r\nI’m closing this issue now as I don’t have hopes that the maintainers will agree to improvements of either the code or the documentation of the library before I migrate."
] |
https://api.github.com/repos/psf/requests/issues/5788
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5788/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5788/comments
|
https://api.github.com/repos/psf/requests/issues/5788/events
|
https://github.com/psf/requests/issues/5788
| 851,569,425 |
MDU6SXNzdWU4NTE1Njk0MjU=
| 5,788 |
SSL requests error in version 2.25.1 (upgraded from version 2.24.0)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/76916440?v=4",
"events_url": "https://api.github.com/users/M-Stenzel/events{/privacy}",
"followers_url": "https://api.github.com/users/M-Stenzel/followers",
"following_url": "https://api.github.com/users/M-Stenzel/following{/other_user}",
"gists_url": "https://api.github.com/users/M-Stenzel/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/M-Stenzel",
"id": 76916440,
"login": "M-Stenzel",
"node_id": "MDQ6VXNlcjc2OTE2NDQw",
"organizations_url": "https://api.github.com/users/M-Stenzel/orgs",
"received_events_url": "https://api.github.com/users/M-Stenzel/received_events",
"repos_url": "https://api.github.com/users/M-Stenzel/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/M-Stenzel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/M-Stenzel/subscriptions",
"type": "User",
"url": "https://api.github.com/users/M-Stenzel",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-04-06T15:57:57Z
|
2021-08-27T00:08:28Z
|
2021-04-08T22:17:16Z
|
NONE
|
resolved
|
Dear team,
I am using requests for the searx search engine (https://github.com/searx/searx)
The script below throws this error and I do not know whether this is related to python-requests or to searx or my specific installation?
System is opensuse, python 3.6.12. latest requests release (2.25.1).
This (original searx) code gives me the error
#!/usr/bin/python3
# EASY-INSTALL-ENTRY-SCRIPT: 'searx==1.0.0','console_scripts','searx-run'
__requires__ = 'searx==1.0.0'
import re
import sys
from pkg_resources import load_entry_point
if __name__ == '__main__':
sys.argv[0] = re.sub(r'(-script\.pyw?|\.exe)?$', '', sys.argv[0])
sys.exit(
load_entry_point('searx==1.0.0', 'console_scripts', 'searx-run')()
)
Output with error message:
Serving Flask app "searx.webapp" (lazy loading)
Environment: production
WARNING: This is a development server. Do not use it in a production deployment.
Use a production WSGI server instead.
Debug mode: off
INFO:werkzeug: * Running on http://127.0.0.1:8898/ (Press CTRL+C to quit)
ERROR:searx.engines:yggtorrent engine: Fail to initialize
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/urllib3/connectionpool.py", line 672, in urlopen
chunked=chunked,
File "/usr/lib/python3.6/site-packages/urllib3/connectionpool.py", line 376, in _make_request
self._validate_conn(conn)
File "/usr/lib/python3.6/site-packages/urllib3/connectionpool.py", line 994, in validate_conn
conn.connect()
File "/usr/lib/python3.6/site-packages/urllib3/connection.py", line 360, in connect
ssl_context=context,
File "/usr/lib/python3.6/site-packages/urllib3/util/ssl.py", line 370, in ssl_wrap_socket
return context.wrap_socket(sock, server_hostname=server_hostname)
File "/usr/lib64/python3.6/ssl.py", line 407, in wrap_socket
_context=self, _session=session)
File "/usr/lib64/python3.6/ssl.py", line 817, in init
self.do_handshake()
File "/usr/lib64/python3.6/ssl.py", line 1077, in do_handshake
self._sslobj.do_handshake()
File "/usr/lib64/python3.6/ssl.py", line 689, in do_handshake
self._sslobj.do_handshake()
ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:852)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/lib/python3.6/site-packages/requests/adapters.py", line 449, in send
timeout=timeout
File "/usr/lib/python3.6/site-packages/urllib3/connectionpool.py", line 720, in urlopen
method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
File "/usr/lib/python3.6/site-packages/urllib3/util/retry.py", line 436, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='www4.yggtorrent.li', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError(1, '[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:852)'),))
Obviously there seem to be problems with SSL/certificate.
Martin.
P. S. This is kinda of crosspost (I know, maybe not the best idea...)
The other post to be found here:
https://github.com/searx/searx/issues/2731
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5788/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5788/timeline
| null |
completed
| null | null | false |
[
"Did you downgrade to the prior working version of requests? If not. Please try that. I doubt you'll see a difference here."
] |
https://api.github.com/repos/psf/requests/issues/5787
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5787/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5787/comments
|
https://api.github.com/repos/psf/requests/issues/5787/events
|
https://github.com/psf/requests/issues/5787
| 850,792,591 |
MDU6SXNzdWU4NTA3OTI1OTE=
| 5,787 |
SOCKS5 Timeout
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/12672984?v=4",
"events_url": "https://api.github.com/users/CollabSensei/events{/privacy}",
"followers_url": "https://api.github.com/users/CollabSensei/followers",
"following_url": "https://api.github.com/users/CollabSensei/following{/other_user}",
"gists_url": "https://api.github.com/users/CollabSensei/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/CollabSensei",
"id": 12672984,
"login": "CollabSensei",
"node_id": "MDQ6VXNlcjEyNjcyOTg0",
"organizations_url": "https://api.github.com/users/CollabSensei/orgs",
"received_events_url": "https://api.github.com/users/CollabSensei/received_events",
"repos_url": "https://api.github.com/users/CollabSensei/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/CollabSensei/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/CollabSensei/subscriptions",
"type": "User",
"url": "https://api.github.com/users/CollabSensei",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-04-05T23:08:21Z
|
2021-08-27T00:08:28Z
|
2021-04-06T11:22:58Z
|
NONE
|
resolved
|
Summary.
## Expected Result
Successful HTTP post via SOCKS5 Proxy. The web service takes about 2-3 minutes to respond. When connecting directly, without SOCKS5, it works fine. Using a Dante Socks5 proxy. Dante logs indicate that the client closed the connection.
## Actual Result
Traceback (most recent call last):
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python36_64\lib\site-packages\urllib3\connectionpool.py", line 706, in urlopen
chunked=chunked,
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python36_64\lib\site-packages\urllib3\connectionpool.py", line 445, in _make_request
six.raise_from(e, None)
File "<string>", line 3, in raise_from
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python36_64\lib\site-packages\urllib3\connectionpool.py", line 440, in _make_request
httplib_response = conn.getresponse()
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python36_64\lib\http\client.py", line 1331, in getresponse
response.begin()
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python36_64\lib\http\client.py", line 297, in begin
version, status, reason = self._read_status()
File "C:\Program Files (x86)\Microsoft Visual Studio\Shared\Python36_64\lib\http\client.py", line 266, in _read_status
raise RemoteDisconnected("Remote end closed connection without"
http.client.RemoteDisconnected: Remote end closed connection without response
## Reproduction Steps
```python
url = "https://" + hostname + ":" + api_port + "/axl/"
bodyUpdate = """
<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/" xmlns:ns="http://www.cisco.com/AXL/API/%version">
<soapenv:Header/>
<soapenv:Body>
<ns:doSmartLicenseRegister>
<token>%token</token>
<force>true</force>
</ns:doSmartLicenseRegister>
</soapenv:Body>
</soapenv:Envelope>"""
bodyUpdate = bodyUpdate.replace("%version",str(version))
bodyUpdate = bodyUpdate.replace("%token", getSmartLicenseToken('myToken'))
sock5Proxy = {
'http': 'socks5h://' + vapp_ip + ':1080',
'https':'socks5h://' + vapp_ip + ':1080'
}
response = requests.post(url, data=bodyUpdate, auth=(username,password), headers={'SOAPAction': 'CUCM:DB ver=14.0'}, proxies=sock5Proxy, verify = False)
```
## System Information
{
"chardet": {
"version": "4.0.0"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.6.6"
},
"platform": {
"release": "10",
"system": "Windows"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "100020ff"
},
"urllib3": {
"version": "1.26.3"
},
"using_pyopenssl": false
```
<paste here>
```
This command is only available on Requests v2.16.4 and greater. Otherwise,
please provide some basic information about your system (Python version,
operating system, &c).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5787/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5787/timeline
| null |
completed
| null | null | false |
[
"We rely on urllib3 for socks proxying and you've already opened https://github.com/urllib3/urllib3/issues/2195 so I'm closing this"
] |
https://api.github.com/repos/psf/requests/issues/5786
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5786/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5786/comments
|
https://api.github.com/repos/psf/requests/issues/5786/events
|
https://github.com/psf/requests/issues/5786
| 850,624,739 |
MDU6SXNzdWU4NTA2MjQ3Mzk=
| 5,786 |
Requests don't work within a celery async task
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/16038063?v=4",
"events_url": "https://api.github.com/users/augustosamame/events{/privacy}",
"followers_url": "https://api.github.com/users/augustosamame/followers",
"following_url": "https://api.github.com/users/augustosamame/following{/other_user}",
"gists_url": "https://api.github.com/users/augustosamame/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/augustosamame",
"id": 16038063,
"login": "augustosamame",
"node_id": "MDQ6VXNlcjE2MDM4MDYz",
"organizations_url": "https://api.github.com/users/augustosamame/orgs",
"received_events_url": "https://api.github.com/users/augustosamame/received_events",
"repos_url": "https://api.github.com/users/augustosamame/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/augustosamame/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/augustosamame/subscriptions",
"type": "User",
"url": "https://api.github.com/users/augustosamame",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2021-04-05T20:01:56Z
|
2021-08-27T00:08:29Z
|
2021-04-05T21:04:02Z
|
NONE
|
resolved
|
When running this in Django shell:
```
import requests
response = requests.get('https://auth.myfanpark.com/health')
```
I get a valid response
The same 2 lines when ran as async celery job in the same Django project:
```
import requests
response = requests.get('https://auth.myfanpark.com/health')
```
Exception: too many values to unpack (expected 2)
## Expected Result
For requests call to work correctly in async celery calls just like in regular Django shell.
## Actual Result
Exception: too many values to unpack (expected 2)
## Reproduction Steps
```python
import requests
response = requests.get(url = 'https://auth.myfanpark.com/health')
```
## System Information
$ python -m requests.help
```
{
"chardet": {
"version": "3.0.4"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.6"
},
"implementation": {
"name": "CPython",
"version": "3.6.7"
},
"platform": {
"release": "19.6.0",
"system": "Darwin"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.21.0"
},
"system_ssl": {
"version": "1010109f"
},
"urllib3": {
"version": "1.24.1"
},
"using_pyopenssl": false
}
```
This command is only available on Requests v2.16.4 and greater. Otherwise,
please provide some basic information about your system (Python version,
operating system, &c).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5786/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5786/timeline
| null |
completed
| null | null | false |
[
"Can we get the full traceback? One line of the traceback isn't useful for figuring out the issue here.",
"Also, I use requests in Celery and it works fine. So I'm pretty close to closing this.",
"I will have a stack trace soon.\n\nOn Mon, 5 Apr 2021 at 15:31, Ian Stapleton Cordasco <\n***@***.***> wrote:\n\n> Also, I use requests in Celery and it works fine. So I'm pretty close to\n> closing this.\n>\n> —\n> You are receiving this because you authored the thread.\n> Reply to this email directly, view it on GitHub\n> <https://github.com/psf/requests/issues/5786#issuecomment-813632369>, or\n> unsubscribe\n> <https://github.com/notifications/unsubscribe-auth/AD2LRL2YSM5PLU3ZMNXMPM3THIM2LANCNFSM42NL2TQQ>\n> .\n>\n",
"This is the complete stack trace:\r\n\r\n```\r\nApr 5 15:55:01 production-celery-ec2 worker.log ERROR Task cancel_notification_no_response[053c87fc-ed50-477a-a757-73c35a00c85f] raised unexpected: ValueError('too many values to unpack (expected 2)',)\r\nApr 5 15:55:01 production-celery-ec2 worker.log Traceback (most recent call last):\r\n File \"/home/ubuntu/.local/lib/python3.6/site-packages/celery/app/trace.py\", line 382, in trace_task\r\n R = retval = fun(*args, **kwargs)\r\n File \"/home/ubuntu/.local/lib/python3.6/site-packages/celery/app/trace.py\", line 641, in __protected_call__\r\n return self.run(*args, **kwargs)\r\n File \"/home/ubuntu/releases/eb0a086fdd0165da73a3fdea33d0be78a0523ce4/stargramz/tasks.py\", line 186, in cancel_starsona_celebrity_no_response\r\n dummy_response = requests.get('https://auth.myfanpark.com/health')\r\n File \"/home/ubuntu/.local/lib/python3.6/site-packages/django/db/models/query.py\", line 399, in get\r\n clone = self.filter(*args, **kwargs)\r\n File \"/home/ubuntu/.local/lib/python3.6/site-packages/django/db/models/query.py\", line 892, in filter\r\n return self._filter_or_exclude(False, *args, **kwargs)\r\n File \"/home/ubuntu/.local/lib/python3.6/site-packages/django/db/models/query.py\", line 910, in _filter_or_exclude\r\n clone.query.add_q(Q(*args, **kwargs))\r\n File \"/home/ubuntu/.local/lib/python3.6/site-packages/django/db/models/sql/query.py\", line 1290, in add_q\r\n clause, _ = self._add_q(q_object, self.used_aliases)\r\n File \"/home/ubuntu/.local/lib/python3.6/site-packages/django/db/models/sql/query.py\", line 1318, in _add_q\r\n split_subq=split_subq, simple_col=simple_col,\r\n File \"/home/ubuntu/.local/lib/python3.6/site-packages/django/db/models/sql/query.py\", line 1187, in build_filter\r\n arg, value = filter_expr\r\nApr 5 15:55:01 production-celery-ec2 worker.log ValueError: too many values to unpack (expected 2)\r\n```",
"Reading your stack trace the important bits are:\r\n\r\n```\r\nFile \"/home/ubuntu/releases/eb0a086fdd0165da73a3fdea33d0be78a0523ce4/stargramz/tasks.py\", line 186, in cancel_starsona_celebrity_no_response\r\ndummy_response = requests.get('https://auth.myfanpark.com/health')\r\nFile \"/home/ubuntu/.local/lib/python3.6/site-packages/django/db/models/query.py\", line 399, in get\r\nclone = self.filter(*args, **kwargs)\r\nFile \"/home/ubuntu/.local/lib/python3.6/site-packages/django/db/models/query.py\", line 892, in filter\r\nreturn self._filter_or_exclude(False, *args, **kwargs)\r\nFile \"/home/ubuntu/.local/lib/python3.6/site-packages/django/db/models/query.py\", line 910, in _filter_or_exclude\r\nclone.query.add_q(Q(*args, **kwargs))\r\nFile \"/home/ubuntu/.local/lib/python3.6/site-packages/django/db/models/sql/query.py\", line 1290, in add_q\r\nclause, _ = self._add_q(q_object, self.used_aliases)\r\nFile \"/home/ubuntu/.local/lib/python3.6/site-packages/django/db/models/sql/query.py\", line 1318, in _add_q\r\nsplit_subq=split_subq, simple_col=simple_col,\r\nFile \"/home/ubuntu/.local/lib/python3.6/site-packages/django/db/models/sql/query.py\", line 1187, in build_filter\r\narg, value = filter_expr\r\nApr 5 15:55:01 production-celery-ec2 worker.log ValueError: too many values to unpack (expected 2)\r\n```\r\n\r\nWhich indicates that `requests` is a Django ORM manager and not the requests library."
] |
https://api.github.com/repos/psf/requests/issues/5785
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5785/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5785/comments
|
https://api.github.com/repos/psf/requests/issues/5785/events
|
https://github.com/psf/requests/issues/5785
| 849,979,181 |
MDU6SXNzdWU4NDk5NzkxODE=
| 5,785 |
Grammar error in GitHub description
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/55339220?v=4",
"events_url": "https://api.github.com/users/saltedcoffii/events{/privacy}",
"followers_url": "https://api.github.com/users/saltedcoffii/followers",
"following_url": "https://api.github.com/users/saltedcoffii/following{/other_user}",
"gists_url": "https://api.github.com/users/saltedcoffii/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/saltedcoffii",
"id": 55339220,
"login": "saltedcoffii",
"node_id": "MDQ6VXNlcjU1MzM5MjIw",
"organizations_url": "https://api.github.com/users/saltedcoffii/orgs",
"received_events_url": "https://api.github.com/users/saltedcoffii/received_events",
"repos_url": "https://api.github.com/users/saltedcoffii/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/saltedcoffii/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/saltedcoffii/subscriptions",
"type": "User",
"url": "https://api.github.com/users/saltedcoffii",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2021-04-04T23:01:04Z
|
2021-08-27T00:08:27Z
|
2021-04-11T16:25:49Z
|
NONE
|
resolved
|
The GitHub description reads as follows: "A simple, yet elegant HTTP library."
It should be: "A simple, yet elegant, HTTP library."
This is also a problem on the pypi page under the project description.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5785/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5785/timeline
| null |
completed
| null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/5784
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5784/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5784/comments
|
https://api.github.com/repos/psf/requests/issues/5784/events
|
https://github.com/psf/requests/pull/5784
| 846,374,415 |
MDExOlB1bGxSZXF1ZXN0NjA1NDc0ODkw
| 5,784 |
fix typo
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/50807669?v=4",
"events_url": "https://api.github.com/users/DavidKimDY/events{/privacy}",
"followers_url": "https://api.github.com/users/DavidKimDY/followers",
"following_url": "https://api.github.com/users/DavidKimDY/following{/other_user}",
"gists_url": "https://api.github.com/users/DavidKimDY/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/DavidKimDY",
"id": 50807669,
"login": "DavidKimDY",
"node_id": "MDQ6VXNlcjUwODA3NjY5",
"organizations_url": "https://api.github.com/users/DavidKimDY/orgs",
"received_events_url": "https://api.github.com/users/DavidKimDY/received_events",
"repos_url": "https://api.github.com/users/DavidKimDY/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/DavidKimDY/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DavidKimDY/subscriptions",
"type": "User",
"url": "https://api.github.com/users/DavidKimDY",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-03-31T10:25:43Z
|
2021-08-27T00:08:44Z
|
2021-03-31T11:32:39Z
|
NONE
|
resolved
|
fix typo
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5784/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5784/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5784.diff",
"html_url": "https://github.com/psf/requests/pull/5784",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5784.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5784"
}
| true |
[
"Thank you for your contribution! Unfortunately, we didn't write the content of that comment and don't maintain the comments in that file."
] |
https://api.github.com/repos/psf/requests/issues/5783
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5783/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5783/comments
|
https://api.github.com/repos/psf/requests/issues/5783/events
|
https://github.com/psf/requests/pull/5783
| 843,692,355 |
MDExOlB1bGxSZXF1ZXN0NjAzMDQwNjY5
| 5,783 |
Add support for brotli decoding
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4992947?v=4",
"events_url": "https://api.github.com/users/dilyanpalauzov/events{/privacy}",
"followers_url": "https://api.github.com/users/dilyanpalauzov/followers",
"following_url": "https://api.github.com/users/dilyanpalauzov/following{/other_user}",
"gists_url": "https://api.github.com/users/dilyanpalauzov/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dilyanpalauzov",
"id": 4992947,
"login": "dilyanpalauzov",
"node_id": "MDQ6VXNlcjQ5OTI5NDc=",
"organizations_url": "https://api.github.com/users/dilyanpalauzov/orgs",
"received_events_url": "https://api.github.com/users/dilyanpalauzov/received_events",
"repos_url": "https://api.github.com/users/dilyanpalauzov/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dilyanpalauzov/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dilyanpalauzov/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dilyanpalauzov",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] |
{
"closed_at": "2021-08-25T21:25:16Z",
"closed_issues": 3,
"created_at": "2021-05-20T21:01:54Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/28",
"id": 6778816,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/28/labels",
"node_id": "MDk6TWlsZXN0b25lNjc3ODgxNg==",
"number": 28,
"open_issues": 0,
"state": "closed",
"title": "2.26.0",
"updated_at": "2021-08-25T21:25:16Z",
"url": "https://api.github.com/repos/psf/requests/milestones/28"
}
| 8 |
2021-03-29T18:53:59Z
|
2021-10-05T14:19:31Z
|
2021-07-07T13:16:28Z
|
CONTRIBUTOR
|
resolved
|
When the `brotli` or `brotlicffi` packages are installed, `urllib3.util.make_headers()` inserts ',br' in the Accept-Encoding header and decodes br from the answers.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/5783/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5783/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5783.diff",
"html_url": "https://github.com/psf/requests/pull/5783",
"merged_at": "2021-07-07T13:16:28Z",
"patch_url": "https://github.com/psf/requests/pull/5783.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5783"
}
| true |
[
"Should that not be something more along the lines of...\r\n\r\n```python\r\nmake_headers(accept_encoding=True).get('accept-encoding', ', '.join(('gzip', 'deflate'))),\r\n```\r\n\r\nas a 'fail safe'",
"You can change it as you want, as long Requests sends transparently `Accept-Encoding: br`.\r\n\r\n.join is an overkiller to generate a constant string.",
"My 2c: I like the approach of reusing code from urllib3 more than mine from #5554. :)\r\n\r\nHowever the [minimal supported by requests version of urllib3 is now 1.21.1](https://github.com/psf/requests/blob/master/setup.py#L46), which [does not support brotli yet](https://github.com/urllib3/urllib3/blame/1.21.1/urllib3/util/request.py). The [minimal version that has this support is 1.25.1](https://github.com/urllib3/urllib3/commit/9eecb6bd0fab0a4546dd35c7ee42e8cd9c59274f). So I think that it should be mentioned in docs update here, along with info about required `brotli` or `brotlicffi` packages.",
"Such fail-safe is not needed, @VeNoMouS , as even the oldest supported by requests urllib3 has `make_headers()` that will return a dict with `Accept-Encoding` key, when called with `accept_encoding=True`.\r\n\r\nSee https://github.com/urllib3/urllib3/blame/1.21.1/urllib3/util/request.py#L7-L55 .",
"> However the [minimal supported by requests version of urllib3 is now 1.21.1](https://github.com/psf/requests/blob/master/setup.py#L46), which [does not support brotli yet](https://github.com/urllib3/urllib3/blame/1.21.1/urllib3/util/request.py). The [minimal version that has this support is 1.25.1](https://github.com/urllib3/urllib3/commit/9eecb6bd0fab0a4546dd35c7ee42e8cd9c59274f). So I think that it should be mentioned in docs update here, along with info about required `brotli` or `brotlicffi` packages.\r\n\r\nPlease propose wording and where to update the text. Currently the brotli-texts are updated on three places - faq, history and quickstart. I consider it better, if the documentation does not repeat the same thing on different places.",
"I apologize If this might come off as a bit rude but this PR seems to be the only thing blocking 2.26.0 milestone (the other changes in it have no pending change requested), and it's been a month without any progress, if for more time is needed that's of course understandable but would it be possible to bump this for a release after the next one so as not to have it block other changes from being released?",
"ap\n\nOn Mon, Jun 14, 2021, 1:46 AM Naor Livne ***@***.***> wrote:\nnah..\n\n> I apologize If this might come off as a bit rude but this PR seems to be\n> the only thing blocking 2.26.0 milestone (the other changes in it have no\n> pending change requested), and it's been a month without any progress, if\n> for more time is needed that's of course understandable but would it be\n> possible to bump this for the next release then so as not to have it block\n> other changes from being released?\n>\n> —\n> You are receiving this because you are subscribed to this thread.\n> Reply to this email directly, view it on GitHub\n> <https://github.com/psf/requests/pull/5783#issuecomment-860510546>, or\n> unsubscribe\n> <https://github.com/notifications/unsubscribe-auth/AUCGM53SLAID7PCJ7LV3QULTSW6YBANCNFSM42AFENQQ>\n> .\n>\n",
"Just bump 2.2.1 for the update ... majority needs to redilimbruim it's\ndepth not making it by lying ..add support!\n\nOn Mon, Jun 14, 2021, 10:38 AM Juan E Alcantar <\n***@***.***> wrote:\n\n> ap\n>\n> On Mon, Jun 14, 2021, 1:46 AM Naor Livne ***@***.***>\n> wrote: nah..\n>\n>> I apologize If this might come off as a bit rude but this PR seems to be\n>> the only thing blocking 2.26.0 milestone (the other changes in it have no\n>> pending change requested), and it's been a month without any progress, if\n>> for more time is needed that's of course understandable but would it be\n>> possible to bump this for the next release then so as not to have it block\n>> other changes from being released?\n>>\n>> —\n>> You are receiving this because you are subscribed to this thread.\n>> Reply to this email directly, view it on GitHub\n>> <https://github.com/psf/requests/pull/5783#issuecomment-860510546>, or\n>> unsubscribe\n>> <https://github.com/notifications/unsubscribe-auth/AUCGM53SLAID7PCJ7LV3QULTSW6YBANCNFSM42AFENQQ>\n>> .\n>>\n>\n"
] |
https://api.github.com/repos/psf/requests/issues/5782
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5782/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5782/comments
|
https://api.github.com/repos/psf/requests/issues/5782/events
|
https://github.com/psf/requests/issues/5782
| 841,548,138 |
MDU6SXNzdWU4NDE1NDgxMzg=
| 5,782 |
AttributeError: type object 'Retry' has no attribute 'other'
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/23280179?v=4",
"events_url": "https://api.github.com/users/dkbarn/events{/privacy}",
"followers_url": "https://api.github.com/users/dkbarn/followers",
"following_url": "https://api.github.com/users/dkbarn/following{/other_user}",
"gists_url": "https://api.github.com/users/dkbarn/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/dkbarn",
"id": 23280179,
"login": "dkbarn",
"node_id": "MDQ6VXNlcjIzMjgwMTc5",
"organizations_url": "https://api.github.com/users/dkbarn/orgs",
"received_events_url": "https://api.github.com/users/dkbarn/received_events",
"repos_url": "https://api.github.com/users/dkbarn/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/dkbarn/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/dkbarn/subscriptions",
"type": "User",
"url": "https://api.github.com/users/dkbarn",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-03-26T03:08:14Z
|
2021-08-27T00:08:29Z
|
2021-03-26T18:30:08Z
|
NONE
|
resolved
|
I am using requests 2.25.1 with urllib3 1.26.3. Occasionally I get the following stacktrace. Note that the program is multi-threaded but I have all calls to ```session.get()``` inside a ```with threading.Lock()``` so there should not be concurrent access to a ```requests.Session``` object.
```
Traceback (most recent call last):
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/urllib3/connectionpool.py", line 699, in urlopen
httplib_response = self._make_request(
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/urllib3/connectionpool.py", line 445, in _make_request
six.raise_from(e, None)
File "<string>", line 3, in raise_from
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/urllib3/connectionpool.py", line 440, in _make_request
httplib_response = conn.getresponse()
File "/usr/lib/python3.8/http/client.py", line 1347, in getresponse
response.begin()
File "/usr/lib/python3.8/http/client.py", line 307, in begin
version, status, reason = self._read_status()
File "/usr/lib/python3.8/http/client.py", line 268, in _read_status
line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
File "/usr/lib/python3.8/socket.py", line 669, in readinto
return self._sock.recv_into(b)
File "/usr/lib/python3.8/ssl.py", line 1241, in recv_into
return self.read(nbytes, buffer)
File "/usr/lib/python3.8/ssl.py", line 1099, in read
return self._sslobj.read(len, buffer)
ConnectionResetError: [Errno 104] Connection reset by peer
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/urllib3/util/retry.py", line 595, in __getattr__
return getattr(super(Retry, self), item)
AttributeError: 'super' object has no attribute 'other'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/dkbarn/wdb/worker/mixins/multithreaded.py", line 114, in run
self._run()
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/dkbarn/wdb/worker/mixins/multithreaded.py", line 229, in _run
self.worker.update_users(update_requests)
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/dkbarn/wdb/worker/mixins/account.py", line 101, in update_users
self.__fetch_users(user_requests)
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/dkbarn/wdb/worker/mixins/account.py", line 66, in __fetch_users
users_by_id, fetch_time = self._fetch_users_by_id(external_ids)
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/dkbarn/wdb/worker/ti.py", line 270, in _fetch_users_by_id
user = self.client_pool.make_request("get_user", external_id)
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/dkbarn/wdb/client_pool.py", line 188, in make_request
client, response = self.make_request_get_client(funcname, *args, **kwargs)
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/dkbarn/wdb/client_pool.py", line 177, in make_request_get_client
response = client.make_request(funcname, *args, **kwargs, retry_on_throttled=retry_on_throttled)
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/dkbarn/wdb/worker/ti.py", line 128, in make_request
return super(TiClientWrapper, self).make_request(funcname, *args, **kwargs)
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/dkbarn/wdb/client_pool.py", line 82, in make_request
response = getattr(self.client, funcname)(*args, **kwargs)
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/dkbarn/apiclient/ti/_client.py", line 274, in get_user
payload = self.__make_api_request(USER_URL.format(user_id=user_id), "get")
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/dkbarn/apiclient/ti/_client.py", line 299, in __make_api_request
response = request_with_retry(
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/dkbarn/apiclient/__init__.py", line 48, in request_with_retry
result = session.get(url, params=args, **kwargs)
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/requests/sessions.py", line 555, in get
return self.request('GET', url, **kwargs)
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/requests/adapters.py", line 439, in send
resp = conn.urlopen(
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/urllib3/connectionpool.py", line 755, in urlopen
retries = retries.increment(
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/urllib3/util/retry.py", line 516, in increment
other = self.other
File "/opt/venv/dkbarn-py38/lib/python3.8/site-packages/urllib3/util/retry.py", line 597, in __getattr__
return getattr(Retry, item)
AttributeError: type object 'Retry' has no attribute 'other'
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5782/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5782/timeline
| null |
completed
| null | null | false |
[
"So the fundamental problem here is that `ConnectionResetError: [Errno 104] Connection reset by peer` while handling that exception we're running into a different exception that's unexpected.\r\n\r\nAll of that said, you have discarded the issue template intended to help us help you. I'm closing this. If you can provide that information on this issue, we can re-open it.\r\n\r\nIt does seem, like perhaps urllib3's [`Retry` object](https://github.com/urllib3/urllib3/blob/a8913042b676c510e94fc2b097f6b514ae11a537/src/urllib3/util/retry.py#L74) isn't fully thread-safe which, if I read between the lines, you're using in `request_with_retry` (I'm also assuming `dkbarn/apiclient/ti/_client.py` is your package given it bears your name). In the end, it may be that this is a more suitable bug report for urllib3."
] |
https://api.github.com/repos/psf/requests/issues/5781
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5781/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5781/comments
|
https://api.github.com/repos/psf/requests/issues/5781/events
|
https://github.com/psf/requests/issues/5781
| 837,455,034 |
MDU6SXNzdWU4Mzc0NTUwMzQ=
| 5,781 |
Why not support Selector or BeautifulSoup to the model of Response like scrapy?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/67509508?v=4",
"events_url": "https://api.github.com/users/fango6/events{/privacy}",
"followers_url": "https://api.github.com/users/fango6/followers",
"following_url": "https://api.github.com/users/fango6/following{/other_user}",
"gists_url": "https://api.github.com/users/fango6/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/fango6",
"id": 67509508,
"login": "fango6",
"node_id": "MDQ6VXNlcjY3NTA5NTA4",
"organizations_url": "https://api.github.com/users/fango6/orgs",
"received_events_url": "https://api.github.com/users/fango6/received_events",
"repos_url": "https://api.github.com/users/fango6/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/fango6/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fango6/subscriptions",
"type": "User",
"url": "https://api.github.com/users/fango6",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2021-03-22T08:23:14Z
|
2021-08-27T00:08:29Z
|
2021-03-22T09:07:53Z
|
NONE
|
resolved
|
easily parse the response data, such as
```python
response = requests.get("https://github.com", verify=False)
title = response.xpath("//head/title/text()").get()
print(title)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/67509508?v=4",
"events_url": "https://api.github.com/users/fango6/events{/privacy}",
"followers_url": "https://api.github.com/users/fango6/followers",
"following_url": "https://api.github.com/users/fango6/following{/other_user}",
"gists_url": "https://api.github.com/users/fango6/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/fango6",
"id": 67509508,
"login": "fango6",
"node_id": "MDQ6VXNlcjY3NTA5NTA4",
"organizations_url": "https://api.github.com/users/fango6/orgs",
"received_events_url": "https://api.github.com/users/fango6/received_events",
"repos_url": "https://api.github.com/users/fango6/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/fango6/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/fango6/subscriptions",
"type": "User",
"url": "https://api.github.com/users/fango6",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5781/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5781/timeline
| null |
completed
| null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/5780
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5780/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5780/comments
|
https://api.github.com/repos/psf/requests/issues/5780/events
|
https://github.com/psf/requests/pull/5780
| 836,264,546 |
MDExOlB1bGxSZXF1ZXN0NTk2Nzk2NjAy
| 5,780 |
Update setup.py
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/42208098?v=4",
"events_url": "https://api.github.com/users/armangido/events{/privacy}",
"followers_url": "https://api.github.com/users/armangido/followers",
"following_url": "https://api.github.com/users/armangido/following{/other_user}",
"gists_url": "https://api.github.com/users/armangido/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/armangido",
"id": 42208098,
"login": "armangido",
"node_id": "MDQ6VXNlcjQyMjA4MDk4",
"organizations_url": "https://api.github.com/users/armangido/orgs",
"received_events_url": "https://api.github.com/users/armangido/received_events",
"repos_url": "https://api.github.com/users/armangido/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/armangido/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/armangido/subscriptions",
"type": "User",
"url": "https://api.github.com/users/armangido",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-03-19T18:38:13Z
|
2021-08-27T00:08:44Z
|
2021-03-19T22:09:29Z
|
CONTRIBUTOR
|
resolved
|
Removed `import re` as unnecessary import
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5780/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5780/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5780.diff",
"html_url": "https://github.com/psf/requests/pull/5780",
"merged_at": "2021-03-19T22:09:29Z",
"patch_url": "https://github.com/psf/requests/pull/5780.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5780"
}
| true |
[
"Thank you! 🎉 "
] |
https://api.github.com/repos/psf/requests/issues/5779
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5779/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5779/comments
|
https://api.github.com/repos/psf/requests/issues/5779/events
|
https://github.com/psf/requests/pull/5779
| 836,016,285 |
MDExOlB1bGxSZXF1ZXN0NTk2NTg3NDA4
| 5,779 |
sets a default timeout and resolves #3070
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3444196?v=4",
"events_url": "https://api.github.com/users/grintor/events{/privacy}",
"followers_url": "https://api.github.com/users/grintor/followers",
"following_url": "https://api.github.com/users/grintor/following{/other_user}",
"gists_url": "https://api.github.com/users/grintor/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/grintor",
"id": 3444196,
"login": "grintor",
"node_id": "MDQ6VXNlcjM0NDQxOTY=",
"organizations_url": "https://api.github.com/users/grintor/orgs",
"received_events_url": "https://api.github.com/users/grintor/received_events",
"repos_url": "https://api.github.com/users/grintor/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/grintor/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/grintor/subscriptions",
"type": "User",
"url": "https://api.github.com/users/grintor",
"user_view_type": "public"
}
|
[
{
"color": "eb6420",
"default": false,
"description": null,
"id": 44501256,
"name": "Breaking API Change",
"node_id": "MDU6TGFiZWw0NDUwMTI1Ng==",
"url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change"
}
] |
open
| false | null |
[] | null | 5 |
2021-03-19T13:50:35Z
|
2022-01-03T15:05:49Z
| null |
NONE
| null |
This sets a (very high) default timeout which is guaranteed not to create any breaking changes while also fixing the longstanding issue of requests possibly hanging indefinitely in any script where the dev forgot to include a timeout. I would like to urge this commit make it into the next minor release version with the possibility of lowering the timeout to 300 seconds in the next major release version.
Please see the discussion at https://github.com/psf/requests/issues/3070
| null |
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 4,
"laugh": 0,
"rocket": 0,
"total_count": 4,
"url": "https://api.github.com/repos/psf/requests/issues/5779/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5779/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5779.diff",
"html_url": "https://github.com/psf/requests/pull/5779",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5779.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5779"
}
| true |
[
"I'd vote for going straight to whatever timeouts browsers have as the default and skipping the intermediate step your have here, but anything will be an important improvement.",
"There are - unfortunately - people who've been made to feel unsafe contributing on issue #3070 who have emailed me in the past to urge this not be changed in a minor release. This is - regardless of what you think - backwards incompatible and thus not capable of being accepted under our policies for compatibility in minor and patch releases.",
"That's fine. Can we do a major release for this then? ",
"That is unfortunate that those people felt unsafe in the discussion. I know people can be terrible sometimes, and I wish everyone could always remain civil. While I find it far-fetched that there is a request takes over 15 minutes to respond, (and the server application does not time-out the connection itself first) I don't find it impossible, so I can understand why the change is considered to be not backwards compatible. I hope that you will still make the 300ms timeout in the next major release though.\r\n\r\nOn the other hand, there is an even more minor change, which I feel certain is not a breaking one, that I would like to propose. Setting a default value for the *Connect* timeout, (while leaving the read timeout indefinite), and making it's value higher than the TCP connect timeout of any supported operating system, could not possibly cause a breaking change and would prevent most deadlocks.\r\n\r\nHave a look at this image from my Ubuntu server: \r\n\r\n\r\n\r\nAs you can see, it timed out in 130 seconds. Now, I know what you may be thinking, \"that's just the ssh application timing out, and does not reflect some hard boundary set by the OS\". But have a look at this:\r\n\r\n\r\n\r\nIt times out because 130 seconds is the hard-limit on TCP connections in Linux (which is actually [documented here](https://blog.cloudflare.com/when-tcp-sockets-refuse-to-die/#synsent:~:text=the%20whole%20process%20takes%20130%20seconds%2C%20until%20the%20kernel%20gives%20up%20with%20the%20ETIMEDOUT%20errno.))\r\n\r\nIf you perform a similar test on windows, you will find the timeout is 21 seconds (which is actually [documented here](https://docs.microsoft.com/en-us/previous-versions/windows/it-pro/windows-server-2003/cc739819(v=ws.10)?redirectedfrom=MSDN#tcpmaxconnectretransmissions:~:text=(decimal)-,Default%3A%202,Description%3A%20This%20parameter%20determines%20the%20number%20of%20times%20that%20TCP%20retransmits%20a%20connect%20request%20(a%20SYN%20segment)%20before%20aborting%20the%20attempt.%20The%20retransmission%20time%2Dout%20is%20doubled%20with%20each%20successive%20retransmission%20in%20a%20given%20connect%20attempt.%20The%20initial%20time%2Dout%20is%20controlled%20by%20the%20TcpInitialRtt%20registry%20value))\r\n\r\nAnd for macOS, the value is 60 seconds. I can't find documentation on it, but you can repeat the test from the screenshots and see.\r\n\r\nSo you see that having a connection timeout of greater than 131 seconds serves no purpose because the underlying TCP connection will have been torn down by the OS at that point.\r\n\r\nI think that having the library waiting on a connection that can't possibly exist can be considered a bug and should be fixed in a non-major release. It will prevent deadlocks in most situations and can't possibly break anything.\r\n\r\nI will edit my pull request for these changes.\r\n",
"@grintor and @sigmavirus24, I'm going to move back to #3070 to try to move the conversation forward on this, since I feel like we lack direction on this issue. Maybe we can get consensus around the fix over there, and then implement it here. "
] |
https://api.github.com/repos/psf/requests/issues/5778
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5778/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5778/comments
|
https://api.github.com/repos/psf/requests/issues/5778/events
|
https://github.com/psf/requests/pull/5778
| 835,656,016 |
MDExOlB1bGxSZXF1ZXN0NTk2MjgxNzQx
| 5,778 |
Cleanup redundant allow_redirects kwargs in api
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/12206611?v=4",
"events_url": "https://api.github.com/users/laggardkernel/events{/privacy}",
"followers_url": "https://api.github.com/users/laggardkernel/followers",
"following_url": "https://api.github.com/users/laggardkernel/following{/other_user}",
"gists_url": "https://api.github.com/users/laggardkernel/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/laggardkernel",
"id": 12206611,
"login": "laggardkernel",
"node_id": "MDQ6VXNlcjEyMjA2NjEx",
"organizations_url": "https://api.github.com/users/laggardkernel/orgs",
"received_events_url": "https://api.github.com/users/laggardkernel/received_events",
"repos_url": "https://api.github.com/users/laggardkernel/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/laggardkernel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/laggardkernel/subscriptions",
"type": "User",
"url": "https://api.github.com/users/laggardkernel",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2021-03-19T07:38:02Z
|
2021-08-27T00:08:44Z
|
2021-03-19T16:44:20Z
|
CONTRIBUTOR
|
resolved
|
Not a big problem, just remove the redundant param `allow_redirects` in API `get()` and `options()`. Cause the underlying calling `Session.request()` has param `allow_redirects` defaults to `True`.
Only leaves
```python
def head(url, **kwargs):
kwargs.setdefault('allow_redirects', False)
return request('head', url, **kwargs)
```
in `api.py` to make a contrast that http method `head()` is the only one disallows redirects by default.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5778/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5778/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5778.diff",
"html_url": "https://github.com/psf/requests/pull/5778",
"merged_at": "2021-03-19T16:44:20Z",
"patch_url": "https://github.com/psf/requests/pull/5778.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5778"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/5777
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5777/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5777/comments
|
https://api.github.com/repos/psf/requests/issues/5777/events
|
https://github.com/psf/requests/pull/5777
| 834,555,224 |
MDExOlB1bGxSZXF1ZXN0NTk1MzQ2OTcz
| 5,777 |
quickstart.rst: r.json() can raise JSONDecodeError on Py3
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3709715?v=4",
"events_url": "https://api.github.com/users/cclauss/events{/privacy}",
"followers_url": "https://api.github.com/users/cclauss/followers",
"following_url": "https://api.github.com/users/cclauss/following{/other_user}",
"gists_url": "https://api.github.com/users/cclauss/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/cclauss",
"id": 3709715,
"login": "cclauss",
"node_id": "MDQ6VXNlcjM3MDk3MTU=",
"organizations_url": "https://api.github.com/users/cclauss/orgs",
"received_events_url": "https://api.github.com/users/cclauss/received_events",
"repos_url": "https://api.github.com/users/cclauss/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/cclauss/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cclauss/subscriptions",
"type": "User",
"url": "https://api.github.com/users/cclauss",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-03-18T09:06:28Z
|
2021-08-27T00:08:45Z
|
2021-03-18T14:36:35Z
|
CONTRIBUTOR
|
resolved
|
% `python2 -c "import requests ; requests.get('https://github.com').json()"`
% `python3 -c "import requests ; requests.get('https://github.com').json()"`
Python | Simplejson is installed | Simplejson is NOT installed
--- | --- | ---
Python 2 | simplejson.JSONDecodeError | ValueError
Python 3 | simplejson.JSONDecodeError | json.JSONDecodeError
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5777/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5777/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5777.diff",
"html_url": "https://github.com/psf/requests/pull/5777",
"merged_at": "2021-03-18T14:36:35Z",
"patch_url": "https://github.com/psf/requests/pull/5777.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5777"
}
| true |
[
"Thank you! 🎉 "
] |
https://api.github.com/repos/psf/requests/issues/5776
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5776/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5776/comments
|
https://api.github.com/repos/psf/requests/issues/5776/events
|
https://github.com/psf/requests/pull/5776
| 833,712,902 |
MDExOlB1bGxSZXF1ZXN0NTk0NjYyNzU4
| 5,776 |
Added explanation why the timeout is doubled.(#5773)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/80456253?v=4",
"events_url": "https://api.github.com/users/cpyberry/events{/privacy}",
"followers_url": "https://api.github.com/users/cpyberry/followers",
"following_url": "https://api.github.com/users/cpyberry/following{/other_user}",
"gists_url": "https://api.github.com/users/cpyberry/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/cpyberry",
"id": 80456253,
"login": "cpyberry",
"node_id": "MDQ6VXNlcjgwNDU2MjUz",
"organizations_url": "https://api.github.com/users/cpyberry/orgs",
"received_events_url": "https://api.github.com/users/cpyberry/received_events",
"repos_url": "https://api.github.com/users/cpyberry/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/cpyberry/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cpyberry/subscriptions",
"type": "User",
"url": "https://api.github.com/users/cpyberry",
"user_view_type": "public"
}
|
[] |
open
| false | null |
[] | null | 8 |
2021-03-17T12:21:12Z
|
2022-01-03T15:22:26Z
| null |
NONE
| null |
## issue
#5773
| null |
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5776/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5776/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5776.diff",
"html_url": "https://github.com/psf/requests/pull/5776",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5776.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5776"
}
| true |
[
"@sigmavirus24\n\nI added what the user should do and made the phenomenon stand out.\n\nIf you have time, I'd be happy if you could see it\n",
"Hi,\r\n\r\nI would like to tell what I understand on what you want to publish, and possibly, suggest a different worlds to be more accurate.\r\n\r\nIn Networking, we have a reference model set by ISO, called OSI Ref Model, it is a 7 layers reference model that, by ISO, could made any complaint system , interoperable.\r\n\r\nSaying that, HTTP/HTTPS is a High L4 Protocol, while a connection is High L3/ Low L4 Pair...based on this....a Read Timeout, really means that once all higher security layer have been passed a sessin with a server is established, and a request has been made....without any protocol response status within the time set as timeout, it should be consider as a read timeout, the connection timeout, means that, when the orignation point request or has a established connection with an endpoint does not response within the timeout defined.\r\n\r\nHTTP Keepalive is a way to call TCP Keealive mechanims to take care of that connection, but...as the RFC Standard is Out of this world, the endpoint commonly drip the connectn without notification...so a Relk Read Timeout...is now conerted as a Connection Timeout..\r\n\r\nAnd Finallly, just to be crystal....timeout is not being double by client....it is really set by Server side response.\r\n\r\nCheers",
"I will use it as a reference.\n\nI'm really thankful to you",
"I made some changes so that some people could understand",
"The IETF Standard defines a 2 hours lapse in order to start sending TCP KA Packets, obviously, it is too long, so server managers decrease that value in order to allocate the available resource properly... I.E, Netflix have a 4 minutes timeout, instead of 2 hours.... so no matter if you have enabled HTTP KA.... the server side will drop the connection on its side after 4 minutes and when the client side tries to get data on a open soclet, it will fail, cais, that socket no logr exists...BTW, NF, has a avascript code, thtat modifies the connectin time out to 2 min.\r\n\r\nSo, having the requests/urllib3 tieout parameter enabled, it wont make any diffrence, since the server might drop the soclet before the client reches its threshold",
"Wow!\nI learned for the first time that the initial value of keepalive was 2 hours.",
"@cpyberry I don't understand the documentation you've added here any longer. It's vague, unclear, and roundabout. You've been given clear suggestions that you can simply apply and would get this PR merged. Please either update this or close it, as I will not accept these updates in their current state",
"@sigmavirus24 \nFirst of all, I apologize for leaving this pull request for several months.\n\nI want to commit the changes you have suggested, but my stupid additional commits have made it an \"old suggestion\".\n\nApparently, github can only commit latest suggestions.\n\nIs it possible to make the same proposal again?\n\nNext time I will not do stupid things."
] |
https://api.github.com/repos/psf/requests/issues/5775
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5775/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5775/comments
|
https://api.github.com/repos/psf/requests/issues/5775/events
|
https://github.com/psf/requests/issues/5775
| 833,010,180 |
MDU6SXNzdWU4MzMwMTAxODA=
| 5,775 |
RFC6874 IPv6 Zone Identifiers in urls not parsed correctly
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/80776399?v=4",
"events_url": "https://api.github.com/users/ph1l1p139/events{/privacy}",
"followers_url": "https://api.github.com/users/ph1l1p139/followers",
"following_url": "https://api.github.com/users/ph1l1p139/following{/other_user}",
"gists_url": "https://api.github.com/users/ph1l1p139/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ph1l1p139",
"id": 80776399,
"login": "ph1l1p139",
"node_id": "MDQ6VXNlcjgwNzc2Mzk5",
"organizations_url": "https://api.github.com/users/ph1l1p139/orgs",
"received_events_url": "https://api.github.com/users/ph1l1p139/received_events",
"repos_url": "https://api.github.com/users/ph1l1p139/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ph1l1p139/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ph1l1p139/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ph1l1p139",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-03-16T16:52:53Z
|
2021-08-27T00:08:30Z
|
2021-03-16T18:18:41Z
|
NONE
|
resolved
|
[RFC6874 ](https://tools.ietf.org/html/rfc6874)updates [RFC3986 ](https://tools.ietf.org/html/rfc3986) URI Syntax to support IPv6 Zone Identifiers. Zone identifiers are required when working with link local IPv6 addresses on a machine with multiple network adapters.
RFC6874 updates the URI ABNF to include:
> IP-literal = "[" ( IPv6address / IPv6addrz / IPvFuture ) "]"
> ZoneID = 1*( unreserved / pct-encoded )
> IPv6addrz = IPv6address "%25" ZoneID
Windows systems use integers for the zone identifiers (Linux systems often use interface names instead).
When performing a get request to a link local address with zone ID "37" formatted as per RFC6874 (e.g. http://[fe80::1234%2537]) the percent encoding seems to be interpreted incorrectly and attempts to access fe80::12347 which is an invalid IPv6 address.
## Expected Result
Get request to "fe80::1234%37" performed correctly
## Actual Result
urlllib3 LocationParseError and requests.exceptions.InvalidURL exceptions are generated.
```
>>> r = requests.get("http://[fe80::1234%2537]")
Traceback (most recent call last):
File "C:\Users\username_removed\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\adapters.py", line 412, in send
conn = self.get_connection(request.url, proxies)
File "C:\Users\username_removed\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\adapters.py", line 315, in get_connection
conn = self.poolmanager.connection_from_url(url)
File "C:\Users\username_removed\AppData\Local\Programs\Python\Python39\lib\site-packages\urllib3\poolmanager.py", line 297, in connection_from_url
u = parse_url(url)
File "C:\Users\username_removed\AppData\Local\Programs\Python\Python39\lib\site-packages\urllib3\util\url.py", line 392, in parse_url
return six.raise_from(LocationParseError(source_url), None)
File "<string>", line 3, in raise_from
urllib3.exceptions.LocationParseError: Failed to parse: http://[fe80::12347]/
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\username_removed\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\api.py", line 76, in get
return request('get', url, params=params, **kwargs)
File "C:\Users\username_removed\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Users\username_removed\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\username_removed\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "C:\Users\username_removed\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\adapters.py", line 414, in send
raise InvalidURL(e, request=request)
requests.exceptions.InvalidURL: Failed to parse: http://[fe80::12347]/
```
It seems that part of the zone ID is concatenated onto the IPv6 address and the % removed altogther.
## Reproduction Steps
```python
import requests
r = requests.get("http://[fe80::1234%2537]")
```
Note that the zone identifiers vary between PCs/OSs and sometimes over reboots, it is unlikely that 37 will be a valid Zone ID on most systems. However the connection should fail with "[WinError 10049] The requested address is not valid in its context" rather than
## Other Notes
The correct URL can be accesses by escaping the Zone ID twice. e.g:
```python
import requests
r = requests.get("http://[fe80::1234%252537]")
```
Will attempt to access the correct address fe80::1234%37 perhaps %encoding is being performed twice?
Exact behaviour depends on the specific zone ID used, for instance some zone IDs work with or without the percent encoded % character:
- r = requests.get("http://[fe80::1234%1]")
- r = requests.get("http://[fe80::1234%251]")
- r = requests.get("http://[fe80::1234%25251]")
All attempt a get request from fe80::1234%1
## urllib3 behaviour
urllib3 seems to perform in accordance with RFC6874:
```
>>> import urllib3
>>> urllib3.util.parse_url("http://[fe80::1234%2537]")
Url(scheme='http', auth=None, host='[fe80::1234%37]', port=None, path=None, query=None, fragment=None)
```
## System Information
$ python -m requests.help
```
{
"chardet": {
"version": "4.0.0"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.9.2"
},
"platform": {
"release": "10",
"system": "Windows"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010109f"
},
"urllib3": {
"version": "1.26.3"
},
"using_pyopenssl": false
}
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5775/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5775/timeline
| null |
completed
| null | null | false |
[
"Duplicate of https://github.com/psf/requests/issues/5126"
] |
https://api.github.com/repos/psf/requests/issues/5774
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5774/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5774/comments
|
https://api.github.com/repos/psf/requests/issues/5774/events
|
https://github.com/psf/requests/pull/5774
| 831,906,368 |
MDExOlB1bGxSZXF1ZXN0NTkzMTU0Mjc4
| 5,774 |
#5773: Fixed the number of seconds specified for timeout being doubled
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/80456253?v=4",
"events_url": "https://api.github.com/users/cpyberry/events{/privacy}",
"followers_url": "https://api.github.com/users/cpyberry/followers",
"following_url": "https://api.github.com/users/cpyberry/following{/other_user}",
"gists_url": "https://api.github.com/users/cpyberry/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/cpyberry",
"id": 80456253,
"login": "cpyberry",
"node_id": "MDQ6VXNlcjgwNDU2MjUz",
"organizations_url": "https://api.github.com/users/cpyberry/orgs",
"received_events_url": "https://api.github.com/users/cpyberry/received_events",
"repos_url": "https://api.github.com/users/cpyberry/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/cpyberry/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cpyberry/subscriptions",
"type": "User",
"url": "https://api.github.com/users/cpyberry",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2021-03-15T15:07:31Z
|
2021-08-27T00:08:45Z
|
2021-03-15T17:18:19Z
|
NONE
|
resolved
|
## issue
#5773
## changes
Create a subclass that divides the connect argument of TimeoutSauce by 2.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5774/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5774/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5774.diff",
"html_url": "https://github.com/psf/requests/pull/5774",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5774.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5774"
}
| true |
[
"IMHO blindly halving the specified timeout is not a good strategy, specially if the underlying issue is still unknown.\r\n\r\nBesides, there are a few issues with that implementation, from not handling a `None` value to shadowing the imported name",
"Thank you for looking at my code.",
"I forgot about handling None."
] |
https://api.github.com/repos/psf/requests/issues/5773
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5773/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5773/comments
|
https://api.github.com/repos/psf/requests/issues/5773/events
|
https://github.com/psf/requests/issues/5773
| 831,807,572 |
MDU6SXNzdWU4MzE4MDc1NzI=
| 5,773 |
Connection timeout (not Read, Wall or Total) is consistently taking twice as long
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/992317?v=4",
"events_url": "https://api.github.com/users/MestreLion/events{/privacy}",
"followers_url": "https://api.github.com/users/MestreLion/followers",
"following_url": "https://api.github.com/users/MestreLion/following{/other_user}",
"gists_url": "https://api.github.com/users/MestreLion/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/MestreLion",
"id": 992317,
"login": "MestreLion",
"node_id": "MDQ6VXNlcjk5MjMxNw==",
"organizations_url": "https://api.github.com/users/MestreLion/orgs",
"received_events_url": "https://api.github.com/users/MestreLion/received_events",
"repos_url": "https://api.github.com/users/MestreLion/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/MestreLion/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MestreLion/subscriptions",
"type": "User",
"url": "https://api.github.com/users/MestreLion",
"user_view_type": "public"
}
|
[
{
"color": "fad8c7",
"default": false,
"description": null,
"id": 136616769,
"name": "Documentation",
"node_id": "MDU6TGFiZWwxMzY2MTY3Njk=",
"url": "https://api.github.com/repos/psf/requests/labels/Documentation"
}
] |
closed
| false | null |
[] | null | 11 |
2021-03-15T13:23:45Z
|
2023-12-28T02:03:33Z
|
2023-12-16T13:30:21Z
|
CONTRIBUTOR
| null |
I'm aware that several issues related to timeout were opened (and closed) before, so I'm trying to narrow this report down to a very specific scope: **connection** timeout is behaving in a consistent, wrong way: it times out at _precisely_ **twice** the requested time.
Results below are _so_ consistent we must acknowledge there is _something_ going on here! I **beg** you guys not to dismiss this report before taking a look at it!
What this report is **not** about:
- Total/Wall timeout:
That would be a nice feature, but I'm fully aware this is currently outside the scope of Requests. I'm focusing on _connection_ timeout only.
- Read timeout:
All my tests were using http://google.com:81, which fails to even _connect_. There's no read involved, the server exists but never responds, not even to refuse the connection. No data is ever transmitted. No HTTP connection is ever established. This is **not** about `ReadTimeoutError`, this is about `ConnectTimeoutError`.
- Accurate timings / network fluctuations:
Not asking for millisecond precision. I don't even care about _whole seconds_ imprecision. But, surprisingly, `requests` is being incredibly accurate... to _twice_ the time.
## Expected Result
`requests.get('http://google.com:81', timeout=(4, 1))` should take _approximately_ 4 seconds to timeout.
## Actual Result
It _consistently_ takes about 8.0 seconds to raise `requests.ConnectTimeout`. It always takes twice the time, for timeouts ranging from 1 to 100. Exception message clearly says in the end: `Connection to google.com timed out. (connect timeout=4)`, a very distinct message from read timeouts.
## Reproduction Steps
```python
import requests, time, os, sys
# Using a know URL to test connection timeout
def test_timeout(timeout, url='http://google.com:81'):
start = time.time()
try:
requests.get(url, timeout=timeout)
print("OK!") # will never reach this...
except requests.ConnectTimeout: # any other exception will bubble out
print('{}: {:.1f}'.format(timeout, time.time()-start))
print("\n1 to 10, simple numeric timeout")
for i in range(1, 11):
test_timeout(i)
print("\n1 to 10, (x, 1) timeout tuple")
for i in range(1, 11):
test_timeout((i, 1))
print("\n1 to 10, (x, 10) timeout tuple")
for i in range(1, 11):
test_timeout((i, 1))
print("\nLarge timeouts")
for i in (20, 30, 50, 100):
test_timeout((i, 1))
```
Results:
```
Linux desktop 5.4.0-66-generic #74~18.04.2-Ubuntu SMP Fri Feb 5 11:17:31 UTC 2021 x86_64
3.6.9 (default, Jan 26 2021, 15:33:00)
[GCC 8.4.0]
Requests: 2.25.1
Urllib3: 1.26.3
1 to 10, simple numeric timeout
1: 2.0
2: 4.0
3: 6.0
4: 8.0
5: 10.0
6: 12.0
7: 14.0
8: 16.0
9: 18.0
10: 20.0
1 to 10, (x, 1) timeout tuple
(1, 1): 2.0
(2, 1): 4.0
(3, 1): 6.0
(4, 1): 8.0
(5, 1): 10.0
(6, 1): 12.0
(7, 1): 14.0
(8, 1): 16.0
(9, 1): 18.0
(10, 1): 20.0
1 to 10, (x, 10) timeout tuple
(1, 10): 2.0
(2, 10): 4.0
(3, 10): 6.0
(4, 10): 8.0
(5, 10): 10.0
(6, 10): 12.0
(7, 10): 14.0
(8, 10): 16.0
(9, 10): 18.0
(10, 10): 20.0
Large timeouts
(20, 1): 40.0
(30, 1): 60.0
(50, 1): 100.1
(100, 1): 200.2
```
## System Information
```
{
"chardet": {
"version": "3.0.4"
},
"cryptography": {
"version": "3.2.1"
},
"idna": {
"version": "2.8"
},
"implementation": {
"name": "CPython",
"version": "3.6.9"
},
"platform": {
"release": "5.4.0-66-generic",
"system": "Linux"
},
"pyOpenSSL": {
"openssl_version": "1010108f",
"version": "17.5.0"
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010100f"
},
"urllib3": {
"version": "1.26.3"
},
"using_pyopenssl": true
}
```
It seems there **is** a single, "hidden", connection retry, performed by either `requests` or `urllib3`, somewhere in the line. It has been reported by other users in other platforms too.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 2,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 2,
"url": "https://api.github.com/repos/psf/requests/issues/5773/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5773/timeline
| null |
completed
| null | null | false |
[
"When using `urllib3` directly, it seems to also give a very consistent timeout of **8 times** the requested one:\r\n\r\n```pycon\r\n>>> import urllib3, time\r\n>>> http = urllib3.PoolManager()\r\n>>> try:\r\n... start = time.time()\r\n... http.request('GET', 'http://google.com:81', timeout=1)\r\n... except urllib3.exceptions.MaxRetryError:\r\n... print(time.time() - start)\r\n... \r\n8.030268907546997\r\n>>> try:\r\n... start = time.time()\r\n... http.request('GET', 'http://google.com:81', timeout=2)\r\n... except urllib3.exceptions.MaxRetryError:\r\n... print(time.time() - start)\r\n... \r\n16.021981477737427\r\n>>> try:\r\n... start = time.time()\r\n... http.request('GET', 'http://google.com:81', timeout=3)\r\n... except urllib3.exceptions.MaxRetryError:\r\n... print(time.time() - start)\r\n... \r\n24.041129112243652\r\n```",
"Urllib3's [documentation on retrying requests](https://urllib3.readthedocs.io/en/latest/user-guide.html#retrying-requests) say:\r\n\r\n> urllib3 can automatically retry idempotent requests. This same mechanism also handles redirects. You can control the retries using the retries parameter to request(). By default, urllib3 will retry requests 3 times and follow up to 3 redirects.\r\n\r\n3 retries = 4 connection attempts, taking that into account still leads us to the same issue: _each attempt is taking twice as long to timeout_",
"And, regarding the last comment on duplicate issue #5760 by @sigmavirus24:\r\n\r\n> You're asking for a wall clock timeout and we can not provide that.\r\n\r\nI'm **not** asking for a wall clock timeout, nor a total request timeout. I'm asking for a fix in _Connection_ timeout. That double time is a bug.\r\n\r\n> Further we have clearly documented how timeouts work in our docs\r\n\r\nTrue, and my tests show that connection timeouts are not behaving as the documentation says it should. Where in the documentation this double time behavior is explained?",
"I think I've found the culprit: **IPv6**! It seems requests/urllib3 is automatically trying to connect using both IPv4 and IPv6, and that accounts for the doubled time.\r\n\r\nI'll do some more tests to properly isolate the problem, as it seems requests is trying IPv6 _even when it's not available_, raising a `ConnectionError` with `Failed to establish a new connection: [Errno 101] Network is unreachable`, which is undesirable.",
"https://github.com/urllib3/urllib3/issues/2030",
"> [urllib3/urllib3#2030](https://github.com/urllib3/urllib3/issues/2030)\r\n\r\nI'm not sure that issue is related to this one. I'm not experiencing connections that are slower/faster depending on IP family, I'm experiencing an exact _double_ timeout time due to (trying to) connect using both IPv4 and IPv6, and failing in both families.\r\n\r\nIs this \"if IPv4 fails, retry using IPv6\" a known/expected behavior? Is this documented anywhere?",
"After more tests, the issue is really the dual IPv4/IPv6 connection attempts. Using the workaround proposed at by a [Stackoverflow answer](https://stackoverflow.com/a/46972341/624066) to force either IPv4 or IPv6 *only*, timeout behaves as expected:\r\n\r\n```python\r\n# Monkey-patch urllib3 to force IPv4-connections only.\r\n# Adapted from https://stackoverflow.com/a/46972341/624066\r\nimport socket\r\nimport urllib3.util.connection\r\ndef allowed_gai_family():\r\n\treturn socket.AF_INET\r\n\r\nimport urllib3\r\nimport requests\r\nimport time, os, sys\r\n\r\n# Using a know URL to test connection timeout\r\nURL='http://google.com:81'\r\n\r\nhttp = urllib3.PoolManager()\r\n\r\ndef test_urllib3_timeout(timeout, url=URL):\r\n start = time.time()\r\n try:\r\n http.request('GET', url, timeout=timeout, retries=0)\r\n print(\"OK!\")\r\n except urllib3.exceptions.MaxRetryError:\r\n print('{}: {:.1f}'.format(timeout, time.time()-start))\r\n\r\ndef test_requests_timeout(timeout, url=URL):\r\n start = time.time()\r\n try:\r\n requests.get(url, timeout=timeout)\r\n print(\"OK!\") # will never reach this...\r\n except requests.ConnectTimeout: # any other exception will bubble out\r\n print('{}: {:.1f}'.format(timeout, time.time()-start))\r\n\r\ndef test_timeouts():\r\n print(\"\\nUrllib3\")\r\n for i in range(1, 6):\r\n test_urllib3_timeout(i)\r\n\r\n print(\"\\nRequests\")\r\n for i in range(1, 6):\r\n test_requests_timeout((i, 1))\r\n\r\n\r\nprint(\"BEFORE PATCH:\")\r\ntest_timeouts()\r\n\r\nurllib3.util.connection.allowed_gai_family = allowed_gai_family\r\n\r\nprint(\"\\nAFTER PATCH:\")\r\ntest_timeouts()\r\n```\r\nResults:\r\n```\r\nBEFORE PATCH:\r\n\r\nUrllib3\r\n1: 2.0\r\n2: 4.0\r\n3: 6.0\r\n4: 8.0\r\n5: 10.0\r\n\r\nRequests\r\n(1, 1): 2.0\r\n(2, 1): 4.0\r\n(3, 1): 6.0\r\n(4, 1): 8.0\r\n(5, 1): 10.0\r\n\r\nAFTER PATCH:\r\n\r\nUrllib3\r\n1: 1.0\r\n2: 2.0\r\n3: 3.0\r\n4: 4.0\r\n5: 5.0\r\n\r\nRequests\r\n(1, 1): 1.0\r\n(2, 1): 2.0\r\n(3, 1): 3.0\r\n(4, 1): 4.0\r\n(5, 1): 5.0\r\n",
"Still, I believe at least `requests` documentation should mention this behavior. for example, https://requests.readthedocs.io/en/latest/user/advanced/#timeouts could add something like this:\r\n\r\n> When the client has IPv6 support and the server has an IPv6 DNS record (AAAA), if the IPv4 connection fails the underlying `urllib3` will automatically retry using IPv6, which may lead to an effective connection timeout of twice the specified time, so take that into account when setting the connection timeout.",
"@MestreLion I would accept a PR adding an explanation of what can go wrong with timeouts like this",
"Just popping in here to say thanks for digging into this. This would explain my double timeout question on the other issue I opened too.\r\n\r\nI just tested by forcing IPv4 and the timeout is no longer doubled",
"explanation of what can be done fixe the timeouts "
] |
https://api.github.com/repos/psf/requests/issues/5772
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5772/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5772/comments
|
https://api.github.com/repos/psf/requests/issues/5772/events
|
https://github.com/psf/requests/issues/5772
| 830,110,982 |
MDU6SXNzdWU4MzAxMTA5ODI=
| 5,772 |
Session proxies definition raise error on Get method
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/31013072?v=4",
"events_url": "https://api.github.com/users/bbuzens/events{/privacy}",
"followers_url": "https://api.github.com/users/bbuzens/followers",
"following_url": "https://api.github.com/users/bbuzens/following{/other_user}",
"gists_url": "https://api.github.com/users/bbuzens/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bbuzens",
"id": 31013072,
"login": "bbuzens",
"node_id": "MDQ6VXNlcjMxMDEzMDcy",
"organizations_url": "https://api.github.com/users/bbuzens/orgs",
"received_events_url": "https://api.github.com/users/bbuzens/received_events",
"repos_url": "https://api.github.com/users/bbuzens/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bbuzens/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bbuzens/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bbuzens",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-03-12T13:18:14Z
|
2021-11-26T03:00:37Z
|
2021-08-28T02:46:01Z
|
NONE
|
resolved
|
I want to pass proxies definition through session's proxies attribute.
It is described in [the documentation](https://2.python-requests.org/en/master/user/advanced/#proxies).
In my configuration, it raises a ProxyError
## Expected Result
This should work
```Python
import requests
proxies = {
'http': 'http://10.10.1.10:3128',
'https': 'http://10.10.1.10:1080',
}
requests.get('http://example.org', proxies=proxies)
```
That too
```Python
import requests
proxies = {
'http': 'http://10.10.1.10:3128',
'https': 'http://10.10.1.10:1080',
}
session = requests.Session()
session.proxies.update(proxies)
session.get('http://example.org')
```
So, i wrote something like that
```Python
import requests
proxies = {
'http': 'http://localhost:53128',
'https': 'http://localhost:53128',
}
session = requests.sessions.Session()
session_response = session.get('https://www.google.com', proxies=proxies).status_code
print(session_response)
session_with_proxies = requests.sessions.Session()
session_with_proxies.proxies.update(proxies)
print(session_with_proxies.proxies)
try:
session_with_proxies_response = session_with_proxies.get('https://www.google.com').status_code
except requests.exceptions.ProxyError:
print("Proxy error catched")
```
The *session_with_proxies_response* should work and no *"Proxy error catched"* should be catched.
## Actual Result
And i got
```
200
{'http': 'http://localhost:53128', 'https': 'http://localhost:53128', 'ftp': 'ftp://localhost:53128'}
Proxy error catched
Process finished with exit code 0
```
The `session_with_proxies.get('https://www.google.com').status_code` command raises a proxy error.
But the proxy is defined as it is recommanded by the documentation.
## Reproduction Steps
Install *requests*
```python
import requests
```
Run that script with a local proxy running (Squid or what you like...)
```Python
import requests
proxies = {
'http': 'http://localhost:53128',
'https': 'http://localhost:53128',
}
session = requests.sessions.Session()
session_response = session.get('https://www.google.com', proxies=proxies).status_code
print(session_response)
session_with_proxies = requests.sessions.Session()
session_with_proxies.proxies.update(proxies)
print(session_with_proxies.proxies)
session_with_proxies_response = session_with_proxies.get('https://www.google.com').status_code
```
## System Information
$ python -m requests.help
```
{
"chardet": {
"version": "4.0.0"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.8.3"
},
"platform": {
"release": "7",
"system": "Windows"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010106f"
},
"urllib3": {
"version": "1.26.3"
},
"using_pyopenssl": false
}
```
This command is only available on Requests v2.16.4 and greater. Otherwise,
please provide some basic information about your system (Python version,
operating system, &c).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5772/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5772/timeline
| null |
completed
| null | null | false |
[
"Hi @bbuzens, there's no information on the actual error you're receiving. From the information provided, it looks like there's a legitimate issue connecting to your proxy.\r\n\r\nIf you have more information on the exact traceback you're receiving we can take a look. I'm going to close this for now since there isn't anything actionable, feel free to reopen with more info."
] |
https://api.github.com/repos/psf/requests/issues/5771
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5771/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5771/comments
|
https://api.github.com/repos/psf/requests/issues/5771/events
|
https://github.com/psf/requests/issues/5771
| 828,958,525 |
MDU6SXNzdWU4Mjg5NTg1MjU=
| 5,771 |
Dependency on cryptography package breaks upgrades
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/203885?v=4",
"events_url": "https://api.github.com/users/jfinkhaeuser/events{/privacy}",
"followers_url": "https://api.github.com/users/jfinkhaeuser/followers",
"following_url": "https://api.github.com/users/jfinkhaeuser/following{/other_user}",
"gists_url": "https://api.github.com/users/jfinkhaeuser/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jfinkhaeuser",
"id": 203885,
"login": "jfinkhaeuser",
"node_id": "MDQ6VXNlcjIwMzg4NQ==",
"organizations_url": "https://api.github.com/users/jfinkhaeuser/orgs",
"received_events_url": "https://api.github.com/users/jfinkhaeuser/received_events",
"repos_url": "https://api.github.com/users/jfinkhaeuser/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jfinkhaeuser/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jfinkhaeuser/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jfinkhaeuser",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2021-03-11T09:20:22Z
|
2021-03-11T17:50:59Z
|
2021-03-11T12:42:23Z
|
NONE
|
resolved
|
Unfortunately, the maintainers of the `pyca/cryptography` package have made Rust a hard dependency in newer releases. Since they cut short any discussion about this (not your problem, but a problem for the Python community in general), I was not able to explain to them that a dependency that is required by default and must be manually disabled can be considered a hard dependency.
As a result, updating any package that depends on `cryptography` breaks. Yours happens to be a widely used one.
It's not your fault, for which I am sorry. But you can solve this for your package by keeping your requirement on `cryptography` restricted to versions that do not require Rust.
## Expected Result
When I `pip install -U requests[security]`, the update works.
## Actual Result
I get errors that newer versions of the `cryptography` require a Rust toolchain.
## Reproduction Steps
see above.
## System Information
various
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 1,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/5771/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5771/timeline
| null |
completed
| null | null | false |
[
"Relates to https://github.com/pyca/cryptography/issues/5810",
"No. In fact, I think we should raise the lower limit to the ones requiring the Rust toolchain. It's significantly more secure which is one of the tenants of this library - maximum security by default for users",
"Ok.\r\n\r\nI disagree. Rust has the potential to introduce more security, but the same was said of Java. We all know how that ended. While Rust is following a different design, it'll take more time to prove its approach.\r\n\r\nWell, I raised the issue to the PSF directly, so maybe we'll come back to that. Maybe not.\r\n\r\nThe problem is that Rust just doesn't support the same range of target platforms that C or C++ do. So this excludes a bunch of e.g. embedded platforms that are currently still viable targets. I don't think that's the right approach for Python as a whole. Hence getting the PSF involved.\r\n\r\nBut it may be sufficient for requests. At least I'm not going to argue about it here. I tried raising this, if it's rejected then so be it."
] |
https://api.github.com/repos/psf/requests/issues/5770
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5770/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5770/comments
|
https://api.github.com/repos/psf/requests/issues/5770/events
|
https://github.com/psf/requests/issues/5770
| 827,911,617 |
MDU6SXNzdWU4Mjc5MTE2MTc=
| 5,770 |
Considerations before posting a new "Issue"
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/45062549?v=4",
"events_url": "https://api.github.com/users/jakermx/events{/privacy}",
"followers_url": "https://api.github.com/users/jakermx/followers",
"following_url": "https://api.github.com/users/jakermx/following{/other_user}",
"gists_url": "https://api.github.com/users/jakermx/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jakermx",
"id": 45062549,
"login": "jakermx",
"node_id": "MDQ6VXNlcjQ1MDYyNTQ5",
"organizations_url": "https://api.github.com/users/jakermx/orgs",
"received_events_url": "https://api.github.com/users/jakermx/received_events",
"repos_url": "https://api.github.com/users/jakermx/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jakermx/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jakermx/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jakermx",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2021-03-10T15:26:51Z
|
2021-03-11T14:35:50Z
|
2021-03-11T14:35:46Z
|
NONE
|
off-topic
|
I really appreciate your hard word guys, but I would like to suggest you that tell your lib consumers that your lib is a state of art HTTPimplementantion, that depends on urllib3 and OS TCP/IP stack... which means... this library just consume and uses another libraries, so this lib is a pretty and easy way to grab content from WebServers,but it depends onthe urllib3 and OS network features and capabilies...
For Everybody, I am not a Contributor, Developer or friend of the ones who really have done this Lib....I just saying what I think and what we all might take in mind before opening an Issue.
Respect you all
To my concerns, I will only say.. OMFG Thanks you all for your great and hard effort...keep it going!!!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5770/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5770/timeline
| null |
completed
| null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/5769
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5769/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5769/comments
|
https://api.github.com/repos/psf/requests/issues/5769/events
|
https://github.com/psf/requests/issues/5769
| 826,029,989 |
MDU6SXNzdWU4MjYwMjk5ODk=
| 5,769 |
Couldn't modify user-agent while sending multipart/form-data request
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/53445798?v=4",
"events_url": "https://api.github.com/users/DarcJC/events{/privacy}",
"followers_url": "https://api.github.com/users/DarcJC/followers",
"following_url": "https://api.github.com/users/DarcJC/following{/other_user}",
"gists_url": "https://api.github.com/users/DarcJC/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/DarcJC",
"id": 53445798,
"login": "DarcJC",
"node_id": "MDQ6VXNlcjUzNDQ1Nzk4",
"organizations_url": "https://api.github.com/users/DarcJC/orgs",
"received_events_url": "https://api.github.com/users/DarcJC/received_events",
"repos_url": "https://api.github.com/users/DarcJC/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/DarcJC/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DarcJC/subscriptions",
"type": "User",
"url": "https://api.github.com/users/DarcJC",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-03-09T14:37:53Z
|
2021-08-27T00:08:30Z
|
2021-03-09T15:02:37Z
|
NONE
|
resolved
|
Here is my code:
```python
with requests.Session() as s:
s.cookies.set("CASTGC", CASTGC)
s.get(...) // to get target website cookie through redirecting to CAS service.
files = {"Filedata": (
'upload.png',
tmp_file,
'image/png'
)}
res = s.post('my url', files=files, header={'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:80.0) Gecko/20100101 Firefox/80.0'})
assert(res.request.body is None) // excepted ======xxxx blabla
```
It is ok after changing user-gent header to '1111', But I need to pass user-agent header to bypass the firewall.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/53445798?v=4",
"events_url": "https://api.github.com/users/DarcJC/events{/privacy}",
"followers_url": "https://api.github.com/users/DarcJC/followers",
"following_url": "https://api.github.com/users/DarcJC/following{/other_user}",
"gists_url": "https://api.github.com/users/DarcJC/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/DarcJC",
"id": 53445798,
"login": "DarcJC",
"node_id": "MDQ6VXNlcjUzNDQ1Nzk4",
"organizations_url": "https://api.github.com/users/DarcJC/orgs",
"received_events_url": "https://api.github.com/users/DarcJC/received_events",
"repos_url": "https://api.github.com/users/DarcJC/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/DarcJC/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DarcJC/subscriptions",
"type": "User",
"url": "https://api.github.com/users/DarcJC",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5769/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5769/timeline
| null |
completed
| null | null | false |
[
"Oops, I found it is caused by the 302 redirecting."
] |
https://api.github.com/repos/psf/requests/issues/5768
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5768/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5768/comments
|
https://api.github.com/repos/psf/requests/issues/5768/events
|
https://github.com/psf/requests/issues/5768
| 824,139,206 |
MDU6SXNzdWU4MjQxMzkyMDY=
| 5,768 |
model.RequestEncodingMixin._encode_files does not work for 3 or 4 elements tuple
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/16507006?v=4",
"events_url": "https://api.github.com/users/hefvcjm/events{/privacy}",
"followers_url": "https://api.github.com/users/hefvcjm/followers",
"following_url": "https://api.github.com/users/hefvcjm/following{/other_user}",
"gists_url": "https://api.github.com/users/hefvcjm/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/hefvcjm",
"id": 16507006,
"login": "hefvcjm",
"node_id": "MDQ6VXNlcjE2NTA3MDA2",
"organizations_url": "https://api.github.com/users/hefvcjm/orgs",
"received_events_url": "https://api.github.com/users/hefvcjm/received_events",
"repos_url": "https://api.github.com/users/hefvcjm/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/hefvcjm/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/hefvcjm/subscriptions",
"type": "User",
"url": "https://api.github.com/users/hefvcjm",
"user_view_type": "public"
}
|
[
{
"color": "777777",
"default": false,
"description": null,
"id": 162780722,
"name": "Question/Not a bug",
"node_id": "MDU6TGFiZWwxNjI3ODA3MjI=",
"url": "https://api.github.com/repos/psf/requests/labels/Question/Not%20a%20bug"
}
] |
closed
| true | null |
[] | null | 1 |
2021-03-08T03:53:53Z
|
2021-08-27T00:08:30Z
|
2021-03-08T16:26:46Z
|
NONE
|
resolved
|
https://github.com/psf/requests/blob/913880c45a3a8c3bf6b298e9c38709cd95a9c97c/requests/models.py#L110-L118
Anotation for this function mentions "The tuples may be 2-tuples (filename, fileobj), 3-tuples (filename, fileobj, contentype) or 4-tuples (filename, fileobj, contentype, custom_headers).", but it does not work for 3 or 4 elements tuple.
It will throw exception in following line
https://github.com/psf/requests/blob/913880c45a3a8c3bf6b298e9c38709cd95a9c97c/requests/models.py#L141
My test code:
```python
import requests
with open("file.txt", "r") as f:
requests.get("http://www.example.com", files=[("file.txt", f, "multipart/form-data")])
```
And exception:
```
...
...
for (k, v) in files:
ValueError: too many values to unpack
```
would you comfire which one, the anotation or the code, is right? thx~
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5768/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5768/timeline
| null |
completed
| null | null | false |
[
"That is not the documentation or annotation you should be reading for that parameter. Please read the appropriate documentation"
] |
https://api.github.com/repos/psf/requests/issues/5767
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5767/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5767/comments
|
https://api.github.com/repos/psf/requests/issues/5767/events
|
https://github.com/psf/requests/issues/5767
| 822,998,883 |
MDU6SXNzdWU4MjI5OTg4ODM=
| 5,767 |
Invalid JSON serialization of data with NaNs when using `json` request argument
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/44946?v=4",
"events_url": "https://api.github.com/users/soxofaan/events{/privacy}",
"followers_url": "https://api.github.com/users/soxofaan/followers",
"following_url": "https://api.github.com/users/soxofaan/following{/other_user}",
"gists_url": "https://api.github.com/users/soxofaan/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/soxofaan",
"id": 44946,
"login": "soxofaan",
"node_id": "MDQ6VXNlcjQ0OTQ2",
"organizations_url": "https://api.github.com/users/soxofaan/orgs",
"received_events_url": "https://api.github.com/users/soxofaan/received_events",
"repos_url": "https://api.github.com/users/soxofaan/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/soxofaan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/soxofaan/subscriptions",
"type": "User",
"url": "https://api.github.com/users/soxofaan",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2021-03-05T11:28:28Z
|
2021-08-27T00:08:23Z
|
2021-05-06T18:18:58Z
|
NONE
|
resolved
|
When passing data to `json` argument in a `requests.post()` (or comparable) request and data contains "nan" floats (`float("nan")`), the data payload will be invalid JSON (in strict sense). Example:
>>> import requests
>>> requests.__version__
'2.25.1'
>>> data = {"foo": float("nan")}
>>> resp = requests.post("https://httpbin.org/post", json=data)
>>> resp.json()["data"]
'{"foo": NaN}'
While the `NaN` in this result (the JSON serialized data payload that is sent) is valid JavaScript, it is not valid JSON.
The reason is that stdlib's `json.dumps` has `allow_nan` option which is `True` by default and consequently generates invalid JSON by default:
>>> json.dumps(data)
'{"foo": NaN}'
## Expected Result
A ValueError should be raised (at least by default). E.g.:
>>> json.dumps(data, allow_nan=False)
...
ValueError: Out of range float values are not JSON compliant
## Actual Result
library silently sends invalid JSON
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 2,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 2,
"url": "https://api.github.com/repos/psf/requests/issues/5767/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5767/timeline
| null |
completed
| null | null | false |
[
"I am a first-time contributor and I would like to work on this issue and implement it if possible? Do let me know. Thank you!\r\n",
"Hello, I am not sure if I have fixed this with the PR #5810 , can someone check it out ?"
] |
https://api.github.com/repos/psf/requests/issues/5766
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5766/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5766/comments
|
https://api.github.com/repos/psf/requests/issues/5766/events
|
https://github.com/psf/requests/issues/5766
| 821,104,641 |
MDU6SXNzdWU4MjExMDQ2NDE=
| 5,766 |
Randomly returning TypeError from AWS Lambda
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/26573544?v=4",
"events_url": "https://api.github.com/users/mickog/events{/privacy}",
"followers_url": "https://api.github.com/users/mickog/followers",
"following_url": "https://api.github.com/users/mickog/following{/other_user}",
"gists_url": "https://api.github.com/users/mickog/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mickog",
"id": 26573544,
"login": "mickog",
"node_id": "MDQ6VXNlcjI2NTczNTQ0",
"organizations_url": "https://api.github.com/users/mickog/orgs",
"received_events_url": "https://api.github.com/users/mickog/received_events",
"repos_url": "https://api.github.com/users/mickog/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mickog/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mickog/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mickog",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2021-03-03T13:21:46Z
|
2021-08-27T00:08:32Z
|
2021-03-04T13:13:53Z
|
NONE
|
resolved
|
Hi trying to pull a file from S3 and ingest to a multipart post endpoint on another service, code looks something like this
```
def execute(event, context):
if "emailId" in event:
props = {
"fieldOne": "someValue",
"fieldTwo": "someOtherValue",
"contentType": "eml"
}
props = json.dumps(props)
file = download_s3_file_to_tmp(event["emailId"])
multipart_request(props, file, event["emailId"])
def download_s3_file_to_tmp(message_id):
s3_key = "".join([_email_bucket_folder_name, "/", str(message_id), ".eml"])
s3_client = boto3.client("s3", _region)
s3_client.download_file(_my_bucket, s3_key, "/tmp/"+str(message_id)+".eml")
downloaded_eml_file = open("/tmp/"+str(message_id)+".eml", 'rb')
return downloaded_eml_file
def multipart_request(props, file_content, message_id):
my_id = decrypt_edm_secrets(_encrypted_id)
secret = decrypt_edm_secrets(_encrypted_secret)
url = f"{_url}/...."
payload = {"props": props}
files = [{"fileContent", file_content}]
tmp_file_path = "/tmp/"+message_id+".eml"
if os.path.exists(tmp_file_path):
os.remove(tmp_file_path)
print("Removed the file %s" % tmp_file_path)
else:
print("File %s does not exist." % tmp_file_path)
LOGGER.info(f"payload: {payload}")
resp = requests.post(url,data=payload,files=files,auth=requests.auth.HTTPBasicAuth(my_id, secret))
LOGGER.info(f"Request Headers: {resp.request.headers}")
return resp.status_code, resp.text, filenote_id
```
The problem is(when testing with the same file from the S3 bucket so should be no inconsistencies there) that intermittently I am getting the error below, sometimes however there is no issues I am receiving 200 response.
```
{
"errorMessage": "expected string or bytes-like object",
"errorType": "TypeError",
"stackTrace": [
" File \"/var/task/my_lambda.py\", line 42, in execute\n multipart_request(props, file, event["emailId"])\n",
" File \"/var/task/mime_retrieval_parser.py\", line 75, in multipart_request\n resp = requests.post(url,data=payload,files=files,auth=requests.auth.HTTPBasicAuth(edm_id, edm_secret))\n",
" File \"/opt/python/requests/api.py\", line 119, in post\n return request('post', url, data=data, json=json, **kwargs)\n",
" File \"/opt/python/requests/api.py\", line 61, in request\n return session.request(method=method, url=url, **kwargs)\n",
" File \"/opt/python/requests/sessions.py\", line 516, in request\n prep = self.prepare_request(req)\n",
" File \"/opt/python/requests/sessions.py\", line 449, in prepare_request\n p.prepare(\n",
" File \"/opt/python/requests/models.py\", line 317, in prepare\n self.prepare_body(data, files, json)\n",
" File \"/opt/python/requests/models.py\", line 505, in prepare_body\n (body, content_type) = self._encode_files(files, data)\n",
" File \"/opt/python/requests/models.py\", line 166, in _encode_files\n rf.make_multipart(content_type=ft)\n",
" File \"/opt/python/urllib3/fields.py\", line 267, in make_multipart\n self._render_parts(\n",
" File \"/opt/python/urllib3/fields.py\", line 225, in _render_parts\n parts.append(self._render_part(name, value))\n",
" File \"/opt/python/urllib3/fields.py\", line 205, in _render_part\n return self.header_formatter(name, value)\n",
" File \"/opt/python/urllib3/fields.py\", line 116, in format_header_param_html5\n value = _replace_multiple(value, _HTML5_REPLACEMENTS)\n",
" File \"/opt/python/urllib3/fields.py\", line 89, in _replace_multiple\n result = pattern.sub(replacer, value)\n"
]
}
```
Is this some sort of bug or is there any other explanation, not sure where to look now.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/26573544?v=4",
"events_url": "https://api.github.com/users/mickog/events{/privacy}",
"followers_url": "https://api.github.com/users/mickog/followers",
"following_url": "https://api.github.com/users/mickog/following{/other_user}",
"gists_url": "https://api.github.com/users/mickog/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mickog",
"id": 26573544,
"login": "mickog",
"node_id": "MDQ6VXNlcjI2NTczNTQ0",
"organizations_url": "https://api.github.com/users/mickog/orgs",
"received_events_url": "https://api.github.com/users/mickog/received_events",
"repos_url": "https://api.github.com/users/mickog/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mickog/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mickog/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mickog",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5766/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5766/timeline
| null |
completed
| null | null | false |
[
"You're passing in something in either `data` or `files` that isn't a string or bytes-like object. This isn't a bug in Requests. It sounds like your external dependencies are perhaps causing this problem for you (i.e., downloading a file from S3 to re-upload elsewhere could be failing",
"Yes correct, thanks for your comment it made me print out the types of the data and payload objects and files was printing as a list, i'm not sure why it worked sometimes it might of been an inconsistency with the S3 client. "
] |
https://api.github.com/repos/psf/requests/issues/5765
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5765/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5765/comments
|
https://api.github.com/repos/psf/requests/issues/5765/events
|
https://github.com/psf/requests/issues/5765
| 819,832,321 |
MDU6SXNzdWU4MTk4MzIzMjE=
| 5,765 |
Shortcut to set auth header
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2218936?v=4",
"events_url": "https://api.github.com/users/OlafvdSpek/events{/privacy}",
"followers_url": "https://api.github.com/users/OlafvdSpek/followers",
"following_url": "https://api.github.com/users/OlafvdSpek/following{/other_user}",
"gists_url": "https://api.github.com/users/OlafvdSpek/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/OlafvdSpek",
"id": 2218936,
"login": "OlafvdSpek",
"node_id": "MDQ6VXNlcjIyMTg5MzY=",
"organizations_url": "https://api.github.com/users/OlafvdSpek/orgs",
"received_events_url": "https://api.github.com/users/OlafvdSpek/received_events",
"repos_url": "https://api.github.com/users/OlafvdSpek/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/OlafvdSpek/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/OlafvdSpek/subscriptions",
"type": "User",
"url": "https://api.github.com/users/OlafvdSpek",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-03-02T09:35:08Z
|
2021-08-27T00:08:33Z
|
2021-03-03T19:52:11Z
|
NONE
|
resolved
|
> Requests is not accepting feature requests at this time.
Hmm ;)
AFAIK the simplest way is currently `headers={'Authorization': 'super secret'}`
Could `auth='super secret'` be supported?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5765/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5765/timeline
| null |
completed
| null | null | false |
[
"No"
] |
https://api.github.com/repos/psf/requests/issues/5764
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5764/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5764/comments
|
https://api.github.com/repos/psf/requests/issues/5764/events
|
https://github.com/psf/requests/issues/5764
| 819,549,775 |
MDU6SXNzdWU4MTk1NDk3NzU=
| 5,764 |
Would anyone know why using requests with proxies through repl.it would cause this SSLError?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/67934523?v=4",
"events_url": "https://api.github.com/users/jonathanlin0/events{/privacy}",
"followers_url": "https://api.github.com/users/jonathanlin0/followers",
"following_url": "https://api.github.com/users/jonathanlin0/following{/other_user}",
"gists_url": "https://api.github.com/users/jonathanlin0/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jonathanlin0",
"id": 67934523,
"login": "jonathanlin0",
"node_id": "MDQ6VXNlcjY3OTM0NTIz",
"organizations_url": "https://api.github.com/users/jonathanlin0/orgs",
"received_events_url": "https://api.github.com/users/jonathanlin0/received_events",
"repos_url": "https://api.github.com/users/jonathanlin0/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jonathanlin0/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jonathanlin0/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jonathanlin0",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-03-02T02:48:15Z
|
2021-08-27T00:08:31Z
|
2021-03-04T14:47:53Z
|
NONE
|
resolved
|
As the title stated, I'm attempting to ping google to check the quality of the proxy through an api in repl.it but when I try it gives this error:
HTTPSConnectionPool(host='www.google.com', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLError(1, '[SSL: WRONG_VERSION_NUMBER] wrong version number (_ssl.c:1125)')))

|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5764/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5764/timeline
| null |
completed
| null | null | false |
[
"This isn't a Q&A forum and is likely best asked to someone familiar with `repl.it`."
] |
https://api.github.com/repos/psf/requests/issues/5763
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5763/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5763/comments
|
https://api.github.com/repos/psf/requests/issues/5763/events
|
https://github.com/psf/requests/issues/5763
| 819,363,841 |
MDU6SXNzdWU4MTkzNjM4NDE=
| 5,763 |
raise InvalidSchema("No connection adapters were found for {!r}".format(url)) requests.exceptions.InvalidSchema: No connection adapters were found for '/v1/models/flower-classification:predict'
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/20457308?v=4",
"events_url": "https://api.github.com/users/Adesoji1/events{/privacy}",
"followers_url": "https://api.github.com/users/Adesoji1/followers",
"following_url": "https://api.github.com/users/Adesoji1/following{/other_user}",
"gists_url": "https://api.github.com/users/Adesoji1/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Adesoji1",
"id": 20457308,
"login": "Adesoji1",
"node_id": "MDQ6VXNlcjIwNDU3MzA4",
"organizations_url": "https://api.github.com/users/Adesoji1/orgs",
"received_events_url": "https://api.github.com/users/Adesoji1/received_events",
"repos_url": "https://api.github.com/users/Adesoji1/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Adesoji1/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Adesoji1/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Adesoji1",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-03-01T22:35:47Z
|
2021-08-27T00:08:32Z
|
2021-03-04T01:34:51Z
|
NONE
|
resolved
|
Summa have this issue too. below is the code.
import os
import json
import io
import requests
import numpy as np
import tensorflow as tf
from PIL import Image
import matplotlib.pyplot as plt
import PIL
ASSETS_DIR = r"C:\Users\Sortol\Documents\adesoji-fastapi\assets"
TEST_IMAGES_DIR = r"C:\Users\Sortol\Documents\adesoji-fastapi\images"
classes = json.load(open(os.path.join(ASSETS_DIR,'categories.json')))
classes = {v:k for k,v in classes.items()}
URL = "/v1/models/flower-classification:predict"
def read_img_file(file) -> Image.Image:
img = Image.open(io.BytesIO(file))
if img.mode == "RGBA":
img = img.convert("RGB")
img = img.resize((224,224), Image.ANTIALIAS)
img = tf.expand_dims(np.asarray(img)/255, 0)
return img
def load_img(img_path, show:bool=False):
img_path = random.choice(glob.glob(f"{DATASET_DIR}/test/{cls}/*.jpg"))
img = PIL.Image.open(img_path)
img = img.resize((224,224), PIL.Image.ANTIALIAS)
if show:
plt.imshow(img)
plt.title('cls')
img = tf.expand_dims(np.asarray(img)/255, 0)
return img
def predict(img):
headers = {
"content-type" : "application/json"
}
data = json.dumps({
"signature_name": "serving_default", "instances": img.numpy().tolist()
})
response = requests.post(URL, data=data, headers=headers)
response_data = json.loads(response.text)['predictions'][0]
confidence = np.max(response_data)
predicted_class = classes[np.argmax(response_data)]
return response_data,confidence,predicted_class
now i get this error below
C:\Users\Sortol\Documents\adesoji-fastapi\app>uvicorn main:app
2021-03-01 23:18:10.330558: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cudart64_110.dll
INFO: Started server process [6912]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO: 127.0.0.1:54121 - "GET / HTTP/1.1" 404 Not Found
INFO: 127.0.0.1:54129 - "GET /docs HTTP/1.1" 200 OK
INFO: 127.0.0.1:54129 - "GET /openapi.json HTTP/1.1" 200 OK
2021-03-01 23:18:39.587847: I tensorflow/compiler/jit/xla_cpu_device.cc:41] Not creating XLA devices, tf_xla_enable_xla_devices not set
2021-03-01 23:18:39.596337: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library nvcuda.dll
2021-03-01 23:18:39.686352: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1720] Found device 0 with properties:
pciBusID: 0000:03:00.0 name: GeForce 840M computeCapability: 5.0
coreClock: 1.124GHz coreCount: 3 deviceMemorySize: 2.00GiB deviceMemoryBandwidth: 13.41GiB/s
2021-03-01 23:18:39.687791: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cudart64_110.dll
2021-03-01 23:18:39.782907: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cublas64_11.dll
2021-03-01 23:18:39.783942: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cublasLt64_11.dll
2021-03-01 23:18:39.832809: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cufft64_10.dll
2021-03-01 23:18:39.844951: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library curand64_10.dll
2021-03-01 23:18:39.939685: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cusolver64_10.dll
2021-03-01 23:18:39.961540: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cusparse64_11.dll
2021-03-01 23:18:39.965590: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cudnn64_8.dll
2021-03-01 23:18:39.966611: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1862] Adding visible gpu devices: 0
2021-03-01 23:18:39.981739: I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2
To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.
2021-03-01 23:18:39.985255: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1720] Found device 0 with properties:
pciBusID: 0000:03:00.0 name: GeForce 840M computeCapability: 5.0
coreClock: 1.124GHz coreCount: 3 deviceMemorySize: 2.00GiB deviceMemoryBandwidth: 13.41GiB/s
2021-03-01 23:18:39.987565: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cudart64_110.dll
2021-03-01 23:18:39.988770: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cublas64_11.dll
2021-03-01 23:18:39.989115: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cublasLt64_11.dll
2021-03-01 23:18:39.989445: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cufft64_10.dll
2021-03-01 23:18:39.989775: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library curand64_10.dll
2021-03-01 23:18:39.991063: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cusolver64_10.dll
2021-03-01 23:18:39.991441: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cusparse64_11.dll
2021-03-01 23:18:39.991842: I tensorflow/stream_executor/platform/default/dso_loader.cc:49] Successfully opened dynamic library cudnn64_8.dll
2021-03-01 23:18:39.993129: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1862] Adding visible gpu devices: 0
2021-03-01 23:18:43.086253: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1261] Device interconnect StreamExecutor with strength 1 edge matrix:
2021-03-01 23:18:43.088620: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1267] 0
2021-03-01 23:18:43.089128: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1280] 0: N
2021-03-01 23:18:43.107867: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1406] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with
1369 MB memory) -> physical GPU (device: 0, name: GeForce 840M, pci bus id: 0000:03:00.0, compute capability: 5.0)
2021-03-01 23:18:43.156687: I tensorflow/compiler/jit/xla_gpu_device.cc:99] Not creating XLA devices, tf_xla_enable_xla_devices not set
tf.Tensor(
[[[[0.64705882 0.65490196 0.48627451]
[0.58823529 0.60392157 0.46666667]
[0.54117647 0.56078431 0.42352941]
...
[0.35686275 0.52941176 0.32941176]
[0.20392157 0.34509804 0.23529412]
[0.13333333 0.25882353 0.14117647]]
[[0.6627451 0.69411765 0.51372549]
[0.58431373 0.60784314 0.4627451 ]
[0.50588235 0.50980392 0.4 ]
...
[0.39607843 0.55294118 0.36470588]
[0.15294118 0.29411765 0.18431373]
[0.05098039 0.18823529 0.06666667]]
[[0.61960784 0.64313725 0.50196078]
[0.63921569 0.62745098 0.52156863]
[0.57254902 0.51764706 0.47843137]
...
[0.38039216 0.52156863 0.36862745]
[0.16862745 0.30980392 0.21568627]
[0.0745098 0.21960784 0.09411765]]
...
[[0.32941176 0.48627451 0.33333333]
[0.32941176 0.48627451 0.3254902 ]
[0.31764706 0.50196078 0.32156863]
...
[0.57254902 0.68235294 0.66666667]
[0.23529412 0.34509804 0.36078431]
[0.20784314 0.3254902 0.29411765]]
[[0.36078431 0.48235294 0.3372549 ]
[0.35686275 0.49019608 0.32941176]
[0.34117647 0.52156863 0.3372549 ]
...
[0.53333333 0.67058824 0.63137255]
[0.22745098 0.36078431 0.37254902]
[0.27843137 0.40392157 0.37647059]]
[[0.34509804 0.4745098 0.32941176]
[0.34117647 0.49019608 0.32156863]
[0.33333333 0.52156863 0.3254902 ]
...
[0.51764706 0.67843137 0.61568627]
[0.25098039 0.40392157 0.40784314]
[0.22352941 0.37647059 0.33333333]]]], shape=(1, 224, 224, 3), dtype=float64)
INFO: 127.0.0.1:54141 - "POST /predict HTTP/1.1" 500 Internal Server Error
ERROR: Exception in ASGI application
Traceback (most recent call last):
File "c:\users\sortol\anaconda3\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 396, in run_asgi
result = await app(self.scope, self.receive, self.send)
File "c:\users\sortol\anaconda3\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 45, in call
return await self.app(scope, receive, send)
File "c:\users\sortol\anaconda3\lib\site-packages\fastapi\applications.py", line 199, in call
await super().call(scope, receive, send)
File "c:\users\sortol\anaconda3\lib\site-packages\starlette\applications.py", line 111, in call
await self.middleware_stack(scope, receive, send)
File "c:\users\sortol\anaconda3\lib\site-packages\starlette\middleware\errors.py", line 181, in call
raise exc from None
File "c:\users\sortol\anaconda3\lib\site-packages\starlette\middleware\errors.py", line 159, in call
await self.app(scope, receive, _send)
File "c:\users\sortol\anaconda3\lib\site-packages\starlette\exceptions.py", line 82, in call
raise exc from None
File "c:\users\sortol\anaconda3\lib\site-packages\starlette\exceptions.py", line 71, in call
await self.app(scope, receive, sender)
File "c:\users\sortol\anaconda3\lib\site-packages\starlette\routing.py", line 566, in call
await route.handle(scope, receive, send)
File "c:\users\sortol\anaconda3\lib\site-packages\starlette\routing.py", line 227, in handle
await self.app(scope, receive, send)
File "c:\users\sortol\anaconda3\lib\site-packages\starlette\routing.py", line 41, in app
response = await func(request)
File "c:\users\sortol\anaconda3\lib\site-packages\fastapi\routing.py", line 201, in app
raw_response = await run_endpoint_function(
File "c:\users\sortol\anaconda3\lib\site-packages\fastapi\routing.py", line 148, in run_endpoint_function
return await dependant.call(**values)
File ".\main.py", line 21, in predict_image
predictions = predict(flower_img)
File ".\helpers.py", line 46, in predict
response = requests.post(URL, data=data, headers=headers)
File "c:\users\sortol\anaconda3\lib\site-packages\requests\api.py", line 119, in post
return request('post', url, data=data, json=json, **kwargs)
File "c:\users\sortol\anaconda3\lib\site-packages\requests\api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "c:\users\sortol\anaconda3\lib\site-packages\requests\sessions.py", line 530, in request
resp = self.send(prep, **send_kwargs)
File "c:\users\sortol\anaconda3\lib\site-packages\requests\sessions.py", line 637, in send
adapter = self.get_adapter(url=request.url)
File "c:\users\sortol\anaconda3\lib\site-packages\requests\sessions.py", line 730, in get_adapter
raise InvalidSchema("No connection adapters were found for {!r}".format(url))
requests.exceptions.InvalidSchema: No connection adapters were found for '/v1/models/flower-classification:predict'ry.
## Expected Result
What you expected.
## Actual Result
What happened instead.
## Reproduction Steps
```python
import requests
```
## System Information
$ python -m requests.help
```
<paste here>
```
This command is only available on Requests v2.16.4 and greater. Otherwise,
please provide some basic information about your system (Python version,
operating system, &c).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5763/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5763/timeline
| null |
completed
| null | null | false |
[
"You're specifying the path without a scheme or host. This is not a bug and even if it were, you did not fill out the full issue template."
] |
https://api.github.com/repos/psf/requests/issues/5762
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5762/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5762/comments
|
https://api.github.com/repos/psf/requests/issues/5762/events
|
https://github.com/psf/requests/issues/5762
| 818,182,524 |
MDU6SXNzdWU4MTgxODI1MjQ=
| 5,762 |
No Error is thrown when proxie url or key is wrong or the proxie is not reachable
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/40398336?v=4",
"events_url": "https://api.github.com/users/DanielGermann/events{/privacy}",
"followers_url": "https://api.github.com/users/DanielGermann/followers",
"following_url": "https://api.github.com/users/DanielGermann/following{/other_user}",
"gists_url": "https://api.github.com/users/DanielGermann/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/DanielGermann",
"id": 40398336,
"login": "DanielGermann",
"node_id": "MDQ6VXNlcjQwMzk4MzM2",
"organizations_url": "https://api.github.com/users/DanielGermann/orgs",
"received_events_url": "https://api.github.com/users/DanielGermann/received_events",
"repos_url": "https://api.github.com/users/DanielGermann/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/DanielGermann/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/DanielGermann/subscriptions",
"type": "User",
"url": "https://api.github.com/users/DanielGermann",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-02-28T11:18:39Z
|
2021-11-25T20:00:35Z
|
2021-08-27T19:41:24Z
|
NONE
|
resolved
|
Summary.
## Expected Result
When I make a get request with
```proxies = {key: value}
request.get(url, proxies=proxies)
```
I would assume that an error is thrown if the key value pair is not correct or if the proxie connection was not possible.
So if i would write something like this
```{"i am a wonderfull key": "i am a wonderfull value"}```,
i think an error should be the output.
Also i think there should be an output if the proxie connection was not possible, so if the key: value pair is right but the proxie is not reachable.
## Actual Result
Nothing, the request just works fine you dont get an error message.
## Reproduction Steps
```python
import requests
proxies = {"i am a wonderfull key": "i am a wonderfull value"}
request.get(url=example.url, proxies=proxies)
```
```python
import requests
proxies = {"http": "unreachable proxie"}
request.get(url=example.url, proxies=proxies)
```
## System Information
```
{
"chardet": {
"version": "3.0.2"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.7.7"
},
"platform": {
"release": "10",
"system": "Windows"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010104f"
},
"urllib3": {
"version": "1.26.2"
},
"using_pyopenssl": false
}
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5762/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5762/timeline
| null |
completed
| null | null | false |
[
"Hi @Mettfisto,\r\n\r\nIt looks like you may have some confusion about how the proxies dict works in Requests.\r\n\r\n> I would assume that an error is thrown if the key value pair is not correct or if the proxie connection was not possible.\r\nSo if i would write something like this\r\n{\"i am a wonderfull key\": \"i am a wonderfull value\"},\r\ni think an error should be the output.\r\n\r\nThere's no reason this needs to throw an error. We allow users to implement their own Adapters for arbitrary protocols and adding strict validation on \"correct\" schemes makes that workflow more difficult. If a key is provided that matches a parsed scheme, we will attempt to use it.\r\n\r\n> Also i think there should be an output if the proxie connection was not possible, so if the key: value pair is right but the proxie is not reachable.\r\n\r\nWe do provide a `requests.exceptions.ProxyError` when the proxy cannot be reached. The examples you've provided though are not valid Python code. If you use them with correctly specified urls, you'll find they either (correctly) ignore the proxy because the schemes do not match, or throw an appropriate error. "
] |
https://api.github.com/repos/psf/requests/issues/5761
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5761/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5761/comments
|
https://api.github.com/repos/psf/requests/issues/5761/events
|
https://github.com/psf/requests/pull/5761
| 815,678,173 |
MDExOlB1bGxSZXF1ZXN0NTc5NDY4MDg5
| 5,761 |
Fix typo
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1322614?v=4",
"events_url": "https://api.github.com/users/andrewmwhite/events{/privacy}",
"followers_url": "https://api.github.com/users/andrewmwhite/followers",
"following_url": "https://api.github.com/users/andrewmwhite/following{/other_user}",
"gists_url": "https://api.github.com/users/andrewmwhite/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/andrewmwhite",
"id": 1322614,
"login": "andrewmwhite",
"node_id": "MDQ6VXNlcjEzMjI2MTQ=",
"organizations_url": "https://api.github.com/users/andrewmwhite/orgs",
"received_events_url": "https://api.github.com/users/andrewmwhite/received_events",
"repos_url": "https://api.github.com/users/andrewmwhite/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/andrewmwhite/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/andrewmwhite/subscriptions",
"type": "User",
"url": "https://api.github.com/users/andrewmwhite",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2021-02-24T17:10:51Z
|
2021-08-27T00:08:46Z
|
2021-02-24T17:19:48Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5761/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5761/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5761.diff",
"html_url": "https://github.com/psf/requests/pull/5761",
"merged_at": "2021-02-24T17:19:48Z",
"patch_url": "https://github.com/psf/requests/pull/5761.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5761"
}
| true |
[] |
|
https://api.github.com/repos/psf/requests/issues/5760
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5760/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5760/comments
|
https://api.github.com/repos/psf/requests/issues/5760/events
|
https://github.com/psf/requests/issues/5760
| 814,659,995 |
MDU6SXNzdWU4MTQ2NTk5OTU=
| 5,760 |
requests.get() timeout times out after twice the given value
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1658405?v=4",
"events_url": "https://api.github.com/users/LefterisJP/events{/privacy}",
"followers_url": "https://api.github.com/users/LefterisJP/followers",
"following_url": "https://api.github.com/users/LefterisJP/following{/other_user}",
"gists_url": "https://api.github.com/users/LefterisJP/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/LefterisJP",
"id": 1658405,
"login": "LefterisJP",
"node_id": "MDQ6VXNlcjE2NTg0MDU=",
"organizations_url": "https://api.github.com/users/LefterisJP/orgs",
"received_events_url": "https://api.github.com/users/LefterisJP/received_events",
"repos_url": "https://api.github.com/users/LefterisJP/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/LefterisJP/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/LefterisJP/subscriptions",
"type": "User",
"url": "https://api.github.com/users/LefterisJP",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2021-02-23T17:11:57Z
|
2021-03-15T11:56:17Z
|
2021-02-24T15:47:00Z
|
NONE
| null |
I made a simple example script to showcase what I experienced
```python
import requests
import time
print('About to do a request that will timeout')
start_time = int(time.time())
try:
response = requests.get('http://www.google.com:81/', timeout=4)
except requests.exceptions.RequestException as e:
print(f'Exception: {str(e)}')
elapsed_secs = int(time.time()) - start_time
print(f'After request that should timeout. Elapsed seconds: {elapsed_secs}')
```
## Expected Result
I expected this to timeout after 4 seconds and per the given timeout.
## Actual Result
What happens instead is that the timeout exception is raised after 2x the given timeout argument.
```
About to do a request that will timeout
Exception: HTTPConnectionPool(host='www.google.com', port=81): Max retries exceeded with url: / (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x7f204f680550>, 'Connection to www.google.com timed out. (connect timeout=4)'))
After request that should timeout. Elapsed seconds: 8
```
## System Information
$ python -m requests.help
```
{
"chardet": {
"version": "4.0.0"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.7.9"
},
"platform": {
"release": "5.10.16-arch1-1",
"system": "Linux"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "101010af"
},
"urllib3": {
"version": "1.26.3"
},
"using_pyopenssl": false
}
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5760/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5760/timeline
| null |
completed
| null | null | false |
[
"In the future, please search **closed and** open issues before creating new ones that are duplicates.",
"Hello @sigmavirus24 thank you for your response.\r\n\r\nI did do that but could not find one. I can of course only judge by the issue's title and what github search can afford to show me.\r\n\r\n Can you please at least link the issue for which you are closing this one as a duplicate? Thanks in advance.",
"One such issue I could find is #5450, the issue was closed but not solved.\r\n\r\nI'm also experiencing this problem, and requests is _consistently_ timing out after approximately *twice* the specified timeout. This is **not** a precision problem, this is an actual bug!\r\n\r\nPlease reopen either this or the other issue!\r\n\r\n```pycon\r\nPython 3.6.9 (default, Jan 26 2021, 15:33:00) \r\n[GCC 8.4.0] on linux\r\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\r\n>>> import time, requests\r\n>>> requests.__version__\r\n'2.25.1'\r\n>>> requests.urllib3.__version__\r\n'1.26.3'\r\n>>> def test_timeout(timeout=1, url='http://google.com:81'):\r\n... start = time.time()\r\n... try:\r\n... requests.get(url, timeout=timeout)\r\n... except requests.ConnectTimeout:\r\n... print(time.time()-start)\r\n... \r\n>>> for i in range(1, 11):\r\n... test_timeout(i)\r\n... \r\n2.0313971042633057\r\n4.0080249309539795\r\n6.010318040847778\r\n8.012223482131958\r\n10.014029264450073\r\n12.012367725372314\r\n14.017669200897217\r\n16.01678204536438\r\n18.01997184753418\r\n20.043574571609497\r\n>>> \r\n```\r\n\r\nThis is a `ConnectTimeout`, no Read Timeout involved. Look at the timings, it can't be more consistent than this!",
"#4276 #3099\r\n\r\nYou're asking for a wall clock timeout and we can not provide that. Further we have clearly documented how timeouts work in our docs"
] |
https://api.github.com/repos/psf/requests/issues/5759
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5759/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5759/comments
|
https://api.github.com/repos/psf/requests/issues/5759/events
|
https://github.com/psf/requests/issues/5759
| 812,697,135 |
MDU6SXNzdWU4MTI2OTcxMzU=
| 5,759 |
Upper version limits in dependencies
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/233340?v=4",
"events_url": "https://api.github.com/users/oberstet/events{/privacy}",
"followers_url": "https://api.github.com/users/oberstet/followers",
"following_url": "https://api.github.com/users/oberstet/following{/other_user}",
"gists_url": "https://api.github.com/users/oberstet/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/oberstet",
"id": 233340,
"login": "oberstet",
"node_id": "MDQ6VXNlcjIzMzM0MA==",
"organizations_url": "https://api.github.com/users/oberstet/orgs",
"received_events_url": "https://api.github.com/users/oberstet/received_events",
"repos_url": "https://api.github.com/users/oberstet/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/oberstet/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/oberstet/subscriptions",
"type": "User",
"url": "https://api.github.com/users/oberstet",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 15 |
2021-02-20T20:08:34Z
|
2021-11-25T20:00:36Z
|
2021-08-27T19:32:59Z
|
NONE
|
resolved
|
**requests** [currently](https://github.com/psf/requests/blob/bdc00eb0978dd3cdd43f7cd1f95ced7aa75fab39/setup.py#L44) has a bunch of version restrictions:
* `chardet>=3.0.2,<5`
* `idna>=2.5,<3`
* `urllib3>=1.21.1,<1.27`
In particular with **idna** and **urllib3**, which are widely used themself, whenever another package triggers installation of an open-ended version of these libraries, a subsequent install of **requests** then will lead to warnings or errors the like
```
requests 2.25.1 requires idna<3,>=2.5, but you'll have idna 3.1 which is incompatible.
```
Are there actual reasons to limit the upper versions? Could these be lifted?
> Putting version upper limits into widely used libraries leads to major headaches for downstream project. Even when following a dedicated-venv-per-app model, if that app has multiple dependencies that in turn depend on above, this is tricky to fix (order dependency, finding an intersecting version in the first place, ..). fwiw, I've spent countless hours on fixing fallout from such scenarios ...
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5759/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5759/timeline
| null |
completed
| null | null | false |
[
"Seems related to https://github.com/psf/requests/issues/5710, in that case the reasoning for version limits is that version 3.x of idna doesn't support python2.x anymore which leads to why it's proposed fix in https://github.com/psf/requests/pull/5711 has version markers with the upper limit being different for py3 & py2",
"ah yes, rgd idna that seem like the same issue. your PR would \"fix\" idna .. it bumps the upper limit of idna on py 3 to <4 - which is practically ok. but why limit it at all (on py3)?\r\n\r\nrequests upper limits the version of idna - in an effort to still run on python 2? even if so, that should be left to python version markets in _idna_ IMO, not requests. eg if requests v3 would run on python 2, why would idna want to limit to idna<2? I am assuming the python version being the only reason in idna for the upp limit.\r\n\r\nsimilar holds true for urllib .. why the upper limit?",
"As far as I understad it it's so not to make too big a change untested, personally I'm with you that this should be left to unittesting to confirm but that's seems like the spirit of things in https://github.com/psf/requests/pull/5711 discussion which is a valid reasoning",
"> Are there actual reasons to limit the upper versions?\r\n\r\nYes there are. I believe they're even documented - both in requests documentation as well as numerous issues complaining about this. There's only so much time the maintainers have these days. Version limits communicate to the users what we are able to support at the time of release as well as roughly what versions are known to work well. (This is the shortest answer.) Other projects play fast and loose with their dependencies because they have people who have more time on their hands and can react quickly to breaking releases in their dependencies. We don't",
"thanks for explaining reasons and project resource limits! I understand.\r\n\r\nguess we'll have to deal with it. for me, with our \"main app\" having a dependency DAG with 130 entries, and requests and idna on different paths in the DAG, that means a bit of pain, but hey, I should contribute not complain;)",
"Thanks for your all of your hard work and I really enjoy requests. But continuing to support Python 2 prolongs this problem of everyone transitioning off of it. The world has had *more* than enough time to migrate, wasting time being compatible with an [unsupported language](https://www.python.org/doc/sunset-python-2/) is a waste of your limited time. ",
"Can't these version limits be handled by package managers, such that incompatible versions won't be installed?\r\nA runtime warning like this seems sub-optimal.",
"Ahem, it's not just a runtime warning. Deployment won't even finish because of this and bail out with an error code, hindering deployments that use this package.\r\n\r\nIs my best course of action really to fork this repo and remove the upper limits? Please say that there is another meaningful way that doesn't involve keeping a repo that pulls sane dependencies.",
"For the time being, until #5711 will get merged, the solution is to use the pull request's code state:\r\n\r\nThat is, putting the following line into the `requirements.txt`:\r\n`git+https://github.com/psf/requests.git@refs/pull/5711/head`",
"> Can't these version limits be handled by package managers, such that incompatible versions won't be installed?\r\n\r\nNo. Most of our users don't use package managers and package managers have all too often shipped incompatible versions.\r\n\r\n> Is my best course of action really to fork this repo and remove the upper limits? Please say that there is another meaningful way that doesn't involve keeping a repo that pulls sane dependencies.\r\n\r\nSanity is the wrong term to use here. These dependencies represent what's supported and tested. Forking to use unsupported configuration is fine if you want, but don't open an issue with us when it breaks. By using a PR branch, you're using a fork - just one further out of your control.",
"> > Can't these version limits be handled by package managers, such that incompatible versions won't be installed?\r\n> \r\n> No. Most of our users don't use package managers and package managers have all too often shipped incompatible versions.\r\n\r\nJust wondering, how do they install this library?",
"> Just wondering, how do they install this library?\r\n\r\npip? from wheels or source archives fetched from PyPI. or plain setuptools from source. fwiw, we're recommending to install our app (crossbar.io) into a separate, dedicated virtualenv. this minimizes version conflicts and allows the app to exactly control the installed versions, if possible even using hash pinned wheels. using distro package managers in contrast for python packages is \"wrong\" (IMO). sure, pip (and PyPI) is imperfect, but it's real, works and is commonly used.",
"> > Just wondering, how do they install this library?\r\n> \r\n> pip?\r\n\r\nI thought pip was a package manager.",
"I thought you were talking about people who distribute requests for linux distributions. Those tend to be called package managers more than pip is called that. Sorry for the misunderstanding",
"We've talked about the reasoning for our upper bound pins a fair amount in this thread, but I'll resummarize before closing. Requests is critical infrastructure for a large portion of the Python community. We've implemented upper bound pins as a reaction to previous large scale outages from new major versions of our dependencies being released. This is a defensive decision which, while preventing from upgrading to the latest and greatest, is not strictly incompatible and prevents these worst case scenarios.\r\n\r\nAs for the idna 3.0 issue, we addressed the Python 2.7 versioning problems in #5711 which was released in 2.26.0. We'll resolve this as complete for now.\r\n\r\n"
] |
https://api.github.com/repos/psf/requests/issues/5758
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5758/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5758/comments
|
https://api.github.com/repos/psf/requests/issues/5758/events
|
https://github.com/psf/requests/pull/5758
| 812,581,232 |
MDExOlB1bGxSZXF1ZXN0NTc2OTE1NzU1
| 5,758 |
Catch urllib3 LocationParseError on send(), raise InvalidURL instead
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/33416700?v=4",
"events_url": "https://api.github.com/users/cknabs/events{/privacy}",
"followers_url": "https://api.github.com/users/cknabs/followers",
"following_url": "https://api.github.com/users/cknabs/following{/other_user}",
"gists_url": "https://api.github.com/users/cknabs/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/cknabs",
"id": 33416700,
"login": "cknabs",
"node_id": "MDQ6VXNlcjMzNDE2NzAw",
"organizations_url": "https://api.github.com/users/cknabs/orgs",
"received_events_url": "https://api.github.com/users/cknabs/received_events",
"repos_url": "https://api.github.com/users/cknabs/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/cknabs/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cknabs/subscriptions",
"type": "User",
"url": "https://api.github.com/users/cknabs",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2021-02-20T11:03:52Z
|
2024-07-30T00:03:44Z
|
2023-07-30T02:00:07Z
|
NONE
|
resolved
|
Fixes #5744
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5758/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5758/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5758.diff",
"html_url": "https://github.com/psf/requests/pull/5758",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5758.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5758"
}
| true |
[
"@cknabs the tests are failing. Did you struggle to understand why? Can you please fix them?",
"Resolving as this hasn't seen an update in quite some time and appears to have stalled out waiting for review comments to be addressed."
] |
https://api.github.com/repos/psf/requests/issues/5757
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5757/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5757/comments
|
https://api.github.com/repos/psf/requests/issues/5757/events
|
https://github.com/psf/requests/issues/5757
| 811,072,041 |
MDU6SXNzdWU4MTEwNzIwNDE=
| 5,757 |
Add support for HTTP/2
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/8332263?v=4",
"events_url": "https://api.github.com/users/Apteryks/events{/privacy}",
"followers_url": "https://api.github.com/users/Apteryks/followers",
"following_url": "https://api.github.com/users/Apteryks/following{/other_user}",
"gists_url": "https://api.github.com/users/Apteryks/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Apteryks",
"id": 8332263,
"login": "Apteryks",
"node_id": "MDQ6VXNlcjgzMzIyNjM=",
"organizations_url": "https://api.github.com/users/Apteryks/orgs",
"received_events_url": "https://api.github.com/users/Apteryks/received_events",
"repos_url": "https://api.github.com/users/Apteryks/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Apteryks/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Apteryks/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Apteryks",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-02-18T12:52:00Z
|
2021-08-27T00:08:34Z
|
2021-02-20T15:04:55Z
|
NONE
|
resolved
|
It is my understanding that HTTP/2 is not currently supported by requests; hence this is a feature request (no pun intended) to add such support.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5757/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5757/timeline
| null |
completed
| null | null | false |
[
"Duplicate of #4604 and #5506\r\n\r\nRequests is in a feature-freeze and is unlikely to add HTTP/2 in the future"
] |
https://api.github.com/repos/psf/requests/issues/5756
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5756/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5756/comments
|
https://api.github.com/repos/psf/requests/issues/5756/events
|
https://github.com/psf/requests/issues/5756
| 810,699,615 |
MDU6SXNzdWU4MTA2OTk2MTU=
| 5,756 |
Issue while issuing GET requests to Google owned services.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/43145883?v=4",
"events_url": "https://api.github.com/users/Lyfhael/events{/privacy}",
"followers_url": "https://api.github.com/users/Lyfhael/followers",
"following_url": "https://api.github.com/users/Lyfhael/following{/other_user}",
"gists_url": "https://api.github.com/users/Lyfhael/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lyfhael",
"id": 43145883,
"login": "Lyfhael",
"node_id": "MDQ6VXNlcjQzMTQ1ODgz",
"organizations_url": "https://api.github.com/users/Lyfhael/orgs",
"received_events_url": "https://api.github.com/users/Lyfhael/received_events",
"repos_url": "https://api.github.com/users/Lyfhael/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lyfhael/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lyfhael/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lyfhael",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-02-18T02:30:27Z
|
2021-08-27T00:08:35Z
|
2021-02-18T05:30:31Z
|
NONE
|
resolved
|
GET Requests to Google owned services don't work or work very very slowly(minutes). Example of urls having problems : Google.com Google.fr Youtube.fr Youtube.com Googleapis.com
I have no issue with any other website(I tried github.com, facebook.com and a bunch of others)
It worked an hour ago, it doesn't work anymore, I didn't change anything.
I can still access those urls fine on my browser.
## Expected Result
I expect to get a response, but the program seems to be loading(?) indefinitely
## Actual Result
It seems to be loading indefinitely. Rarely, I get a response, 99% of cases(Only 2 worked) it just doesn't do anything and the program doesn't stop or show anything
## Reproduction Steps
```python
import requests
a = requests.get("https://www.youtube.fr")
print(a)
```
## System Information
$ python -m requests.help
```
{
"chardet": {
"version": "3.0.4"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.8.0"
},
"platform": {
"release": "10",
"system": "Windows"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010104f"
},
"urllib3": {
"version": "1.26.3"
},
"using_pyopenssl": false
}```
This command is only available on Requests v2.16.4 and greater. Otherwise,
please provide some basic information about your system (Python version,
operating system, &c).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/43145883?v=4",
"events_url": "https://api.github.com/users/Lyfhael/events{/privacy}",
"followers_url": "https://api.github.com/users/Lyfhael/followers",
"following_url": "https://api.github.com/users/Lyfhael/following{/other_user}",
"gists_url": "https://api.github.com/users/Lyfhael/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Lyfhael",
"id": 43145883,
"login": "Lyfhael",
"node_id": "MDQ6VXNlcjQzMTQ1ODgz",
"organizations_url": "https://api.github.com/users/Lyfhael/orgs",
"received_events_url": "https://api.github.com/users/Lyfhael/received_events",
"repos_url": "https://api.github.com/users/Lyfhael/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Lyfhael/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Lyfhael/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Lyfhael",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5756/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5756/timeline
| null |
completed
| null | null | false |
[
"It fixed by itself... Not sure what it was since no error messages. I haven't restarted my computer or done anything.\r\n"
] |
https://api.github.com/repos/psf/requests/issues/5755
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5755/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5755/comments
|
https://api.github.com/repos/psf/requests/issues/5755/events
|
https://github.com/psf/requests/pull/5755
| 810,524,175 |
MDExOlB1bGxSZXF1ZXN0NTc1MjAxMzQ2
| 5,755 |
Provide directions for cloning repository despite malformed commits
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/31069?v=4",
"events_url": "https://api.github.com/users/pdmccormick/events{/privacy}",
"followers_url": "https://api.github.com/users/pdmccormick/followers",
"following_url": "https://api.github.com/users/pdmccormick/following{/other_user}",
"gists_url": "https://api.github.com/users/pdmccormick/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/pdmccormick",
"id": 31069,
"login": "pdmccormick",
"node_id": "MDQ6VXNlcjMxMDY5",
"organizations_url": "https://api.github.com/users/pdmccormick/orgs",
"received_events_url": "https://api.github.com/users/pdmccormick/received_events",
"repos_url": "https://api.github.com/users/pdmccormick/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/pdmccormick/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pdmccormick/subscriptions",
"type": "User",
"url": "https://api.github.com/users/pdmccormick",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2021-02-17T20:44:27Z
|
2021-08-27T00:08:47Z
|
2021-02-17T21:37:02Z
|
CONTRIBUTOR
|
resolved
|
Issues #2690, #3008, #3088, #3805 #4254 all refer to the issue with commit 5e6ecdad9f69b1ff789a17733b8edc6fd7091bd8 that cause Git to error out when cloning the repository. As originally discussed in #2690, this issue cannot be avoided without rewriting history, so why don't we instead include some helpful directions at the bottom of the README?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5755/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5755/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5755.diff",
"html_url": "https://github.com/psf/requests/pull/5755",
"merged_at": "2021-02-17T21:37:02Z",
"patch_url": "https://github.com/psf/requests/pull/5755.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5755"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/5754
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5754/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5754/comments
|
https://api.github.com/repos/psf/requests/issues/5754/events
|
https://github.com/psf/requests/pull/5754
| 810,131,311 |
MDExOlB1bGxSZXF1ZXN0NTc0ODcyMjg2
| 5,754 |
Enhance RequestEncodingMixin._encode_params to support complex dict struct when content-type is x-www-form-urlencoded
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/19822609?v=4",
"events_url": "https://api.github.com/users/zgj0607/events{/privacy}",
"followers_url": "https://api.github.com/users/zgj0607/followers",
"following_url": "https://api.github.com/users/zgj0607/following{/other_user}",
"gists_url": "https://api.github.com/users/zgj0607/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/zgj0607",
"id": 19822609,
"login": "zgj0607",
"node_id": "MDQ6VXNlcjE5ODIyNjA5",
"organizations_url": "https://api.github.com/users/zgj0607/orgs",
"received_events_url": "https://api.github.com/users/zgj0607/received_events",
"repos_url": "https://api.github.com/users/zgj0607/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/zgj0607/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zgj0607/subscriptions",
"type": "User",
"url": "https://api.github.com/users/zgj0607",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-02-17T12:14:50Z
|
2021-08-27T00:08:47Z
|
2021-02-17T15:27:59Z
|
NONE
|
resolved
|
Add an util's method in `utils.py` named `unfold_complex_data_key`, called by `RequestEncodingMixin._encode_params` to support complex dict struct when content-type is `x-www-form-urlencoded`.
The function unfold_complex_data_key just do the follow work:
Unfold complex dict which has deep level key or list to simple list[tuple]
when requests.post()'s pemeter data is
`{ "id": "857-37-9333", "label": "User", "count": 0, "properties": { "name": "Rich Hintz", "city": "New Edythstad", "gender": "male", "age": 24, "profile": [ "Zondy", "ZTESoft", "YunWen", "Ci123" ] } }`
encoding the date to follow format:
`[('id','857-37-9333'), ('label','User'), ('count',0), ('properties[name]','Rich Hintz'), ('properties[city]','New Edythstad') ('properties[gender]','male'), ('properties[age]',24), ('properties[profile][0]','Zondy'), ('properties[profile][1]','ZTESoft'), ('properties[profile][2]','YunWen'), ('properties[profile][3]','Ci123')]`
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5754/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5754/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5754.diff",
"html_url": "https://github.com/psf/requests/pull/5754",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5754.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5754"
}
| true |
[
"There's already a way to do this in the requests-toolbelt and this feature has been rejected before in this project"
] |
https://api.github.com/repos/psf/requests/issues/5753
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5753/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5753/comments
|
https://api.github.com/repos/psf/requests/issues/5753/events
|
https://github.com/psf/requests/pull/5753
| 810,079,203 |
MDExOlB1bGxSZXF1ZXN0NTc0ODI4Mzcz
| 5,753 |
enhance RequestEncodingMixin._encode_params to support complex dict struct when content-type is x-www-form-urlencoded
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/19822609?v=4",
"events_url": "https://api.github.com/users/zgj0607/events{/privacy}",
"followers_url": "https://api.github.com/users/zgj0607/followers",
"following_url": "https://api.github.com/users/zgj0607/following{/other_user}",
"gists_url": "https://api.github.com/users/zgj0607/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/zgj0607",
"id": 19822609,
"login": "zgj0607",
"node_id": "MDQ6VXNlcjE5ODIyNjA5",
"organizations_url": "https://api.github.com/users/zgj0607/orgs",
"received_events_url": "https://api.github.com/users/zgj0607/received_events",
"repos_url": "https://api.github.com/users/zgj0607/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/zgj0607/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zgj0607/subscriptions",
"type": "User",
"url": "https://api.github.com/users/zgj0607",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2021-02-17T11:00:41Z
|
2021-08-27T00:08:48Z
|
2021-02-17T12:08:37Z
|
NONE
|
resolved
|
Add an util's method in `utils.py`, named `unfold_complex_data_key`, called by `RequestEncodingMixin._encode_params` to support complex dict struct when content-type is `x-www-form-urlencoded`.
The function unfold_complex_data_key just do the follow work:
Unfold complex dict which has deep level key or list to simple list[tuple]
when requests.post()'s pemeter data is
`
{
"id": "857-37-9333",
"label": "User",
"count": 0,
"properties": {
"name": "Rich Hintz",
"city": "New Edythstad",
"gender": "male",
"age": 24,
"profile": [
"Zondy",
"ZTESoft",
"YunWen",
"Ci123"
]
}
}
`
encoding the date to follow format:
`
[
('id','857-37-9333'),
('label','User'),
('count',0),
('properties[name]','Rich Hintz'),
('properties[city]','New Edythstad')
('properties[gender]','male'),
('properties[age]',24),
('properties[profile][0]','Zondy'),
('properties[profile][1]','ZTESoft'),
('properties[profile][2]','YunWen'),
('properties[profile][3]','Ci123'),
]
`
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/19822609?v=4",
"events_url": "https://api.github.com/users/zgj0607/events{/privacy}",
"followers_url": "https://api.github.com/users/zgj0607/followers",
"following_url": "https://api.github.com/users/zgj0607/following{/other_user}",
"gists_url": "https://api.github.com/users/zgj0607/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/zgj0607",
"id": 19822609,
"login": "zgj0607",
"node_id": "MDQ6VXNlcjE5ODIyNjA5",
"organizations_url": "https://api.github.com/users/zgj0607/orgs",
"received_events_url": "https://api.github.com/users/zgj0607/received_events",
"repos_url": "https://api.github.com/users/zgj0607/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/zgj0607/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/zgj0607/subscriptions",
"type": "User",
"url": "https://api.github.com/users/zgj0607",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5753/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5753/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5753.diff",
"html_url": "https://github.com/psf/requests/pull/5753",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5753.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5753"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/5752
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5752/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5752/comments
|
https://api.github.com/repos/psf/requests/issues/5752/events
|
https://github.com/psf/requests/issues/5752
| 808,811,531 |
MDU6SXNzdWU4MDg4MTE1MzE=
| 5,752 |
Issue with verify options
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3485377?v=4",
"events_url": "https://api.github.com/users/kornpow/events{/privacy}",
"followers_url": "https://api.github.com/users/kornpow/followers",
"following_url": "https://api.github.com/users/kornpow/following{/other_user}",
"gists_url": "https://api.github.com/users/kornpow/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kornpow",
"id": 3485377,
"login": "kornpow",
"node_id": "MDQ6VXNlcjM0ODUzNzc=",
"organizations_url": "https://api.github.com/users/kornpow/orgs",
"received_events_url": "https://api.github.com/users/kornpow/received_events",
"repos_url": "https://api.github.com/users/kornpow/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kornpow/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kornpow/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kornpow",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2021-02-15T20:51:49Z
|
2021-08-27T00:08:31Z
|
2021-02-15T20:53:59Z
|
NONE
|
resolved
|
Since the verify parameter is designed with the certificate stored on the filesystem, I have had to do some workarounds in my application, since I am expecting certificates to be passed in using environment variables. While experimenting with these workarounds, I noticed some strange behaviors.
Main Issue:
* The docs claim that path/path-like objects are the intended use case, however, you are able to pass some things in that aren't path-like, and it is able to work.
* You should definitely be able to just pass in a certificate as a variable, why is this not supported?
Please read the _Reproduction Steps_ below, that is best way to understand the issue.
## Expected Result
When using the verify= parameter, I expect to be able to pass in these types of objects:
* string of path
* os.PathLike
Current Error Message, when passing in a wrong Type:
TypeError: stat: path should be string, bytes, os.PathLike or integer, not _io.StringIO
## Actual Result
Things passed in that could work:
* certificate in string form, however only when using .read() function
* path or pathlike of certificate
What happened instead.
## Reproduction Steps
```python
import requests
from io import StringIO
import os
import tempfile
## Get certificate stored in env variable
tls = os.getenv("TLS")
## Do some conversions of the certificate
# StringIO Form
f = StringIO(codecs.decode(tls, encoding="hex").decode())
# String Form
y = f.read()
# Tempfile Form
a = bytes.fromhex(tls)
fp = tempfile.NamedTemporaryFile()
fn = fp.name
fp.write(a)
fp.seek(0)
cert_path = fn
## Use the certificate in various ways:
requests.get("https://192.168.1.12:8080/v1/getinfo", headers=headers, verify=cert_path).json()
# Successful, most canonical way to do it
requests.get("https://192.168.1.12:8080/v1/getinfo", headers=headers, verify=f.read()).json()
# Successful, throws WARNING: InsecureRequestWarning: Unverified HTTPS request is being made to host '192.168.1.12'. # Adding certificate verification is strongly advised.
# Since it was successful, it seems the warning isn't valid, since an unverified HTTPS requests was not sent.
requests.get("https://192.168.1.12:8080/v1/getinfo", headers=headers, verify=y).json()
# Failure: OSError: Could not find a suitable TLS CA certificate bundle, invalid path: ...
```
## System Information
$ python -m requests.help
```
{
"chardet": {
"version": "3.0.4"
},
"cryptography": {
"version": "3.2"
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.8.5"
},
"platform": {
"release": "5.8.0-7642-generic",
"system": "Linux"
},
"pyOpenSSL": {
"openssl_version": "1010108f",
"version": "20.0.1"
},
"requests": {
"version": "2.25.0"
},
"system_ssl": {
"version": "1010106f"
},
"urllib3": {
"version": "1.25.10"
},
"using_pyopenssl": true
}
```
This command is only available on Requests v2.16.4 and greater. Otherwise,
please provide some basic information about your system (Python version,
operating system, &c).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5752/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5752/timeline
| null |
completed
| null | null | false |
[
"In the future, please search **closed and** open issues before creating new ones that are duplicates.",
"While I have been fighting with the issue of \"Unable to pass in a variable to verify\" for years now, and from what I can tell there is no desire to change that behavior, I do believe this might be a new issue.\r\n\r\nThe issue is that I can pass file.open() to verify and for some reason it is totally fine with that even though it isnt a path.",
"Your first solution doesn't actually work on Windows either, since you can't reopen a temporary file without further tweaks [as described here](https://stackoverflow.com/a/15235559/5085211), but those tweaks are not immediately available through the requests API."
] |
https://api.github.com/repos/psf/requests/issues/5751
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5751/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5751/comments
|
https://api.github.com/repos/psf/requests/issues/5751/events
|
https://github.com/psf/requests/issues/5751
| 808,039,895 |
MDU6SXNzdWU4MDgwMzk4OTU=
| 5,751 |
TLS SNI Support (Under Python 2.7.6)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3859005?v=4",
"events_url": "https://api.github.com/users/Tectract/events{/privacy}",
"followers_url": "https://api.github.com/users/Tectract/followers",
"following_url": "https://api.github.com/users/Tectract/following{/other_user}",
"gists_url": "https://api.github.com/users/Tectract/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Tectract",
"id": 3859005,
"login": "Tectract",
"node_id": "MDQ6VXNlcjM4NTkwMDU=",
"organizations_url": "https://api.github.com/users/Tectract/orgs",
"received_events_url": "https://api.github.com/users/Tectract/received_events",
"repos_url": "https://api.github.com/users/Tectract/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Tectract/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Tectract/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Tectract",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2021-02-14T20:25:23Z
|
2021-11-26T14:00:18Z
|
2021-08-28T02:51:19Z
|
NONE
|
resolved
|
Hello.
In reference to https://github.com/psf/requests/issues/749
I'm trying to get SNI support working on a system that is locked to Python2.7.6 / pip20.3.4.
I've tried 'pip install -U requests' and 'pip install -U requests[security]', but the issue is not resolved for me.
I'm getting this error when I try to connect to servers that support SNI:
[Errno 1] _ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure
Is there some way I can still backport SNI support into my Python2.7.6 / pip installation here? Could I just copy the SSL libs from a pip3 installation or something?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5751/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5751/timeline
| null |
completed
| null | null | false |
[
"Was the code from PR 749 reverted? I'm looking at the __init__.py from requests v2.25.1 and I don't see this:\r\n\r\n```\r\n# Attempt to enable urllib3's SNI support, if possible\r\ntry:\r\n from requests.packages.urllib3.contrib import pyopenssl\r\n pyopenssl.inject_into_urllib3()\r\nexcept ImportError:\r\n pass\r\n```\r\n\r\n\r\n \r\n ",
"2.24.0 dropped the automatic usage of pyOpenSSL if SNI support is detected. What is the result of `ssl.HAS_SNI` for your distribution (should be False)? In that can pyOpenSSL should be getting injected automatically.",
"This is interesting. Any help is greatly, greatly appreciated here. Thanks!\r\n\r\n```\r\n>>> import ssl\r\n>>> ssl.HAS_SNI\r\nTraceback (most recent call last):\r\n File \"<stdin>\", line 1, in <module>\r\nAttributeError: 'module' object has no attribute 'HAS_SNI'\r\n```",
"I tried going down to requests 2.23.0, and now I see code trying to interact with urrlib3 in the init, but sadly still no dice, same error for me. Are there maybe specific versions of urllib3/pyopenssl I can try, or some other way to test or patch the ssl / nmi connections?",
"Hmmmmmm, I tried using a different tool... this is a problem connecting the coinbase websockets so I tried a python coinbase tool. It definitely looks like it might be something with the ssl.py file in my /usr/lib/python2.7/ folder, which is a system folder as I understand. Can I get it to use a different ssl implementation somehow?\r\n\r\nHere's what I see now, this might be helpful:\r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/usr/lib/python2.7/threading.py\", line 810, in __bootstrap_inner\r\n self.run()\r\n File \"/usr/lib/python2.7/threading.py\", line 763, in run\r\n self.__target(*self.__args, **self.__kwargs)\r\n File \"/home/myusername/.local/lib/python2.7/site-packages/cbpro/websocket_client.py\", line 40, in _go\r\n self._connect()\r\n File \"/home/myusername/.local/lib/python2.7/site-packages/cbpro/websocket_client.py\", line 72, in _connect\r\n self.ws = create_connection(self.url)\r\n File \"/usr/local/lib/python2.7/dist-packages/websocket/_core.py\", line 487, in create_connection\r\n websock.connect(url, **options)\r\n File \"/usr/local/lib/python2.7/dist-packages/websocket/_core.py\", line 211, in connect\r\n options.pop('socket', None))\r\n File \"/usr/local/lib/python2.7/dist-packages/websocket/_http.py\", line 77, in connect\r\n sock = _ssl_socket(sock, options.sslopt, hostname)\r\n File \"/usr/local/lib/python2.7/dist-packages/websocket/_http.py\", line 182, in _ssl_socket\r\n sock = ssl.wrap_socket(sock, **sslopt)\r\n File \"/usr/lib/python2.7/ssl.py\", line 487, in wrap_socket\r\n ciphers=ciphers)\r\n File \"/usr/lib/python2.7/ssl.py\", line 243, in __init__\r\n self.do_handshake()\r\n File \"/usr/lib/python2.7/ssl.py\", line 405, in do_handshake\r\n self._sslobj.do_handshake()\r\nSSLError: [Errno 1] _ssl.c:510: error:14077410:SSL routines:SSL23_GET_SERVER_HELLO:sslv3 alert handshake failure\r\n```",
"Hi @Tectract, I don't believe there's an alternative option. If injecting the pyopenssl module from `urllib3.contrib` module manually isn't resolving the issue, I'm not sure there's much we can do. It would appear there's perhaps a distro modified version of ssl.py on your machine, or another environment issue. This issue tracker won't be able to help with that unfortunately.\r\n\r\nI'm going to resolve this for now, let us know if you've have more information on what you think is a defect in Requests.",
"Hello, I did eventually find a way to get TLS SNI support under Python 2.7. \r\n\r\nFrom my notes:\r\n\r\n python requests package 2.24.0 dropped the automatic usage of pyOpenSSL if SNI support is detected\r\n might work with an older urllib3 (?)\r\n\r\n from requests/__init__.py\r\n\r\n # Attempt to enable urllib3's SNI support, if possible\r\n try:\r\n from urllib3.contrib import pyopenssl\r\n pyopenssl.inject_into_urllib3()\r\n \r\n within your python code, where you see this:\r\n\r\n self.sock = ssl.wrap_socket(self.sock)\r\n\r\n change it to this:\r\n\r\n self.ssl_options = { \"server_hostname\" : self.host }\r\n self.sock = ssl.SSLSocket(self.sock, **self.ssl_options)\r\n \r\n Thank you for your support!"
] |
https://api.github.com/repos/psf/requests/issues/5750
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5750/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5750/comments
|
https://api.github.com/repos/psf/requests/issues/5750/events
|
https://github.com/psf/requests/issues/5750
| 807,691,598 |
MDU6SXNzdWU4MDc2OTE1OTg=
| 5,750 |
Post request using https proxies returning encoded text after update to python 3.9.1
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/30209138?v=4",
"events_url": "https://api.github.com/users/bloxploits/events{/privacy}",
"followers_url": "https://api.github.com/users/bloxploits/followers",
"following_url": "https://api.github.com/users/bloxploits/following{/other_user}",
"gists_url": "https://api.github.com/users/bloxploits/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bloxploits",
"id": 30209138,
"login": "bloxploits",
"node_id": "MDQ6VXNlcjMwMjA5MTM4",
"organizations_url": "https://api.github.com/users/bloxploits/orgs",
"received_events_url": "https://api.github.com/users/bloxploits/received_events",
"repos_url": "https://api.github.com/users/bloxploits/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bloxploits/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bloxploits/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bloxploits",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 8 |
2021-02-13T06:45:51Z
|
2021-08-27T00:08:34Z
|
2021-02-19T06:03:35Z
|
NONE
|
resolved
|
Summary.
I am making a request to a url with a proxy, however, after I updated from python 3.8.3 to 3.9.1, the response started to be encoded.
When I request this url without a proxy, there are no problems and it worked before with a proxy in 3.8.3. I don't know which of requests I used in 3.8.3, but I've seen a few open and closed issues about proxies in the new requests update.
I also tried using the latest version of urllib3 (1.2.6), and the requests just never got sent, I was stuck on an empty console screen so I was forced to use an older version of urllib that worked.
Expected result:
{
"fingerprint": "810034976510050356.i2OpgwJJalp3uKzcq7sakA10bMY"
}
Actual result:
! c�R������� � ��TX���p8��,��,�cO�L5��Lޙ�R��A�z���'��
Code example:
```python
import requests
session = requests.Session()
adapter = HTTPAdapter(max_retries=Retry(connect=15, backoff_factor=0.5))
session.mount("http://", adapter)
session.mount("https://", adapter)
proxy = "example:port"
proxy_dict = {"http": "http://" + proxy, "https": "http:s//" + proxy}
headers = {
"accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9",
"accept-encoding": "gzip, deflate, br",
"accept-language": "en-US,en;q=0.9",
"cache-control": "no-cache",
"pragma": "no-cache",
"sec-fetch-dest": "document",
"sec-fetch-mode": "navigate",
"sec-fetch-site": "none",
"sec-fetch-user": "?1",
"upgrade-insecure-requests": "1",
"user-agent": "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/88.0.4324.104 Safari/537.36"
}
r = session.post("https://example.com/", headers=headers, proxies=proxy_dict)
print(r.text)
```
```
{
"chardet": {
"version": "3.0.4"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.9.1"
},
"platform": {
"release": "10",
"system": "Windows"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010107f"
},
"urllib3": {
"version": "1.25.2"
},
"using_pyopenssl": false
}
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/30209138?v=4",
"events_url": "https://api.github.com/users/bloxploits/events{/privacy}",
"followers_url": "https://api.github.com/users/bloxploits/followers",
"following_url": "https://api.github.com/users/bloxploits/following{/other_user}",
"gists_url": "https://api.github.com/users/bloxploits/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bloxploits",
"id": 30209138,
"login": "bloxploits",
"node_id": "MDQ6VXNlcjMwMjA5MTM4",
"organizations_url": "https://api.github.com/users/bloxploits/orgs",
"received_events_url": "https://api.github.com/users/bloxploits/received_events",
"repos_url": "https://api.github.com/users/bloxploits/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bloxploits/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bloxploits/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bloxploits",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5750/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5750/timeline
| null |
completed
| null | null | false |
[
"What do the response headers look like? I have a suspicion as to what this is related to, but it likely is related to you having different versions of Requests, not different versions of Python.",
"> What do the response headers look like? I have a suspicion as to what this is related to, but it likely is related to you having different versions of Requests, not different versions of Python.\r\n\r\n{'Date': 'Sat, 13 Feb 2021 23:05:03 GMT', 'Content-Type': 'application/json', 'Transfer-Encoding': 'chunked', 'Connection': 'keep-alive', 'Set-Cookie': '__cfduid=d7fc9575f417fc238cfe0aedb530af9a31613257503; expires=Mon, 15-Mar-21 23:05:03 GMT; path=/; domain=.discord.com; HttpOnly; SameSite=Lax; Secure', 'strict-transport-security': 'max-age=31536000; includeSubDomains; preload', 'x-envoy-upstream-service-time': '5', 'Via': '1.1 google', 'Alt-Svc': 'h3-27=\":443\"; ma=86400, h3-28=\":443\"; ma=86400, h3-29=\":443\"; ma=86400', 'CF-Cache-Status': 'DYNAMIC', 'cf-request-id': '083f3d0bc200003e27c6148000000001', 'Expect-CT': 'max-age=604800, report-uri=\"https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct\"', 'Report-To': '{\"group\":\"cf-nel\",\"endpoints\":[{\"url\":\"https:\\\\/\\\\/a.nel.cloudflare.com\\\\/report?s=%2BbaRFUhx35RUeoOIL%2Fpfif9luP3A9yqZSQo7siR7WmD83NLf3AK3P5xJs8QLBxP1Rxfx3I3YZFe%2BRvhXZBTD5cBdEkM2hwjRP7Su7w%3D%3D\"}],\"max_age\":604800}', 'NEL': '{\"report_to\":\"cf-nel\",\"max_age\":604800}', 'X-Content-Type-Options': 'nosniff', 'Server': 'cloudflare', 'CF-RAY': '6212312609273e27-EWR', 'Content-Encoding': 'br'}\r\n\r\nStatus code: 200\r\n\r\nYes, it is because of the version of requests most likely because it was working on an older version yesterday but I removed it when I updated it from 3.8.3 to 3.9.1",
"Try setting `r.encoding = None` before printing `r.text` and let me know what happens",
"> Try setting `r.encoding = None` before printing `r.text` and let me know what happens\r\n\r\nsame thing",
"Well then that rules out my idea",
"> Well then that rules out my idea\n\nWell, i guess ill wait for a new update.\nTested on postman using the same proxy and it worked so I think its a requests problem not necessarily a problem with the website ",
"Some websites change behaviour based on user-agent. It's possible they've something that's causing this problem for you. I'd also compare the raw bytes (`r.content`) to the raw bytes postman gets back.",
"Fixed by removing the headers"
] |
https://api.github.com/repos/psf/requests/issues/5749
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5749/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5749/comments
|
https://api.github.com/repos/psf/requests/issues/5749/events
|
https://github.com/psf/requests/issues/5749
| 805,982,602 |
MDU6SXNzdWU4MDU5ODI2MDI=
| 5,749 |
Params list with one item
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/614232?v=4",
"events_url": "https://api.github.com/users/iBobik/events{/privacy}",
"followers_url": "https://api.github.com/users/iBobik/followers",
"following_url": "https://api.github.com/users/iBobik/following{/other_user}",
"gists_url": "https://api.github.com/users/iBobik/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/iBobik",
"id": 614232,
"login": "iBobik",
"node_id": "MDQ6VXNlcjYxNDIzMg==",
"organizations_url": "https://api.github.com/users/iBobik/orgs",
"received_events_url": "https://api.github.com/users/iBobik/received_events",
"repos_url": "https://api.github.com/users/iBobik/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/iBobik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/iBobik/subscriptions",
"type": "User",
"url": "https://api.github.com/users/iBobik",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-02-11T00:19:13Z
|
2021-08-27T00:08:36Z
|
2021-02-11T00:51:46Z
|
NONE
|
resolved
|
If parameter has list with only one item, then the list is converted to a string.
## Expected Result
`params={'fields': ['fieldname']}` should convert to `?fields[]=fieldname`
## Actual Result
`params={'fields': ['fieldname']}` converts to `?fields=fieldname`
## Reproduction Steps
```python
import requests
requests.request(method,
posixpath.join(self.base_url, url),
params={'fields': ['fieldname']},
data=payload,
headers=self.headers)
```
## System Information
$ python -m requests.help
```
{
"chardet": {
"version": "4.0.0"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.9.1"
},
"platform": {
"release": "4.19.121-linuxkit",
"system": "Linux"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010104f"
},
"urllib3": {
"version": "1.26.3"
},
"using_pyopenssl": false
}
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5749/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5749/timeline
| null |
completed
| null | null | false |
[
"Hi @iBobik,\r\n\r\nThe behavior you're seeing is what's intended. Our documentation shows how we handle parameter encoding for lists and it's identical regardless of the length of the list. We'll generate a key-value pair for every list element.\r\n\r\nThe proposed expectation is non-standard and violates ABNF for the [URI query component](https://tools.ietf.org/html/rfc3986#section-3.4), so we won't be changing anything at this time."
] |
https://api.github.com/repos/psf/requests/issues/5748
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5748/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5748/comments
|
https://api.github.com/repos/psf/requests/issues/5748/events
|
https://github.com/psf/requests/pull/5748
| 805,501,636 |
MDExOlB1bGxSZXF1ZXN0NTcxMDkxNzg3
| 5,748 |
Fix: #4362 - Redirect resolved even though allow_redirects is set to False causing exception for unsupported connection adapter
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/41421345?v=4",
"events_url": "https://api.github.com/users/luckydenis/events{/privacy}",
"followers_url": "https://api.github.com/users/luckydenis/followers",
"following_url": "https://api.github.com/users/luckydenis/following{/other_user}",
"gists_url": "https://api.github.com/users/luckydenis/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/luckydenis",
"id": 41421345,
"login": "luckydenis",
"node_id": "MDQ6VXNlcjQxNDIxMzQ1",
"organizations_url": "https://api.github.com/users/luckydenis/orgs",
"received_events_url": "https://api.github.com/users/luckydenis/received_events",
"repos_url": "https://api.github.com/users/luckydenis/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/luckydenis/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/luckydenis/subscriptions",
"type": "User",
"url": "https://api.github.com/users/luckydenis",
"user_view_type": "public"
}
|
[] |
open
| false | null |
[] | null | 2 |
2021-02-10T13:12:54Z
|
2021-12-29T03:56:43Z
| null |
CONTRIBUTOR
| null |
Good evening
The problem was that when checking, we got data that passed the condition test: they contained `:` , and the request did not start with `http`. Example `0.0.0.0:8080`.
Done:
- [x] Fix #4362
- [x] Add and fix tests
- [x] make test
- [x] flake8 (diff)
| null |
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5748/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5748/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5748.diff",
"html_url": "https://github.com/psf/requests/pull/5748",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5748.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5748"
}
| true |
[
"Good evening, @kennethreitz42\r\n\r\nPlease review my edits when you have some free time",
"Hi @LuckyDenis, we're looking at this as a possible candidate for 2.27.0. There was some feedback left in July on the current proposal. Would you have time to rebase this change and address that?"
] |
https://api.github.com/repos/psf/requests/issues/5747
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5747/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5747/comments
|
https://api.github.com/repos/psf/requests/issues/5747/events
|
https://github.com/psf/requests/pull/5747
| 805,405,860 |
MDExOlB1bGxSZXF1ZXN0NTcxMDExNjA3
| 5,747 |
utils.py: default to utf-8 for text/csv (#5746)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/265630?v=4",
"events_url": "https://api.github.com/users/jgehrcke/events{/privacy}",
"followers_url": "https://api.github.com/users/jgehrcke/followers",
"following_url": "https://api.github.com/users/jgehrcke/following{/other_user}",
"gists_url": "https://api.github.com/users/jgehrcke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jgehrcke",
"id": 265630,
"login": "jgehrcke",
"node_id": "MDQ6VXNlcjI2NTYzMA==",
"organizations_url": "https://api.github.com/users/jgehrcke/orgs",
"received_events_url": "https://api.github.com/users/jgehrcke/received_events",
"repos_url": "https://api.github.com/users/jgehrcke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jgehrcke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jgehrcke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jgehrcke",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 7 |
2021-02-10T10:57:41Z
|
2023-05-12T00:03:07Z
|
2022-05-11T01:28:52Z
|
NONE
|
resolved
|
This is a super quick attempt to address https://github.com/psf/requests/issues/5746 -- an invitation for feedback. Would appreciate your opinions. Thanks!
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5747/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5747/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5747.diff",
"html_url": "https://github.com/psf/requests/pull/5747",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5747.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5747"
}
| true |
[
"(happy to add tests, changelog entry, etc -- brief instructions appreciated)",
"I believe the last change we accepted along these lines was contentious and broke things. Plus 7111 is likely not widely implemented by folks distributing `text/csv` files at this point given it's recency (and given that 16+ year old RFCs aren't widely implemented either). I'm mildly -1 on accepting this",
"@sigmavirus24 thanks for the feedback. I share the conservative approach.\r\n\r\nDo you see a different method that fixes the specific example given in #5746 (Google Sheets CSV export) while minimizing risk for breakage?\r\n\r\nJust hypothetically.. what kind of breakage and unintended change in behavior would we be talking about? UTF-8-decoding byte sequences that would otherwise be... how-decoded? :)",
"So this function will override logic if we could otherwise detect it from the `Content-Type` header. This means if someone is using UTF-16, UTF-32 or some other encoding that's not decode-able by UTF-8 then the Response.encoding value which was previously detected correctly will now always be `utf-8` which will fail if they try to access `text`. ",
"> if we could otherwise detect it *from the Content-Type header*.\r\n\r\n(emphasis mine)\r\n\r\nJust to be sure: you agree that the new code path would not trigger when `charset` is _not_ set in the `Content-Type` header, right? This takes precedence:\r\n\r\n```python\r\n if 'charset' in params:\r\n return params['charset'].strip(\"'\\\"\")\r\n```\r\n\r\nHow would the library, right now, automatically detect for example `UTF-16` in the absence of `charset`?\r\n\r\nI have just noticed that the patch as proposed would not change behavior for `text/csv`, because this condition takes precedence:\r\n\r\n```\r\n if 'text' in content_type:\r\n return 'ISO-8859-1'\r\n```\r\n\r\nThat is, when `charset` is _not_ present in the `Content-Type` header, but when the substring `text` is found in the `content_type` string (it is an object of type `str`, and not a different kind of iterable, right?) then ISO-8859-1 is chosen for decoding for decoding the byte sequence.\r\n\r\nThe ` if 'text/csv' in content_type` would need to be added before that so that it can itself take precedence.\r\n\r\nIs it fair to say that currently in all cases where the `Content-Type` header contains the word \"text\" but does _not_ specify a `charset` the library picks ISO-8859-1?\r\n\r\nIf the answer is a definite 'yes' that would be good news, because ISO-8859-1 is a subset of UTF-8.",
"> Is it fair to say that currently in all cases where the `Content-Type` header contains the word \"text\" but does _not_ specify a `charset` the library picks ISO-8859-1?\r\n\r\nSo that's only one part of it. The other half of this coin is that people probably shouldn't be using [`Response.text`](https://github.com/psf/requests/blob/8c211a96cdbe9fe320d63d9e1ae15c5c07e179f8/requests/models.py#L852-L874) because it forces to a string however possible and is likely lossy.\r\n\r\nBut yes, latin-1 would be the default if `charset` isn't specified which is the right approach for most servers to take. ",
"I agree with Sigmavirus24 that this isn't the right approach. The risk being introduced doesn't outweigh the problem it solves. If this does eventually solidify as widely implemented, we can consider it for a major release but it won't be accepted for Requests 2.x."
] |
https://api.github.com/repos/psf/requests/issues/5746
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5746/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5746/comments
|
https://api.github.com/repos/psf/requests/issues/5746/events
|
https://github.com/psf/requests/issues/5746
| 805,400,543 |
MDU6SXNzdWU4MDU0MDA1NDM=
| 5,746 |
content-type: text/csv does not apply UTF-8-decoding by default (RFC 7111 violation?)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/265630?v=4",
"events_url": "https://api.github.com/users/jgehrcke/events{/privacy}",
"followers_url": "https://api.github.com/users/jgehrcke/followers",
"following_url": "https://api.github.com/users/jgehrcke/following{/other_user}",
"gists_url": "https://api.github.com/users/jgehrcke/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jgehrcke",
"id": 265630,
"login": "jgehrcke",
"node_id": "MDQ6VXNlcjI2NTYzMA==",
"organizations_url": "https://api.github.com/users/jgehrcke/orgs",
"received_events_url": "https://api.github.com/users/jgehrcke/received_events",
"repos_url": "https://api.github.com/users/jgehrcke/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jgehrcke/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jgehrcke/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jgehrcke",
"user_view_type": "public"
}
|
[] |
open
| false | null |
[] | null | 1 |
2021-02-10T10:50:42Z
|
2021-02-10T11:01:38Z
| null |
NONE
| null |
Created a test/repro/MWE sheet here with a cell containing the letter `ö`: https://docs.google.com/spreadsheets/d/1q02F0AjDfCo_XlgFtT7HHU96emOcyf0PRQGKGoQTXCE/edit?usp=sharing
When you HTTP GET `https://docs.google.com/spreadsheets/d/1q02F0AjDfCo_XlgFtT7HHU96emOcyf0PRQGKGoQTXCE/export?format=csv` Google generates and sends a CSV document with UTF-8-encoded text in the response body.
Fetching this URL with `requests` and accessing the response content with the `text` attribute reveals that `requests` does _not_ UTF-8-decode the response body bytes:
```text
$ python -c 'import requests; r=requests.get("https://docs.google.com/spreadsheets/d/1q02F0AjDfCo_XlgFtT7HHU96emOcyf0PRQGKGoQTXCE/export?format=csv"); print(r.text)'
foo,bar,umlaut: öö
```
Library version:
```
$ pip list | grep requests
requests 2.25.1
```
Response headers (from a `curl -v ...`):
```
> GET /<snip> HTTP/2
> Host: doc-14-2s-sheets.googleusercontent.com
> user-agent: curl/7.69.1
> accept: */*
>
{ [5 bytes data]
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
{ [264 bytes data]
* TLSv1.3 (IN), TLS handshake, Newsession Ticket (4):
{ [264 bytes data]
* old SSL session ID is stale, removing
{ [5 bytes data]
* Connection state changed (MAX_CONCURRENT_STREAMS == 100)!
} [5 bytes data]
< HTTP/2 200
< content-type: text/csv
< x-robots-tag: noindex, nofollow, nosnippet
< cache-control: no-cache, no-store, max-age=0, must-revalidate
< pragma: no-cache
< expires: Mon, 01 Jan 1990 00:00:00 GMT
< date: Wed, 10 Feb 2021 10:12:52 GMT
< content-disposition: attachment; filename="<snip>.csv"; filename*=UTF-8''<snip>.csv
< access-control-allow-origin: *
< access-control-expose-headers: Cache-Control,Content-Disposition,Content-Encoding,Content-Length,Content-Type,Date,Expires,Pragma,Server,Transfer-Encoding
< content-security-policy: base-uri 'self';object-src 'self';report-uri https://docs.google.com/spreadsheets/cspreport;script-src 'nonce-+Qyt<snip>w' 'unsafe-inline' 'strict-dynamic' https: http: 'unsafe-eval';worker-src 'self'
< content-security-policy: frame-ancestors 'self' https://docs.google.com
< x-frame-options: ALLOW-FROM https://docs.google.com
< x-content-type-options: nosniff
< x-xss-protection: 1; mode=block
< server: GSE
< alt-svc: h3-29=":443"; ma=2592000,h3-T051=":443"; ma=2592000,h3-Q050=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000,quic=":443"; ma=2592000; v="46,43"
< accept-ranges: none
< vary: Accept-Encoding
```
So, there is `content-type: text/csv`, i.e. no `'charset'` specification.
I am not sure if [RFC 7111](https://tools.ietf.org/html/rfc7111) is the most recent / authoritative reference, but it says about the `text/csv` media type that
```
The "charset" parameter specifies the charset employed by the CSV
content. In accordance with RFC 6657 [RFC6657], the charset
parameter SHOULD be used, and if it is not present, UTF-8 SHOULD
be assumed as the default (this implies that US-ASCII CSV will
work, even when not specifying the "charset" parameter). Any
charset defined by IANA for the "text" tree may be used in
conjunction with the "charset" parameter.
```
| null |
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/5746/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5746/timeline
| null | null | null | null | false |
[
"Given this a quick stab at https://github.com/psf/requests/pull/5747. Would appreciate an initial review and opinions. Thanks!"
] |
https://api.github.com/repos/psf/requests/issues/5745
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5745/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5745/comments
|
https://api.github.com/repos/psf/requests/issues/5745/events
|
https://github.com/psf/requests/issues/5745
| 804,678,861 |
MDU6SXNzdWU4MDQ2Nzg4NjE=
| 5,745 |
HTTPDigestAuth uses quotes for algorithm and qop tokens of the Digest header
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4246686?v=4",
"events_url": "https://api.github.com/users/alex-che/events{/privacy}",
"followers_url": "https://api.github.com/users/alex-che/followers",
"following_url": "https://api.github.com/users/alex-che/following{/other_user}",
"gists_url": "https://api.github.com/users/alex-che/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/alex-che",
"id": 4246686,
"login": "alex-che",
"node_id": "MDQ6VXNlcjQyNDY2ODY=",
"organizations_url": "https://api.github.com/users/alex-che/orgs",
"received_events_url": "https://api.github.com/users/alex-che/received_events",
"repos_url": "https://api.github.com/users/alex-che/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/alex-che/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/alex-che/subscriptions",
"type": "User",
"url": "https://api.github.com/users/alex-che",
"user_view_type": "public"
}
|
[] |
open
| false | null |
[] | null | 0 |
2021-02-09T15:53:28Z
|
2021-02-09T15:53:28Z
| null |
NONE
| null |
The `HTTPDigestAuth.build_digest_header()` method produces the Digest header witch contains quoted strings for `algorithm` and `qop` tokens. E.g.:
`
Digest username="admin", realm="server", nonce="QScBItGtnPq4Dz3v25Tht4SlctJnsR", uri="/api/v1/info", response="e0d12a4b85789351a847c773e6f4b30e", algorithm="MD5", qop="auth", nc=00000001, cnonce="0f905170a2cafe15"`
While according to [RFC 7616](https://tools.ietf.org/html/rfc7616) these tokens must not be quoted:
`
Digest username="admin", realm="server", nonce="QScBItGtnPq4Dz3v25Tht4SlctJnsR", uri="/api/v1/info", response="e0d12a4b85789351a847c773e6f4b30e", algorithm=MD5, qop=auth, nc=00000001, cnonce="0f905170a2cafe15"`
Below is the [corresponding part of the RFC](https://tools.ietf.org/html/rfc7616#section-3.4):
_For historical reasons, a sender MUST only generate the quoted string syntax for the following parameters: username, realm, nonce, uri, response, cnonce, and opaque._
_For historical reasons, a sender MUST NOT generate the quoted string syntax for the following parameters: algorithm, qop, and nc._
This can also be seen in [requests examples](https://tools.ietf.org/html/rfc7616#section-3.9) in the RFC.
Current behavior may cause problems with some servers. The following subclass can be used as a temporary workaround:
```
class FixedHTTPDigestAuth(HTTPDigestAuth):
def build_digest_header(self, method, url):
header = super().build_digest_header(method, url)
invalid_parts = ('algorithm', 'qop')
parts = header.split(', ')
for i, part in enumerate(parts):
if any(part.startswith(ip + '=') for ip in invalid_parts):
parts[i] = part.replace('"', '')
header = ', '.join(parts)
return header
```
| null |
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5745/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5745/timeline
| null | null | null | null | false |
[] |
https://api.github.com/repos/psf/requests/issues/5744
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5744/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5744/comments
|
https://api.github.com/repos/psf/requests/issues/5744/events
|
https://github.com/psf/requests/issues/5744
| 802,720,321 |
MDU6SXNzdWU4MDI3MjAzMjE=
| 5,744 |
urllib3 LocationParseError (label empty or too long) uncaught by requests
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/33416700?v=4",
"events_url": "https://api.github.com/users/cknabs/events{/privacy}",
"followers_url": "https://api.github.com/users/cknabs/followers",
"following_url": "https://api.github.com/users/cknabs/following{/other_user}",
"gists_url": "https://api.github.com/users/cknabs/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/cknabs",
"id": 33416700,
"login": "cknabs",
"node_id": "MDQ6VXNlcjMzNDE2NzAw",
"organizations_url": "https://api.github.com/users/cknabs/orgs",
"received_events_url": "https://api.github.com/users/cknabs/received_events",
"repos_url": "https://api.github.com/users/cknabs/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/cknabs/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/cknabs/subscriptions",
"type": "User",
"url": "https://api.github.com/users/cknabs",
"user_view_type": "public"
}
|
[] |
open
| false | null |
[] | null | 1 |
2021-02-06T15:06:05Z
|
2021-02-20T09:37:35Z
| null |
NONE
| null |
When accessing an URL with a label with more than 63 characters, an `urllib3.exceptions.LocationParseError` (label empty or too long) is returned without being caught by requests.
Maybe this is related to #4746?
## Expected Result
No exception, or an exception raised by requests (maybe InvalidURL?).
## Actual Result
Exception is not caught by requests.
## Reproduction Steps
```python
import requests
requests.get('http://1234567890123456789012345678901234567890123456789012345678901234.com')
Traceback (most recent call last):
File "<input>", line 1, in <module>
File "/home/christian/Projects/EUvsDisinfo/venv/lib64/python3.8/site-packages/requests/api.py", line 76, in get
return request('get', url, params=params, **kwargs)
File "/home/christian/Projects/EUvsDisinfo/venv/lib64/python3.8/site-packages/requests/api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "/home/christian/Projects/EUvsDisinfo/venv/lib64/python3.8/site-packages/requests/sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "/home/christian/Projects/EUvsDisinfo/venv/lib64/python3.8/site-packages/requests/sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "/home/christian/Projects/EUvsDisinfo/venv/lib64/python3.8/site-packages/requests/adapters.py", line 439, in send
resp = conn.urlopen(
File "/home/christian/Projects/EUvsDisinfo/venv/lib64/python3.8/site-packages/urllib3/connectionpool.py", line 699, in urlopen
httplib_response = self._make_request(
File "/home/christian/Projects/EUvsDisinfo/venv/lib64/python3.8/site-packages/urllib3/connectionpool.py", line 394, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/home/christian/Projects/EUvsDisinfo/venv/lib64/python3.8/site-packages/urllib3/connection.py", line 234, in request
super(HTTPConnection, self).request(method, url, body=body, headers=headers)
File "/usr/lib64/python3.8/http/client.py", line 1255, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/usr/lib64/python3.8/http/client.py", line 1301, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/usr/lib64/python3.8/http/client.py", line 1250, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/usr/lib64/python3.8/http/client.py", line 1010, in _send_output
self.send(msg)
File "/usr/lib64/python3.8/http/client.py", line 950, in send
self.connect()
File "/home/christian/Projects/EUvsDisinfo/venv/lib64/python3.8/site-packages/urllib3/connection.py", line 200, in connect
conn = self._new_conn()
File "/home/christian/Projects/EUvsDisinfo/venv/lib64/python3.8/site-packages/urllib3/connection.py", line 169, in _new_conn
conn = connection.create_connection(
File "/home/christian/Projects/EUvsDisinfo/venv/lib64/python3.8/site-packages/urllib3/util/connection.py", line 69, in create_connection
return six.raise_from(
File "<string>", line 3, in raise_from
urllib3.exceptions.LocationParseError: Failed to parse: '1234567890123456789012345678901234567890123456789012345678901234.com', label empty or too long
```
## System Information
$ python -m requests.help
```
{
"chardet": {
"version": "4.0.0"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.8.7"
},
"platform": {
"release": "5.10.11-100.fc32.x86_64",
"system": "Linux"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010109f"
},
"urllib3": {
"version": "1.26.2"
},
"using_pyopenssl": false
}
```
This command is only available on Requests v2.16.4 and greater. Otherwise,
please provide some basic information about your system (Python version,
operating system, &c).
| null |
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5744/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5744/timeline
| null | null | null | null | false |
[
"After sifting through similar issues/PR, it seems that this could be solved similarly to #2344, i.e., catch `urllib3.exceptions.LocationParseError` for every call to `urllib3.util.parse_url` and raise an `requests.exceptions.InvalidURL` instead. "
] |
https://api.github.com/repos/psf/requests/issues/5743
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5743/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5743/comments
|
https://api.github.com/repos/psf/requests/issues/5743/events
|
https://github.com/psf/requests/issues/5743
| 802,017,200 |
MDU6SXNzdWU4MDIwMTcyMDA=
| 5,743 |
Loss of the cookie state when using max-age=0.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/41421345?v=4",
"events_url": "https://api.github.com/users/luckydenis/events{/privacy}",
"followers_url": "https://api.github.com/users/luckydenis/followers",
"following_url": "https://api.github.com/users/luckydenis/following{/other_user}",
"gists_url": "https://api.github.com/users/luckydenis/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/luckydenis",
"id": 41421345,
"login": "luckydenis",
"node_id": "MDQ6VXNlcjQxNDIxMzQ1",
"organizations_url": "https://api.github.com/users/luckydenis/orgs",
"received_events_url": "https://api.github.com/users/luckydenis/received_events",
"repos_url": "https://api.github.com/users/luckydenis/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/luckydenis/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/luckydenis/subscriptions",
"type": "User",
"url": "https://api.github.com/users/luckydenis",
"user_view_type": "public"
}
|
[] |
open
| false | null |
[] | null | 1 |
2021-02-05T09:48:37Z
|
2021-02-06T16:43:22Z
| null |
CONTRIBUTOR
| null |
Summary.
Loss of the `cookie` state when receiving a response from the server containing the lifetime in the format `max-age=0`.
## Expected Result
```python
import requests
r = requests.get('http://127.0.0.1:5000/expire')
>>> r.headers
{'Set-Cookie': 'test=test; Path=/; Max-Age=0', 'Content-Type': 'text/html; charset=utf-8', 'Content-Length': '10', 'Server': 'Werkzeug/1.0.1 Python/3.8.7', 'Date': 'Fri, 05 Feb 2021 09:26:48 GMT'}
>>> r.cookies
<RequestsCookieJar[Cookie(version=0, name='test', value='test', port=None, port_specified=False, domain='127.0.0.1', domain_specified=False, domain_initial_dot=False, path='/', path_specified=True, secure=False, expires=1612520808, discard=False, comment=None, comment_url=None, rest={}, rfc2109=False)]>
```
## Actual Result
```python
>>> import requests
>>> r = requests.get('http://127.0.0.1:5000/expire')
>>> r.headers
{'Set-Cookie': 'test=test; Path=/; Max-Age=0', 'Content-Type': 'text/html; charset=utf-8', 'Content-Length': '14', 'Server': 'Werkzeug/1.0.1 Python/3.8.7', 'Date': 'Fri, 05 Feb 2021 09:16:43 GMT'}
>>> r.cookies
<RequestsCookieJar[]>
```
## Reproduction Steps
app.py
```python
from flask import Flask, make_response
app = Flask(__name__)
@app.route('/expire')
def expire():
return "Expired cookie", {'Set-Cookie': 'test=test; Path=/; Max-Age=0'}
```
Start app: `FLASK_APP=app.py flask run -p 5000`
test.py
```python
import requests
r = requests.get('http://127.0.0.1:5000/expire')
print('headers:', r.headers)
print()
print('cookies:', r.cookies)
```
output
```bash
headers: {'Set-Cookie': 'test=test; Path=/; Max-Age=0', 'Content-Type': 'text/html; charset=utf-8', 'Content-Length': '14', 'Server': 'Werkzeug/1.0.1 Python/3.8.7', 'Date': 'Fri, 05 Feb 2021 09:40:32 GMT'}
cookies: <RequestsCookieJar[]>
```
## System Information
$ python -m requests.help
```bash
{
"chardet": {
"version": "4.0.0"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.8.7"
},
"platform": {
"release": "5.8.0-41-generic",
"system": "Linux"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010106f"
},
"urllib3": {
"version": "1.26.3"
},
"using_pyopenssl": false
}
```
This command is only available on Requests v2.16.4 and greater. Otherwise,
please provide some basic information about your system (Python version,
operating system, &c).
| null |
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5743/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5743/timeline
| null | null | null | null | false |
[
"Good evening.\r\nThe cookie state is deleted at this location.\r\nhttps://github.com/python/cpython/blob/39aeb9ff9064808b08ec629403edbc36a232369b/Lib/http/cookiejar.py#L1548-L1561"
] |
https://api.github.com/repos/psf/requests/issues/5742
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5742/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5742/comments
|
https://api.github.com/repos/psf/requests/issues/5742/events
|
https://github.com/psf/requests/issues/5742
| 800,755,699 |
MDU6SXNzdWU4MDA3NTU2OTk=
| 5,742 |
Request not redirecting when I use a user agent
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/59785672?v=4",
"events_url": "https://api.github.com/users/trevtravtrev/events{/privacy}",
"followers_url": "https://api.github.com/users/trevtravtrev/followers",
"following_url": "https://api.github.com/users/trevtravtrev/following{/other_user}",
"gists_url": "https://api.github.com/users/trevtravtrev/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/trevtravtrev",
"id": 59785672,
"login": "trevtravtrev",
"node_id": "MDQ6VXNlcjU5Nzg1Njcy",
"organizations_url": "https://api.github.com/users/trevtravtrev/orgs",
"received_events_url": "https://api.github.com/users/trevtravtrev/received_events",
"repos_url": "https://api.github.com/users/trevtravtrev/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/trevtravtrev/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/trevtravtrev/subscriptions",
"type": "User",
"url": "https://api.github.com/users/trevtravtrev",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 5 |
2021-02-03T22:34:41Z
|
2021-02-03T23:37:43Z
|
2021-02-03T22:53:16Z
|
NONE
|
resolved
|
I am making a request to a twitter shortened url that redirects to amazon. When I don't use a user agent it redirects perfectly, when I use a user agent it no longer redirects.
## Expected Result
Make a request to a twitter shortened url for an amazon link with a user agent header and it successfully redirect to amazon.
## Actual Result
When I add a user agent to my request, it is no longer redirecting to amazon. (When I remove the user agent again, the request redirects perfectly fine to amazon.
## Reproduction Steps
```python
import requests
header = {'User-Agent': 'Mozilla/5.0 (iPad; CPU OS 8_1_2 like Mac OS X; en-US) AppleWebKit/534.20.2 (KHTML, like Gecko) Version/4.0.5 Mobile/8B116 Safari/6534.20.2'}
session = requests.Session()
session.headers.update(header)
request = session.head("https://t.co/ErjlKsTuZy?amp=1", allow_redirects=True)
amazon_link = request.url
```
## System Information
$ python3 -m requests.help
```
{
"chardet": {
"version": "3.0.4"
},
"cryptography": {
"version": "2.6.1"
},
"idna": {
"version": "2.8"
},
"implementation": {
"name": "CPython",
"version": "3.7.3"
},
"platform": {
"release": "5.4.51-v7l+",
"system": "Linux"
},
"pyOpenSSL": {
"openssl_version": "1010104f",
"version": "19.0.0"
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010104f"
},
"urllib3": {
"version": "1.24.1"
},
"using_pyopenssl": true
```
This command is only available on Requests v2.16.4 and greater. Otherwise,
please provide some basic information about your system (Python version,
operating system, &c).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5742/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5742/timeline
| null |
completed
| null | null | false |
[
"Hi @trevtravtrev, there's a handful of closed issues already discussing this behavior and it's pretty quick to self validate. If you shut off redirects and check the `status_code` and `headers` returned from the first request, you'll see Twitter isn't returning a redirect with the custom user-agent. They're doing something server side to circumvent scraping and this is unrelated to Requests.",
"@nateprewitt Don't you find the behavior strange that it redirects without a user agent but not with one?\n\nI just tested it with a different link shortener (amzn.to) and got the exact same behavior. Without a user agent it redirects fine, with a user agent it does not redirect.\n\nIs this a coincidence?",
"No, because it's not the behavior changing because the user agent is included, it's changing because of the *specific* user agent you're supplying. Doing the same thing with a custom user-agent works fine. This is entirely server side, we can't change how they respond.\r\n\r\n```\r\nimport requests\r\n\r\nheader = {'User-Agent': 'MyBot/1.0'}\r\n\r\nsession = requests.Session()\r\nsession.headers.update(header)\r\nrequest = session.head(\"https://t.co/ErjlKsTuZy?amp=1\", allow_redirects=True)\r\namazon_link = request.url\r\n```",
"Has anyone found a workaround for user agents that work? I've tried over 30 user agents that spoof all of the major web browsers and devices but none worked. I have not tried something like you provided \"MyBot/1.0\" and am not in front of a computer to test. Did you mean that specific user agent just worked for you and redirected when you tested?",
"If you look at the response, it's performing a javascript redirect for your user agents rather than a 302 because you're falsely claiming to be a full fledged browser. This isn't the appropriate venue for these kinds of questions though, please open a question on StackOverflow for further help. Thanks!"
] |
https://api.github.com/repos/psf/requests/issues/5741
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5741/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5741/comments
|
https://api.github.com/repos/psf/requests/issues/5741/events
|
https://github.com/psf/requests/pull/5741
| 800,640,233 |
MDExOlB1bGxSZXF1ZXN0NTY3MDkzNzY1
| 5,741 |
Add 2/3 Check So Latest idna Works in 3
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2775739?v=4",
"events_url": "https://api.github.com/users/utkonos/events{/privacy}",
"followers_url": "https://api.github.com/users/utkonos/followers",
"following_url": "https://api.github.com/users/utkonos/following{/other_user}",
"gists_url": "https://api.github.com/users/utkonos/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/utkonos",
"id": 2775739,
"login": "utkonos",
"node_id": "MDQ6VXNlcjI3NzU3Mzk=",
"organizations_url": "https://api.github.com/users/utkonos/orgs",
"received_events_url": "https://api.github.com/users/utkonos/received_events",
"repos_url": "https://api.github.com/users/utkonos/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/utkonos/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/utkonos/subscriptions",
"type": "User",
"url": "https://api.github.com/users/utkonos",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-02-03T19:37:28Z
|
2021-08-27T00:08:48Z
|
2021-02-03T20:08:46Z
|
NONE
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5741/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5741/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5741.diff",
"html_url": "https://github.com/psf/requests/pull/5741",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5741.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5741"
}
| true |
[
"Resolving as a duplicate of #5711."
] |
|
https://api.github.com/repos/psf/requests/issues/5740
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5740/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5740/comments
|
https://api.github.com/repos/psf/requests/issues/5740/events
|
https://github.com/psf/requests/issues/5740
| 799,623,604 |
MDU6SXNzdWU3OTk2MjM2MDQ=
| 5,740 |
Unable to use system proxy with HTTPS connection on Windows and Python 3.9
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/17663689?v=4",
"events_url": "https://api.github.com/users/kotori2/events{/privacy}",
"followers_url": "https://api.github.com/users/kotori2/followers",
"following_url": "https://api.github.com/users/kotori2/following{/other_user}",
"gists_url": "https://api.github.com/users/kotori2/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kotori2",
"id": 17663689,
"login": "kotori2",
"node_id": "MDQ6VXNlcjE3NjYzNjg5",
"organizations_url": "https://api.github.com/users/kotori2/orgs",
"received_events_url": "https://api.github.com/users/kotori2/received_events",
"repos_url": "https://api.github.com/users/kotori2/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kotori2/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kotori2/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kotori2",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2021-02-02T19:49:36Z
|
2021-08-27T00:08:33Z
|
2021-02-21T17:41:59Z
|
NONE
|
resolved
|
On Windows you can only set a "host:port" http proxy as system proxy.
```
PS > Get-ItemProperty -Path 'HKCU:\Software\Microsoft\Windows\CurrentVersion\Internet Settings' | findstr ProxyServer
ProxyServer : 127.0.0.1:7890
```
But in Python, it assumes that this proxy supports both http and https on the same port:
```
>>> import urllib
>>> urllib.request.getproxies()
{'http': 'http://127.0.0.1:7890', 'https': 'https://127.0.0.1:7890', 'ftp': 'ftp://127.0.0.1:7890'}
```
Which will try to handshake httpps on http port.
## Expected Result
On Python 3.8.x:
```
> .\python38.exe
Python 3.8.3 (tags/v3.8.3:6f8c832, May 13 2020, 22:37:02) [MSC v.1924 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import urllib, requests
>>> urllib.request.getproxies()
{'http': 'http://127.0.0.1:7890', 'https': 'https://127.0.0.1:7890', 'ftp': 'ftp://127.0.0.1:7890'}
>>> requests.get("http://www.google.com")
<Response [200]>
>>> requests.get("https://www.google.com")
<Response [200]>
```
## Actual Result
```
> python3
Python 3.9.1 (tags/v3.9.1:1e5d33e, Dec 7 2020, 17:08:21) [MSC v.1927 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import urllib, requests
>>> urllib.request.getproxies()
{'http': 'http://127.0.0.1:7890', 'https': 'https://127.0.0.1:7890', 'ftp': 'ftp://127.0.0.1:7890'}
>>> requests.get("http://www.google.com")
<Response [200]>
>>> requests.get("https://www.google.com")
Traceback (most recent call last):
File "C:\Python39\lib\site-packages\urllib3\connectionpool.py", line 696, in urlopen
self._prepare_proxy(conn)
File "C:\Python39\lib\site-packages\urllib3\connectionpool.py", line 964, in _prepare_proxy
conn.connect()
File "C:\Python39\lib\site-packages\urllib3\connection.py", line 359, in connect
conn = self._connect_tls_proxy(hostname, conn)
File "C:\Python39\lib\site-packages\urllib3\connection.py", line 496, in _connect_tls_proxy
return ssl_wrap_socket(
File "C:\Python39\lib\site-packages\urllib3\util\ssl_.py", line 432, in ssl_wrap_socket
ssl_sock = _ssl_wrap_socket_impl(sock, context, tls_in_tls)
File "C:\Python39\lib\site-packages\urllib3\util\ssl_.py", line 474, in _ssl_wrap_socket_impl
return ssl_context.wrap_socket(sock)
File "C:\Python39\lib\ssl.py", line 500, in wrap_socket
return self.sslsocket_class._create(
File "C:\Python39\lib\ssl.py", line 1040, in _create
self.do_handshake()
File "C:\Python39\lib\ssl.py", line 1309, in do_handshake
self._sslobj.do_handshake()
ssl.SSLEOFError: EOF occurred in violation of protocol (_ssl.c:1123)
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\Python39\lib\site-packages\requests\adapters.py", line 439, in send
resp = conn.urlopen(
File "C:\Python39\lib\site-packages\urllib3\connectionpool.py", line 755, in urlopen
retries = retries.increment(
File "C:\Python39\lib\site-packages\urllib3\util\retry.py", line 573, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='www.google.com', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1123)')))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Python39\lib\site-packages\requests\api.py", line 76, in get
return request('get', url, params=params, **kwargs)
File "C:\Python39\lib\site-packages\requests\api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Python39\lib\site-packages\requests\sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "C:\Python39\lib\site-packages\requests\sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "C:\Python39\lib\site-packages\requests\adapters.py", line 514, in send
raise SSLError(e, request=request)
requests.exceptions.SSLError: HTTPSConnectionPool(host='www.google.com', port=443): Max retries exceeded with url: / (Caused by SSLError(SSLEOFError(8, 'EOF occurred in violation of protocol (_ssl.c:1123)')))
```
## Reproduction Steps
Set a http Windows system proxy then execute:
```python
import requests
requests.get("https://www.google.com")
```
## System Information
$ python -m requests.help
```
{
"chardet": {
"version": "4.0.0"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.9.1"
},
"platform": {
"release": "10",
"system": "Windows"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010107f"
},
"urllib3": {
"version": "1.26.2"
},
"using_pyopenssl": false
}
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/17663689?v=4",
"events_url": "https://api.github.com/users/kotori2/events{/privacy}",
"followers_url": "https://api.github.com/users/kotori2/followers",
"following_url": "https://api.github.com/users/kotori2/following{/other_user}",
"gists_url": "https://api.github.com/users/kotori2/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/kotori2",
"id": 17663689,
"login": "kotori2",
"node_id": "MDQ6VXNlcjE3NjYzNjg5",
"organizations_url": "https://api.github.com/users/kotori2/orgs",
"received_events_url": "https://api.github.com/users/kotori2/received_events",
"repos_url": "https://api.github.com/users/kotori2/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/kotori2/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/kotori2/subscriptions",
"type": "User",
"url": "https://api.github.com/users/kotori2",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/5740/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5740/timeline
| null |
completed
| null | null | false |
[
"see: https://github.com/pypa/pip/issues/9216#issuecomment-741836058\r\n\r\nreinstall `urllib3==1.25.11` or pass `proxies` to requests\r\n\r\n```python\r\nproxies={\r\n'http': 'http://127.0.0.1:7890',\r\n'https': 'http://127.0.0.1:7890' # https -> http\r\n}\r\n\r\nr = requests.get(url, proxies=proxies)\r\n```",
"@davycloud \r\nSo is that means it is a `urllib3` issue? \r\nBut from this line\r\n> Users that have their proxy configured as proxy_url=https://... and connecting to https://pypi.org before were actually using their proxy in an HTTP configuration\r\n\r\nSince Windows only support http proxy as system proxy, I think it's more like a system proxy parsing issue. So I have no idea if the issue happens on `urllib3` side or `requests` side. \r\nAlso of course setting `proxies` in request will work but I smh don't want to detect system proxy manually in every script. ",
"@kotori2 \r\n\r\n> I think it's more like a system proxy parsing issue. \r\n\r\nI think you are right.\r\nSince the proxy server is `http` or `https` is nothing about the target url. \r\nSo in Windows, when we config the proxy `127.0.0.1` and `7890`,it should means:\r\n\r\n```python\r\n# http proxy\r\nproxies={\r\n'http': 'http://127.0.0.1:7890',\r\n'https': 'http://127.0.0.1:7890' \r\n}\r\n```\r\nIf the proxy is https, we should config the address manually: `https://127.0.0.1` ,and the result should be:\r\n\r\n```python\r\n# https proxy\r\nproxies={\r\n'http': 'https://127.0.0.1:7890',\r\n'https': 'https://127.0.0.1:7890' \r\n}\r\n```\r\nBUT,in case 1,the actual result is :\r\n\r\n```python\r\n>>> import urllib\r\n>>> urllib.request.getproxies()\r\n{'http': 'http://127.0.0.1:7890', 'https': 'https://127.0.0.1:7890', 'ftp': 'ftp://127.0.0.1:7890'}\r\n```\r\nwhich cause the error in new `urllib3` .\r\n\r\nThen I manually config the address `http://127.0.0.1`\r\n\r\n```python\r\n>>> urllib.request.getproxies()\r\n{'http': 'http://127.0.0.1:7890'}\r\n```\r\n😂 \r\n\r\nBTW,set Environment variable `https_proxy=http://127.0.0.1:7890` also work,but it will ignore the system setting.\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n",
"https://bugs.python.org/issue42627"
] |
https://api.github.com/repos/psf/requests/issues/5739
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5739/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5739/comments
|
https://api.github.com/repos/psf/requests/issues/5739/events
|
https://github.com/psf/requests/issues/5739
| 797,777,526 |
MDU6SXNzdWU3OTc3Nzc1MjY=
| 5,739 |
Safe Option for the Character Endoing of GET Request Parameters
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/74075657?v=4",
"events_url": "https://api.github.com/users/farshiddel/events{/privacy}",
"followers_url": "https://api.github.com/users/farshiddel/followers",
"following_url": "https://api.github.com/users/farshiddel/following{/other_user}",
"gists_url": "https://api.github.com/users/farshiddel/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/farshiddel",
"id": 74075657,
"login": "farshiddel",
"node_id": "MDQ6VXNlcjc0MDc1NjU3",
"organizations_url": "https://api.github.com/users/farshiddel/orgs",
"received_events_url": "https://api.github.com/users/farshiddel/received_events",
"repos_url": "https://api.github.com/users/farshiddel/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/farshiddel/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/farshiddel/subscriptions",
"type": "User",
"url": "https://api.github.com/users/farshiddel",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-01-31T18:27:31Z
|
2021-08-27T00:08:37Z
|
2021-01-31T18:47:49Z
|
NONE
|
resolved
|
This issue is related to character encoding of the request parameters when sending a GET request. In my scenario, I need the percent character to not be encoded as %25. The `urllib.parse.urlencode` function has the `safe` input for this purpose. However, the `requests.get` function does not have such input. Therefore, I use `urllib.parse.urlencode` with `safe='%'` to get an ampersand-separated string of my request parameters and send that in for the `params` value of `requests.get`. Even with this trick, the response that I get is different than what I get with `urllib.request.Request`. The interesting thing is that when I print `response.url` (where `response` is the response object of `requests.get`), I see that the percent character is not encoded, but still the response is incorrect. It seems like encoding takes place somewhere else. I really hate abandoning the requests package because of this defect. I was wondering if there is any way this can be fixed in a future release.
## Expected Result
```
<result>
<TotalMatches>1135</TotalMatches>
<TotalPages>57</TotalPages>
<PageNumber>1</PageNumber>
<item>
<mid>36145</mid>
<merchantname> ...
```
## Actual Result
```
<result>
<TotalMatches>0</TotalMatches>
<TotalPages>0</TotalPages>
<PageNumber>1</PageNumber>
</result>
```
## Reproduction Steps
This is the URL I am hitting:
`http://productsearch.linksynergy.com/productsearch?token=<token>&mid=36145&cat=Women%27s+Warehouse+Sale+-+Up+To+70%+Off&keyword=dress&pagenumber=1`
Unfortunately, I cannot share the token. The decoded value for the `cat` parameter is "Women's Warehouse Sale - Up To 70% Off". The percent sign after 70 is the one that is causing the issue.
```python
import requests
with requests.Session() as http_session:
response = http_session.get('http://productsearch.linksynergy.com/productsearch', params='token=<token>&mid=36145&cat=Women%27s+Warehouse+Sale+-+Up+To+70%+Off&keyword=dress&pagenumber=1')
```
## System Information
$ python -m requests.help
```
{
"chardet": {
"version": "4.0.0"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.7.2"
},
"platform": {
"release": "5.8.0-40-generic",
"system": "Linux"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010106f"
},
"urllib3": {
"version": "1.26.3"
},
"using_pyopenssl": false
}
```
This command is only available on Requests v2.16.4 and greater. Otherwise,
please provide some basic information about your system (Python version,
operating system, &c).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5739/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5739/timeline
| null |
completed
| null | null | false |
[
"In the future, please search **closed and** open issues before creating new ones that are duplicates.\r\n\r\nThere are many issues here that are closed with the same root issue (wanting greater control over query parameter encoding). The short answer is to use the prepared request flow."
] |
https://api.github.com/repos/psf/requests/issues/5738
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5738/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5738/comments
|
https://api.github.com/repos/psf/requests/issues/5738/events
|
https://github.com/psf/requests/issues/5738
| 797,602,068 |
MDU6SXNzdWU3OTc2MDIwNjg=
| 5,738 |
Timeout not working
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/42503383?v=4",
"events_url": "https://api.github.com/users/surenjanath/events{/privacy}",
"followers_url": "https://api.github.com/users/surenjanath/followers",
"following_url": "https://api.github.com/users/surenjanath/following{/other_user}",
"gists_url": "https://api.github.com/users/surenjanath/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/surenjanath",
"id": 42503383,
"login": "surenjanath",
"node_id": "MDQ6VXNlcjQyNTAzMzgz",
"organizations_url": "https://api.github.com/users/surenjanath/orgs",
"received_events_url": "https://api.github.com/users/surenjanath/received_events",
"repos_url": "https://api.github.com/users/surenjanath/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/surenjanath/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/surenjanath/subscriptions",
"type": "User",
"url": "https://api.github.com/users/surenjanath",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-01-31T02:39:21Z
|
2021-08-28T00:05:54Z
|
2021-01-31T14:29:26Z
|
NONE
|
resolved
|
Summary.
## Expected Result
div with temp email and not loading after couple seconds of loading
<input class="emailbox-input opentip" data-original-title="Your 10 Minute Mail address" data-placement="bottom" id="mail" onclick="select(this);" readonly="" type="text" value=**""**/>
What you expected.
## Actual Result
blank
## Reproduction Steps
No paused between timeout
```python
import requests
with requests.Session() as s:
post = s.get('https://temp-mail.org/en/10minutemail',headers=headers,timeout=20)
data = bs(post.text,'html.parser')
email = data.find('input',{'id':'mail'})
print(email)
```
## System Information
$ python -m requests.help
```
{
"chardet": {
"version": "3.0.4"
},
"cryptography": {
"version": "3.2.1"
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.9.0"
},
"platform": {
"release": "10",
"system": "Windows"
},
"pyOpenSSL": {
"openssl_version": "1010108f",
"version": "20.0.1"
},
"requests": {
"version": "2.24.0"
},
"system_ssl": {
"version": "1010107f"
},
"urllib3": {
"version": "1.25.11"
},
"using_pyopenssl": true
}
google callaborator
```
This command is only available on Requests v2.16.4 and greater. Otherwise,
please provide some basic information about your system (Python version,
operating system, &c).
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5738/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5738/timeline
| null |
completed
| null | null | false |
[
"Hi there! Thanks for opening this issue. Unfortunately, it seems this is a request for help instead of a report of a defect in the project. Please use [StackOverflow](https://stackoverflow.com) for general usage questions instead and only report defects here."
] |
https://api.github.com/repos/psf/requests/issues/5737
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5737/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5737/comments
|
https://api.github.com/repos/psf/requests/issues/5737/events
|
https://github.com/psf/requests/issues/5737
| 797,496,186 |
MDU6SXNzdWU3OTc0OTYxODY=
| 5,737 |
How to force session pool connections to reestablish?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/28147700?v=4",
"events_url": "https://api.github.com/users/ikhz/events{/privacy}",
"followers_url": "https://api.github.com/users/ikhz/followers",
"following_url": "https://api.github.com/users/ikhz/following{/other_user}",
"gists_url": "https://api.github.com/users/ikhz/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ikhz",
"id": 28147700,
"login": "ikhz",
"node_id": "MDQ6VXNlcjI4MTQ3NzAw",
"organizations_url": "https://api.github.com/users/ikhz/orgs",
"received_events_url": "https://api.github.com/users/ikhz/received_events",
"repos_url": "https://api.github.com/users/ikhz/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ikhz/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ikhz/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ikhz",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-01-30T18:10:46Z
|
2021-08-28T00:05:55Z
|
2021-01-30T20:29:17Z
|
NONE
|
resolved
|
I have an app (desktop) that use sessions for keepalive connections, there is a case when users restart internet connection or switch vpn on/off, the next 2-3 requests (no matter how much time passed after switching internet) made from session hang until read timeout but internet connection is already up (script that runs in parallel in separate thread that making same requests but without session is already started to receive responses). How can I force session to reeastablish all connections or any other workaround?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5737/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5737/timeline
| null |
completed
| null | null | false |
[
"Hi there! Thanks for opening this issue. Unfortunately, it seems this is a request for help instead of a report of a defect in the project. Please use [StackOverflow](https://stackoverflow.com) for general usage questions instead and only report defects here."
] |
https://api.github.com/repos/psf/requests/issues/5736
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5736/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5736/comments
|
https://api.github.com/repos/psf/requests/issues/5736/events
|
https://github.com/psf/requests/pull/5736
| 796,770,876 |
MDExOlB1bGxSZXF1ZXN0NTYzODk0MjYw
| 5,736 |
Make the response class in adapter easier to override
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/8541009?v=4",
"events_url": "https://api.github.com/users/sonthonaxrk/events{/privacy}",
"followers_url": "https://api.github.com/users/sonthonaxrk/followers",
"following_url": "https://api.github.com/users/sonthonaxrk/following{/other_user}",
"gists_url": "https://api.github.com/users/sonthonaxrk/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sonthonaxrk",
"id": 8541009,
"login": "sonthonaxrk",
"node_id": "MDQ6VXNlcjg1NDEwMDk=",
"organizations_url": "https://api.github.com/users/sonthonaxrk/orgs",
"received_events_url": "https://api.github.com/users/sonthonaxrk/received_events",
"repos_url": "https://api.github.com/users/sonthonaxrk/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sonthonaxrk/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sonthonaxrk/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sonthonaxrk",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2021-01-29T10:36:45Z
|
2021-08-27T00:08:48Z
|
2021-01-29T18:27:11Z
|
NONE
|
resolved
|
I find myself pinning for this quite a bit.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5736/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5736/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5736.diff",
"html_url": "https://github.com/psf/requests/pull/5736",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5736.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5736"
}
| true |
[
"We're not accepting new features at this time",
"@sigmavirus24 \r\nWhat's the process to add a feature? Do you tender roadmaps? Or is the internal API of requests just utterly frozen?"
] |
https://api.github.com/repos/psf/requests/issues/5735
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5735/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5735/comments
|
https://api.github.com/repos/psf/requests/issues/5735/events
|
https://github.com/psf/requests/pull/5735
| 796,256,743 |
MDExOlB1bGxSZXF1ZXN0NTYzNDY0NTE4
| 5,735 |
5677: Respect variable precedence in session
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1688249?v=4",
"events_url": "https://api.github.com/users/mateusduboli/events{/privacy}",
"followers_url": "https://api.github.com/users/mateusduboli/followers",
"following_url": "https://api.github.com/users/mateusduboli/following{/other_user}",
"gists_url": "https://api.github.com/users/mateusduboli/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mateusduboli",
"id": 1688249,
"login": "mateusduboli",
"node_id": "MDQ6VXNlcjE2ODgyNDk=",
"organizations_url": "https://api.github.com/users/mateusduboli/orgs",
"received_events_url": "https://api.github.com/users/mateusduboli/received_events",
"repos_url": "https://api.github.com/users/mateusduboli/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mateusduboli/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mateusduboli/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mateusduboli",
"user_view_type": "public"
}
|
[
{
"color": "eb6420",
"default": false,
"description": null,
"id": 44501256,
"name": "Breaking API Change",
"node_id": "MDU6TGFiZWw0NDUwMTI1Ng==",
"url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change"
}
] |
open
| false | null |
[] | null | 12 |
2021-01-28T18:44:04Z
|
2022-01-03T15:25:21Z
| null |
CONTRIBUTOR
| null |
- Move `Session#merge_environment_variables` from `Session#request` to `Session#send` to make it consistent
- On `Session#send` change variable precedence to (higher precedence first) `kwargs` -> `session args` -> `environment`.
| null |
{
"+1": 2,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 2,
"url": "https://api.github.com/repos/psf/requests/issues/5735/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5735/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5735.diff",
"html_url": "https://github.com/psf/requests/pull/5735",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5735.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5735"
}
| true |
[
"I propose the following change at line:\r\nhttps://github.com/psf/requests/blob/913880c45a3a8c3bf6b298e9c38709cd95a9c97c/requests/sessions.py#L530\r\n\r\nReplace it by:\r\n\r\n```python\r\nproxies = proxies or self.proxies\r\n```\r\n\r\nThis change will make this hierarchy `kwargs` -> `session args` -> `environment` to be respected with no need to change anywhere else. Solving problems also with pip as discussed [here](https://github.com/pypa/pip/issues/9691#issuecomment-791608247) \r\n",
"I've added a test case, and changed the strategy.\r\n\r\nNow we are merging the session and request settings before we ever look into the environment, that way the precedence becomes more explicit.\r\n\r\nCan you test this patch with https://github.com/pypa/pip/issues/9691 @junqfisica ?\r\n",
"@mateusduboli Yes, your commit 57ddecdd2af7bf45f85f38cf63cc5d72de529336, also fixes the problem with [ pypa/pip#9691](https://github.com/pypa/pip/issues/9691#issue-823218500). :)\r\n",
"> `proxies = proxies or self.proxies`\r\n\r\nin fact, the solutions of you two are similar: merging `func args` with `session.self` first, then with env. \r\nand `mateusduboli` does more work: move all merging (not only proxies) to `merge_environment_settings()`, adjust the code sequence after merging so that `kwargs` will not be overwritten.",
"@CrazyBoyFeng Do we need any additional review on this, so it can be merged?\r\n",
"> Do we need any additional review on this, so it can be merged?\r\n\r\nI used this fix with `pip`, and it works fine with the `--proxy` parameter. \r\nI think it can be merged.",
"@sigmavirus24, can you take a look at these changes?",
"@nateprewitt, can you maybe take a look at this PR before the next release? We would really appreciate this bugfix included in the next available version of requests. Thank you.",
"@nateprewitt thanks for taking a look at this!\r\n\r\nI'm wondering why this is a tagged as a `Breaking API Change`, I can understand that it should not go through a `patch` revision, but it does keeps the API as intended on the docs.\r\n\r\nI'm guessing if we not consider it as such it would be less hassle to merge it, could you clarify it better?",
"This should be considered as a bug, I think.\r\nWhen there are both environment variables and argument of `requests.session` about the proxy, `requests` eventually takes the system environment variable. This is not the usual practice.\r\nIf this is not considered as a bug, then it needs to be answered: how to use `session` with a custom proxy argument when the environment variable `http_proxy` or `https_proxy` exists?",
"Hi @mateusduboli, the reason we have this marked as a breaking change is we're doing a pretty fundamental change to the Session API here. The last PR (#5888) that was introduced for this had some non-trivial impact that's made Requests 2.26.0 unusable for a number of our dependents. It arguably shouldn't have been merged to begin with.\r\n\r\nI know we've discussed the session precedence issue more than once previously, but I wasn't able to find actual issue numbers. `send` and `request` are not intended to function identically and trying to accomplish that breaks a few different workflows. `send` is intended to do almost nothing to the request and send the PreparedRequest exactly as it was created. This has drifted over the years, so it's no longer entirely true, but this change will make the situation worse.\r\n\r\nI think we need to reevaluate how these interfaces actually operate going into a new major version. Until then though, this has been the behavior of `send` for 7+ years, and there's a lot of infrastructure that will stop behaving correctly because of this.\r\n",
"> If this is not considered as a bug, then it needs to be answered: how to use session with a custom proxy argument when the environment variable http_proxy or https_proxy exists?\r\n\r\n@CrazyBoyFeng I'm reading over the pip issue more and while I agree this far from optimal, as I've stated above, it's kind of where we are. We would _like_ to fix this but can't currently.\r\n\r\nFor the interim, I'm failing to see why `pip` cannot handle this the same way they do timeout. Requests 2.x was specifically designed to ignore Session.timeout because the maintainers at the time didn't agree with it being a session level property. To get around this, Pip created its own `timeout` attribute and sets it on each request before calling into Requests. The same thing can be done with proxies. This doesn't require a patch or change to the vendored code, and produces the desired outcome.\r\n\r\ne.g.\r\n\r\nhttps://github.com/pypa/pip/blob/3ab760aaa17fdc7f00c468a529241164b070b353/src/pip/_internal/network/session.py#L443-L449\r\n\r\nWould become:\r\n```python\r\ndef request(self, method, url, *args, **kwargs):\r\n # type: (str, str, *Any, **Any) -> Response\r\n # Allow setting a default timeout on a session\r\n kwargs.setdefault(\"timeout\", self.timeout)\r\n+ if self.proxies:\r\n+ kwargs.setdefault(\"proxies\", self.proxies)\r\n\r\n # Dispatch the actual request\r\n return super().request(method, url, *args, **kwargs)\r\n```"
] |
https://api.github.com/repos/psf/requests/issues/5734
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5734/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5734/comments
|
https://api.github.com/repos/psf/requests/issues/5734/events
|
https://github.com/psf/requests/issues/5734
| 795,255,014 |
MDU6SXNzdWU3OTUyNTUwMTQ=
| 5,734 |
Cannot stream both request and response at the same time
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4",
"events_url": "https://api.github.com/users/ghost/events{/privacy}",
"followers_url": "https://api.github.com/users/ghost/followers",
"following_url": "https://api.github.com/users/ghost/following{/other_user}",
"gists_url": "https://api.github.com/users/ghost/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ghost",
"id": 10137,
"login": "ghost",
"node_id": "MDQ6VXNlcjEwMTM3",
"organizations_url": "https://api.github.com/users/ghost/orgs",
"received_events_url": "https://api.github.com/users/ghost/received_events",
"repos_url": "https://api.github.com/users/ghost/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ghost/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ghost",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2021-01-27T16:31:57Z
|
2021-08-28T00:05:55Z
|
2021-01-27T18:22:33Z
|
NONE
|
resolved
|
It appears to not be possible to use a generator for the data and `stream=True` and expect to be able to iterate over `resp.iter_lines()` at the same time.
## Expected Result
I expected to be able to send request data from a generator, and read response data with `resp.iter_lines`, at the same time.
## Actual Result
The library seemingly waits until all `data` is sent before `resp.iter_lines` produces any ouptut.
## Reproduction Steps
With a `server.py` using `flask==1.1.2`:
```python
from flask import Flask, Response, request, stream_with_context
app = Flask('bug')
@app.route('/bug', methods=['POST'])
def bug():
def resp():
while True:
line = request.stream.readline()
print(line)
if line:
yield 'Bye\n'
else:
break
return Response(stream_with_context(resp()))
app.run()
```
And a `client.py` using `requests==2.25.1`:
```python
import requests
import time
def req():
for _ in range(10):
yield b'Hi\n'
time.sleep(0.5)
resp = requests.post("http://localhost:5000/bug", data=req(), stream=True)
for line in resp.iter_lines():
print(line)
```
By running `server.py` in one console and then `client.py` in another, it can be seen that the server correctly prints the incoming request data every 0.5s, while the client prints all lines immediately after 5s.
I have also tried with a different library, `aiohttp`, and this seems to be capable of streaming both request and response at the same time.
Note that iterating over `resp.iter_lines()` in a different thread doesn't make it work either, although I would expect it to work with a single thread.
<details><summary>Threaded code</summary>
```python
import requests
import time
import threading
def req():
for _ in range(10):
yield b'Hi\n'
time.sleep(0.5)
def handle_resp(resp):
for line in resp.iter_lines():
print(line)
threading.Thread(
target=handle_resp,
args=(requests.post("http://localhost:5000/bug", data=req(), stream=True),)
).start()
```
</details>
## System Information
<details><summary>$ python -m requests.help</summary>
```json
{
"chardet": {
"version": "3.0.4"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.9.1"
},
"platform": {
"release": "10",
"system": "Windows"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010107f"
},
"urllib3": {
"version": "1.26.2"
},
"using_pyopenssl": false
}
```
</details>
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/5734/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5734/timeline
| null |
completed
| null | null | false |
[
"I’ve noticed this too. I believe this is unfortunately a known issue with status WONTFIX (and can try to find a reference to a previous issue later if no one beats me to it), but please correct me if I’m wrong.\r\n\r\nA separate but related issue that I think has also been reported before and is in the same category: If you use requests to make a request with a body (no matter how large), it waits for the server to read the entire request body before it tries to read the server’s response. But if e.g. the server requires authentication and the request was sent without credentials, production-quality servers will not read arbitrarily large request bodies before sending a 401 response (as a DoS protection), causing an uncaught ConnectionError (broken pipe) in requests (since it expects the server to read the request body in its entirety before it tries to read the response), and making the 401 challenge response negotiation impossible for requests auth extensions like requests_kerberos.",
"@jab you're correct. Anything built on http.client suffers the problem you described",
"Thanks for the feedback, I guessed this issue was already reported, and I tried to [search for \"stream request response\"](https://github.com/psf/requests/issues?q=is%3Aissue+stream+request+response) in the list of issues but with nearly 300 issues it was hard to identify the right one, if it was there at all with my query :P\r\n\r\nAs a workaround, I'll be using `aiohttp` 3.7.3 which seems to work fine, in case someone has this same issue in the future."
] |
https://api.github.com/repos/psf/requests/issues/5733
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5733/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5733/comments
|
https://api.github.com/repos/psf/requests/issues/5733/events
|
https://github.com/psf/requests/issues/5733
| 793,026,725 |
MDU6SXNzdWU3OTMwMjY3MjU=
| 5,733 |
How to send multiple images to Fastapi using requests library
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/38287214?v=4",
"events_url": "https://api.github.com/users/saireddy12/events{/privacy}",
"followers_url": "https://api.github.com/users/saireddy12/followers",
"following_url": "https://api.github.com/users/saireddy12/following{/other_user}",
"gists_url": "https://api.github.com/users/saireddy12/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/saireddy12",
"id": 38287214,
"login": "saireddy12",
"node_id": "MDQ6VXNlcjM4Mjg3MjE0",
"organizations_url": "https://api.github.com/users/saireddy12/orgs",
"received_events_url": "https://api.github.com/users/saireddy12/received_events",
"repos_url": "https://api.github.com/users/saireddy12/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/saireddy12/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/saireddy12/subscriptions",
"type": "User",
"url": "https://api.github.com/users/saireddy12",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-01-25T05:02:47Z
|
2021-08-28T00:05:55Z
|
2021-01-28T12:58:00Z
|
NONE
|
resolved
|
I need to hit the api with multiple images
```
@app.post("/text")
def get_text(files: List[UploadFile] = File(...),frame_timestamps: Optional[List[int]] = Body([]),is_video:bool=False)
```
its working when I try uploading multiple images using /docs interface, I tried with one file its working fine here is the code for it
```
import requests
import json
def get_text(image_path):
#images={}
url = 'http://address/text''
try:
with open(image_path, "rb") as im:
image_data={"files":im}
response=requests.post(url,files=image_data)
return json.loads(response.text)
except Exception as er:
print("error occured")
return "{} error occured".format(er)
```
when I tried adding one more image. to the image_data I am getting error . tried this
```
image_data ={"files":[]}
for image in image_list:
with open(image, "rb") as im:
image_data['files'].append(im)
```
The error I am getting is
<img width="1015" alt="Screenshot 2021-01-25 at 9 27 02 AM" src="https://user-images.githubusercontent.com/38287214/105663586-a2227680-5ef8-11eb-9438-fa3bc09c51de.png">
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/38287214?v=4",
"events_url": "https://api.github.com/users/saireddy12/events{/privacy}",
"followers_url": "https://api.github.com/users/saireddy12/followers",
"following_url": "https://api.github.com/users/saireddy12/following{/other_user}",
"gists_url": "https://api.github.com/users/saireddy12/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/saireddy12",
"id": 38287214,
"login": "saireddy12",
"node_id": "MDQ6VXNlcjM4Mjg3MjE0",
"organizations_url": "https://api.github.com/users/saireddy12/orgs",
"received_events_url": "https://api.github.com/users/saireddy12/received_events",
"repos_url": "https://api.github.com/users/saireddy12/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/saireddy12/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/saireddy12/subscriptions",
"type": "User",
"url": "https://api.github.com/users/saireddy12",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5733/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5733/timeline
| null |
completed
| null | null | false |
[
"I finally found the solution \r\nIncase any one needs the solution here it is \r\n```\r\nfiles = [\r\n ('files', ('image1', open('/Users/ai/image1.jpg','rb'), 'image/png')),\r\n ('files', ('image2', open('/Users/ai/image2.jpeg','rb'), 'image/png'))\r\n ]\r\n```\r\nyou can use the below function for multiple files\r\n```\r\nimport requests\r\nimport json\r\n\r\ndef get_text(image_list,url):\r\n try:\r\n image_data=[]\r\n for image in image_list:\r\n image_data.append(('files',(image.split('/')[-1],open(image,'rb'),'image/png')))#('files',(image_name,open image,type))\r\n response=requests.post(url,files=image_data)\r\n return json.loads(response.text)\r\n except Exception as er:\r\n print(\"error occured\")\r\n return \"{} error occured\".format(er)\r\n```\r\n\r\nYou can check docs [Here][1] \r\n\r\nThanks..!\r\n\r\n\r\n [1]: https://requests.readthedocs.io/en/latest/user/advanced/#post-multiple-multipart-encoded-files"
] |
https://api.github.com/repos/psf/requests/issues/5732
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5732/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5732/comments
|
https://api.github.com/repos/psf/requests/issues/5732/events
|
https://github.com/psf/requests/issues/5732
| 792,692,953 |
MDU6SXNzdWU3OTI2OTI5NTM=
| 5,732 |
endless redirection because of rewriting query containing "[]".
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/507141?v=4",
"events_url": "https://api.github.com/users/darkfader/events{/privacy}",
"followers_url": "https://api.github.com/users/darkfader/followers",
"following_url": "https://api.github.com/users/darkfader/following{/other_user}",
"gists_url": "https://api.github.com/users/darkfader/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/darkfader",
"id": 507141,
"login": "darkfader",
"node_id": "MDQ6VXNlcjUwNzE0MQ==",
"organizations_url": "https://api.github.com/users/darkfader/orgs",
"received_events_url": "https://api.github.com/users/darkfader/received_events",
"repos_url": "https://api.github.com/users/darkfader/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/darkfader/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/darkfader/subscriptions",
"type": "User",
"url": "https://api.github.com/users/darkfader",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2021-01-24T01:04:21Z
|
2021-08-28T00:05:56Z
|
2021-01-24T13:15:46Z
|
NONE
|
resolved
|
It's probably a combination of python3 libraries, since I am not seeing this effect on my webbrowser nor cURL.
I tried disabling automatic redirection but my subsequent prepare_request calls ends up in the same rewriting.
## Expected Result
[] should not rewrite save characters %5B%5D I think because those are safe characters?. For redirection, It probably just should take the redirection from the server as granted anyways.
## Actual Result
[] in queries gets rewritten to %5B%5D. Server rewrites %5B%5D back to [] in Location response header in a redirect.
## Reproduction Steps
```python
import requests
session = requests.Session()
session.send(session.prepare_request(requests.Request(method='get', url='https://www.suruga-ya.jp/search?category=&search_word=&restrict[]=brand=ATARI')), allow_redirects=False)
# ( just some example URL with [] )
```
## System Information
$ python -m requests.help
```
{
"chardet": {
"version": "3.0.4"
},
"cryptography": {
"version": "3.3.1"
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.9.1"
},
"platform": {
"release": "20.3.0",
"system": "Darwin"
},
"pyOpenSSL": {
"openssl_version": "1010109f",
"version": "20.0.1"
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010109f"
},
"urllib3": {
"version": "1.26.2"
},
"using_pyopenssl": true
}
```
(the server runs Drupal 8 with PHP/7.3.11 if that matters)
## Workaround
In urllib3.util.url I replaced
`QUERY_CHARS = FRAGMENT_CHARS = PATH_CHARS | {"?"}`
to
`QUERY_CHARS = FRAGMENT_CHARS = PATH_CHARS | {"?","[","]","(",")","=",":","_"}`
(I think there are more safe characters though)
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5732/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5732/timeline
| null |
completed
| null | null | false |
[
"urllib3 follows RFC 3986 on URLs. If you look at the [ABNF](https://tools.ietf.org/html/rfc3986#appendix-A) for query you'll see that `[` and `]` are not allowed (they are characters in `gen-delims` so not in `pchar`) so I would argue that the server is not handling your request correctly. ",
"Further, not escaping the Location header can lead to a host of security issues, which is frankly unacceptable as a solution",
"Thank you for the elaborate answer. Accepted.\r\nI am a bit shocked however of the many client interpretations. It's probably all about compatibility with old/buggy sites.\r\n\r\nFunny to see how `curl/7.64.1` gives `GET /test?&wow[]=test HTTP/1.1` and `GET /test?&wow=test%5B%5D HTTP/1.1` depending on position relative to the `=`. (Using -G and --data-urlencode) \r\nChrome doesn't care and just passes along the URL.\r\nOf course some characters are still percent-encoded."
] |
https://api.github.com/repos/psf/requests/issues/5731
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5731/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5731/comments
|
https://api.github.com/repos/psf/requests/issues/5731/events
|
https://github.com/psf/requests/issues/5731
| 792,638,349 |
MDU6SXNzdWU3OTI2MzgzNDk=
| 5,731 |
no/no_proxy is not honoured
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2320837?v=4",
"events_url": "https://api.github.com/users/Suika/events{/privacy}",
"followers_url": "https://api.github.com/users/Suika/followers",
"following_url": "https://api.github.com/users/Suika/following{/other_user}",
"gists_url": "https://api.github.com/users/Suika/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Suika",
"id": 2320837,
"login": "Suika",
"node_id": "MDQ6VXNlcjIzMjA4Mzc=",
"organizations_url": "https://api.github.com/users/Suika/orgs",
"received_events_url": "https://api.github.com/users/Suika/received_events",
"repos_url": "https://api.github.com/users/Suika/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Suika/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Suika/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Suika",
"user_view_type": "public"
}
|
[] |
open
| false | null |
[] | null | 8 |
2021-01-23T19:59:26Z
|
2024-02-06T17:54:56Z
| null |
NONE
| null |
I guess PRs are overlooked without an Issues. It's about #5596 and the way handle no_proxy. Since urllib handles `no_proxy` properly, it's the logic in requests that messes with the env in a seemingly twisted way.
## Expected Result
The ability to use no_proxy vairable via OS and function arguments.
## Actual Result
Only OS `no_proxy` ENV is being processed. Uncer certain conditions that were described multiple times in the `no_proxy` issues.
## System Information
$ python -m requests.help
```json
{
"chardet": {
"version": "3.0.4"
},
"cryptography": {
"version": "2.9.2"
},
"idna": {
"version": "2.9"
},
"implementation": {
"name": "CPython",
"version": "3.8.5"
},
"platform": {
"release": "4.15.18-10-pve",
"system": "Linux"
},
"pyOpenSSL": {
"openssl_version": "1010107f",
"version": "19.1.0"
},
"requests": {
"version": "2.23.0"
},
"system_ssl": {
"version": "1010107f"
},
"urllib3": {
"version": "1.25.9"
},
"using_pyopenssl": true
}
```
| null |
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5731/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5731/timeline
| null | null | null | null | false |
[
"I have the same issue: `no_proxy` is ignored in a simple `requests.get()` call:\r\n```\r\nimport requests\r\n\r\nproxies = {\r\n 'http': 'proxy.example.com',\r\n 'no_proxy': 'google.com'\r\n}\r\n\r\nrequests.get('http://google.com/', proxies=proxies)\r\n```\r\n\r\nWith 2.28.0 this yields:\r\n\r\n```\r\nrequests.exceptions.ProxyError: HTTPConnectionPool(host='proxy.example.com', port=80): Max retries exceeded with url: http://google.com/ (Caused by ProxyError('Cannot connect to proxy.', NewConnectionError('<urllib3.connection.HTTPConnection object at 0x10418f5b0>: Failed to establish a new connection: [Errno 8] nodename nor servname provided, or not known')))\r\n```\r\n\r\nSet the following in bash:\r\n```\r\nexport http_proxy=\"proxy.example.com\"\r\nexport no_proxy=\"google.com\"\r\n```\r\n\r\nand the `requests.get('http://google.com/')` works just fine.",
"Where is it documented that what you believe is a big should work? At no point has this library supported what you've made up",
"From the [2.14.0 release history](https://github.com/psf/requests/blob/main/HISTORY.md#2140-2017-05-09) \"Improvements\" section:\r\n```\r\n- It is now possible to pass ``no_proxy`` as a key to the ``proxies`` dictionary to provide handling similar to the ``NO_PROXY`` environment variable.\r\n```",
"So something from ages ago that is likely not documented elsewhere? ",
"It all comes down to feature request https://github.com/psf/requests/issues/2817 and the implementation https://github.com/psf/requests/commit/85400d8d6751071ef78f042d1efa72bdcf76cc0e not actually working. If you think this is not a bug I'd happily create a documentation PR instead.",
"I did some more testing and was able to make an exception using the per-host proxy settings:\r\n```\r\nproxies = {\r\n 'http': 'http://proxy.example.com',\r\n 'http://google.com': '',\r\n}\r\n```\r\nThis is more flexible than the `no_proxy` mechanism so I can live with it not working as described. \r\n\r\nI've made a PR to clarify this in the documentation at #6172 ",
"## `no_proxy` doesn't work\r\n\r\n```bash\r\npython3 -c 'import requests; print(requests.get(\"http://ipinfo.io/ip\", proxies={\"no_proxy\":\"ipinfo.io\",\"http\":\"http://myproxy:3128\"}).text)'\r\n```\r\n\r\n## `no_proxy` works\r\n\r\n```bash\r\nexport no_proxy=ipinfo.io\r\nexport http_proxy=http://myproxy:3128\r\n\r\npython3 -c 'import requests; print(requests.get(\"http://ipinfo.io/ip\").text)'\r\n```\r\n\r\n----\r\n\r\nThe difference is caused by this `if` statement:\r\n\r\n```python\r\n if \"proxies\" not in kwargs:\r\n kwargs[\"proxies\"] = resolve_proxies(request, self.proxies, self.trust_env)\r\n```\r\n\r\n`no_proxy` is handled by `resolve_proxies()`\r\n\r\nhttps://github.com/psf/requests/blob/main/src/requests/sessions.py#L683-L684",
"EDIT: I mistakenly thought that this was working in 2.25.0, but this was because in 2.25.0 the `send` operation was not even using the OS env settings at all (as opposed to the `post` function). I therefore wonder if the following analysis is even relevant ? \r\n\r\nIt still seems from the code that the `no_proxy` variable is not read when a `send` (and not a `post`) is performed. This is because `resolve_proxies` (where the issue is) is only used in `send` and not in `post`. `post` relies on `request`, which as opposed to `send`, uses `merge_environment_settings` for proxy resolution and not `resolve_proxies`. This is not very consistent, maybe something to improve ?\r\n\r\n------\r\n\r\nI ran into this today\r\n\r\n> Only OS no_proxy ENV is being processed.\r\n\r\nActually, in `requests==2.31.0` the OS env variable `no_proxy` is not processed correctly anymore. ~~It was working correctly in `2.25.0` though, I just checked. So this is a regression.~~ It seems to be related to the fact that once `get_environ_proxies()` is called, the `no_proxy` part of its contents is not returned.\r\n\r\nhttps://github.com/psf/requests/blob/4f3f189d6b7d2a19ff73dfb4a3e39713d4abe01a/src/requests/utils.py#L876-L882\r\n\r\nYou can see that `new_proxies` \r\n\r\n - is updated with the entry in `environ_proxies` corresponding to the `scheme`, \r\n - but is not updated with the `no_proxy` contents if any in `environ_proxies`\r\n\r\nShould I open a separate issue ?\r\n"
] |
https://api.github.com/repos/psf/requests/issues/5730
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5730/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5730/comments
|
https://api.github.com/repos/psf/requests/issues/5730/events
|
https://github.com/psf/requests/issues/5730
| 792,612,984 |
MDU6SXNzdWU3OTI2MTI5ODQ=
| 5,730 |
Why does the same request return different things on burp suite and python?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/72662270?v=4",
"events_url": "https://api.github.com/users/jvictore/events{/privacy}",
"followers_url": "https://api.github.com/users/jvictore/followers",
"following_url": "https://api.github.com/users/jvictore/following{/other_user}",
"gists_url": "https://api.github.com/users/jvictore/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jvictore",
"id": 72662270,
"login": "jvictore",
"node_id": "MDQ6VXNlcjcyNjYyMjcw",
"organizations_url": "https://api.github.com/users/jvictore/orgs",
"received_events_url": "https://api.github.com/users/jvictore/received_events",
"repos_url": "https://api.github.com/users/jvictore/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jvictore/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jvictore/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jvictore",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-01-23T17:38:39Z
|
2021-08-28T00:05:56Z
|
2021-01-23T22:56:45Z
|
NONE
|
resolved
|
When I make a post with Burpsuite the post is executed correctly. But whe I try to do the same post using python requests I get the 401 error. I am using the same User agent and the same headers for everything. There is a way for the site to recognize that the request isn’t from a browser? Do you have any ideias for what could I do?
I look for the answer everywhere, mainly on stack overflow, but I didn’t find the answer.
Thank for the help
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5730/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5730/timeline
| null |
completed
| null | null | false |
[
"Hi there! Thanks for opening this issue. Unfortunately, it seems this is a request for help instead of a report of a defect in the project. Please ask for help on [StackOverflow](https://stackoverflow.com) for general usage questions instead and only report defects here.\r\n\r\n"
] |
https://api.github.com/repos/psf/requests/issues/5729
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5729/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5729/comments
|
https://api.github.com/repos/psf/requests/issues/5729/events
|
https://github.com/psf/requests/issues/5729
| 792,521,358 |
MDU6SXNzdWU3OTI1MjEzNTg=
| 5,729 |
Cookies d'ont work
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/53025923?v=4",
"events_url": "https://api.github.com/users/UnknownSourceCode/events{/privacy}",
"followers_url": "https://api.github.com/users/UnknownSourceCode/followers",
"following_url": "https://api.github.com/users/UnknownSourceCode/following{/other_user}",
"gists_url": "https://api.github.com/users/UnknownSourceCode/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/UnknownSourceCode",
"id": 53025923,
"login": "UnknownSourceCode",
"node_id": "MDQ6VXNlcjUzMDI1OTIz",
"organizations_url": "https://api.github.com/users/UnknownSourceCode/orgs",
"received_events_url": "https://api.github.com/users/UnknownSourceCode/received_events",
"repos_url": "https://api.github.com/users/UnknownSourceCode/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/UnknownSourceCode/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/UnknownSourceCode/subscriptions",
"type": "User",
"url": "https://api.github.com/users/UnknownSourceCode",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2021-01-23T10:01:56Z
|
2021-08-28T00:05:56Z
|
2021-01-23T13:05:32Z
|
NONE
|
resolved
|
Well i wanted to Login to my textnow account to send smsusing cookies
i copied from firefox to curl and the request done and worked very well
when i use python
requests.get("https://www.textnow.com/api/sessions",
headers={
"Content-Type": "application/x-www-form-urlencoded; charset=UTF-8"
},
cookies={},
)
i tried use Cookies in Cookies KeyWord i tried use requests.cookies.set ,set-cookie header, cookie header , tried injecting cookies to cookie jar and tried use alot of things and everything i found on stackoverflow i tried
Every New thing i try Cookies Fail
## Reproduction Steps
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5729/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5729/timeline
| null |
completed
| null | null | false |
[
"Hi there! Thanks for opening this issue. Unfortunately, it seems this is a request for help instead of a report of a defect in the project. Please use [StackOverflow](https://stackoverflow.com) for general usage questions instead and only report defects here.",
"I TRIED ALl StackOVerflow posts no solution\r\n",
"but none of them did help sir ",
"that why i did come here to demand Solution in CURL , PyCurl Works Perfectly but i prefer Requests but is don't know why if you can help thanks you d'ont send a solution that you didn't try because as i did say no solution found till now"
] |
https://api.github.com/repos/psf/requests/issues/5728
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5728/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5728/comments
|
https://api.github.com/repos/psf/requests/issues/5728/events
|
https://github.com/psf/requests/issues/5728
| 792,013,249 |
MDU6SXNzdWU3OTIwMTMyNDk=
| 5,728 |
GET Method + HTTPS does not append request Body
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/62646829?v=4",
"events_url": "https://api.github.com/users/MimounMen/events{/privacy}",
"followers_url": "https://api.github.com/users/MimounMen/followers",
"following_url": "https://api.github.com/users/MimounMen/following{/other_user}",
"gists_url": "https://api.github.com/users/MimounMen/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/MimounMen",
"id": 62646829,
"login": "MimounMen",
"node_id": "MDQ6VXNlcjYyNjQ2ODI5",
"organizations_url": "https://api.github.com/users/MimounMen/orgs",
"received_events_url": "https://api.github.com/users/MimounMen/received_events",
"repos_url": "https://api.github.com/users/MimounMen/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/MimounMen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/MimounMen/subscriptions",
"type": "User",
"url": "https://api.github.com/users/MimounMen",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2021-01-22T13:40:34Z
|
2021-08-28T00:05:57Z
|
2021-01-22T16:04:14Z
|
NONE
|
resolved
|
I'm writing a Script to interact with an Elasticsearch Instance. The ES Server blocks POST Requests, therefore i have to use GET Methods with a request body.
## Expected Result
A HTTP GET Request with Content-type and Content-length set with the corrensponding body
## Actual Result
The Headers are always been set, but if the protocol is HTTPS the body will not be appended.
## Reproduction Steps
```python
import requests, json
proxies={'http':'http://localhost:8080',
'https':'https://localhost:8080'
}
test={'test':'test'}
payload=json.dumps(test)
#Does not append Body (GET,HTTPS)
r=requests.request(method='get',url='https://google.com',json=payload,proxies=proxies)
#Does Append Body (GET, HTTP)
r=requests.request(method='get',url='http://google.com',json=payload,proxies=proxies)
#Does append Body (POST HTTPS)
r=requests.request(method='post',url='https://google.com',json=payload,proxies=proxies)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5728/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5728/timeline
| null |
completed
| null | null | false |
[
"Hey, I know about Elasticsearch because I'm the Python ES client maintainer. Elasticsearch accepts POST in place of GET for all requests that require a body. The client automatically uses POST for you for this reason.",
"Hi Sethmlarson, thanks for the quick answer.\r\nThere is a Proxy between the client and the ES instance.\r\nThe proxy is blocking any POST Method.",
"A few things:\r\n\r\n1. Semantically according to the RFCs, GET bodies are meaningless and _any intermediary_ (i.e., a proxy) can decide to strip it\r\n2. `json=json.dumps(...)` negates the purpose of the `json=` parameter. Stop that\r\n3. The issue sounds completely isolated to your proxies and not to requests as I know for a fact that Requests adds the body regardless."
] |
https://api.github.com/repos/psf/requests/issues/5727
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5727/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5727/comments
|
https://api.github.com/repos/psf/requests/issues/5727/events
|
https://github.com/psf/requests/pull/5727
| 792,009,179 |
MDExOlB1bGxSZXF1ZXN0NTU5OTY3NTM4
| 5,727 |
Fix typo error
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/22741481?v=4",
"events_url": "https://api.github.com/users/xiaojueguan/events{/privacy}",
"followers_url": "https://api.github.com/users/xiaojueguan/followers",
"following_url": "https://api.github.com/users/xiaojueguan/following{/other_user}",
"gists_url": "https://api.github.com/users/xiaojueguan/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/xiaojueguan",
"id": 22741481,
"login": "xiaojueguan",
"node_id": "MDQ6VXNlcjIyNzQxNDgx",
"organizations_url": "https://api.github.com/users/xiaojueguan/orgs",
"received_events_url": "https://api.github.com/users/xiaojueguan/received_events",
"repos_url": "https://api.github.com/users/xiaojueguan/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/xiaojueguan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/xiaojueguan/subscriptions",
"type": "User",
"url": "https://api.github.com/users/xiaojueguan",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2021-01-22T13:34:35Z
|
2021-08-27T00:08:50Z
|
2021-01-22T13:41:40Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5727/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5727/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5727.diff",
"html_url": "https://github.com/psf/requests/pull/5727",
"merged_at": "2021-01-22T13:41:39Z",
"patch_url": "https://github.com/psf/requests/pull/5727.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5727"
}
| true |
[] |
|
https://api.github.com/repos/psf/requests/issues/5726
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5726/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5726/comments
|
https://api.github.com/repos/psf/requests/issues/5726/events
|
https://github.com/psf/requests/issues/5726
| 791,814,139 |
MDU6SXNzdWU3OTE4MTQxMzk=
| 5,726 |
Possible memory leaking when combining session, threading and proxies
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/9117697?v=4",
"events_url": "https://api.github.com/users/jaimecoj/events{/privacy}",
"followers_url": "https://api.github.com/users/jaimecoj/followers",
"following_url": "https://api.github.com/users/jaimecoj/following{/other_user}",
"gists_url": "https://api.github.com/users/jaimecoj/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jaimecoj",
"id": 9117697,
"login": "jaimecoj",
"node_id": "MDQ6VXNlcjkxMTc2OTc=",
"organizations_url": "https://api.github.com/users/jaimecoj/orgs",
"received_events_url": "https://api.github.com/users/jaimecoj/received_events",
"repos_url": "https://api.github.com/users/jaimecoj/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jaimecoj/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jaimecoj/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jaimecoj",
"user_view_type": "public"
}
|
[] |
open
| false | null |
[] | null | 7 |
2021-01-22T08:42:05Z
|
2023-11-03T02:03:03Z
| null |
NONE
| null |
I it helps I got error `OSError: [Errno 24] Too many open files` when running script, not sure if it is related with memory leak, I solved setting to 10000 `ulimit -n 10000`
## Expected Result
RAM usage kept under reasonable limits
## Actual Result
RAM usage doesn't stop growing
## Reproduction Steps
I usually wouldn't be posting target website or the proxy credentials, but in this case I think they are needed for reproduce the bug.
```
import requests
from threading import Thread
from time import sleep
session = requests.Session()
from memory_profiler import profile
from random import randrange
finished = False
def get_proxy():
proxy = "http://lum-customer-hl_f53c879b-zone-static-session-" + str(randrange(999999)) + ":[email protected]:22225"
return {
"http": proxy,
"https": proxy
}
def make_request(url):
session.get(url, proxies=get_proxy())
def worker():
while True:
if finished: return
make_request("http://1000imagens.com/")
@profile
def main():
global finished
threads = []
for i in range(2):
t = Thread(target=worker)
t.start()
threads.append(t)
count = 0
while True:
sleep(1)
count += 1
if count == 300:
finished = True
return
main()
```
## System Information
$ python3.9 -m requests.help
```
{
"chardet": {
"version": "3.0.4"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.6"
},
"implementation": {
"name": "CPython",
"version": "3.9.1"
},
"platform": {
"release": "4.15.0-134-generic",
"system": "Linux"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010100f"
},
"urllib3": {
"version": "1.22"
},
"using_pyopenssl": false
}
```
```
# lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 18.04.5 LTS
Release: 18.04
Codename: bionic
```
I tried with python versions 3.6, 3.8 and 3.9 and found no difference.
## Output of memory_profiler
```
Line # Mem usage Increment Occurences Line Contents
============================================================
31 23.8 MiB 23.8 MiB 1 @profile
32 def main():
33 global finished
34 23.8 MiB 0.0 MiB 1 threads = []
35 23.8 MiB 0.0 MiB 3 for i in range(2):
36 23.8 MiB 0.0 MiB 2 t = Thread(target=worker)
37 23.8 MiB 0.0 MiB 2 t.start()
38 23.8 MiB 0.0 MiB 2 threads.append(t)
39
40 23.8 MiB 0.0 MiB 1 count = 0
41 while True:
42 547.1 MiB 523.2 MiB 300 sleep(1)
43 547.1 MiB 0.0 MiB 300 count += 1
44 547.1 MiB 0.0 MiB 300 if count == 300:
45 547.1 MiB 0.0 MiB 1 finished = True
46 547.1 MiB 0.0 MiB 1 return
```
After 5 minutes it eats +500MB ram. If I leave it running indefinitely it would consume all available ram and would be killed.
| null |
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5726/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5726/timeline
| null | null | null | null | false |
[
"If I add `verify=False` to same script it doesn't leak, so it seems related to SSL verification\r\n\r\n```\r\nLine # Mem usage Increment Occurences Line Contents\r\n============================================================\r\n 31 23.9 MiB 23.9 MiB 1 @profile\r\n 32 def main():\r\n 33 global finished\r\n 34 23.9 MiB 0.0 MiB 1 threads = []\r\n 35 24.2 MiB 0.0 MiB 3 for i in range(2):\r\n 36 24.1 MiB 0.0 MiB 2 t = Thread(target=worker)\r\n 37 24.2 MiB 0.3 MiB 2 t.start()\r\n 38 24.2 MiB 0.0 MiB 2 threads.append(t)\r\n 39\r\n 40 24.2 MiB 0.0 MiB 1 count = 0\r\n 41 while True:\r\n 42 67.5 MiB 43.3 MiB 300 sleep(1)\r\n 43 67.5 MiB 0.0 MiB 300 count += 1\r\n 44 67.5 MiB 0.0 MiB 300 if count == 300:\r\n 45 67.5 MiB 0.0 MiB 1 finished = True\r\n 46 67.5 MiB 0.0 MiB 1 return\r\n```\r\n",
"Yes. Every report of a memory leak we've had has been related to using TLS. We've never been able to track it further than the SSL library",
"when using random proxy, session.get_adapter(\"http://\").proxy_manager dnot remove ProxyManager Object.\r\ntoo many ProxyManger object to memory leaking.\r\nsession = requests.session()\r\nfor x in range(1, 100):\r\n try:\r\n session.get(\"http://test.comaaa\", proxies={\"http\": \"http://{}:{}\".format(x,x)}, timeout=0.1)\r\n except:\r\n continue\r\nprint(session.get_adapter(\"http://\").proxy_manager)\r\n",
"+1 same issue here",
"+1 same issue here",
"> when using random proxy, session.get_adapter(\"http://\").proxy_manager dnot remove ProxyManager Object. too many ProxyManger object to memory leaking. session = requests.session() for x in range(1, 100): try: session.get(\"http://test.comaaa\", proxies={\"http\": \"http://{}:{}\".format(x,x)}, timeout=0.1) except: continue print(session.get_adapter(\"http://\").proxy_manager)\r\n\r\nsure, at this method **requests.adapters.HTTPAdapter.proxy_manager_for()** when using proxy, **manager = self.proxy_manager[proxy] = proxy_from_url(...)**,this is a cache, here every random proxy comes into self.proxy_manager(a dict), when using a session, this proxy_manager won't clear its' values and become bigger to leak memory. To solve this, we need to pop values in it manually?",
"> > when using random proxy, session.get_adapter(\"http://\").proxy_manager dnot remove ProxyManager Object. too many ProxyManger object to memory leaking. session = requests.session() for x in range(1, 100): try: session.get(\"http://test.comaaa\", proxies={\"http\": \"http://{}:{}\".format(x,x)}, timeout=0.1) except: continue print(session.get_adapter(\"http://\").proxy_manager)\r\n> \r\n> sure, at this method **requests.adapters.HTTPAdapter.proxy_manager_for()** when using proxy, **manager = self.proxy_manager[proxy] = proxy_from_url(...)**,this is a cache, here every random proxy comes into self.proxy_manager(a dict), when using a session, this proxy_manager won't clear its' values and become bigger to leak memory. To solve this, we need to pop values in it manually?\r\n\r\nHere's my solution: use **self.session = requests.sessions.Session()** to handle cookies for website's login, use **with self.session.get(url, headers=headers, proxies=self.proxies, ...) as self.response:** to ensure response closed after request, and then at the method that changes the **self.proxies**, use **self.session.get_adapter(\"https://\").proxy_manager.clear()** to clear the proxy_maneger's cache. This works for me."
] |
https://api.github.com/repos/psf/requests/issues/5725
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5725/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5725/comments
|
https://api.github.com/repos/psf/requests/issues/5725/events
|
https://github.com/psf/requests/issues/5725
| 791,526,254 |
MDU6SXNzdWU3OTE1MjYyNTQ=
| 5,725 |
Support for logging is abysmal, and documentation is missing
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4665100?v=4",
"events_url": "https://api.github.com/users/jackjansen/events{/privacy}",
"followers_url": "https://api.github.com/users/jackjansen/followers",
"following_url": "https://api.github.com/users/jackjansen/following{/other_user}",
"gists_url": "https://api.github.com/users/jackjansen/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/jackjansen",
"id": 4665100,
"login": "jackjansen",
"node_id": "MDQ6VXNlcjQ2NjUxMDA=",
"organizations_url": "https://api.github.com/users/jackjansen/orgs",
"received_events_url": "https://api.github.com/users/jackjansen/received_events",
"repos_url": "https://api.github.com/users/jackjansen/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/jackjansen/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/jackjansen/subscriptions",
"type": "User",
"url": "https://api.github.com/users/jackjansen",
"user_view_type": "public"
}
|
[] |
open
| false | null |
[] | null | 3 |
2021-01-21T22:03:23Z
|
2022-02-21T22:54:10Z
| null |
NONE
| null |
Attempting to change logging for requests is very complicated, can only be done application-wide (I think), and the documentation is missing.
Google finds many incorrect suggestions (mainly pointing to readthedocs documentation of requests version 0.x), the only workable information I found was at this stack overflow thread: https://stackoverflow.com/questions/10588644/how-can-i-see-the-entire-http-request-thats-being-sent-by-my-python-application
As a first step it would be good if the information from this thread (insofar as it is still correct) was included in the documentation somewhere.
But it would be a lot nicer if there was easier support for logging, possibly through a toolbelt module or something. Controlling logging on a fine-grained scale is vital, especially when you are trying to debug things like SSL-related errors (which are rather unintelligible in their own right) in a large scale application.
I understand that part of the problem is with urllib3, but I guess I have to start somewhere.
And I'm willing to help, if personpower is an issue.
| null |
{
"+1": 1,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/5725/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5725/timeline
| null | null | null | null | false |
[
"I guess nobody would mind if information from https://docs.python-requests.org/en/master/api/#api-changes will get its own Debugging chapter.\r\n\r\nMerging the actual code into helper method might not be accepted as that will lead to many pull requests with different formats that people want. For example, sometimes I want HAR dump (#16) to compare with browser, sometimes to compare to `curl` output, and sometimes colorful thing like `httpie -v`.",
"I think it's important to support logging. There are many code like this where I use requests:\r\n```\r\nres = requets.post(url, data) \r\nif res.status_code >= 400:\r\n logger.error(res.text)\r\nres.raise_for_status()\r\n```\r\n\r\nI think the `raise_for_status` should get the requests logger and log the error response.\r\n\r\nFurther more, the method should add parameter like `check=True` as [subprocess](https://docs.python.org/3/library/subprocess.html) did.",
"@ramwin as the OP of this issue I think your problem is completely different: you want help from requests to make it easier to do logging for your own application.\r\n\r\nThis report is about logging what requests itself is doing internally, to debug internal issues inside requests (or, more likely, in my code using requests or in the server that requests is communicating with).\r\n"
] |
https://api.github.com/repos/psf/requests/issues/5724
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5724/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5724/comments
|
https://api.github.com/repos/psf/requests/issues/5724/events
|
https://github.com/psf/requests/issues/5724
| 791,335,269 |
MDU6SXNzdWU3OTEzMzUyNjk=
| 5,724 |
Support the new Deprecation HTTP header
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1526883?v=4",
"events_url": "https://api.github.com/users/pimterry/events{/privacy}",
"followers_url": "https://api.github.com/users/pimterry/followers",
"following_url": "https://api.github.com/users/pimterry/following{/other_user}",
"gists_url": "https://api.github.com/users/pimterry/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/pimterry",
"id": 1526883,
"login": "pimterry",
"node_id": "MDQ6VXNlcjE1MjY4ODM=",
"organizations_url": "https://api.github.com/users/pimterry/orgs",
"received_events_url": "https://api.github.com/users/pimterry/received_events",
"repos_url": "https://api.github.com/users/pimterry/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/pimterry/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pimterry/subscriptions",
"type": "User",
"url": "https://api.github.com/users/pimterry",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 9 |
2021-01-21T17:04:06Z
|
2022-02-26T03:00:20Z
|
2021-11-28T02:58:30Z
|
NONE
|
resolved
|
The Deprecation header indicates that an API is deprecated, and shouldn't be used. It's either `true`, or an HTTP datetime value (the time after which the API is deprecated).
This is a draft HTTP standard from the IETF HTTPAPI working group: https://datatracker.ietf.org/doc/draft-ietf-httpapi-deprecation-header/. It's still a draft, but it's been around since 2019 and been through a few revisions already. It'd be great if it was supported by Requests!
I think there's a good argument that Requests should treat this like any of its own internal deprecation warnings (e.g. [a](https://github.com/psf/requests/blob/4f6c0187150af09d085c03096504934eb91c7a9e/requests/auth.py#L38-L55), [b](https://github.com/psf/requests/blob/589c4547338b592b1fb77c65663d8aa6fbb7e38b/requests/utils.py#L446-L450), [c](https://github.com/psf/requests/blob/589c4547338b592b1fb77c65663d8aa6fbb7e38b/requests/utils.py#L551-L555)), i.e. automatically logging it via `warnings` (if true, or if the date in the header has passed) to notify developers that they're depending on explicitly deprecated functionality. This seems like a quick standards-based win to help developers deal with API changes before their projects break, rather than later.
I understand if you'd prefer to wait until the standard is finalized, but in the meantime I'm just interested to know if the Requests team is open to supporting this in theory?
Feedback on the standard is also very welcome, the WG mailing list is here: https://www.ietf.org/mailman/listinfo/httpapi
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5724/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5724/timeline
| null |
completed
| null | null | false |
[
"Since this standard appears to not require any functional changes beyond unconditionally emitting warnings, should this instead be supported by applications that use Requests? That's my initial thought here. ",
"Yes, it could be, it's absolutely possible to handle application-side (with a session hook as the best option, I guess?)\r\n\r\nI think there's a few reasons why this might be particularly valuable to have built-in though:\r\n\r\n* There's a clear & useful default behaviour that Requests can very easily provide, which is in keeping with existing behaviour, and which would significantly ease painful API changes (by ensuring developers know about them before their code breaks).\r\n* Deprecations are not rare events. 100% of APIs get deprecated and shutdown, eventually, so this is relevant to every application - it's not a niche feature.\r\n* Having it as a built-in default is unusually valuable, because most of the time it's not clear that you need this when you're initially writing the code. When you first start sending requests to an API, it is presumably not deprecated.\r\n* Supporting this drives a virtuous cycle for the HTTP ecosystem: having more popular HTTP clients warn users for you makes the standard immediately valuable to API developers, which drives adoption there, making more HTTP clients use it, etc, until deprecation notices for APIs become a solved problem everywhere (granted, this may take a while :smiley:).",
"Yeah, that's not this library's responsibility, nor should any HTTP focused library be raising exceptions for things like that breaking code unexpectedly and requiring significantly more work from users of the library to be aware of it to start and handle it explicitly.\r\n\r\nLogging this via warnings (another option) is also a non-starter. We log (and warn) things that are legitimately more concerning and get heaping loads of complaints about that already. If we start warning this on a single request, I can't imagine the level of confusion and anger we'll receive. \r\n\r\n> * Supporting this drives a virtuous cycle for the HTTP ecosystem: having more popular HTTP clients warn users for you makes the standard immediately valuable to API developers, which drives adoption there, making more HTTP clients use it, etc, until deprecation notices for APIs become a solved problem everywhere (granted, this may take a while smiley).\r\n\r\nAlternatively, Requests supports this and other HTTP client developers see the vitriol that's poured out here and decide never to do that because developers tend to treat each other horribly, especially when the other is providing free labour.\r\n",
"> Yeah, that's not this library's responsibility, nor should any HTTP focused library be raising exceptions for things\r\n\r\nYep, absolutely not proposing anybody starts throwing exceptions for this - purely warning about it, to provide the same developer UX for API deprecations as any other deprecation.\r\n\r\n> I can't imagine the level of confusion and anger we'll receive.\r\n\r\n> developers tend to treat each other horribly, especially when the other is providing free labour.\r\n\r\nYeah, I sympathise, open-source users can be terrible people. I totally understand not wanting to merge things that'll increase your maintenance load or create angry users.\r\n\r\nI have just found an interesting point though, which may help: I thought DeprecationWarnings were printed by default, but since Python 3.2 that's only true for code running in `__main__`, according to https://docs.python.org/3/library/warnings.html#default-warning-filter.\r\n\r\nThat means if Requests reported these as deprecations, they _wouldn't_ appear by default anywhere. They'd only show up for users who opt into deprecation warnings globally (for modern Python - we could just disable warning entirely for older Python versions).\r\n\r\nThat feels like a route that should avoid confusion & complaints, but would still be very valuable, since the users who explicitly want to hear about deprecations in general are exactly the users likely to also want to know if they're using deprecated APIs. That would mean a dev would just need to enable deprecation warnings globally to set this up, which is much easier than reconfiguring Requests everywhere they use it. I'd imagine that the warning would fire as a DeprecationWarning subclass (HttpDeprecationWarning, say), so users can also opt into deprecation warnings elsewhere but disable API warnings with standard warning filters, if they prefer.\r\n\r\nDoes that change the balance here at all?",
"Note that there's also a `Sunset` header, this one a published RFC: https://tools.ietf.org/html/rfc8594\r\n\r\nI personally don't care if this is built into requests or a separate hook, though I would appreciate a convenient way to enable those warnings when I run with `python -Wd`.",
"There is indeed. They're intended to work together, and being worked on by overlapping groups in the IETF. Just to clarify the difference:\r\n\r\n* The deprecation header signifies that the endpoint should no longer be used after a certain date (or if set to `true` just 'now'). I would expect that the date will normally be in the past.\r\n* The sunset header signifies a date when the endpoint will completely stop working. It's normally a date in the future: if the date has passed, then you should assume the endpoint could disappear at any moment.\r\n\r\nIt's likely that you'd use them together: start serving deprecation headers when you're officially phasing out the endpoint, add a sunset header once you have a final shutdown date, and then shut it down on that shutdown date. There's some examples here: https://httptoolkit.tech/blog/how-to-turn-off-your-old-apis/. Debatable, but I think the former is the one that's most useful for warnings.",
"Since it is still beig a Draft...it is what it is.... \r\n\r\nThere are a lot of forks based on drafts that have been dropped out before an official RFC was release, but.... any help is well received, so, if you have a solution in mind, please share it, and it could be added as API pre-release parameter, or something like...if the developer team agree with it...\r\n\r\nremember this is a community effort team,so any help could be well received, but any support will be appreciated too...\r\n\r\nCheers\r\n",
"> There are a lot of forks based on drafts that have been dropped out before an official RFC was released\r\n\r\nYep, I think there's a very good argument for not implementing this today, or at least not enabling it by default, until the standard is finalized.\r\n\r\nI posted this mainly to start a discussion _before_ that happens, to make it easier to implement this once/if this becomes a fully fledged IETF standard. Note that the sunset RFC is already a published RFC though ([RFC 8549](https://datatracker.ietf.org/doc/rfc8594/)).\r\n\r\nPersonally, I'd find it really useful if `python -Wd` automatically printed warnings when talking to deprecated and/or sunsetted HTTP APIs, since that's a strong sign that your code will stop working soon.",
"Going to close this as other maintainers have all agreed this is applications responsibility."
] |
https://api.github.com/repos/psf/requests/issues/5723
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5723/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5723/comments
|
https://api.github.com/repos/psf/requests/issues/5723/events
|
https://github.com/psf/requests/pull/5723
| 789,537,622 |
MDExOlB1bGxSZXF1ZXN0NTU3ODY5Njg4
| 5,723 |
Format c_rehash
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/1192780?v=4",
"events_url": "https://api.github.com/users/bbodenmiller/events{/privacy}",
"followers_url": "https://api.github.com/users/bbodenmiller/followers",
"following_url": "https://api.github.com/users/bbodenmiller/following{/other_user}",
"gists_url": "https://api.github.com/users/bbodenmiller/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/bbodenmiller",
"id": 1192780,
"login": "bbodenmiller",
"node_id": "MDQ6VXNlcjExOTI3ODA=",
"organizations_url": "https://api.github.com/users/bbodenmiller/orgs",
"received_events_url": "https://api.github.com/users/bbodenmiller/received_events",
"repos_url": "https://api.github.com/users/bbodenmiller/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/bbodenmiller/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/bbodenmiller/subscriptions",
"type": "User",
"url": "https://api.github.com/users/bbodenmiller",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2021-01-20T01:28:08Z
|
2021-08-27T00:08:50Z
|
2021-01-20T14:58:36Z
|
CONTRIBUTOR
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5723/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5723/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5723.diff",
"html_url": "https://github.com/psf/requests/pull/5723",
"merged_at": "2021-01-20T14:58:36Z",
"patch_url": "https://github.com/psf/requests/pull/5723.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5723"
}
| true |
[] |
|
https://api.github.com/repos/psf/requests/issues/5722
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5722/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5722/comments
|
https://api.github.com/repos/psf/requests/issues/5722/events
|
https://github.com/psf/requests/pull/5722
| 789,335,333 |
MDExOlB1bGxSZXF1ZXN0NTU3NjkyNTg3
| 5,722 |
To work just like normal browsers, default headers in requests.api.get(), requests.api.request(), requests.models.Request.__init__() and requests.sessions.Session.request() are set
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/74560907?v=4",
"events_url": "https://api.github.com/users/shreyanavigyan/events{/privacy}",
"followers_url": "https://api.github.com/users/shreyanavigyan/followers",
"following_url": "https://api.github.com/users/shreyanavigyan/following{/other_user}",
"gists_url": "https://api.github.com/users/shreyanavigyan/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/shreyanavigyan",
"id": 74560907,
"login": "shreyanavigyan",
"node_id": "MDQ6VXNlcjc0NTYwOTA3",
"organizations_url": "https://api.github.com/users/shreyanavigyan/orgs",
"received_events_url": "https://api.github.com/users/shreyanavigyan/received_events",
"repos_url": "https://api.github.com/users/shreyanavigyan/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/shreyanavigyan/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shreyanavigyan/subscriptions",
"type": "User",
"url": "https://api.github.com/users/shreyanavigyan",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-01-19T20:20:43Z
|
2021-08-27T00:08:51Z
|
2021-01-19T20:25:49Z
|
NONE
|
resolved
|
The headers argument in `requests.api.get()`, `requests.api.request()`, `requests.models.Request.__init__()` and `requests.sessions.Session.request()` are set to a default dictionary value rather than NoneType Data Type to handle errors like 503 server error though the URL is correct. Some websites block webscrapers. To avoid being blocked, this default headers is very useful. This works in Mac OS X, Linux and Windows as of January 2020.
For example, https://www.amazon.com blocks Web-scrapers nowadays. Before this change, the returned response object's status code was always 503 though the website existed.

After this change, the returned response object's status code is always 200.

|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5722/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5722/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5722.diff",
"html_url": "https://github.com/psf/requests/pull/5722",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5722.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5722"
}
| true |
[
"Hi @shreyanavigyan,\r\n\r\nI don't think it makes sense to add these user agents to Requests. They aren't actually accurate and the goal of Requests isn't to impersonate a browser. Users are free to add headers as they see fit, but we're not trying to violate protective restrictions services have put in place by default.\r\n\r\nThanks for the suggestion though."
] |
https://api.github.com/repos/psf/requests/issues/5721
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5721/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5721/comments
|
https://api.github.com/repos/psf/requests/issues/5721/events
|
https://github.com/psf/requests/issues/5721
| 789,267,504 |
MDU6SXNzdWU3ODkyNjc1MDQ=
| 5,721 |
requests library code walkthrough
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/77692938?v=4",
"events_url": "https://api.github.com/users/samrohon/events{/privacy}",
"followers_url": "https://api.github.com/users/samrohon/followers",
"following_url": "https://api.github.com/users/samrohon/following{/other_user}",
"gists_url": "https://api.github.com/users/samrohon/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/samrohon",
"id": 77692938,
"login": "samrohon",
"node_id": "MDQ6VXNlcjc3NjkyOTM4",
"organizations_url": "https://api.github.com/users/samrohon/orgs",
"received_events_url": "https://api.github.com/users/samrohon/received_events",
"repos_url": "https://api.github.com/users/samrohon/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/samrohon/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/samrohon/subscriptions",
"type": "User",
"url": "https://api.github.com/users/samrohon",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2021-01-19T18:44:21Z
|
2022-02-26T04:00:33Z
|
2021-11-28T03:37:59Z
|
NONE
|
resolved
|
Wannabe opensource contributor and intermediate python developer here. I am wondering if any contributors of this library has done or could do a code walkthrough video ?
- It is difficult for newbies to undestand the code base from a bigger perspective, what are the patterns used, design decisions made etc merely going throught the code
- Most of the time there is a lot of context that is required by the reader, around why code is written the way it is - which cannot be understood simply reading the code
- There are a lot of tutorials on how to use the repo functionality, but I could not find any on the implementation details and the design.
- This library has been reccomended by many as a good case study of readable python code. It would highly benifit the community if there is a free-form walkthrough of the code base - which gives a good intro to the internal of the library - and give a bigger picture on how a successful python opensource project is structured and run.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5721/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5721/timeline
| null |
completed
| null | null | false |
[
"> * This library has been reccomended by many as a good case study of readable python code.\r\n\r\nAs a maintainer for several years, I've always disagreed with this notion. That said, those folks would be the people who should be doing the walk-through. I would never feel honest giving such a walk through, nor does anyone directly involved in the project have the sufficient free time to write, record, edit, and post such a walk through (or series of such)",
"I am also a begineer and got overwhelmed by the code so, I would also request someone to give us a full walkthrough of the entire codebase.",
"I also don't recommend this project as somewhere that beginners should get their start. We're unlikely to create any resources for this either as our limited time is better spent maintaining the project itself. Good luck finding a project that speaks to you and finding time to invest in OSS if you'd really like to make an impact those two components are critical."
] |
https://api.github.com/repos/psf/requests/issues/5720
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5720/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5720/comments
|
https://api.github.com/repos/psf/requests/issues/5720/events
|
https://github.com/psf/requests/issues/5720
| 788,116,053 |
MDU6SXNzdWU3ODgxMTYwNTM=
| 5,720 |
Requests is using dependencies with copyleft licenses
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/12371245?v=4",
"events_url": "https://api.github.com/users/farooquiyasir/events{/privacy}",
"followers_url": "https://api.github.com/users/farooquiyasir/followers",
"following_url": "https://api.github.com/users/farooquiyasir/following{/other_user}",
"gists_url": "https://api.github.com/users/farooquiyasir/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/farooquiyasir",
"id": 12371245,
"login": "farooquiyasir",
"node_id": "MDQ6VXNlcjEyMzcxMjQ1",
"organizations_url": "https://api.github.com/users/farooquiyasir/orgs",
"received_events_url": "https://api.github.com/users/farooquiyasir/received_events",
"repos_url": "https://api.github.com/users/farooquiyasir/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/farooquiyasir/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/farooquiyasir/subscriptions",
"type": "User",
"url": "https://api.github.com/users/farooquiyasir",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-01-18T09:57:31Z
|
2021-08-28T00:05:58Z
|
2021-01-18T14:57:16Z
|
NONE
|
resolved
|
We are using request's latest version but it uses certifi as depedency which is under MPL 2.0 license. This makes certifi a plugin with a copyleft license. Our company does not want to use any copyleft license plugin. I know that requests added certifi as separate plugin dependency after v 2.16 and onwards but I am not sure if its okay to use v2.15 because a lot of other bugs would also be fixed in latest versions. Is there a way we can exclude certifi depedency to avoid using a copyleft license plugin?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5720/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5720/timeline
| null |
completed
| null | null | false |
[
"The MPL-licensed code included in certifi is not modified so doesn't trigger the MPL license terms. Please search for existing closed issues before opening a new one. "
] |
https://api.github.com/repos/psf/requests/issues/5719
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5719/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5719/comments
|
https://api.github.com/repos/psf/requests/issues/5719/events
|
https://github.com/psf/requests/issues/5719
| 787,146,532 |
MDU6SXNzdWU3ODcxNDY1MzI=
| 5,719 |
Old Dependencies
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/22400253?v=4",
"events_url": "https://api.github.com/users/gtoph/events{/privacy}",
"followers_url": "https://api.github.com/users/gtoph/followers",
"following_url": "https://api.github.com/users/gtoph/following{/other_user}",
"gists_url": "https://api.github.com/users/gtoph/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gtoph",
"id": 22400253,
"login": "gtoph",
"node_id": "MDQ6VXNlcjIyNDAwMjUz",
"organizations_url": "https://api.github.com/users/gtoph/orgs",
"received_events_url": "https://api.github.com/users/gtoph/received_events",
"repos_url": "https://api.github.com/users/gtoph/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gtoph/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gtoph/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gtoph",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2021-01-15T19:37:01Z
|
2021-08-28T00:05:58Z
|
2021-01-15T19:40:39Z
|
NONE
|
resolved
|
Just wondering why the dependencies haven't been updated for a while now. Seems to be requiring a pretty old version of IDNA
**requests 2.25.1 requires idna<3,>=2.5**
Ran it with the latest 3.1 and still seems to work fine. Wonder if the older version is a real requirement?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5719/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5719/timeline
| null |
completed
| null | null | false |
[
"Hi @gtoph,\r\n\r\nidna 3.0 was only released 2 weeks ago. We've restricted version caps at major version bumps to avoid pulling in breaking API changes without testing. There's currently a PR (#5711) open discussing this, so we'll close this in favor of that. Thanks.",
"No worries, it's just my ocd to have the latest versions ;)"
] |
https://api.github.com/repos/psf/requests/issues/5718
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5718/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5718/comments
|
https://api.github.com/repos/psf/requests/issues/5718/events
|
https://github.com/psf/requests/pull/5718
| 783,377,581 |
MDExOlB1bGxSZXF1ZXN0NTUyNzQ3MDU2
| 5,718 |
c2b307d
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/77180439?v=4",
"events_url": "https://api.github.com/users/Aleksey0329/events{/privacy}",
"followers_url": "https://api.github.com/users/Aleksey0329/followers",
"following_url": "https://api.github.com/users/Aleksey0329/following{/other_user}",
"gists_url": "https://api.github.com/users/Aleksey0329/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/Aleksey0329",
"id": 77180439,
"login": "Aleksey0329",
"node_id": "MDQ6VXNlcjc3MTgwNDM5",
"organizations_url": "https://api.github.com/users/Aleksey0329/orgs",
"received_events_url": "https://api.github.com/users/Aleksey0329/received_events",
"repos_url": "https://api.github.com/users/Aleksey0329/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/Aleksey0329/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/Aleksey0329/subscriptions",
"type": "User",
"url": "https://api.github.com/users/Aleksey0329",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2021-01-11T13:37:03Z
|
2021-01-11T14:06:52Z
|
2021-01-11T14:06:40Z
|
NONE
|
spam
|
Grohs
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5718/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5718/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5718.diff",
"html_url": "https://github.com/psf/requests/pull/5718",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5718.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5718"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/5717
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5717/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5717/comments
|
https://api.github.com/repos/psf/requests/issues/5717/events
|
https://github.com/psf/requests/pull/5717
| 782,843,151 |
MDExOlB1bGxSZXF1ZXN0NTUyMzAxODg1
| 5,717 |
Make raise_for_status() chainable
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/24324496?v=4",
"events_url": "https://api.github.com/users/sambragg/events{/privacy}",
"followers_url": "https://api.github.com/users/sambragg/followers",
"following_url": "https://api.github.com/users/sambragg/following{/other_user}",
"gists_url": "https://api.github.com/users/sambragg/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sambragg",
"id": 24324496,
"login": "sambragg",
"node_id": "MDQ6VXNlcjI0MzI0NDk2",
"organizations_url": "https://api.github.com/users/sambragg/orgs",
"received_events_url": "https://api.github.com/users/sambragg/received_events",
"repos_url": "https://api.github.com/users/sambragg/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sambragg/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sambragg/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sambragg",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2021-01-10T15:35:20Z
|
2021-08-27T00:08:52Z
|
2021-01-10T17:16:43Z
|
NONE
|
resolved
|
Frequently, I find myself doing the following:
```python
r = requests.get(url)
r.raise_for_status()
json = r.json()
```
I thought it may be useful to be able to do the following instead, where `raise_for_status()` is chainable:
```python
r = requests.get(url)
json = r.raise_for_status().json()
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5717/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5717/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5717.diff",
"html_url": "https://github.com/psf/requests/pull/5717",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5717.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5717"
}
| true |
[
"Unfortunately, Requests is under a feature-freeze. As a result, we won't be expanding the behaviour of `raise_for_status()`"
] |
https://api.github.com/repos/psf/requests/issues/5716
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5716/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5716/comments
|
https://api.github.com/repos/psf/requests/issues/5716/events
|
https://github.com/psf/requests/pull/5716
| 782,713,713 |
MDExOlB1bGxSZXF1ZXN0NTUyMjA2NjAz
| 5,716 |
testes e mais testes
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/69920850?v=4",
"events_url": "https://api.github.com/users/rayokofi/events{/privacy}",
"followers_url": "https://api.github.com/users/rayokofi/followers",
"following_url": "https://api.github.com/users/rayokofi/following{/other_user}",
"gists_url": "https://api.github.com/users/rayokofi/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/rayokofi",
"id": 69920850,
"login": "rayokofi",
"node_id": "MDQ6VXNlcjY5OTIwODUw",
"organizations_url": "https://api.github.com/users/rayokofi/orgs",
"received_events_url": "https://api.github.com/users/rayokofi/received_events",
"repos_url": "https://api.github.com/users/rayokofi/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/rayokofi/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/rayokofi/subscriptions",
"type": "User",
"url": "https://api.github.com/users/rayokofi",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2021-01-10T00:32:32Z
|
2021-08-27T00:08:52Z
|
2021-01-10T00:41:48Z
|
NONE
|
resolved
|
só treinando e nada mais
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5716/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5716/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5716.diff",
"html_url": "https://github.com/psf/requests/pull/5716",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5716.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5716"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/5715
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5715/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5715/comments
|
https://api.github.com/repos/psf/requests/issues/5715/events
|
https://github.com/psf/requests/pull/5715
| 781,799,770 |
MDExOlB1bGxSZXF1ZXN0NTUxNDY4NTc0
| 5,715 |
Add json arguments to .put and .patch
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/117248?v=4",
"events_url": "https://api.github.com/users/mdelcambre/events{/privacy}",
"followers_url": "https://api.github.com/users/mdelcambre/followers",
"following_url": "https://api.github.com/users/mdelcambre/following{/other_user}",
"gists_url": "https://api.github.com/users/mdelcambre/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/mdelcambre",
"id": 117248,
"login": "mdelcambre",
"node_id": "MDQ6VXNlcjExNzI0OA==",
"organizations_url": "https://api.github.com/users/mdelcambre/orgs",
"received_events_url": "https://api.github.com/users/mdelcambre/received_events",
"repos_url": "https://api.github.com/users/mdelcambre/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/mdelcambre/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/mdelcambre/subscriptions",
"type": "User",
"url": "https://api.github.com/users/mdelcambre",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2021-01-08T02:47:03Z
|
2021-08-27T00:08:52Z
|
2021-01-08T19:24:55Z
|
NONE
|
resolved
|
This PR adds the `json` argument to both the `put` and `patch` function definitions. This is to bring them into agreement with the docstring of each function and to mirror the definition of `post`.
This should not change any functionally as the `json` arg would be captured by the `**kwargs` expansion.
Obviously, if this difference was intentional, feel free to close without merging. Alternatively, if you prefer I can quickly put together a pr to update the docstring. It was just something that briefly tripped me up and I figured it was quick and easy to change.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5715/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5715/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5715.diff",
"html_url": "https://github.com/psf/requests/pull/5715",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5715.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5715"
}
| true |
[
"Duplicate of https://github.com/psf/requests/pull/3664",
"Swear I searched before submitting but must have only looked in issues not closed PRs, sorry about that.\r\n\r\nIronically, it was the PR from that (#3666) that introduced the issue I was trying to solve. Currently, the documentation specifically calls out JSON as a parameter, but it is not actually a parameter in the function definition (obviously it is just grabbed by **kwargs). It creates seemingly broken documentation as pictured below.\r\n\r\n<img width=\"745\" alt=\"Screen Shot 2021-01-08 at 4 37 59 PM\" src=\"https://user-images.githubusercontent.com/117248/104066809-055d9a80-51d0-11eb-9632-bf533ecdd432.png\">\r\n\r\n\r\nHowever, perhaps the current state is the lesser of all evils. Happy to drop it and not take up any more time.\r\n\r\n"
] |
https://api.github.com/repos/psf/requests/issues/5714
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5714/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5714/comments
|
https://api.github.com/repos/psf/requests/issues/5714/events
|
https://github.com/psf/requests/issues/5714
| 781,373,560 |
MDU6SXNzdWU3ODEzNzM1NjA=
| 5,714 |
Inconsistent behavior after redirects when passing cookies directly
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/4580066?v=4",
"events_url": "https://api.github.com/users/JanPokorny/events{/privacy}",
"followers_url": "https://api.github.com/users/JanPokorny/followers",
"following_url": "https://api.github.com/users/JanPokorny/following{/other_user}",
"gists_url": "https://api.github.com/users/JanPokorny/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/JanPokorny",
"id": 4580066,
"login": "JanPokorny",
"node_id": "MDQ6VXNlcjQ1ODAwNjY=",
"organizations_url": "https://api.github.com/users/JanPokorny/orgs",
"received_events_url": "https://api.github.com/users/JanPokorny/received_events",
"repos_url": "https://api.github.com/users/JanPokorny/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/JanPokorny/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/JanPokorny/subscriptions",
"type": "User",
"url": "https://api.github.com/users/JanPokorny",
"user_view_type": "public"
}
|
[] |
open
| false | null |
[] | null | 5 |
2021-01-07T14:57:10Z
|
2021-01-07T23:40:23Z
| null |
NONE
| null |
I have noticed that cookies passed directly (no session) behave strangely. When passed as the `cookie` argument, cookies are persisted after all redirects (even cross-domain and when expired explicitly). When passed as the `Cookie` header, the header is dropped after any redirect.
I came across this behavior when discussing the same issue for the `httpx` library (https://github.com/encode/httpx/issues/1404). I do not know whether this behavior is intended. If it is, I would be glad for an explanation behind this design choice.
## Expected Result
I would expect cookies passed as the `cookie` argument to work the same way as if a single-use session was used, and the `Cookie` header to be passed regardless of redirects.
## Actual Result
`cookie` argument cookies are always passed (regardless of being expired or the redirect being cross-domain), while `Cookie` header cookies are always dropped (even on same-domain redirects).
## Reproduction Steps
Server:
```python
import flask
app = flask.Flask(__name__)
@app.route('/r')
def r():
return "yes" if "test" in flask.request.cookies else "no"
@app.route('/same_domain_redirect')
def same_domain_redirect():
return flask.redirect("/r", code=302)
@app.route('/cross_domain_redirect')
def cross_domain_redirect():
return flask.redirect("http://localhost:5000/r", code=302)
@app.route('/same_domain_redirect_expire')
def same_domain_redirect_expire():
resp = flask.redirect("/r", code=302)
resp.set_cookie('test', '', expires=0)
return resp
if __name__ == '__main__':
app.run()
```
Client:
```python
import requests
print(requests.get("http://127.0.0.1:5000/same_domain_redirect", cookies={ "test": "test" }).content)
print(requests.get("http://127.0.0.1:5000/cross_domain_redirect", cookies={ "test": "test" }).content)
print(requests.get("http://127.0.0.1:5000/same_domain_redirect_expire", cookies={ "test": "test" }).content)
print(requests.get("http://127.0.0.1:5000/same_domain_redirect", headers={ "Cookie": "test=test" }).content)
print(requests.get("http://127.0.0.1:5000/cross_domain_redirect", headers={ "Cookie": "test=test" }).content)
print(requests.get("http://127.0.0.1:5000/same_domain_redirect_expire", headers={ "Cookie": "test=test" }).content)
```
## System Information
```
{
"chardet": {
"version": "4.0.0"
},
"cryptography": {
"version": ""
},
"idna": {
"version": "2.10"
},
"implementation": {
"name": "CPython",
"version": "3.9.1"
},
"platform": {
"release": "10",
"system": "Windows"
},
"pyOpenSSL": {
"openssl_version": "",
"version": null
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010107f"
},
"urllib3": {
"version": "1.26.2"
},
"using_pyopenssl": false
}
```
| null |
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5714/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5714/timeline
| null | null | null | null | false |
[
"This is expected behaviour as the `cookies` argument doesn't provide a domain attribute for us to check. I've discouraged this usage in the past and even started building something to make handling cookies (and then passing that to Requests) more sensible (such that it would be harder to do this here) but it's incomplete",
"@sigmavirus24 So why is the Cookie header being cleared? I would expect that cookies passed as the `cookies` argument to have the same behavior as directly passing the `Cookie` header, whatever that behavior is.",
"So it's a semantic difference. If you pass `auth=ClassInheritingFromAuthBase` we also behave differently from passing `headers={\"Authorization\": \"...\"}`. The behaviour for scrubbing the header provided by the user that contains sensitive information is a security precaution. Unfortunately the cookies behaviour has been cemented since 0.x releases and the cookies parameter is just a short-cut to pain and misery. Luckily, in our understanding, exceedingly few people use it.",
"@sigmavirus24 I understand. I ask this because I've been affected by an ugly bug that resulted from a different behavior of `requests` and `httpx`, and there's currently a discussion happening in https://github.com/encode/httpx/issues/1404 about what should be the \"correct\" behavior.",
"\"correct\" would be `cookies={\"foo\": \"bar\"}` is great for testing things but an awful pattern for security in general (`cookies=SomeCookieJarPleaseGodNotTheStdlib(...)`). `Cookie: foo=bar` headers set on a request should be assumed to be for a single domain and redirects that end up on HTTP (I can see the case for http -> http not removing the header, but it's _safer_ to remove it), or domain (even to a subdomain) need to remove the header because you can't be 100% certain it's intended for any other requests like that."
] |
https://api.github.com/repos/psf/requests/issues/5713
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5713/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5713/comments
|
https://api.github.com/repos/psf/requests/issues/5713/events
|
https://github.com/psf/requests/issues/5713
| 778,769,741 |
MDU6SXNzdWU3Nzg3Njk3NDE=
| 5,713 |
PUT call to unify does not work, curl call does
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3207629?v=4",
"events_url": "https://api.github.com/users/pverhoye/events{/privacy}",
"followers_url": "https://api.github.com/users/pverhoye/followers",
"following_url": "https://api.github.com/users/pverhoye/following{/other_user}",
"gists_url": "https://api.github.com/users/pverhoye/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/pverhoye",
"id": 3207629,
"login": "pverhoye",
"node_id": "MDQ6VXNlcjMyMDc2Mjk=",
"organizations_url": "https://api.github.com/users/pverhoye/orgs",
"received_events_url": "https://api.github.com/users/pverhoye/received_events",
"repos_url": "https://api.github.com/users/pverhoye/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/pverhoye/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pverhoye/subscriptions",
"type": "User",
"url": "https://api.github.com/users/pverhoye",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 2 |
2021-01-05T09:00:58Z
|
2021-08-28T00:05:58Z
|
2021-01-05T18:54:25Z
|
NONE
|
resolved
|
Hi all!
Best wishes (first things first!)
I want to enable/disable a PoE port on my UniFi switch. For this I aim using Python 3.9.1 (first time) with the following code:
```
import requests
import json
import sys
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
gateway = {"ip": "MYSERVER.COM", "port": "8443"}
headers = {"Accept": "application/json", "Content-Type": "application/json"}
login_url = f"https://{gateway['ip']}:{gateway['port']}/api/login"
login_data = {
"username": "MYUSERNAME",
"password": "MYPASSWORD"
}
session = requests.Session()
login_response = session.post(login_url, headers=headers, data=json.dumps(login_data), verify=False)
if (login_response.status_code == 200):
poe_url = f"https://{gateway['ip']}:{gateway['port']}/api/s/default/rest/device/MYDEVICEID"
json_poe_state_off = '{"port_overrides": [{"port_idx": 6, "portconf_id": "MYPROFILE2"}]}'
post_response = session.put(poe_url, headers=headers, data=json.dumps(json_poe_state_off))
print('Response HTTP Request {request}'.format(request=post_response.request ))
else:
print("Login failed")
```
The login works (I get the 2 security cookies and tried them in Paw (a macOS REST API client) to see if these were ok)) but the second call, the. PUT, returns OK but noting happens.
Before I've done this in Python, I tried all my calls in Paw first and there it works. I tried everything in bash with curl and there it works too. So I am a bit at a loss here.
Anyone has an idea?
Thank you for your time!
Best regards!
Peter
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/3207629?v=4",
"events_url": "https://api.github.com/users/pverhoye/events{/privacy}",
"followers_url": "https://api.github.com/users/pverhoye/followers",
"following_url": "https://api.github.com/users/pverhoye/following{/other_user}",
"gists_url": "https://api.github.com/users/pverhoye/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/pverhoye",
"id": 3207629,
"login": "pverhoye",
"node_id": "MDQ6VXNlcjMyMDc2Mjk=",
"organizations_url": "https://api.github.com/users/pverhoye/orgs",
"received_events_url": "https://api.github.com/users/pverhoye/received_events",
"repos_url": "https://api.github.com/users/pverhoye/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/pverhoye/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/pverhoye/subscriptions",
"type": "User",
"url": "https://api.github.com/users/pverhoye",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5713/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5713/timeline
| null |
completed
| null | null | false |
[
"It's likely your device is being super picky about how the HTTP request looks.\r\n\r\nTry using Wireshark to capture the exact bytes / headers that are being sent by Requests and curl and compare them. If you aren't able to discern a difference attach the output/.pcap files here and we can maybe lend an eye. Without this information it'll be tough to figure out what's causing this.",
"Hi Seth!\r\nThanks for your hint! I tried with Wireshark and saw indeed that the payload was different. The culprit was the json.dumps function which encoded the string by putting a backslash in front of each double quote.\r\nIt works now!\r\n\r\nThanks!"
] |
https://api.github.com/repos/psf/requests/issues/5712
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5712/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5712/comments
|
https://api.github.com/repos/psf/requests/issues/5712/events
|
https://github.com/psf/requests/issues/5712
| 777,299,266 |
MDU6SXNzdWU3NzcyOTkyNjY=
| 5,712 |
Unable to to get request
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/54060657?v=4",
"events_url": "https://api.github.com/users/imzeki/events{/privacy}",
"followers_url": "https://api.github.com/users/imzeki/followers",
"following_url": "https://api.github.com/users/imzeki/following{/other_user}",
"gists_url": "https://api.github.com/users/imzeki/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/imzeki",
"id": 54060657,
"login": "imzeki",
"node_id": "MDQ6VXNlcjU0MDYwNjU3",
"organizations_url": "https://api.github.com/users/imzeki/orgs",
"received_events_url": "https://api.github.com/users/imzeki/received_events",
"repos_url": "https://api.github.com/users/imzeki/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/imzeki/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/imzeki/subscriptions",
"type": "User",
"url": "https://api.github.com/users/imzeki",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 3 |
2021-01-01T15:02:31Z
|
2021-02-10T14:33:45Z
|
2021-01-01T15:10:44Z
|
NONE
|
resolved
|
I was following a tutorial on W3School on how to do get request. The link is here: https://www.w3schools.com/python/ref_requests_get.asp
I expected the status code of 200 when I run the code
## Actual Result
I got a error saying:
File "api2.py", line 3, in <module>
x = requests.get('https://w3schools.com')
File "C:\Users\...\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\api.py", line 76, in get
return request('get', url, params=params, **kwargs)
File "C:\Users\...\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\api.py", line 61, in request
File "C:\Users\...\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\...\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\sessions.py", line 655, in send
r = adapter.send(request, **kwargs)
File "C:\Users\...\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\adapters.py", line 412, in send
conn = self.get_connection(request.url, proxies)
File "C:\Users\...\AppData\Local\Programs\Python\Python39\lib\site-packages\requests\adapters.py", line 315, in get_connection
conn = self.poolmanager.connection_from_url(url)
File "C:\Users\...\AppData\Local\Programs\Python\Python39\lib\urllib3\poolmanager.py", line 293, in connection_from_url
return self.connection_from_host(
File "C:\Users\...\AppData\Local\Programs\Python\Python39\lib\urllib3\poolmanager.py", line 240, in connection_from_host
return self.connection_from_context(request_context)
File "C:\Users\...\AppData\Local\Programs\Python\Python39\lib\urllib3\poolmanager.py", line 253, in connection_from_context
pool_key = pool_key_constructor(request_context)
File "C:\Users\...\AppData\Local\Programs\Python\Python39\lib\urllib3\poolmanager.py", line 119, in _default_key_normalizer
return key_class(**context)
TypeError: <lambda>() got an unexpected keyword argument 'key_strict'
PS C:\Users\zekgo> & C:/Users/../AppData/Local/Programs/Python/Python39/python.exe
## Reproduction Steps
```
import requests
x = requests.get('https://w3schools.com')
print(x.status_code)
```
This command is only available on Requests v2.16.4 and greater. Otherwise,
please provide some basic information about your system (Python version,
operating system, &c).
My Python Version is 3.9, version of requests is 2.25.1 and my operating system is Windows. Any help will be appreciated
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5712/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5712/timeline
| null |
completed
| null | null | false |
[
"This exception is coming from a dependency of requests. Without the information you were asked to provide, we can't provide any guidance as to what might be causing it.",
"Ok understood",
"https://github.com/urllib3/urllib3/commit/b6061f09741082bed3d8b7db10c25bf1ae683a43"
] |
https://api.github.com/repos/psf/requests/issues/5711
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5711/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5711/comments
|
https://api.github.com/repos/psf/requests/issues/5711/events
|
https://github.com/psf/requests/pull/5711
| 777,277,874 |
MDExOlB1bGxSZXF1ZXN0NTQ3NjIyNzEx
| 5,711 |
bump idna has version 3.0 was released
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5310609?v=4",
"events_url": "https://api.github.com/users/naorlivne/events{/privacy}",
"followers_url": "https://api.github.com/users/naorlivne/followers",
"following_url": "https://api.github.com/users/naorlivne/following{/other_user}",
"gists_url": "https://api.github.com/users/naorlivne/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/naorlivne",
"id": 5310609,
"login": "naorlivne",
"node_id": "MDQ6VXNlcjUzMTA2MDk=",
"organizations_url": "https://api.github.com/users/naorlivne/orgs",
"received_events_url": "https://api.github.com/users/naorlivne/received_events",
"repos_url": "https://api.github.com/users/naorlivne/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/naorlivne/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/naorlivne/subscriptions",
"type": "User",
"url": "https://api.github.com/users/naorlivne",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] |
{
"closed_at": "2021-08-25T21:25:16Z",
"closed_issues": 3,
"created_at": "2021-05-20T21:01:54Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/28",
"id": 6778816,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/28/labels",
"node_id": "MDk6TWlsZXN0b25lNjc3ODgxNg==",
"number": 28,
"open_issues": 0,
"state": "closed",
"title": "2.26.0",
"updated_at": "2021-08-25T21:25:16Z",
"url": "https://api.github.com/repos/psf/requests/milestones/28"
}
| 18 |
2021-01-01T12:31:47Z
|
2021-07-07T13:25:21Z
|
2021-07-07T13:25:20Z
|
CONTRIBUTOR
|
off-topic
|
Closes https://github.com/psf/requests/issues/5710
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5711/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5711/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5711.diff",
"html_url": "https://github.com/psf/requests/pull/5711",
"merged_at": "2021-07-07T13:25:20Z",
"patch_url": "https://github.com/psf/requests/pull/5711.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5711"
}
| true |
[
"Hi @naorlivne,\r\n\r\nThanks for the PR. I think this looks fine but there's some mild concern about users on Python 2. idna 3.0 drops Python 2 support and has added an appropriate `python_requires` specifier so in most cases this change will work fine. Users still on Python 2 though tend to have more outdated versions of pip, potentially earlier than 9.0 which may make this breaking.\r\n\r\nI'm wondering if we need to add an environment marker splitting out the dependency range for Python 2 and 3.",
"@nateprewitt the real solution here is for @naorlivne to not try to force dependencies to upgrade when they're untested with other projects (as you can see in the original report, they upgraded `idna` without regard for their other dependencies).",
"@sigmavirus24 what do you suggest? the alternative is that `idna` can never be upgraded when using `requests` from this moment on which opens a whole range of issues as no future bugfixes ( including security fixes) in it will be possible to implement when using `requests`\r\n\r\nThis change doesn't force anyone to use the latest version as it simply increase the range of accepted versions of `idna`, it doesn't lock the version to 3.0 but simply allows it to be used.",
"@nateprewitt note that the real thing to ponder upon here is that requests dependencies are dropping Python 2 support. Requests needs to start preparing for a future where it also must drop Python 2 support. On the whole I don't see harm with @naorlivne's change though assuming there's no API incompatibilities.",
"> This change doesn't force anyone to use the latest version as it simply increase the range of accepted versions of `idna`, it doesn't lock the version to 3.0 but simply allows it to be used.\r\n\r\nSemantically speaking, that range isn't correct for all users of Requests though. Which is why @nateprewitt is pondering using `python_version` to manage the dependencies which will be a nightmare for maintenance.\r\n\r\n> note that the real thing to ponder upon here is that requests dependencies are dropping Python 2 support. Requests needs to start preparing for a future where it also must drop Python 2 support.\r\n\r\nThat's called Requests 3.0\r\n\r\n> On the whole I don't see harm with @naorlivne's change though assuming there's no API incompatibilities.\r\n\r\nThe scant notes don't imply any API incompatibilities: https://github.com/kjd/idna/blob/master/HISTORY.rst#30-2021-01-01 which doesn't guarantee there aren't any",
"\r\n> Semantically speaking, that range isn't correct for all users of Requests though. Which is why @nateprewitt is pondering using `python_version` to manage the dependencies which will be a nightmare for maintenance.\r\n\r\nIt will happen more and more, this is just the first but a lot of packages are deprecating support for Python 2.x so this problem will keep pop up with other dependencies as well so the options are: \r\n\r\n1. That maintenance nightmare (which I think we all agree is indeed a nightmare) will happen on more and more packages\r\n2. Requests stops supporting Python 2.x\r\n3. Every package that drops support to Python 2.x gets version locked which will cause serious security issues as no fixes to them will be made\r\n\r\n\r\n\r\n> \r\n> That's called Requests 3.0\r\n> \r\n\r\nIt's possible this will need to be merged to Requests 3.0 if the decision is that's the version where python 2.x support will be dropped, however I noticed in https://github.com/psf/requests/pull/5660 that there's planned to drop support for Python 3.5 as it's EOL and I do find it weird we are keeping version 2.x support and a much newer version support is dropped, I also think that unless Requests 3.0 is right around the corner it might not be a good idea to wait a long time with this as the problem will only worsen as more time pass.\r\n\r\n> > On the whole I don't see harm with @naorlivne's change though assuming there's no API incompatibilities.\r\n> \r\n> The scant notes don't imply any API incompatibilities: https://github.com/kjd/idna/blob/master/HISTORY.rst#30-2021-01-01 which doesn't guarantee there aren't any\r\n\r\nThe only guarantees in life is death and taxes, we can only work on what's known (including known unknowns) and given that there hasn't been any declared API changes & all tests passed I don't think a worry of a \"what if\" possibly caused by an unknown unknown should be factored in.",
"> The only guarantees in life is death and taxes, we can only work on what's known (including known unknowns) and given that there hasn't been any declared API changes & all tests passed I don't think a worry of a \"what if\" possibly caused by an unknown unknown should be factored in.\r\n\r\nAh, yes. What a package trying to provide stability to governments, corporations, and hobbyists alike needs - flippant attitudes towards functionality",
"Regarding request 3, what is the right place to follow the roadmap? ",
"@sigmavirus24 What do you suggest instead of upgrading then? this change has passed all tests and there is no documented API changes in `idna` anywhere official or unofficial... do you suggest we version lock all packages forever as we may never know 100% noting broke at any change?",
"> do you suggest we version lock all packages forever as we may never know 100% noting broke at any change?\r\n\r\nNo I don't. I suggest taking greater care",
"Can you give suggestion to what greater care you think is needed then? Aside from making sure all tests pass and going through the release data of the new package version I'm not sure what else can be done.",
"@BoboTiG didn't know it's possible to have a python version if in the requirements,txt... guess you learn something every day right?\r\n\r\nAnyway the suggested edit been made, is there anything else needed before this can be merged?",
"FTR @naorlivne here are all markers you can use in requirements files: [PEP 496](https://www.python.org/dev/peps/pep-0496/#micro-language) ;)",
"Is there anything else needed for me to do regarding this ticket before this can be approved & merged?",
"It would be most useful if these could be merged for the next release, we're also stuck with issues due to idna dependencies.",
"I'm also looking forward to this ticket being fixed. It's the only requirement blocking maintenance update. I of course totally understand if you're busy! :)",
"@sigmavirus24 Can you provide any guidance for things that people should look out for in the upgrade from `idna` version 2.10 to 3.1?",
"@jschlyter It's likely this PR will land in the next release. Pinging this PR doesn't change anything except generate noise to 1400 people. Locking this discussion to contributors to avoid this happening again.\r\n\r\n@jayaddison Requests maintainers won't provide any guidance here, instead you should [read the release notes for idna](https://github.com/kjd/idna/blob/master/HISTORY.rst). From my reading there are unlikely to be functional changes for most users."
] |
https://api.github.com/repos/psf/requests/issues/5710
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5710/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5710/comments
|
https://api.github.com/repos/psf/requests/issues/5710/events
|
https://github.com/psf/requests/issues/5710
| 777,277,633 |
MDU6SXNzdWU3NzcyNzc2MzM=
| 5,710 |
idna 3.0 version package conflict
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5310609?v=4",
"events_url": "https://api.github.com/users/naorlivne/events{/privacy}",
"followers_url": "https://api.github.com/users/naorlivne/followers",
"following_url": "https://api.github.com/users/naorlivne/following{/other_user}",
"gists_url": "https://api.github.com/users/naorlivne/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/naorlivne",
"id": 5310609,
"login": "naorlivne",
"node_id": "MDQ6VXNlcjUzMTA2MDk=",
"organizations_url": "https://api.github.com/users/naorlivne/orgs",
"received_events_url": "https://api.github.com/users/naorlivne/received_events",
"repos_url": "https://api.github.com/users/naorlivne/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/naorlivne/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/naorlivne/subscriptions",
"type": "User",
"url": "https://api.github.com/users/naorlivne",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 20 |
2021-01-01T12:29:46Z
|
2021-10-07T20:00:26Z
|
2021-07-07T13:25:20Z
|
CONTRIBUTOR
|
resolved
|
idna released version 3.0 but requests has a dependency on idna<3, this makes it impossible to keep up to date on both packages.
## Expected Result
I want to be able to install the latest idna package alongside the latest requests package
## Actual Result
```
ERROR: Cannot install -r requirements.txt (line 12) and idna==3.0 because these package versions have conflicting dependencies.
The conflict is caused by:
The user requested idna==3.0
requests 2.25.1 depends on idna<3 and >=2.5
```
## Reproduction Steps
try to run `pip install` on a requirements.txt file with
```
requests==2.25.1
idna==3.0
```
## System Information
multiple Python versions (3.6 up to 3.9) running on Docker containers inside Drone CI/CD
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4",
"events_url": "https://api.github.com/users/sethmlarson/events{/privacy}",
"followers_url": "https://api.github.com/users/sethmlarson/followers",
"following_url": "https://api.github.com/users/sethmlarson/following{/other_user}",
"gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sethmlarson",
"id": 18519037,
"login": "sethmlarson",
"node_id": "MDQ6VXNlcjE4NTE5MDM3",
"organizations_url": "https://api.github.com/users/sethmlarson/orgs",
"received_events_url": "https://api.github.com/users/sethmlarson/received_events",
"repos_url": "https://api.github.com/users/sethmlarson/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sethmlarson",
"user_view_type": "public"
}
|
{
"+1": 88,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 88,
"url": "https://api.github.com/repos/psf/requests/issues/5710/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5710/timeline
| null |
completed
| null | null | false |
[
"As `urllib3[secure]` defines\r\n\r\nhttps://github.com/urllib3/urllib3/blob/3f21165969b838fda29898cbd7218ac9578e319b/setup.py#L119\r\n\r\n```py\r\n\"idna>=2.0.0\",\r\n```\r\n\r\nThis causes `pip` 20.2.1 to show warning:\r\n\r\n```\r\n> pip install urllib3[secure] requests --force-reinstall\r\n...\r\nCollecting idna>=2.0.0; extra == \"secure\"\r\n Using cached idna-3.1-py3-none-any.whl (58 kB)\r\n...\r\nrequests 2.25.1 requires idna<3,>=2.5, but you'll have idna 3.1 which is incompatible.\r\n```",
"This conflict between `urllib3` and `requests` has left my automated builds broken for the last few weeks. We've been coping with manual deployments, but it hurts to not be able to use Travis CI. What's a recommended workaround?",
"@jace `urllib3[secure]` requires `idna>=2`, `requests` requires `idna>=2,<3`, so if you install `idna==2.10` there won't be any problem? Nothing should be breaking as a result of this, dependencies should be getting resolved by pip properly since version constraints are being set.",
"@sethmlarson\r\nWe should ignore this ERROR message from `pip`?\r\n```\r\n$ python3 -m pip install --upgrade idna \r\nRequirement already satisfied: idna in /usr/local/lib/python3.9/site-packages (2.5)\r\nCollecting idna\r\n Using cached idna-3.1-py3-none-any.whl (58 kB)\r\nInstalling collected packages: idna\r\n Attempting uninstall: idna\r\n Found existing installation: idna 2.5\r\n Uninstalling idna-2.5:\r\n Successfully uninstalled idna-2.5\r\nERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\r\nrequests 2.25.1 requires idna<3,>=2.5, but you have idna 3.1 which is incompatible.\r\nSuccessfully installed idna-3.1\r\n```\r\n",
"@sethmlarson how long do we have to keep using older version of `idna`? This is essentially blocking us from using latest version of it. I don't think that should be suggested to anyone.",
"Upgrading idna requires dropping python 2.7 so whenever that happens unless someone adds python_version markers to the dependancies",
"@RhinosF1 There's already a pull request with version markers waiting for approval/merging https://github.com/psf/requests/pull/5711 - my hope is to have it merged into the next release but as the conversation about it been locked for contributors only (which being my first PR for requests I'm not yet are) I can't make sure that will happen.",
"> @RhinosF1 There's already a pull request with version markers waiting for approval/merging #5711 - my hope is to have it merged into the next release but as the conversation about it been locked for contributors only (which being my first PR for requests I'm not yet are) I can't make sure that will happen.\r\n\r\nDidn't see that, thanks!",
"The PR that fixes this and lets people use updated version of both packages has been blocked because this one entry might become a \"maintenance nightmare\" until `requests` drops Py2 support? The alternative being all the `requests` users will have a broken build or have to use a downgraded version of `idna` until `requests` decides to let us update both. That's just great.",
"@iambibhas That's not how I understood it, my understanding was that they are waiting for the next release for it and locked the conversation because they had too many people asking when it's due but I could be wrong",
"@naorlivne apologies. You're right. I missed parts of the thread. Will keep an eye for the release.",
"pip install idna==2.10\r\nLooking in indexes: https://pypi.python.org/pypi/\r\nERROR: Could not find a version that satisfies the requirement idna==2.10\r\nERROR: No matching distribution found for idna==2.10\r\n\r\n\r\ncan't really install 2.10 ",
"~Yes, it appears that the idna project has removed pre 3.x versions of the package from PyPi as a measure to remove support for Python 2 I believe -- they appear to have noticed that this has affected requests users [and are tracking that with an issue on their side](https://github.com/kjd/idna/issues/98).~",
"@almssp @ViktorHaag [The 2.x versions are still present and not yanked](https://pypi.org/project/idna/#history).",
"My apologies; I was looking in the wrong place... I looked at only the download files for the headrev release, and not the page for the older 2.10 release.\r\n",
"Version 2.05 works for me, but it's a bit annoying to get that error during build on Heroku. Any idea how I can ignore it?",
"This has been blocking my update to latest idna for three months now (https://github.com/hartwork/wnpp.debian.net/pull/13 (edit, and now https://github.com/hartwork/wnpp.debian.net/pull/116)).\r\nIf someone at @psf wants to team up on fixing this, please contact me through e-mail, maybe I can help.",
"@hartwork https://github.com/psf/requests/pull/5711 will fix this, @psf are just sitting on it until the next release",
"🙏 Is there an ETA for when #5711's patch goes into a PyPI release?",
"@jace We do not currently have a solidified date but are working on getting things ready. You can follow the status of #5868 for when we perform the release."
] |
https://api.github.com/repos/psf/requests/issues/5709
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5709/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5709/comments
|
https://api.github.com/repos/psf/requests/issues/5709/events
|
https://github.com/psf/requests/issues/5709
| 774,742,813 |
MDU6SXNzdWU3NzQ3NDI4MTM=
| 5,709 |
Cookies are lost on 307 redirects
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/6244078?v=4",
"events_url": "https://api.github.com/users/gil9red/events{/privacy}",
"followers_url": "https://api.github.com/users/gil9red/followers",
"following_url": "https://api.github.com/users/gil9red/following{/other_user}",
"gists_url": "https://api.github.com/users/gil9red/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gil9red",
"id": 6244078,
"login": "gil9red",
"node_id": "MDQ6VXNlcjYyNDQwNzg=",
"organizations_url": "https://api.github.com/users/gil9red/orgs",
"received_events_url": "https://api.github.com/users/gil9red/received_events",
"repos_url": "https://api.github.com/users/gil9red/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gil9red/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gil9red/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gil9red",
"user_view_type": "public"
}
|
[] |
open
| false | null |
[] | null | 1 |
2020-12-25T14:42:22Z
|
2020-12-25T14:58:28Z
| null |
NONE
| null |
When I submit a request, I get a 307 response and a cookie. But when the request is repeated, the cookie is not added.
Added request logging:
```
try:
import http.client as http_client
except ImportError:
# Python 2
import httplib as http_client
http_client.HTTPConnection.debuglevel = 1
# You must initialize logging, otherwise you'll not see debug output.
import logging
logging.basicConfig()
logging.getLogger().setLevel(logging.DEBUG)
requests_log = logging.getLogger("requests.packages.urllib3")
requests_log.setLevel(logging.DEBUG)
requests_log.propagate = True
```
Result (first 2 requests):
```
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): www.dns-shop.ru:443
send: b'GET /product/4d664a0d90d61b80/processor-amd-ryzen-7-3700x-oem/ HTTP/1.1\r\nHost: www.dns-shop.ru\r\nUser-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:84.0) Gecko/20100101 Firefox/84.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'
DEBUG:urllib3.connectionpool:https://www.dns-shop.ru:443 "GET /product/4d664a0d90d61b80/processor-amd-ryzen-7-3700x-oem/ HTTP/1.1" 307 0
reply: 'HTTP/1.1 307 TemporaryRedirect\r\n'
header: Server: Variti/0.9.3a
header: Date: Fry, 25 Dec 2020 14:3753 GMT
header: Set-Cookie: ipp_uid_tst=1608907073940/2LhuRHu-NT--na0AX4hLKg; Expires=; Domain=; Path=/
header: Location: /product/4d664a0d90d61b80/processor-amd-ryzen-7-3700x-oem/
header: Connection: keep-alive
header: Keep-Alive: timeout=60
header: Content-Length: 0
send: b'GET /product/4d664a0d90d61b80/processor-amd-ryzen-7-3700x-oem/ HTTP/1.1\r\nHost: www.dns-shop.ru\r\nUser-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:84.0) Gecko/20100101 Firefox/84.0\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\n\r\n'
DEBUG:urllib3.connectionpool:https://www.dns-shop.ru:443 "GET /product/4d664a0d90d61b80/processor-amd-ryzen-7-3700x-oem/ HTTP/1.1" 307 0
reply: 'HTTP/1.1 307 TemporaryRedirect\r\n'
header: Server: Variti/0.9.3a
header: Date: Fry, 25 Dec 2020 14:3754 GMT
header: Set-Cookie: ipp_uid_tst=1608907074028/oOg1XsGU12ZE8PkUludY5g; Expires=; Domain=; Path=/
header: Location: /product/4d664a0d90d61b80/processor-amd-ryzen-7-3700x-oem/
header: Connection: keep-alive
header: Keep-Alive: timeout=60
header: Content-Length: 0
```
## Expected Result
HTTP 200, OK
## Actual Result
```
Traceback (most recent call last):
File "C:/Users/ipetrash/Projects/SimplePyScripts/html_parsing/www_dns_shop_ru/get_price.py", line 17, in <module>
rs = requests.get('https://www.dns-shop.ru/')
File "C:\Users\ipetrash\Anaconda3\lib\site-packages\requests\api.py", line 76, in get
return request('get', url, params=params, **kwargs)
File "C:\Users\ipetrash\Anaconda3\lib\site-packages\requests\api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Users\ipetrash\Anaconda3\lib\site-packages\requests\sessions.py", line 542, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\ipetrash\Anaconda3\lib\site-packages\requests\sessions.py", line 677, in send
history = [resp for resp in gen]
File "C:\Users\ipetrash\Anaconda3\lib\site-packages\requests\sessions.py", line 677, in <listcomp>
history = [resp for resp in gen]
File "C:\Users\ipetrash\Anaconda3\lib\site-packages\requests\sessions.py", line 166, in resolve_redirects
raise TooManyRedirects('Exceeded {} redirects.'.format(self.max_redirects), response=resp)
requests.exceptions.TooManyRedirects: Exceeded 30 redirects.
```
## Reproduction Steps
```python
import requests
session = requests.session()
rs = session.get('https://www.dns-shop.ru/')
print(rs)
```
## System Information
$ python -m requests.help
```
{
"chardet": {
"version": "3.0.4"
},
"cryptography": {
"version": "2.7"
},
"idna": {
"version": "2.8"
},
"implementation": {
"name": "CPython",
"version": "3.7.3"
},
"platform": {
"release": "10",
"system": "Windows"
},
"pyOpenSSL": {
"openssl_version": "1010103f",
"version": "17.2.0"
},
"requests": {
"version": "2.25.1"
},
"system_ssl": {
"version": "1010108f"
},
"urllib3": {
"version": "1.24.2"
},
"using_pyopenssl": true
}
```
| null |
{
"+1": 2,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 2,
"url": "https://api.github.com/repos/psf/requests/issues/5709/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5709/timeline
| null | null | null | null | false |
[
"Rewrote the code to manually handle redirects and forwarding cookies.\r\n\r\nPS. after I sent the cookie, there was no 307 on subsequent requests.\r\n\r\nThey have interesting protection ...\r\n\r\n[Code:](https://github.com/gil9red/SimplePyScripts/blob/7c3fc621399d7ec32bfde9174d5d47dfed30282d/html_parsing/www_dns_shop_ru/get_price.py#L31)\r\n```\r\nfrom http.cookies import SimpleCookie\r\nfrom typing import Optional\r\nfrom urllib.parse import urljoin\r\n\r\nimport requests\r\n\r\nsession = requests.session()\r\nsession.headers['User-Agent'] = 'Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:84.0) Gecko/20100101 Firefox/84.0'\r\n\r\ndef get_price(url: str) -> Optional[int]:\r\n cookies = None\r\n attempts = 30\r\n\r\n while True:\r\n attempts -= 1\r\n if attempts <= 0:\r\n raise Exception('Too many redirects!')\r\n\r\n rs = session.get(url, allow_redirects=False, cookies=cookies)\r\n\r\n redirect_url = rs.headers.get('Location')\r\n if redirect_url:\r\n url = urljoin(rs.url, redirect_url)\r\n if rs.cookies:\r\n cookies = rs.cookies\r\n continue\r\n\r\n cookies = rs.headers.get('Set-Cookie')\r\n if cookies:\r\n cookies = {key: value.value for key, value in SimpleCookie(cookies).items()}\r\n\r\n continue\r\n\r\n break\r\n ...\r\n```"
] |
https://api.github.com/repos/psf/requests/issues/5708
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5708/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5708/comments
|
https://api.github.com/repos/psf/requests/issues/5708/events
|
https://github.com/psf/requests/issues/5708
| 774,694,700 |
MDU6SXNzdWU3NzQ2OTQ3MDA=
| 5,708 |
Can requests work with file:/// urls?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/71920621?v=4",
"events_url": "https://api.github.com/users/amaank404/events{/privacy}",
"followers_url": "https://api.github.com/users/amaank404/followers",
"following_url": "https://api.github.com/users/amaank404/following{/other_user}",
"gists_url": "https://api.github.com/users/amaank404/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/amaank404",
"id": 71920621,
"login": "amaank404",
"node_id": "MDQ6VXNlcjcxOTIwNjIx",
"organizations_url": "https://api.github.com/users/amaank404/orgs",
"received_events_url": "https://api.github.com/users/amaank404/received_events",
"repos_url": "https://api.github.com/users/amaank404/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/amaank404/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/amaank404/subscriptions",
"type": "User",
"url": "https://api.github.com/users/amaank404",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2020-12-25T09:40:25Z
|
2021-08-28T00:05:59Z
|
2020-12-25T16:10:41Z
|
NONE
|
resolved
|
for example my application uses requests for downloading some special data. i want that to be compatible with `file:///` type urls while also supporting `http://`. does requests have that ability builtin or not?
Also if you know a work around please help me
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5708/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5708/timeline
| null |
completed
| null | null | false |
[
"Hi there! Thanks for opening this issue. Unfortunately, it seems this is a request for help instead of a report of a defect in the project. Please use [StackOverflow](https://stackoverflow.com) for general usage questions instead and only report defects here."
] |
https://api.github.com/repos/psf/requests/issues/5707
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5707/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5707/comments
|
https://api.github.com/repos/psf/requests/issues/5707/events
|
https://github.com/psf/requests/pull/5707
| 774,541,972 |
MDExOlB1bGxSZXF1ZXN0NTQ1NDQ5Njg1
| 5,707 |
Avoid zip extract racing condition by using read+write instead extract
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/690238?v=4",
"events_url": "https://api.github.com/users/gaborbernat/events{/privacy}",
"followers_url": "https://api.github.com/users/gaborbernat/followers",
"following_url": "https://api.github.com/users/gaborbernat/following{/other_user}",
"gists_url": "https://api.github.com/users/gaborbernat/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/gaborbernat",
"id": 690238,
"login": "gaborbernat",
"node_id": "MDQ6VXNlcjY5MDIzOA==",
"organizations_url": "https://api.github.com/users/gaborbernat/orgs",
"received_events_url": "https://api.github.com/users/gaborbernat/received_events",
"repos_url": "https://api.github.com/users/gaborbernat/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/gaborbernat/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/gaborbernat/subscriptions",
"type": "User",
"url": "https://api.github.com/users/gaborbernat",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] |
{
"closed_at": "2021-08-25T21:25:16Z",
"closed_issues": 3,
"created_at": "2021-05-20T21:01:54Z",
"creator": {
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
},
"description": "",
"due_on": null,
"html_url": "https://github.com/psf/requests/milestone/28",
"id": 6778816,
"labels_url": "https://api.github.com/repos/psf/requests/milestones/28/labels",
"node_id": "MDk6TWlsZXN0b25lNjc3ODgxNg==",
"number": 28,
"open_issues": 0,
"state": "closed",
"title": "2.26.0",
"updated_at": "2021-08-25T21:25:16Z",
"url": "https://api.github.com/repos/psf/requests/milestones/28"
}
| 2 |
2020-12-24T19:20:04Z
|
2021-12-28T19:23:15Z
|
2021-07-07T00:14:52Z
|
CONTRIBUTOR
|
resolved
|
Extract also creates the folder hierarchy, however we do not need that,
the file itself being extracted to a temporary folder is good enough.
Instead we read the content of the zip and then write it. The write is
not locked but it's OK to update the same file multiple times given the
update operation will not alter the content of the file. By not creating
the folder hierarchy (default via extract) we no longer can run into the
problem of two parallel extracts both trying to create the folder
hierarchy without exists ok flag, and one must fail.
Resolves #5223.
Signed-off-by: Bernát Gábor <[email protected]>
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 1,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 1,
"url": "https://api.github.com/repos/psf/requests/issues/5707/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5707/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5707.diff",
"html_url": "https://github.com/psf/requests/pull/5707",
"merged_at": "2021-07-07T00:14:52Z",
"patch_url": "https://github.com/psf/requests/pull/5707.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5707"
}
| true |
[
"@sigmavirus24 @nateprewitt would you be able to accept this, so we can fix a long outstanding pip issue? Thanks!",
"@sigmavirus24 @nateprewitt do you have any plan to have a look at this?"
] |
https://api.github.com/repos/psf/requests/issues/5706
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5706/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5706/comments
|
https://api.github.com/repos/psf/requests/issues/5706/events
|
https://github.com/psf/requests/issues/5706
| 774,303,732 |
MDU6SXNzdWU3NzQzMDM3MzI=
| 5,706 |
Using session object in a multiprocess pool
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/2996964?v=4",
"events_url": "https://api.github.com/users/g1patnaik/events{/privacy}",
"followers_url": "https://api.github.com/users/g1patnaik/followers",
"following_url": "https://api.github.com/users/g1patnaik/following{/other_user}",
"gists_url": "https://api.github.com/users/g1patnaik/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/g1patnaik",
"id": 2996964,
"login": "g1patnaik",
"node_id": "MDQ6VXNlcjI5OTY5NjQ=",
"organizations_url": "https://api.github.com/users/g1patnaik/orgs",
"received_events_url": "https://api.github.com/users/g1patnaik/received_events",
"repos_url": "https://api.github.com/users/g1patnaik/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/g1patnaik/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/g1patnaik/subscriptions",
"type": "User",
"url": "https://api.github.com/users/g1patnaik",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2020-12-24T09:54:45Z
|
2021-08-28T00:05:59Z
|
2020-12-24T14:15:57Z
|
NONE
|
resolved
|
Hi,
I am trying to run parallel requests to server. And my understanding is that it is only possible by having multiple connections created to the server. And I think somehow the Session object is doing that -> automatically creating multiple connections to the server from a single session object or it could be that a new scope of session object is created when a new process is forked and hence, new connection is created.
Now, I want to send a bulk of requests to the process pool such that the requests are streamlined in to the existing connections.
ie., Let's say if I have 50 requests and processpool of 20 and using map function, I can submit the 50 requests in bulk to the process pool. Internally, it the splits the 50 requests to 20 sets based on some hash value I think and they are submitted to each process in the pool.
Issue is that I expected a maximum of 20 connections will be created and which will stay alive and handles all the existing requests. However, I see the connections are being recycled and I couldn't find a pattern.
However, I could see the issue only when I give the process pool a value higher than 10 and if I give a value less than 10, the connections doesn't recycle.
I have tried disabling the garbage collector import gc; gc.disable() and still connections are recycled.
```
#!/usr/bin/env python3
import time
import requests
from multiprocessing.dummy import Pool as ProcessPool, current_process as processpool_process
def get_list_of_ids():
some_calls()
return(id_list)
def check_ids(id):
url = f"https://myserver.com/check/{ id }"
json_op = s.get(url,verify=False).json()
value = json_op['id']
print(str(value) + '-' + str(processpool_process()) + str(id(s)))
def main():
pool = ProcessPool(processes=20)
while True:
pool.map(check_ids, get_list_of_ids())
print("Let's wait for 10 seconds")
time.sleep(10)
if __name__ == "__main__":
s = requests.Session()
s.headers.update = {
'Accept': 'application/json'
}
main()
```
Output Example:
```
4-<DummyProcess(Thread-2, started daemon 140209222559488)>140209446508360
5-<DummyProcess(Thread-5, started daemon 140209123481344)>140209446508360
7-<DummyProcess(Thread-6, started daemon 140209115088640)>140209446508360
2-<DummyProcess(Thread-11, started daemon 140208527894272)>140209446508360
None-<DummyProcess(Thread-1, started daemon 140209230952192)>140209446508360
10-<DummyProcess(Thread-4, started daemon 140209131874048)>140209446508360
12-<DummyProcess(Thread-7, started daemon 140209106695936)>140209446508360
8-<DummyProcess(Thread-3, started daemon 140209140266752)>140209446508360
6-<DummyProcess(Thread-12, started daemon 140208519501568)>140209446508360
3-<DummyProcess(Thread-13, started daemon 140208511108864)>140209446508360
11-<DummyProcess(Thread-10, started daemon 140208536286976)>140209446508360
9-<DummyProcess(Thread-9, started daemon 140209089910528)>140209446508360
1-<DummyProcess(Thread-8, started daemon 140209098303232)>140209446508360
Let's wait for 10 seconds
None-<DummyProcess(Thread-14, started daemon 140208502716160)>140209446508360
3-<DummyProcess(Thread-20, started daemon 140208108455680)>140209446508360
1-<DummyProcess(Thread-19, started daemon 140208116848384)>140209446508360
7-<DummyProcess(Thread-17, started daemon 140208133633792)>140209446508360
6-<DummyProcess(Thread-6, started daemon 140209115088640)>140209446508360
4-<DummyProcess(Thread-4, started daemon 140209131874048)>140209446508360
9-<DummyProcess(Thread-16, started daemon 140208485930752)>140209446508360
5-<DummyProcess(Thread-15, started daemon 140208494323456)>140209446508360
2-<DummyProcess(Thread-2, started daemon 140209222559488)>140209446508360
8-<DummyProcess(Thread-18, started daemon 140208125241088)>140209446508360
11-<DummyProcess(Thread-1, started daemon 140209230952192)>140209446508360
10-<DummyProcess(Thread-11, started daemon 140208527894272)>140209446508360
12-<DummyProcess(Thread-5, started daemon 140209123481344)>140209446508360
Let's wait for 10 seconds
None-<DummyProcess(Thread-3, started daemon 140209140266752)>140209446508360
2-<DummyProcess(Thread-10, started daemon 140208536286976)>140209446508360
1-<DummyProcess(Thread-12, started daemon 140208519501568)>140209446508360
4-<DummyProcess(Thread-9, started daemon 140209089910528)>140209446508360
5-<DummyProcess(Thread-14, started daemon 140208502716160)>140209446508360
9-<DummyProcess(Thread-6, started daemon 140209115088640)>140209446508360
8-<DummyProcess(Thread-16, started daemon 140208485930752)>140209446508360
7-<DummyProcess(Thread-4, started daemon 140209131874048)>140209446508360
3-<DummyProcess(Thread-20, started daemon 140208108455680)>140209446508360
6-<DummyProcess(Thread-8, started daemon 140209098303232)>140209446508360
12-<DummyProcess(Thread-13, started daemon 140208511108864)>140209446508360
10-<DummyProcess(Thread-7, started daemon 140209106695936)>140209446508360
11-<DummyProcess(Thread-19, started daemon 140208116848384)>140209446508360
Let's wait for 10 seconds
.
.
```
Is it the correct way of creating batch requests using session object to a multi process pool and at the same time have optimal use of resources?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5706/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5706/timeline
| null |
completed
| null | null | false |
[
"> Issue is that I expected a maximum of 20 connections will be created and which will stay alive and handles all the existing requests. However, I see the connections are being recycled and I couldn't find a pattern.\r\n\r\nThere are so many potential problems happening here, not the least of which being a fundamental misunderstanding of how server connections work in real life and how much control a session has over them. Your issue description has barely enough detail to elaborate on these, but I'll try.\r\n\r\nMost reasonable, moderately resource-conscious servers will never let connections sit open for over 10 seconds, let alone 20 such connections from one client. The session can not prevent the server from closing those connections and can only try its best to reuse connections to the server while able, but if the server closes the connection there's nothing the session can do to stop it.\r\n\r\nOn top of that, the `multiprocessing` library has many different ways of forking different threads. In your example, you're using [threading](https://docs.python.org/3/library/multiprocessing.html#module-multiprocessing.dummy) (via `multiprocessing.dummy` which is ideal for lots of IO but not for actual computation in parallel (which could be contributing to your issue but I wouldn't know because there's so much likely missing from your example). So your computation here could be enough to slow you down and let the server close the connection. Further, one shared session will not work with all of the ways `multiprocessing` forks. You cannot share the connections or connection pools for each of those methods and I will not illuminate which of those are which as they're well documented by the `multiprocessing` library.\r\n\r\nFinally, your output here does not actually demonstrate the problem you're describing. The session instance seems to be the same across all of the threads so I don't know how your reproduction case or its output is intended to convey that you're connections are being allowed to close and then being re-opened.\r\n\r\nThis, not being a bug with Requests, I'm closing this issue. If you have further support needs, please use [StackOverflow](https://stackoverflow.com)"
] |
https://api.github.com/repos/psf/requests/issues/5705
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5705/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5705/comments
|
https://api.github.com/repos/psf/requests/issues/5705/events
|
https://github.com/psf/requests/pull/5705
| 773,491,771 |
MDExOlB1bGxSZXF1ZXN0NTQ0NTczODA3
| 5,705 |
Update README.md
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/71920621?v=4",
"events_url": "https://api.github.com/users/amaank404/events{/privacy}",
"followers_url": "https://api.github.com/users/amaank404/followers",
"following_url": "https://api.github.com/users/amaank404/following{/other_user}",
"gists_url": "https://api.github.com/users/amaank404/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/amaank404",
"id": 71920621,
"login": "amaank404",
"node_id": "MDQ6VXNlcjcxOTIwNjIx",
"organizations_url": "https://api.github.com/users/amaank404/orgs",
"received_events_url": "https://api.github.com/users/amaank404/received_events",
"repos_url": "https://api.github.com/users/amaank404/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/amaank404/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/amaank404/subscriptions",
"type": "User",
"url": "https://api.github.com/users/amaank404",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2020-12-23T05:14:15Z
|
2021-08-27T00:08:53Z
|
2020-12-23T14:19:41Z
|
NONE
|
resolved
|
A liitle bit of stats update 😄
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5705/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5705/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5705.diff",
"html_url": "https://github.com/psf/requests/pull/5705",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5705.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5705"
}
| true |
[
"Hi @xcodz-dot, thanks for sending us a PR. 500,000+ by default includes 750,000. I don't think keeping these \"up-to-date\" is particularly important. If it were, we should be uploading the downloads per month count and lots of other things here. As it stands, these statistics aren't the most important thing to manage or update"
] |
https://api.github.com/repos/psf/requests/issues/5704
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5704/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5704/comments
|
https://api.github.com/repos/psf/requests/issues/5704/events
|
https://github.com/psf/requests/issues/5704
| 772,965,956 |
MDU6SXNzdWU3NzI5NjU5NTY=
| 5,704 |
Documentation bug - missing cookies documentation
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/73482956?v=4",
"events_url": "https://api.github.com/users/ascopes/events{/privacy}",
"followers_url": "https://api.github.com/users/ascopes/followers",
"following_url": "https://api.github.com/users/ascopes/following{/other_user}",
"gists_url": "https://api.github.com/users/ascopes/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/ascopes",
"id": 73482956,
"login": "ascopes",
"node_id": "MDQ6VXNlcjczNDgyOTU2",
"organizations_url": "https://api.github.com/users/ascopes/orgs",
"received_events_url": "https://api.github.com/users/ascopes/received_events",
"repos_url": "https://api.github.com/users/ascopes/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/ascopes/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/ascopes/subscriptions",
"type": "User",
"url": "https://api.github.com/users/ascopes",
"user_view_type": "public"
}
|
[
{
"color": "f7c6c7",
"default": false,
"description": null,
"id": 167537670,
"name": "Propose Close",
"node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=",
"url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close"
},
{
"color": "fef2c0",
"default": false,
"description": null,
"id": 298537994,
"name": "Needs More Information",
"node_id": "MDU6TGFiZWwyOTg1Mzc5OTQ=",
"url": "https://api.github.com/repos/psf/requests/labels/Needs%20More%20Information"
}
] |
closed
| true | null |
[] | null | 6 |
2020-12-22T13:56:59Z
|
2021-08-28T00:05:58Z
|
2021-01-18T04:08:11Z
|
NONE
|
resolved
|
On the current latest documentation, there is a `cookies` link under the `quickstart` option.

We can see that this section should be displayed between `Response Headers` and `Redirection History`.

This section does not currently exist, and the link in the sidebar list is dead - clicking it does nothing.
Is this section missing, or should this link be removed?
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/5271761?v=4",
"events_url": "https://api.github.com/users/nateprewitt/events{/privacy}",
"followers_url": "https://api.github.com/users/nateprewitt/followers",
"following_url": "https://api.github.com/users/nateprewitt/following{/other_user}",
"gists_url": "https://api.github.com/users/nateprewitt/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/nateprewitt",
"id": 5271761,
"login": "nateprewitt",
"node_id": "MDQ6VXNlcjUyNzE3NjE=",
"organizations_url": "https://api.github.com/users/nateprewitt/orgs",
"received_events_url": "https://api.github.com/users/nateprewitt/received_events",
"repos_url": "https://api.github.com/users/nateprewitt/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/nateprewitt/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/nateprewitt/subscriptions",
"type": "User",
"url": "https://api.github.com/users/nateprewitt",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5704/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5704/timeline
| null |
completed
| null | null | false |
[
"This section is missing!\r\n",
"Do we have a template for what information we would want to put in here? I would happily put up a PR if so.",
"The `Cookies` section under `Developer Interface` is missing as well",
"What domain are you accessing the documentation at? Both sections appear at:\r\n\r\nhttps://requests.readthedocs.io/en/master/\r\nhttps://requests.readthedocs.io/en/master/user/quickstart/#cookies\r\nhttps://requests.readthedocs.io/en/master/api/#cookies\r\n\r\nThey are also present in the source files under the docs in this repository. \r\n\r\nOf those affected, how many are using 3rd party browser extensions? Is it possible this content is being removed client side?",
"Looks like the problem I originally reported is fixed now, and I can see it at the links you gave! Perhaps GitHub was serving an outdated cache at the time?\r\n\r\nI am happy to close this, but will leave it for you guys to decide since it sounds like some other people might be having issues too.\r\n\r\nThanks for getting back to me.",
"> Looks like the problem I originally reported is fixed now\r\n\r\nNothing changed"
] |
https://api.github.com/repos/psf/requests/issues/5703
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5703/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5703/comments
|
https://api.github.com/repos/psf/requests/issues/5703/events
|
https://github.com/psf/requests/issues/5703
| 772,654,080 |
MDU6SXNzdWU3NzI2NTQwODA=
| 5,703 |
BUG REPORT: 2.25.1 breaks SSL for proxy connections. Reverting back to 2.24.0 fixes issue.
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/180487?v=4",
"events_url": "https://api.github.com/users/vgoklani/events{/privacy}",
"followers_url": "https://api.github.com/users/vgoklani/followers",
"following_url": "https://api.github.com/users/vgoklani/following{/other_user}",
"gists_url": "https://api.github.com/users/vgoklani/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/vgoklani",
"id": 180487,
"login": "vgoklani",
"node_id": "MDQ6VXNlcjE4MDQ4Nw==",
"organizations_url": "https://api.github.com/users/vgoklani/orgs",
"received_events_url": "https://api.github.com/users/vgoklani/received_events",
"repos_url": "https://api.github.com/users/vgoklani/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/vgoklani/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vgoklani/subscriptions",
"type": "User",
"url": "https://api.github.com/users/vgoklani",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2020-12-22T05:33:21Z
|
2021-08-28T00:06:00Z
|
2020-12-23T03:01:53Z
|
NONE
|
resolved
|
Using requests 2.25.1:
session = requests.session()
session.proxies.update(proxies_dict)
should setup connections over our proxy server.
Unfortunately this is what we see:
SSLError: HTTPSConnectionPool(host='api.xyz.com', port=443): Max retries exceeded with url: /my/url (Caused by SSLError(SSLError(1, '[SSL: WRONG_VERSION_NUMBER] wrong version number (_ssl.c:1123)')))
The error disappears when reverting back to 2.24.0
Please help!
thanks 👍
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/180487?v=4",
"events_url": "https://api.github.com/users/vgoklani/events{/privacy}",
"followers_url": "https://api.github.com/users/vgoklani/followers",
"following_url": "https://api.github.com/users/vgoklani/following{/other_user}",
"gists_url": "https://api.github.com/users/vgoklani/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/vgoklani",
"id": 180487,
"login": "vgoklani",
"node_id": "MDQ6VXNlcjE4MDQ4Nw==",
"organizations_url": "https://api.github.com/users/vgoklani/orgs",
"received_events_url": "https://api.github.com/users/vgoklani/received_events",
"repos_url": "https://api.github.com/users/vgoklani/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/vgoklani/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/vgoklani/subscriptions",
"type": "User",
"url": "https://api.github.com/users/vgoklani",
"user_view_type": "public"
}
|
{
"+1": 4,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 4,
"url": "https://api.github.com/repos/psf/requests/issues/5703/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5703/timeline
| null |
completed
| null | null | false |
[
"I guess this urllib3 bug is related: https://github.com/urllib3/urllib3/issues/2075\r\nSince requests 2.25.1 urllib 1.16 is used, which fixed a bug regarding proxies and https connections.\r\n\r\nThe 'solution' posted there fixed the issue for me."
] |
https://api.github.com/repos/psf/requests/issues/5702
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5702/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5702/comments
|
https://api.github.com/repos/psf/requests/issues/5702/events
|
https://github.com/psf/requests/pull/5702
| 771,580,659 |
MDExOlB1bGxSZXF1ZXN0NTQzMDYzMjIy
| 5,702 |
Don't perform authentication if no credential provided
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/59408894?v=4",
"events_url": "https://api.github.com/users/shelld3v/events{/privacy}",
"followers_url": "https://api.github.com/users/shelld3v/followers",
"following_url": "https://api.github.com/users/shelld3v/following{/other_user}",
"gists_url": "https://api.github.com/users/shelld3v/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/shelld3v",
"id": 59408894,
"login": "shelld3v",
"node_id": "MDQ6VXNlcjU5NDA4ODk0",
"organizations_url": "https://api.github.com/users/shelld3v/orgs",
"received_events_url": "https://api.github.com/users/shelld3v/received_events",
"repos_url": "https://api.github.com/users/shelld3v/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/shelld3v/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shelld3v/subscriptions",
"type": "User",
"url": "https://api.github.com/users/shelld3v",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 4 |
2020-12-20T12:46:32Z
|
2021-08-27T00:08:38Z
|
2020-12-20T13:35:06Z
|
NONE
|
resolved
|
## The old behavior:
```python
>>> requests.get('http://127.0.0.1:8080', auth=(None, None))
```
```http
GET / HTTP/1.1
Host: 127.0.0.1:8080
User-Agent: python-requests/2.25.0
Accept-Encoding: gzip, deflate
Accept: */*
Connection: keep-alive
Authorization: Basic Tm9uZTpOb25l (None:None)
```
This is weird, right!
## Now:
```python
>>> requests.get('http://127.0.0.1:8080', auth=(None, None))
```
```http
GET / HTTP/1.1
Host: 127.0.0.1:8080
User-Agent: python-requests/2.25.0
Accept-Encoding: gzip, deflate
Accept: */*
Connection: keep-alive
```
(No authentication)
This will be easier for applications that allow auth username and password from arguments:
```python
response = self.session.request(
self.httpmethod,
url,
data=self.data,
proxies=proxy,
verify=False,
allow_redirects=self.redirect,
headers=dict(self.headers),
timeout=self.timeout,
auth=(self.user, self.pass)
)
```
If the user didn't provide credentials, which means they don't need to authenticate, then `self.user` will be `None` and `self.pass` will be `None` too. And in this situation, `requests` with my update won't perform authentication!
But in the old version (without my update), we must do this:
```python
if self.user and self.pass:
response = self.session.request(
self.httpmethod,
url,
data=self.data,
proxies=proxy,
verify=False,
allow_redirects=self.redirect,
headers=dict(self.headers),
timeout=self.timeout,
auth=(self.user, self.pass)
)
else:
response = self.session.request(
self.httpmethod,
url,
data=self.data,
proxies=proxy,
verify=False,
allow_redirects=self.redirect,
headers=dict(self.headers),
timeout=self.timeout
)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5702/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5702/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5702.diff",
"html_url": "https://github.com/psf/requests/pull/5702",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5702.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5702"
}
| true |
[
"Also, None username or password will become an empty username/password",
"Requests is in a feature-freeze and this is strictly backwards incompatible",
"Hi, what's the problem? I just want to help!!",
"Totally, change the `None` object to the username `None` (string) is really crazy"
] |
https://api.github.com/repos/psf/requests/issues/5701
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5701/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5701/comments
|
https://api.github.com/repos/psf/requests/issues/5701/events
|
https://github.com/psf/requests/pull/5701
| 771,567,638 |
MDExOlB1bGxSZXF1ZXN0NTQzMDU0MDU1
| 5,701 |
Problem fix
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/59408894?v=4",
"events_url": "https://api.github.com/users/shelld3v/events{/privacy}",
"followers_url": "https://api.github.com/users/shelld3v/followers",
"following_url": "https://api.github.com/users/shelld3v/following{/other_user}",
"gists_url": "https://api.github.com/users/shelld3v/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/shelld3v",
"id": 59408894,
"login": "shelld3v",
"node_id": "MDQ6VXNlcjU5NDA4ODk0",
"organizations_url": "https://api.github.com/users/shelld3v/orgs",
"received_events_url": "https://api.github.com/users/shelld3v/received_events",
"repos_url": "https://api.github.com/users/shelld3v/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/shelld3v/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shelld3v/subscriptions",
"type": "User",
"url": "https://api.github.com/users/shelld3v",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 0 |
2020-12-20T11:25:47Z
|
2021-08-27T00:08:54Z
|
2020-12-20T12:47:22Z
|
NONE
|
resolved
|
You should check the protocol, not the start of the URL. Otherwise, this will be a valid protocol: `httpfuck://google.com`
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/59408894?v=4",
"events_url": "https://api.github.com/users/shelld3v/events{/privacy}",
"followers_url": "https://api.github.com/users/shelld3v/followers",
"following_url": "https://api.github.com/users/shelld3v/following{/other_user}",
"gists_url": "https://api.github.com/users/shelld3v/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/shelld3v",
"id": 59408894,
"login": "shelld3v",
"node_id": "MDQ6VXNlcjU5NDA4ODk0",
"organizations_url": "https://api.github.com/users/shelld3v/orgs",
"received_events_url": "https://api.github.com/users/shelld3v/received_events",
"repos_url": "https://api.github.com/users/shelld3v/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/shelld3v/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shelld3v/subscriptions",
"type": "User",
"url": "https://api.github.com/users/shelld3v",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5701/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5701/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5701.diff",
"html_url": "https://github.com/psf/requests/pull/5701",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5701.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5701"
}
| true |
[] |
https://api.github.com/repos/psf/requests/issues/5700
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5700/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5700/comments
|
https://api.github.com/repos/psf/requests/issues/5700/events
|
https://github.com/psf/requests/pull/5700
| 771,561,746 |
MDExOlB1bGxSZXF1ZXN0NTQzMDUwMDQz
| 5,700 |
Auto-add labels to issues
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/59408894?v=4",
"events_url": "https://api.github.com/users/shelld3v/events{/privacy}",
"followers_url": "https://api.github.com/users/shelld3v/followers",
"following_url": "https://api.github.com/users/shelld3v/following{/other_user}",
"gists_url": "https://api.github.com/users/shelld3v/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/shelld3v",
"id": 59408894,
"login": "shelld3v",
"node_id": "MDQ6VXNlcjU5NDA4ODk0",
"organizations_url": "https://api.github.com/users/shelld3v/orgs",
"received_events_url": "https://api.github.com/users/shelld3v/received_events",
"repos_url": "https://api.github.com/users/shelld3v/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/shelld3v/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shelld3v/subscriptions",
"type": "User",
"url": "https://api.github.com/users/shelld3v",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2020-12-20T10:48:01Z
|
2021-08-27T00:08:54Z
|
2020-12-20T13:36:27Z
|
NONE
|
resolved
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4",
"events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}",
"followers_url": "https://api.github.com/users/sigmavirus24/followers",
"following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}",
"gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/sigmavirus24",
"id": 240830,
"login": "sigmavirus24",
"node_id": "MDQ6VXNlcjI0MDgzMA==",
"organizations_url": "https://api.github.com/users/sigmavirus24/orgs",
"received_events_url": "https://api.github.com/users/sigmavirus24/received_events",
"repos_url": "https://api.github.com/users/sigmavirus24/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions",
"type": "User",
"url": "https://api.github.com/users/sigmavirus24",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5700/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5700/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5700.diff",
"html_url": "https://github.com/psf/requests/pull/5700",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5700.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5700"
}
| true |
[
"Many folks open bugs and feature requests that are neither. Auto-labelling would then require more effort to remove those labels on the part of the maintainers"
] |
|
https://api.github.com/repos/psf/requests/issues/5699
|
https://api.github.com/repos/psf/requests
|
https://api.github.com/repos/psf/requests/issues/5699/labels{/name}
|
https://api.github.com/repos/psf/requests/issues/5699/comments
|
https://api.github.com/repos/psf/requests/issues/5699/events
|
https://github.com/psf/requests/pull/5699
| 771,524,935 |
MDExOlB1bGxSZXF1ZXN0NTQzMDI1MTQ0
| 5,699 |
Don't perform authentication if no credential provided
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/59408894?v=4",
"events_url": "https://api.github.com/users/shelld3v/events{/privacy}",
"followers_url": "https://api.github.com/users/shelld3v/followers",
"following_url": "https://api.github.com/users/shelld3v/following{/other_user}",
"gists_url": "https://api.github.com/users/shelld3v/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/shelld3v",
"id": 59408894,
"login": "shelld3v",
"node_id": "MDQ6VXNlcjU5NDA4ODk0",
"organizations_url": "https://api.github.com/users/shelld3v/orgs",
"received_events_url": "https://api.github.com/users/shelld3v/received_events",
"repos_url": "https://api.github.com/users/shelld3v/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/shelld3v/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shelld3v/subscriptions",
"type": "User",
"url": "https://api.github.com/users/shelld3v",
"user_view_type": "public"
}
|
[] |
closed
| true | null |
[] | null | 1 |
2020-12-20T06:28:32Z
|
2021-08-27T00:08:54Z
|
2020-12-20T12:40:07Z
|
NONE
|
resolved
|
## The old behavior:
```python
>>> requests.get('http://127.0.0.1:8080', auth=(None, None))
```
```http
GET / HTTP/1.1
Host: 127.0.0.1:8080
User-Agent: python-requests/2.25.0
Accept-Encoding: gzip, deflate
Accept: */*
Connection: keep-alive
Authorization: Basic Tm9uZTpOb25l (None:None)
```
The same thing will happen with:
```python
>>> requests.get('http://127.0.0.1:8080', auth=('None', 'None'))
```
This is weird, right!
## Now:
```python
>>> requests.get('http://127.0.0.1:8080', auth=(None, None))
```
```http
GET / HTTP/1.1
Host: 127.0.0.1:8080
User-Agent: python-requests/2.25.0
Accept-Encoding: gzip, deflate
Accept: */*
Connection: keep-alive
```
(No authentication)
This will be easier if the application allows auth username and password from arguments:
```python
response = self.session.request(
self.httpmethod,
url,
data=self.data,
proxies=proxy,
verify=False,
allow_redirects=self.redirect,
headers=dict(self.headers),
timeout=self.timeout,
auth=(self.user, self.pass)
)
```
If the user didn't provide credentials, which means they don't need to authenticate, then `self.user == None` and `self.pass == None`. And in this situation, `requests` with my update won't perform authentication!
In the old version (without my update), it will be like:
```python
if self.user and self.pass:
response = self.session.request(
self.httpmethod,
url,
data=self.data,
proxies=proxy,
verify=False,
allow_redirects=self.redirect,
headers=dict(self.headers),
timeout=self.timeout,
auth=(self.user, self.pass)
)
else:
response = self.session.request(
self.httpmethod,
url,
data=self.data,
proxies=proxy,
verify=False,
allow_redirects=self.redirect,
headers=dict(self.headers),
timeout=self.timeout
)
```
|
{
"avatar_url": "https://avatars.githubusercontent.com/u/59408894?v=4",
"events_url": "https://api.github.com/users/shelld3v/events{/privacy}",
"followers_url": "https://api.github.com/users/shelld3v/followers",
"following_url": "https://api.github.com/users/shelld3v/following{/other_user}",
"gists_url": "https://api.github.com/users/shelld3v/gists{/gist_id}",
"gravatar_id": "",
"html_url": "https://github.com/shelld3v",
"id": 59408894,
"login": "shelld3v",
"node_id": "MDQ6VXNlcjU5NDA4ODk0",
"organizations_url": "https://api.github.com/users/shelld3v/orgs",
"received_events_url": "https://api.github.com/users/shelld3v/received_events",
"repos_url": "https://api.github.com/users/shelld3v/repos",
"site_admin": false,
"starred_url": "https://api.github.com/users/shelld3v/starred{/owner}{/repo}",
"subscriptions_url": "https://api.github.com/users/shelld3v/subscriptions",
"type": "User",
"url": "https://api.github.com/users/shelld3v",
"user_view_type": "public"
}
|
{
"+1": 0,
"-1": 0,
"confused": 0,
"eyes": 0,
"heart": 0,
"hooray": 0,
"laugh": 0,
"rocket": 0,
"total_count": 0,
"url": "https://api.github.com/repos/psf/requests/issues/5699/reactions"
}
|
https://api.github.com/repos/psf/requests/issues/5699/timeline
| null | null | false |
{
"diff_url": "https://github.com/psf/requests/pull/5699.diff",
"html_url": "https://github.com/psf/requests/pull/5699",
"merged_at": null,
"patch_url": "https://github.com/psf/requests/pull/5699.patch",
"url": "https://api.github.com/repos/psf/requests/pulls/5699"
}
| true |
[
"Also, `None` username or password will become an empty username/password"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.