url
stringlengths
50
53
repository_url
stringclasses
1 value
labels_url
stringlengths
64
67
comments_url
stringlengths
59
62
events_url
stringlengths
57
60
html_url
stringlengths
38
43
id
int64
597k
2.65B
node_id
stringlengths
18
32
number
int64
1
6.83k
title
stringlengths
1
296
user
dict
labels
listlengths
0
5
state
stringclasses
2 values
locked
bool
2 classes
assignee
dict
assignees
listlengths
0
4
milestone
dict
comments
int64
0
211
created_at
stringlengths
20
20
updated_at
stringlengths
20
20
closed_at
stringlengths
20
20
author_association
stringclasses
3 values
active_lock_reason
stringclasses
4 values
body
stringlengths
0
65.6k
closed_by
dict
reactions
dict
timeline_url
stringlengths
59
62
performed_via_github_app
null
state_reason
stringclasses
3 values
draft
bool
2 classes
pull_request
dict
is_pull_request
bool
2 classes
issue_comments
listlengths
0
30
https://api.github.com/repos/psf/requests/issues/3031
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3031/labels{/name}
https://api.github.com/repos/psf/requests/issues/3031/comments
https://api.github.com/repos/psf/requests/issues/3031/events
https://github.com/psf/requests/issues/3031
138,483,327
MDU6SXNzdWUxMzg0ODMzMjc=
3,031
[WinError 10048] Only one usage of each socket address ...
{ "avatar_url": "https://avatars.githubusercontent.com/u/16397493?v=4", "events_url": "https://api.github.com/users/iliakarmanov/events{/privacy}", "followers_url": "https://api.github.com/users/iliakarmanov/followers", "following_url": "https://api.github.com/users/iliakarmanov/following{/other_user}", "gists_url": "https://api.github.com/users/iliakarmanov/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/iliakarmanov", "id": 16397493, "login": "iliakarmanov", "node_id": "MDQ6VXNlcjE2Mzk3NDkz", "organizations_url": "https://api.github.com/users/iliakarmanov/orgs", "received_events_url": "https://api.github.com/users/iliakarmanov/received_events", "repos_url": "https://api.github.com/users/iliakarmanov/repos", "site_admin": false, "starred_url": "https://api.github.com/users/iliakarmanov/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/iliakarmanov/subscriptions", "type": "User", "url": "https://api.github.com/users/iliakarmanov", "user_view_type": "public" }
[ { "color": "e11d21", "default": false, "description": null, "id": 136589914, "name": "Needs Info", "node_id": "MDU6TGFiZWwxMzY1ODk5MTQ=", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20Info" }, { "color": "777777", "default": false, "description": null, "id": 162780722, "name": "Question/Not a bug", "node_id": "MDU6TGFiZWwxNjI3ODA3MjI=", "url": "https://api.github.com/repos/psf/requests/labels/Question/Not%20a%20bug" } ]
closed
true
null
[]
null
8
2016-03-04T14:02:15Z
2021-09-08T14:00:29Z
2016-04-16T04:05:20Z
NONE
resolved
I notice that despite using requests.Session() - I still seem to be creating new connections/sockets which eventually exhaust (TIME_WAIT) and I get the following error: > [WinError 10048] Only one usage of each socket address (protocol/network address/port) is normally permitted',)) ``` s = requests.Session() data = zip(url_routes, cycle(s)) calc_routes = pool.map(processRequest, data) ``` I posted a bit more [here](http://stackoverflow.com/questions/35793908/python-multiprocessing-associate-a-process-with-a-session), however not sure how to address this
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3031/reactions" }
https://api.github.com/repos/psf/requests/issues/3031/timeline
null
completed
null
null
false
[ "What does `processRequest` do?\n", "FWIW, I should mention that it's _extremely_ unwise to use a Session from multiple processes at the same time: it's not clear to me at all how well connection pooling will behave in that situation, and any number of things could start misbehaving. It's vastly preferable to have one Session _per process_ if possible.\n", "I have a similar question\n\nafter executing this:\n\n```\nwith requests.Session() as s:\n r= s.get('http://192.168.1.4:8080/health')\n r.close()\n```\n\nnetstat shows \n`tcp 0 0 192.168.1.2:44984 192.168.1.4:8080 TIME_WAIT`\n\nlooks like the socket is not shutdowned. of course OS will shutdown it in a minute by default, but how to accomplish it immediately?\n", "The side of the connection that _closes_ a socket is always moved into `TIME_WAIT`. This cannot be avoided: due to the nature of TCP, it is possible that the final ACK will be lost and so the socket needs to be held around for a while to give confidence that it has not been. This is what `TIME_WAIT` is for. It cannot be avoided if you're closing the socket yourself.\n", "this doesn't have that side effect.\nafter run this code socket is destroyed.\nto avoid TIME_WAIT state it's necessarily run shutdown() before close() \n[https://docs.python.org/3/library/socket.html?highlight=socket#socket.socket.close]\n\n```\nimport socket\n\nHOST = 'localhost' # The remote host\nPORT = 8080 # The same port as used by the server\nwith socket.socket(socket.AF_INET, socket.SOCK_STREAM) as s:\n s.connect((HOST, PORT))\n s.send(b'GET /health\\n\\n')\n data = s.recv(20)\n```\n", "The reason your code doesn't have that side effect is because the _server_ is closing the connection first. That's because your code is performing a HTTP/0.9 style GET request, which requires the server to close the connection to signal the end of the response. That means that the _server_ is almost certainly getting a socket in TIME_WAIT instead, which is _much_ worse than you getting a socket in TIME_WAIT. \n\nTo avoid getting sockets in TIME_WAIT, the best thing to do is to use a single Session object at as high a scope as you can and leave it open for the lifetime of your program. Requests will do its best to re-use the sockets as much as possible, which should prevent them lapsing into TIME_WAIT. But you cannot avoid TIME_WAIT if you want to be closing the sockets yourself: it is a critical part of the TCP state machine that exists for a very good reason.\n\nShutdown does not prevent TIME_WAIT. I don't really know why you think it does.\n", "Related question: If I want to reuse the TCP connection _only_, is using a Session and clearing the Session's cookies and not using session.auth enough? Or is there more I should clear? (I'm simulating very large numbers of users bombarding a service.)", "@tlc Cookies should be enough." ]
https://api.github.com/repos/psf/requests/issues/3030
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3030/labels{/name}
https://api.github.com/repos/psf/requests/issues/3030/comments
https://api.github.com/repos/psf/requests/issues/3030/events
https://github.com/psf/requests/issues/3030
138,403,878
MDU6SXNzdWUxMzg0MDM4Nzg=
3,030
Strange behaviour in class dict
{ "avatar_url": "https://avatars.githubusercontent.com/u/7345039?v=4", "events_url": "https://api.github.com/users/coveritytest/events{/privacy}", "followers_url": "https://api.github.com/users/coveritytest/followers", "following_url": "https://api.github.com/users/coveritytest/following{/other_user}", "gists_url": "https://api.github.com/users/coveritytest/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/coveritytest", "id": 7345039, "login": "coveritytest", "node_id": "MDQ6VXNlcjczNDUwMzk=", "organizations_url": "https://api.github.com/users/coveritytest/orgs", "received_events_url": "https://api.github.com/users/coveritytest/received_events", "repos_url": "https://api.github.com/users/coveritytest/repos", "site_admin": false, "starred_url": "https://api.github.com/users/coveritytest/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/coveritytest/subscriptions", "type": "User", "url": "https://api.github.com/users/coveritytest", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-03-04T07:34:22Z
2016-03-04T15:02:07Z
2016-03-04T08:13:41Z
NONE
null
Apparently the class does not like hyphens or percent sign. cookies = dict(hottest-count='working') SyntaxError: keyword can't be an expression Same problem with cookies like %FOO%='bar'.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3030/reactions" }
https://api.github.com/repos/psf/requests/issues/3030/timeline
null
completed
null
null
false
[ "@coveritytest That has nothing to do with requests, and everything to do with Python: as the error reports to you, your _syntax_ is wrong.\n\nTry: `cookies = {\"hottest-count\": \"working\"}`.\n" ]
https://api.github.com/repos/psf/requests/issues/3029
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3029/labels{/name}
https://api.github.com/repos/psf/requests/issues/3029/comments
https://api.github.com/repos/psf/requests/issues/3029/events
https://github.com/psf/requests/pull/3029
138,314,104
MDExOlB1bGxSZXF1ZXN0NjE2MzU0MDE=
3,029
[MULTIPART-FILE] Content type should always be set
{ "avatar_url": "https://avatars.githubusercontent.com/u/1840162?v=4", "events_url": "https://api.github.com/users/vaibhavkaul/events{/privacy}", "followers_url": "https://api.github.com/users/vaibhavkaul/followers", "following_url": "https://api.github.com/users/vaibhavkaul/following{/other_user}", "gists_url": "https://api.github.com/users/vaibhavkaul/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/vaibhavkaul", "id": 1840162, "login": "vaibhavkaul", "node_id": "MDQ6VXNlcjE4NDAxNjI=", "organizations_url": "https://api.github.com/users/vaibhavkaul/orgs", "received_events_url": "https://api.github.com/users/vaibhavkaul/received_events", "repos_url": "https://api.github.com/users/vaibhavkaul/repos", "site_admin": false, "starred_url": "https://api.github.com/users/vaibhavkaul/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vaibhavkaul/subscriptions", "type": "User", "url": "https://api.github.com/users/vaibhavkaul", "user_view_type": "public" }
[ { "color": "eb6420", "default": false, "description": null, "id": 44501256, "name": "Breaking API Change", "node_id": "MDU6TGFiZWw0NDUwMTI1Ng==", "url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change" }, { "color": "e11d21", "default": false, "description": null, "id": 44501305, "name": "Not Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTMwNQ==", "url": "https://api.github.com/repos/psf/requests/labels/Not%20Ready%20To%20Merge" }, { "color": "e11d21", "default": false, "description": null, "id": 78002701, "name": "Do Not Merge", "node_id": "MDU6TGFiZWw3ODAwMjcwMQ==", "url": "https://api.github.com/repos/psf/requests/labels/Do%20Not%20Merge" }, { "color": "f7c6c7", "default": false, "description": null, "id": 167537670, "name": "Propose Close", "node_id": "MDU6TGFiZWwxNjc1Mzc2NzA=", "url": "https://api.github.com/repos/psf/requests/labels/Propose%20Close" } ]
closed
true
null
[]
null
15
2016-03-03T22:20:14Z
2021-09-08T04:01:15Z
2016-03-04T19:57:56Z
NONE
resolved
# Context In here: https://github.com/kennethreitz/requests/blob/v2.7.0/requests/models.py#L464-L466 The content type is only set if the there isn't one already set. I feel this shouldn't be the case. As we can see here: https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/filepost.py#L92 The content type string has a boundary generated by this method. This is specific to the body being encoded. Without this boundary and content type, the data being posted is no longer considered `Multipart Form Data`, which is the whole point of specifying `files=` in the post method. Even raising an exception would be better than silently skipping setting the content type.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1840162?v=4", "events_url": "https://api.github.com/users/vaibhavkaul/events{/privacy}", "followers_url": "https://api.github.com/users/vaibhavkaul/followers", "following_url": "https://api.github.com/users/vaibhavkaul/following{/other_user}", "gists_url": "https://api.github.com/users/vaibhavkaul/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/vaibhavkaul", "id": 1840162, "login": "vaibhavkaul", "node_id": "MDQ6VXNlcjE4NDAxNjI=", "organizations_url": "https://api.github.com/users/vaibhavkaul/orgs", "received_events_url": "https://api.github.com/users/vaibhavkaul/received_events", "repos_url": "https://api.github.com/users/vaibhavkaul/repos", "site_admin": false, "starred_url": "https://api.github.com/users/vaibhavkaul/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vaibhavkaul/subscriptions", "type": "User", "url": "https://api.github.com/users/vaibhavkaul", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3029/reactions" }
https://api.github.com/repos/psf/requests/issues/3029/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3029.diff", "html_url": "https://github.com/psf/requests/pull/3029", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3029.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3029" }
true
[ "Hi @vaibhavkaul,\n\nThanks for sending a pull request. I'm afraid we likely won't be accepting it. Here's why:\n- This is drastically backwards incompatible behaviour. \n- It takes a great deal of freedom away from the user which we have intentionally placed into their hands.\n- This code does not just change this for the `multipart/form-data` case.\n- This also prevents people from writing API fuzzers which provide incorrect content-type headers for a totally different kind of content and drastically reduces requests flexibility.\n- This affects the cases where people provide something to `data=` or `json=` with a custom content-type header where they expect their explicit decision to specify that header to be honored.\n\nWe could consider this for version 3.0.0 but I think it would not fit there either for several of the reasons listed above and also mostly due to the fact that we hope to not break things too drastically in 3.0.\n\nI won't close this just yet because I hope @Lukasa and @kennethreitz will weigh in as well.\n\nCheers,\nIan\n", "@sigmavirus24 I understand. I figured it might be worth pointing out. \n\n> It takes a great deal of freedom away from the user which we have intentionally placed into their hands.\n\nAs long as this is an intentional choice.\n\n> This code does not just change this for the multipart/form-data case.\n\nYou are right. I missed that, i will at-least make it specific to the files case, even if we end up not merging this.\n\nThe silent nature of the problem can cause a lot of headaches. I don't think a majority of the users using `requests` are going for the over-ridden behavior or API Fuzzers.\n", "> The silent nature of the problem can cause a lot of headaches.\n\nMight I ask why this change is any less of a silent headache though? Let me paint a picture:\n\n``` py\nrequests.post(url, header={'Content-Type': 'multipart/form-data'}, files=files, data=data)\n```\n\nWill make the correct body with an incorrect header. With a little research a user can figure out what they did wrong.\n\n``` py\nrequests.post(url, headers={'Content-Type': 'application/vnd.github.drax-preview+json'}, json=json_data)\n```\n\nWould forcibly reset the header to `application/json` which breaks things for the user doing the _right_ thing.\n\nSometimes overriding requests default behaviour is correct for user. I'm also of the opinion that special-casing this to just multipart uploads is the wrong thing. That means that one of the cases above prevents the user from doing exactly what they intend to do and introduces a great deal of inconsistency in the API. If any user needs to do something with an alternate content-type header for multipart form-data request, we now punish them by forcing them to use the prepared request flow and overriding the header after preparing the request... just to change one header.\n\nI understand that this can be and has been a foot-gun for several people. Google eagerly returns the StackOverflow questions and answers (there are a few) that explain what is happening and what needs to be fixed to correct the request.\n\nI don't think introducing inconsistencies in the API or silent behaviour is the right thing.\n", "I'm also disinclined to make this change. I can see why we'd do it, but I think if we're worried about requests silently subverting expectations then changing the behaviour to _differently_ silently subvert expectations isn't a great direction.\n\nOne option might be to instead fire a warning or a log in this case. That would attract user attention while also allowing users who so choose to override this behaviour. \n\nIn general the requests header dictionary lets people do stupid things. If the user wants to change content-length, they can. If they want to change transfer-encoding, they can. There is only so much we can do to protect users from themselves. \n", "@kennethreitz thoughts?\n", "@sigmavirus24 I made the change so it doesn't affect other things besides files.\n\n> Sometimes overriding requests default behaviour is correct for user\n\nYes, which which is why `sometimes` should not dictate default behavior.\n\n> Will make the correct body with an incorrect header. \n\nI dont think thats true since for the body to be parseable by any standard multi-part library (cgi etc) it would need to have the correct header. Just having the data in the body does not make it correct. I would argue the body is wrong in this case for a multipart request.\n\nMy motive right now is not to get this merged in at all. Happy to close the PR. But we should be clear about what the issue is here. Specifically I would like to address things like:\n\n> That means that one of the cases above prevents the user from doing exactly what they intend to do\n\nand \n\n> then changing the behaviour to differently silently subvert expectations isn't a great direction\n\nThe specific part of code I am referring to deals [exclusively with Multipart Files](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L445-L447). The whole [idea of adding a `file=` param](http://docs.python-requests.org/en/master/user/quickstart/#post-a-multipart-encoded-file) is to encode the request as a multi-part request. `Most` developers using that code path clearly want the request to be posted as multipart.\n\nMultipart requests must [have a boundary specified in the content-type](https://www.w3.org/Protocols/rfc1341/7_2_Multipart.html), if this is missing, the whole purpose for passing a `files=` param is defeated. This is no longer a multipart request. This is not a `correct body with the wrong header`. Clearly the `requests` framework [does not allow a user to pass their own boundary](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L156) (notice the missing boundary kwarg) and use that. It `always` relies on a auto generated boundary string. We should either give people the ability to be in control of the `content-type` (by letting them specify a boundary string) or do the right think in the library.\n\n> In general the requests header dictionary lets people do stupid things. If the user wants to change content-length, they can. If they want to change transfer-encoding, they can\n\nI agree. Devs do stupid things. But then why make [this](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L434-L438) raise an exception. There should at least be a warning somewhere about the request not being encoded as a multi part request if the headers being passed include a `content-type`, even if that content type is `multipart/form-data`.\n\nI completely agree that we should not make this change if you feel this breaks the existing API, so maybe this should be considered for a future release or maybe not. I'll leave it up to @kennethreitz.\n", "FWIW, do you guys think a better solution would be to allow the user to pass a boundary string and use that instead?\n", "> FWIW, do you guys think a better solution would be to allow the user to pass a boundary string and use that instead?\n\nYou already can with the toolbelt's [`MultipartEncoder`](https://toolbelt.readthedocs.org/en/latest/uploading-data.html#streaming-multipart-data-encoder) and since 99% of our users don't need that in requests itself, it makes perfect sense to have something that lives elsewhere to do that.\n\n> But then why make [this](https://github.com/kennethreitz/requests/blob/46184236dc177fb68c7863445609149d0ac243ea/requests/models.py#L434-L438) raise an exception.\n\nBecause one is exceptional. We could issue a warning when overriding the content-type header when a user specifies both `data=` and `files=` but I think that will only make a lot of people very angry with us.\n\n> The specific part of code I am referring to deals exclusively with Multipart Files.\n\nRight you're special casing **silent** behaviour for one very narrow case of something that should either be the case for everything or not. APIs cease being for humans when you have to say to yourself \"Wait, will requests overwrite this header for me if I'm doing this or is it only in this other case?\" every time they go back to do something. That is not a human API, that is an awful API design.\n\n> > Sometimes overriding requests default behaviour is correct for user\n> \n> Yes, which which is why sometimes should not dictate default behavior.\n\nThe default behaviour of requests is not wrong. The default behaviour is to trust the user that they know what they're doing.\n\n> Most developers using that code path clearly want the request to be posted as multipart.\n\nAnd they will get a body that is correct for a `multipart/form-data` request. If they're intentionally changing the header and the server they're talking to does not understand the header, they'll figure that out. People using requests are generally rather intelligent. Most have read the documentation and understand what they need to do when creating a valid `multipart/form-data` request. It is often the people who do not read the documentation that run into this, go and read the documentation, and fix it themselves.\n\n> I completely agree that we should not make this change if you feel this breaks the existing API.\n\nIt's not a matter of _feeling_ that this is backwards incompatible. It is backwards incompatible.\n\nAnd let me just strongly reinforce a point @Lukasa and I seem to be unable to convey appropriately:\n\nThis is taking documented and expected behaviour that you consider to be _silently_ broken and replacing it with behaviour that will be undocumented, unexpected, and will _silently_ break users' code. Your justification is to fix _silent_ behaviour but you're replacing it with inconsistent _silent_ behaviour.\n", "@sigmavirus24 \n\n> It's not a matter of feeling that this is backwards incompatible. It is backwards incompatible.\n\nYes, that was a bad choice of words. My Bad.\n\n> People using requests are generally rather intelligent. Most have read the documentation and understand what they need to do when creating a valid multipart/form-data request. \n\nThats great!\n\n> It is often the people who do not read the documentation that run into this, go and read the documentation, and fix it themselves.\n\nCan you point to to where in the documentation I would have seen the `content-type` behavior. I was able to find it because I looked at the code.\n", "So to be clear, I think we all agree that this is problematic: the only disagreement is whether we believe the problem is leaving the header alone, or the silence of the behaviour.\n\nI'm in favour of the current behaviour, possibly with a warning or a log.\n", "> I'm in favour of the current behaviour, possibly with a warning or a log.\n\nI think just something in the docs should suffice.\n\nSomething on the same lines as:\n\n![image](https://cloud.githubusercontent.com/assets/1840162/13537247/f6556f3c-e211-11e5-9513-a4ce614aee74.png)\n", "I was prepared to go further, with a runtime warning, but I'm totally happy to make a documentation change _right now_ to add warnings. We may actually want a whole section about headers that users generally shouldn't provide: `Host` is another.\n", "If you explicitly provide something, we send it. You know better than us. This is a big design flaw (imo) in other HTTP libraries. \n", "FWIW, this becomes a much bigger problem when you are trying to Proxy requests and want to maintain most of the original headers. \n\nHaving a list of headers that would potentially cause issues would be great for documentation.\n\n@Lukasa big :+1: \n", "@Lukasa likely shouldn't provide _but are certainly able to, if needed_. This is a feature, not a bug. \n" ]
https://api.github.com/repos/psf/requests/issues/3028
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3028/labels{/name}
https://api.github.com/repos/psf/requests/issues/3028/comments
https://api.github.com/repos/psf/requests/issues/3028/events
https://github.com/psf/requests/issues/3028
138,167,856
MDU6SXNzdWUxMzgxNjc4NTY=
3,028
"in" operation on resp.cookies should not raise CookieConflictError
{ "avatar_url": "https://avatars.githubusercontent.com/u/687131?v=4", "events_url": "https://api.github.com/users/trilogysci/events{/privacy}", "followers_url": "https://api.github.com/users/trilogysci/followers", "following_url": "https://api.github.com/users/trilogysci/following{/other_user}", "gists_url": "https://api.github.com/users/trilogysci/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/trilogysci", "id": 687131, "login": "trilogysci", "node_id": "MDQ6VXNlcjY4NzEzMQ==", "organizations_url": "https://api.github.com/users/trilogysci/orgs", "received_events_url": "https://api.github.com/users/trilogysci/received_events", "repos_url": "https://api.github.com/users/trilogysci/repos", "site_admin": false, "starred_url": "https://api.github.com/users/trilogysci/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/trilogysci/subscriptions", "type": "User", "url": "https://api.github.com/users/trilogysci", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" }, { "color": "0b02e1", "default": false, "description": null, "id": 191274, "name": "Contributor Friendly", "node_id": "MDU6TGFiZWwxOTEyNzQ=", "url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly" } ]
closed
true
null
[]
null
3
2016-03-03T12:37:56Z
2021-09-08T18:00:55Z
2016-04-16T04:05:38Z
NONE
resolved
Hosts which have the same cookie define for multiple domains should not raise "CookieConflictError: There are multiple cookies with name," when performing "in" It should short cut and return True immediately when found.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3028/reactions" }
https://api.github.com/repos/psf/requests/issues/3028/timeline
null
completed
null
null
false
[ "So the problem here is that the default implementation of `collections.Mapping.__contains__` calls in to `__getitem__`, which for the `RequestsCookieJar` requires that there be no duplicate. In this instance, we probably get the most bang for our buck by just overriding `__contains__` to call the parent implementation, and then catch `CookieConflictError` and return `True` in that instance.\n\nI'm marking this as contributor friendly because the fix is fairly easy, and it'd be a good one for someone to pick up as a first open source change.\n", "I've been using requests for a while and thinking I'd contribute, so I might take a shot at this.\n", "@davidsoncasey Awesome!\n" ]
https://api.github.com/repos/psf/requests/issues/3027
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3027/labels{/name}
https://api.github.com/repos/psf/requests/issues/3027/comments
https://api.github.com/repos/psf/requests/issues/3027/events
https://github.com/psf/requests/issues/3027
137,907,812
MDU6SXNzdWUxMzc5MDc4MTI=
3,027
requests can't make a request to the server it's running on
{ "avatar_url": "https://avatars.githubusercontent.com/u/3768695?v=4", "events_url": "https://api.github.com/users/isc-rsingh/events{/privacy}", "followers_url": "https://api.github.com/users/isc-rsingh/followers", "following_url": "https://api.github.com/users/isc-rsingh/following{/other_user}", "gists_url": "https://api.github.com/users/isc-rsingh/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/isc-rsingh", "id": 3768695, "login": "isc-rsingh", "node_id": "MDQ6VXNlcjM3Njg2OTU=", "organizations_url": "https://api.github.com/users/isc-rsingh/orgs", "received_events_url": "https://api.github.com/users/isc-rsingh/received_events", "repos_url": "https://api.github.com/users/isc-rsingh/repos", "site_admin": false, "starred_url": "https://api.github.com/users/isc-rsingh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/isc-rsingh/subscriptions", "type": "User", "url": "https://api.github.com/users/isc-rsingh", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2016-03-02T15:38:20Z
2016-03-04T15:02:26Z
2016-03-02T15:39:59Z
NONE
null
Here's a test case that never returns a response. Shouldn't this work? ``` python import requests from flask import Flask app = Flask(__name__) @app.route('/') def Welcome(): return "Testing requests module's ability to call the service in which it's running" @app.route('/test') def TestRequest(): resp = requests.get('http://0.0.0.0:5004') print('Response from TestRequest: '+resp.text) if __name__ == "__main__": app.run(host='0.0.0.0', port=5004, debug=True) ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3027/reactions" }
https://api.github.com/repos/psf/requests/issues/3027/timeline
null
completed
null
null
false
[ "Flask's debug server cannot handle multiple requests at once. Run Flask behind a proper WSGI server and you'll be fine. =)\n", "Thanks for the tip @Lukasa. I'm a newbie when it comes to Python web serving. Been struggling getting Apache WSGI module + Flask on OS X el capitan all day. First it was SIP (system integrity protection) and now it's getting Apache/OS X python to recognize my anaconda python modules. This whole process is extremely unpleasant. Do you happen to know what \"proper WSGI server\" folks tend to use?\n", "@rajrsingh there are too many to list, but here are a few:\n- Apache+mod_wsgi\n- gunicorn\n- uwsgi\n- twisted\n", "Use gunicorn. :)\n", "Thanks for the info. I'll put some of my research here for future Googlers. \n1. Getting mod_wsgi installed on OS X El Capitan with SIP is annoying. This is the best resource on that: https://github.com/GrahamDumpleton/mod_wsgi/issues/98. I ended up using the technique of disabling SIP, then installing mod_wsgi. \n2. Then my problem was either getting Python modules (like pandas and Flask) installed in the system Python, which Apache insists on using, or figuring out how to get Apache to recognize my Anaconda Python install. I gave up at this stage because...\n3. I found a way to get Flask to be multi-threaded here: http://stackoverflow.com/questions/14672753/handling-multiple-requests-in-flask so I can continue my dev work on the laptop with no \"real\" WSGI server.\n4. Next step is to try out gunicorn and get it all running on my eventual production environment on http://bluemix.net. \n", "So the strict answer to the question is nothing is wrong with requests, I just needed to add the `Threaded=true` parameter to the test server.\n\n``` python\nimport requests\nfrom flask import Flask\napp = Flask(__name__)\n\[email protected]('/')\ndef Welcome():\n return \"Testing requests module's ability to call the service in which it's running\"\n\[email protected]('/test')\ndef TestRequest(): \n resp = requests.get('http://0.0.0.0:5004')\n print('Response from TestRequest: '+resp.text)\n\nif __name__ == \"__main__\":\n app.run(host='0.0.0.0', port=5004, Threaded=True, debug=True)\n```\n" ]
https://api.github.com/repos/psf/requests/issues/3026
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3026/labels{/name}
https://api.github.com/repos/psf/requests/issues/3026/comments
https://api.github.com/repos/psf/requests/issues/3026/events
https://github.com/psf/requests/issues/3026
137,789,383
MDU6SXNzdWUxMzc3ODkzODM=
3,026
option to enable TCP level keepalive
{ "avatar_url": "https://avatars.githubusercontent.com/u/61531?v=4", "events_url": "https://api.github.com/users/woosley/events{/privacy}", "followers_url": "https://api.github.com/users/woosley/followers", "following_url": "https://api.github.com/users/woosley/following{/other_user}", "gists_url": "https://api.github.com/users/woosley/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/woosley", "id": 61531, "login": "woosley", "node_id": "MDQ6VXNlcjYxNTMx", "organizations_url": "https://api.github.com/users/woosley/orgs", "received_events_url": "https://api.github.com/users/woosley/received_events", "repos_url": "https://api.github.com/users/woosley/repos", "site_admin": false, "starred_url": "https://api.github.com/users/woosley/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/woosley/subscriptions", "type": "User", "url": "https://api.github.com/users/woosley", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-03-02T06:46:46Z
2021-09-08T19:00:32Z
2016-03-02T07:14:21Z
NONE
resolved
When there is firewall between client and server, http session will be killed even with keepalive enabled because no TCP level keepalive probe is sent. according to [here](http://tldp.org/HOWTO/TCP-Keepalive-HOWTO/usingkeepalive.html) > Remember that keepalive support, even if configured in the kernel, is not the default behavior in Linux. Programs must request keepalive control for their sockets using the setsockopt interface. There are relatively few programs implementing keepalive that said, is it possible to enable this behavior as an option in the library?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3026/reactions" }
https://api.github.com/repos/psf/requests/issues/3026/timeline
null
completed
null
null
false
[ "It is already possible: the requests tool belt provides a [Socket Options transport adapter](http://toolbelt.readthedocs.org/en/latest/adapters.html#socketoptionsadapter) that allows you to turn on TCP_KEEPALIVE.\n", "There's also a separate keep alive adapter which subclasses the socket options adapter. That will ideally make things simpler for your usage\n", "If we're going to be responding to tickets like this, then we need to make Requests-Toolbelt more prominent in the documentation. \n" ]
https://api.github.com/repos/psf/requests/issues/3025
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3025/labels{/name}
https://api.github.com/repos/psf/requests/issues/3025/comments
https://api.github.com/repos/psf/requests/issues/3025/events
https://github.com/psf/requests/issues/3025
136,877,478
MDU6SXNzdWUxMzY4Nzc0Nzg=
3,025
Proxies not working
{ "avatar_url": "https://avatars.githubusercontent.com/u/15353112?v=4", "events_url": "https://api.github.com/users/Tosible/events{/privacy}", "followers_url": "https://api.github.com/users/Tosible/followers", "following_url": "https://api.github.com/users/Tosible/following{/other_user}", "gists_url": "https://api.github.com/users/Tosible/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Tosible", "id": 15353112, "login": "Tosible", "node_id": "MDQ6VXNlcjE1MzUzMTEy", "organizations_url": "https://api.github.com/users/Tosible/orgs", "received_events_url": "https://api.github.com/users/Tosible/received_events", "repos_url": "https://api.github.com/users/Tosible/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Tosible/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Tosible/subscriptions", "type": "User", "url": "https://api.github.com/users/Tosible", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2016-02-27T04:28:31Z
2021-09-08T19:00:30Z
2016-02-27T05:36:42Z
NONE
resolved
My script shows that proxies aren't working. Am I doing something wrong? Here's the code if you need: [Script.zip](https://github.com/kennethreitz/requests/files/149201/Script.zip)
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3025/reactions" }
https://api.github.com/repos/psf/requests/issues/3025/timeline
null
completed
null
null
false
[ "Code here:\n\n``` python\n# Important stuff here\nfrom json import load\nimport requests\nfrom random import choice, randint, shuffle\nfrom urllib2 import urlopen\n\nurl = 'http://epiccodes.com' \n\n# Random proxies from - http://proxygaz.com\nproxies = {'https': 'https:131.96.228.236:9064',\n 'https': 'https:54.85.61.208:80',\n 'https': 'https:54.209.36.19:80',}\n\n# Retrieves IP Address\ndef IPAddress():\n IP = load(urlopen('http://httpbin.org/ip'))['origin']\n return(IP)\n\n# Retrieves Cookie\ndef CookieIP():\n r = requests.get(url,\n # Using proxy\n proxies=proxies)\n try:\n Cookie = r.headers['Set-Cookie']\n return(Cookie)\n except KeyError as e:\n return(\"Website might be down\")\n\n# This prints your IP\nprint(\"IP Address: \" + IPAddress())\n\n# The website's cookie has your IP in it. If proxy works, then the IP on the\n# cookie should be different from the other IP.\nprint(\"\\nProxy's IP: \" + CookieIP())\n\n```\n", "A few things:\n1. If you look at [the proxy documentation](http://docs.python-requests.org/en/master/user/advanced/?highlight=proxies#proxies) you'll see that your proxy dictionary should look like this:\n \n ``` py\n proxies = {'https': 'https://131.96.228.236:9064'}\n ```\n \n Note the `//`.\n2. You haven't described what kind of proxies these are. (Hint: If they're SOCKS proxies, those will not work with requests... yet.)\n3. You haven't described what you're seeing versus what you're expecting.\n4. Questions belong on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests). Please ask future questions there.\n", "1. If you look at my script (the one @kennethreitz posted), that's what I did. \n proxies = {'https': 'https:131.96.228.236:9064',\n 'https': 'https:54.85.61.208:80',\n 'https': 'https:54.209.36.19:80',}\n2. The proxies I tried to use are HTTPS\n3. If you look at the cookies for http://epiccodes.com, you'll see your IP in the cookies, I was expecting to see the proxy's IP in the cookie when I use proxies=proxies in the get request, instead of my actual IP.\n4. My bad, I didn't know. Will do in the future if I have any questions.\n", "> If you look at my script (the one @kennethreitz posted), that's what I did. \n\nIt isn't. Please look closely at what you wrote and what I wrote. Further, which proxy do you expect to be used? Creating a dictionary with a repeated key is undefined behaviour (as to which value is kept) in Python.\n", "@Pr0xy671 you are specifying 3 proxies. Try only doing one. Also, add a `//` to your proxy URL.\n\np.s. you can use `http://httpbin.org/ip` to get ip address easily. \n", "Ok, I tried doing this way and still nothing.\n\nr = requests.get('http://httpbin.org/ip', proxies={'https': 'https://89.163.131.168:3128'})\nprint(r.json()['origin'])\n\nIt then prints out my real IP and not 89.163.131.168\n", "@Pr0xy671 it's because you specified a proxy only for HTTPS (`{'https': …}`) but requested a HTTP page.\n" ]
https://api.github.com/repos/psf/requests/issues/3024
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3024/labels{/name}
https://api.github.com/repos/psf/requests/issues/3024/comments
https://api.github.com/repos/psf/requests/issues/3024/events
https://github.com/psf/requests/pull/3024
134,818,979
MDExOlB1bGxSZXF1ZXN0NTk5MjU1MzE=
3,024
Added unit tests for utils module
{ "avatar_url": "https://avatars.githubusercontent.com/u/1236561?v=4", "events_url": "https://api.github.com/users/Stranger6667/events{/privacy}", "followers_url": "https://api.github.com/users/Stranger6667/followers", "following_url": "https://api.github.com/users/Stranger6667/following{/other_user}", "gists_url": "https://api.github.com/users/Stranger6667/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Stranger6667", "id": 1236561, "login": "Stranger6667", "node_id": "MDQ6VXNlcjEyMzY1NjE=", "organizations_url": "https://api.github.com/users/Stranger6667/orgs", "received_events_url": "https://api.github.com/users/Stranger6667/received_events", "repos_url": "https://api.github.com/users/Stranger6667/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Stranger6667/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Stranger6667/subscriptions", "type": "User", "url": "https://api.github.com/users/Stranger6667", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-02-19T09:44:23Z
2021-09-08T04:01:10Z
2016-04-06T19:05:13Z
CONTRIBUTOR
resolved
More unit tests for `utils` module. Also there is small cleanup for this module - removed unused imports & variable. Lines coverage increased from 46% to 75%, but I'd like to ask about some functions, which are not used in `requests` at all (e.g `dict_to_sequence`, `from_key_val_list`, etc). Is there some plan for them? I wanted to write unit tests for it, but I'm not sure, may be they are staged to be deleted and it is not reasonable to cover them with unit tests. BR
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3024/reactions" }
https://api.github.com/repos/psf/requests/issues/3024/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3024.diff", "html_url": "https://github.com/psf/requests/pull/3024", "merged_at": "2016-04-06T19:05:13Z", "patch_url": "https://github.com/psf/requests/pull/3024.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3024" }
true
[ "Some of the utils module functions are left in place to avoid breaking users that may be using them directly, even though they're not used in requests any more. There's certainly no plan to remove them prior to 3.0.0.\n", "I'm down for removing ones that _don't belong there_ for 3.0. \n" ]
https://api.github.com/repos/psf/requests/issues/3023
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3023/labels{/name}
https://api.github.com/repos/psf/requests/issues/3023/comments
https://api.github.com/repos/psf/requests/issues/3023/events
https://github.com/psf/requests/pull/3023
134,643,102
MDExOlB1bGxSZXF1ZXN0NTk4MzM5NTc=
3,023
Add exceptions into __init__.py
{ "avatar_url": "https://avatars.githubusercontent.com/u/244969?v=4", "events_url": "https://api.github.com/users/jtpereyda/events{/privacy}", "followers_url": "https://api.github.com/users/jtpereyda/followers", "following_url": "https://api.github.com/users/jtpereyda/following{/other_user}", "gists_url": "https://api.github.com/users/jtpereyda/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jtpereyda", "id": 244969, "login": "jtpereyda", "node_id": "MDQ6VXNlcjI0NDk2OQ==", "organizations_url": "https://api.github.com/users/jtpereyda/orgs", "received_events_url": "https://api.github.com/users/jtpereyda/received_events", "repos_url": "https://api.github.com/users/jtpereyda/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jtpereyda/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jtpereyda/subscriptions", "type": "User", "url": "https://api.github.com/users/jtpereyda", "user_view_type": "public" }
[]
closed
true
null
[]
null
9
2016-02-18T17:28:38Z
2021-09-08T04:01:16Z
2016-02-19T01:49:15Z
NONE
resolved
Addresses Issue #3022. Also added the JetBrains IDE settings folder, `.idea/`, to `.gitignore`.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3023/reactions" }
https://api.github.com/repos/psf/requests/issues/3023/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3023.diff", "html_url": "https://github.com/psf/requests/pull/3023", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3023.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3023" }
true
[ "I'm going to leave this up to @kennethreitz. As you stated in #3022, this is totally unnecessary, but neither is it a maintenance burden, so I have no objection to it. Thanks for your contribution! :cake:\n", "We already have all the exceptions available in the public API:\n\n``` python\nfrom .exceptions import (\n RequestException, Timeout, URLRequired,\n TooManyRedirects, HTTPError, ConnectionError,\n FileModeWarning,\n)\n```\n", "I'm with @kennethreitz this isn't necessary. Because of the line @kennethreitz is citing, users can already just do\n\n``` py\nimport requests\n\ntry:\n r = requests.get(url)\nexcept requests.ConnectionError:\n pass\n\n# or\n\ntry:\n requests.get(url)\nexcept requests.exceptions.ConnectionError:\n pass\n```\n\nBoth work perfectly fine. Beyond that, we don't need a gitignore line for PyCharm/idea projects because that should already be in your [global gitignore](https://help.github.com/articles/ignoring-files/#create-a-global-gitignore).\n", "@kennethreitz Thanks, I was wondering about that line. Two questions:\n1. That import makes them available as `requests.RequestException`. It's a subtle difference, but [this doc page](http://docs.python-requests.org/en/master/api/#exceptions) instructs users to include them as `requests.exceptions.RequestException`. Which one is the preferred or supported method?\n - If we want to include both, we could specify both in `__init__.py`.\n - If `requests.RequestException` is preferred (which I like), perhaps we should update the docs.\n2. It looks like `ConnectTimeout` is missing from the list. Should it be added?\n", "@sigmavirus24 Thanks for the global gitignore tip! I hadn't heard about that.\n", "@jtpereyda it's a subtle but unimportant difference. Try this in an interactive session:\n\n``` py\n>>> import requests\n>>> try:\n... requests.get('http://localhost:8080')\n... except requests.ConnectionError:\n... pass\n...\n>>> try:\n... requests.get('http://localhost:8080')\n... except requests.exceptions.ConnectionError:\n... pass\n...\n>>>\n```\n\nBoth already work perfectly fine. You can use either one or both interchangeably. If `ConnectTimeout` is missing, that should be added to the list of exceptions imported. But like I said, as it is now, we do not need to change anything for `requests.exceptions.RequestException` to work. It will simply work.\n", "@jtpereyda the \"recommended\" way is, in fact, `requests.RequestException`, as `requests.exceptions` is really just an internal namespacing.\n\nThat section of the docs are auto-generated from the codebase. I'd love to fix that section if we can. \n\nHowever, at this time, there's nothing broken about the docs or the code. We're just discussing room for possible usability improvement (which is the main goal of this library). \n", "@jtpereyda fixed the docs! http://docs.python-requests.org/en/master/api/#exceptions\n", "@kennethreitz Thanks!\n" ]
https://api.github.com/repos/psf/requests/issues/3022
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3022/labels{/name}
https://api.github.com/repos/psf/requests/issues/3022/comments
https://api.github.com/repos/psf/requests/issues/3022/events
https://github.com/psf/requests/issues/3022
134,639,217
MDU6SXNzdWUxMzQ2MzkyMTc=
3,022
exceptions missing from __init__.py
{ "avatar_url": "https://avatars.githubusercontent.com/u/244969?v=4", "events_url": "https://api.github.com/users/jtpereyda/events{/privacy}", "followers_url": "https://api.github.com/users/jtpereyda/followers", "following_url": "https://api.github.com/users/jtpereyda/following{/other_user}", "gists_url": "https://api.github.com/users/jtpereyda/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jtpereyda", "id": 244969, "login": "jtpereyda", "node_id": "MDQ6VXNlcjI0NDk2OQ==", "organizations_url": "https://api.github.com/users/jtpereyda/orgs", "received_events_url": "https://api.github.com/users/jtpereyda/received_events", "repos_url": "https://api.github.com/users/jtpereyda/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jtpereyda/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jtpereyda/subscriptions", "type": "User", "url": "https://api.github.com/users/jtpereyda", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2016-02-18T17:12:37Z
2021-09-08T18:00:55Z
2016-04-16T04:05:57Z
NONE
resolved
It is traditional to include the public API in `__init__.py`. Consider this test code: ``` python import requests import requests.exceptions a = requests.exceptions.ConnectTimeout ``` This will run properly, but PyCharm raises a warning, saying, "Cannot find reference 'exceptions' in `__init__.py`". This could be fixed by changing an `__init__.py` line from: ``` from . import utils ``` to ``` from . import requests, utils ``` While this is technically a [bug ](https://youtrack.jetbrains.com/issue/PY-18550) in PyCharm, requests' **init**.py already imports utils, apparently for this very purpose. So it would be consistent to import exceptions as well.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3022/reactions" }
https://api.github.com/repos/psf/requests/issues/3022/timeline
null
completed
null
null
false
[ "This isn't _technically_ a bug in PyCharm. It is a bug that they should be fixing. If we did something like this for each editor that does nonsensical things like this, I'm not sure we'd have a very coherent project.\n", "@kennethreitz Resolved this by changing the docs (see PR #3023).\n\nLooks like http://docs.python-requests.org/en/master/api/#exceptions is still missing ConnectTimeout and ReadTimeout. Should they be added to `__init__.py`?\n", "@jtpereyda oddly, those two exceptions are being _raised_ during the sphinx build. I haven't dug into it yet. \n\nBut.... anyone can investigate! ;)\n", "Is this still a reproducible bug? Both (python2 and python3) sphinx builds are clean:\n\n```\n(p2_venv) requests $ make docs\ncd docs && make html\nmake[1]: Entering directory '/home/kmaduri/Documents/programming/python/requests/docs'\nsphinx-build -b html -d _build/doctrees . _build/html\nRunning Sphinx v1.3.5\nloading pickled environment... failed: unsupported pickle protocol: 4\nloading intersphinx inventory from http://urllib3.readthedocs.org/en/latest/objects.inv...\nbuilding [mo]: targets for 0 po files that are out of date\nbuilding [html]: targets for 18 source files that are out of date\nupdating environment: 18 added, 0 changed, 0 removed\nreading sources... [100%] user/quickstart \nlooking for now-outdated files... none found\npickling environment... done\nchecking consistency... done\npreparing documents... done\nwriting output... [100%] user/quickstart \ngenerating indices... genindex py-modindex\nhighlighting module code... [100%] requests.auth \nwriting additional pages... search\ncopying static files... done\ncopying extra files... done\ndumping search index in English (code: en) ... done\ndumping object inventory... done\nbuild succeeded.\n\nBuild finished. The HTML pages are in _build/html.\nmake[1]: Leaving directory '/home/kmaduri/Documents/programming/python/requests/docs'\n\\033[95m\\n\\nBuild successful! View the docs homepage at docs/_build/html/index.html.\\n\\033[0m\n```\n", "This may have been fixed now; there was a change made around this recently. \n" ]
https://api.github.com/repos/psf/requests/issues/3021
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3021/labels{/name}
https://api.github.com/repos/psf/requests/issues/3021/comments
https://api.github.com/repos/psf/requests/issues/3021/events
https://github.com/psf/requests/pull/3021
134,496,478
MDExOlB1bGxSZXF1ZXN0NTk3NTc5MDE=
3,021
Fix #3017: Whitespace characters surrounding a URL should be ignored
{ "avatar_url": "https://avatars.githubusercontent.com/u/120788?v=4", "events_url": "https://api.github.com/users/geckon/events{/privacy}", "followers_url": "https://api.github.com/users/geckon/followers", "following_url": "https://api.github.com/users/geckon/following{/other_user}", "gists_url": "https://api.github.com/users/geckon/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/geckon", "id": 120788, "login": "geckon", "node_id": "MDQ6VXNlcjEyMDc4OA==", "organizations_url": "https://api.github.com/users/geckon/orgs", "received_events_url": "https://api.github.com/users/geckon/received_events", "repos_url": "https://api.github.com/users/geckon/repos", "site_admin": false, "starred_url": "https://api.github.com/users/geckon/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/geckon/subscriptions", "type": "User", "url": "https://api.github.com/users/geckon", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-02-18T06:51:32Z
2021-09-08T05:00:55Z
2016-02-18T07:01:41Z
NONE
resolved
My attempt to fix [the issue I reported](https://github.com/kennethreitz/requests/issues/3017). Feel free to comment my changes, especially if you don't agree with them.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3021/reactions" }
https://api.github.com/repos/psf/requests/issues/3021/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3021.diff", "html_url": "https://github.com/psf/requests/pull/3021", "merged_at": "2016-02-18T07:01:41Z", "patch_url": "https://github.com/psf/requests/pull/3021.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3021" }
true
[ "@geckon thank you!\n\nFor the record, this is a potentially breaking change for _someone_, hence why it has to wait for inclusion in the 3.x series. \n", ":sparkles: :cake: :sparkles:\n", "Thanks for this @geckon \n" ]
https://api.github.com/repos/psf/requests/issues/3020
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3020/labels{/name}
https://api.github.com/repos/psf/requests/issues/3020/comments
https://api.github.com/repos/psf/requests/issues/3020/events
https://github.com/psf/requests/pull/3020
134,435,798
MDExOlB1bGxSZXF1ZXN0NTk3Mjg2ODY=
3,020
Fix #3017: Whitepace characters surrounding a URL should be ignored
{ "avatar_url": "https://avatars.githubusercontent.com/u/120788?v=4", "events_url": "https://api.github.com/users/geckon/events{/privacy}", "followers_url": "https://api.github.com/users/geckon/followers", "following_url": "https://api.github.com/users/geckon/following{/other_user}", "gists_url": "https://api.github.com/users/geckon/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/geckon", "id": 120788, "login": "geckon", "node_id": "MDQ6VXNlcjEyMDc4OA==", "organizations_url": "https://api.github.com/users/geckon/orgs", "received_events_url": "https://api.github.com/users/geckon/received_events", "repos_url": "https://api.github.com/users/geckon/repos", "site_admin": false, "starred_url": "https://api.github.com/users/geckon/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/geckon/subscriptions", "type": "User", "url": "https://api.github.com/users/geckon", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-02-17T23:45:37Z
2021-09-08T05:00:56Z
2016-02-18T06:54:01Z
NONE
resolved
My attempt to fix [the issue I reported](https://github.com/kennethreitz/requests/issues/3017). Feel free to comment my changes, especially if you don't agree with them.
{ "avatar_url": "https://avatars.githubusercontent.com/u/120788?v=4", "events_url": "https://api.github.com/users/geckon/events{/privacy}", "followers_url": "https://api.github.com/users/geckon/followers", "following_url": "https://api.github.com/users/geckon/following{/other_user}", "gists_url": "https://api.github.com/users/geckon/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/geckon", "id": 120788, "login": "geckon", "node_id": "MDQ6VXNlcjEyMDc4OA==", "organizations_url": "https://api.github.com/users/geckon/orgs", "received_events_url": "https://api.github.com/users/geckon/received_events", "repos_url": "https://api.github.com/users/geckon/repos", "site_admin": false, "starred_url": "https://api.github.com/users/geckon/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/geckon/subscriptions", "type": "User", "url": "https://api.github.com/users/geckon", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3020/reactions" }
https://api.github.com/repos/psf/requests/issues/3020/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3020.diff", "html_url": "https://github.com/psf/requests/pull/3020", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3020.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3020" }
true
[ "@geckon please rebase this against that branch instead. If you need help, let me know and I can give you some instructions that should work in this case. Thanks for handling this! :cake: \n", "Hopefully I did it right the third time: https://github.com/kennethreitz/requests/pull/3021\nI believe this PR can be closed, right?\n" ]
https://api.github.com/repos/psf/requests/issues/3019
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3019/labels{/name}
https://api.github.com/repos/psf/requests/issues/3019/comments
https://api.github.com/repos/psf/requests/issues/3019/events
https://github.com/psf/requests/pull/3019
134,430,169
MDExOlB1bGxSZXF1ZXN0NTk3MjUzOTc=
3,019
Fix #3017: Whitepace characters surrounding a URL should be ignored
{ "avatar_url": "https://avatars.githubusercontent.com/u/120788?v=4", "events_url": "https://api.github.com/users/geckon/events{/privacy}", "followers_url": "https://api.github.com/users/geckon/followers", "following_url": "https://api.github.com/users/geckon/following{/other_user}", "gists_url": "https://api.github.com/users/geckon/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/geckon", "id": 120788, "login": "geckon", "node_id": "MDQ6VXNlcjEyMDc4OA==", "organizations_url": "https://api.github.com/users/geckon/orgs", "received_events_url": "https://api.github.com/users/geckon/received_events", "repos_url": "https://api.github.com/users/geckon/repos", "site_admin": false, "starred_url": "https://api.github.com/users/geckon/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/geckon/subscriptions", "type": "User", "url": "https://api.github.com/users/geckon", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-02-17T23:14:54Z
2021-09-08T05:00:56Z
2016-02-18T06:54:36Z
NONE
resolved
My attempt to fix [the issue I reported](https://github.com/kennethreitz/requests/issues/3017). Feel free to comment my changes, especially if you don't agree with them.
{ "avatar_url": "https://avatars.githubusercontent.com/u/120788?v=4", "events_url": "https://api.github.com/users/geckon/events{/privacy}", "followers_url": "https://api.github.com/users/geckon/followers", "following_url": "https://api.github.com/users/geckon/following{/other_user}", "gists_url": "https://api.github.com/users/geckon/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/geckon", "id": 120788, "login": "geckon", "node_id": "MDQ6VXNlcjEyMDc4OA==", "organizations_url": "https://api.github.com/users/geckon/orgs", "received_events_url": "https://api.github.com/users/geckon/received_events", "repos_url": "https://api.github.com/users/geckon/repos", "site_admin": false, "starred_url": "https://api.github.com/users/geckon/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/geckon/subscriptions", "type": "User", "url": "https://api.github.com/users/geckon", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3019/reactions" }
https://api.github.com/repos/psf/requests/issues/3019/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3019.diff", "html_url": "https://github.com/psf/requests/pull/3019", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3019.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3019" }
true
[ "Looks perfect to me!\n", "@geckon although, we're going to be merging this into the `proposed/3.0.0` branch. would you like to make your PR against that branch instead?\n", "No problem: https://github.com/kennethreitz/requests/pull/3020\nIs that alright?\n", "Hopefully I did it right the third time: https://github.com/kennethreitz/requests/pull/3021\nI believe this PR can be closed, right?\n" ]
https://api.github.com/repos/psf/requests/issues/3018
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3018/labels{/name}
https://api.github.com/repos/psf/requests/issues/3018/comments
https://api.github.com/repos/psf/requests/issues/3018/events
https://github.com/psf/requests/issues/3018
134,293,897
MDU6SXNzdWUxMzQyOTM4OTc=
3,018
ConnectionError: ('Connection aborted.', error(13, 'Permission denied'))
{ "avatar_url": "https://avatars.githubusercontent.com/u/16576155?v=4", "events_url": "https://api.github.com/users/OneSecure/events{/privacy}", "followers_url": "https://api.github.com/users/OneSecure/followers", "following_url": "https://api.github.com/users/OneSecure/following{/other_user}", "gists_url": "https://api.github.com/users/OneSecure/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/OneSecure", "id": 16576155, "login": "OneSecure", "node_id": "MDQ6VXNlcjE2NTc2MTU1", "organizations_url": "https://api.github.com/users/OneSecure/orgs", "received_events_url": "https://api.github.com/users/OneSecure/received_events", "repos_url": "https://api.github.com/users/OneSecure/repos", "site_admin": false, "starred_url": "https://api.github.com/users/OneSecure/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/OneSecure/subscriptions", "type": "User", "url": "https://api.github.com/users/OneSecure", "user_view_type": "public" }
[ { "color": "fbca04", "default": false, "description": null, "id": 615414998, "name": "GAE Support", "node_id": "MDU6TGFiZWw2MTU0MTQ5OTg=", "url": "https://api.github.com/repos/psf/requests/labels/GAE%20Support" } ]
closed
true
null
[]
null
5
2016-02-17T14:27:05Z
2021-09-08T01:21:17Z
2016-02-17T14:43:43Z
NONE
resolved
On GAE, it went wrong. WARNING 2016-02-17 14:06:22,023 urlfetch_stub.py:540] Stripped prohibited headers from URLFetch request: ['Host'] INFO 2016-02-17 14:06:22,773 connectionpool.py:758] Starting new HTTPS connection (1): api.weibo.com ERROR 2016-02-17 14:06:23,531 webapp2.py:1528] ('Connection aborted.', error(13, 'Permission denied')) Traceback (most recent call last): File "/Users/go_appengine/lib/webapp2-2.3/webapp2.py", line 1511, in **call** rv = self.handle_exception(request, response, e) File "/Users/go_appengine/lib/webapp2-2.3/webapp2.py", line 1505, in **call** rv = self.router.dispatch(request, response) File "/Users/go_appengine/lib/webapp2-2.3/webapp2.py", line 1253, in default_dispatcher return route.handler_adapter(request, response) File "/Users/go_appengine/lib/webapp2-2.3/webapp2.py", line 1077, in **call** return handler.dispatch() File "/Users/go_appengine/lib/webapp2-2.3/webapp2.py", line 547, in dispatch return self.handle_exception(e, self.app.debug) File "/Users/go_appengine/lib/webapp2-2.3/webapp2.py", line 545, in dispatch return method(_args, *_kwargs) File "/Users/Desktop/main.py", line 27, in get postBingImageToWeibo(prefix, userUrl) File "/Users/Desktop/bingimage.py", line 60, in postBingImageToWeibo client.post('statuses/upload', status=info, pic=picFile) File "/Users/Desktop/weibo_tiny.py", line 130, in post files=files).text) File "/Users/Desktop/requests/sessions.py", line 511, in post return self.request('POST', url, data=data, json=json, *_kwargs) File "/Users/Desktop/requests/sessions.py", line 468, in request resp = self.send(prep, *_send_kwargs) File "/Users/Desktop/requests/sessions.py", line 576, in send r = adapter.send(request, **kwargs) File "/Users/Desktop/requests/adapters.py", line 426, in send raise ConnectionError(err, request=request) ConnectionError: ('Connection aborted.', error(13, 'Permission denied')) INFO 2016-02-17 14:06:23,598 module.py:787] default: "GET /retrievedata HTTP/1.1" 500 114
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3018/reactions" }
https://api.github.com/repos/psf/requests/issues/3018/timeline
null
completed
null
null
false
[ "When I using version 2.3.0, This issue is disappear.\n\nhttps://github.com/kennethreitz/requests/tree/6366d3dd190a9e58ca582955cddf7e2ac5f32dcc\n\nLike the following link\nhttp://stackoverflow.com/questions/27974128/how-do-i-resolve-django-allauth-connection-aborted-error13-permission-d\n", "I'm afraid that Requests does not consider google app engine a supported platform. In this case, the issue you're likely having is that requests attempts to use sockets for its connections, and those are not always allowed on GAE.\n", "For those looking for a workaround, `requests_toolbelt` appears to have a monkey patching solution: https://toolbelt.readthedocs.io/en/latest/adapters.html#appengineadapter (have not tested, use at own risk etc.)\r\n", "shashank@shashank-HP-Notebook:~/Documents/Policy_Gradients_to_beat_Pong-master$ python3 test.py\r\n[2018-03-12 15:42:20,423] Making new env: flashgames.DuskDrive-v0\r\n[2018-03-12 15:42:20,434] Writing logs to file: /tmp/universe-7286.log\r\nTraceback (most recent call last):\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/urllib3/connectionpool.py\", line 601, in urlopen\r\n chunked=chunked)\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/urllib3/connectionpool.py\", line 357, in _make_request\r\n conn.request(method, url, **httplib_request_kw)\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/docker/unixconn/unixconn.py\", line 48, in request\r\n super(UnixHTTPConnection, self).request(method, url, **kwargs)\r\n File \"/usr/lib/python3.6/http/client.py\", line 1239, in request\r\n self._send_request(method, url, body, headers, encode_chunked)\r\n File \"/usr/lib/python3.6/http/client.py\", line 1285, in _send_request\r\n self.endheaders(body, encode_chunked=encode_chunked)\r\n File \"/usr/lib/python3.6/http/client.py\", line 1234, in endheaders\r\n self._send_output(message_body, encode_chunked=encode_chunked)\r\n File \"/usr/lib/python3.6/http/client.py\", line 1026, in _send_output\r\n self.send(msg)\r\n File \"/usr/lib/python3.6/http/client.py\", line 964, in send\r\n self.connect()\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/docker/unixconn/unixconn.py\", line 39, in connect\r\n sock.connect(self.base_url.replace(\"http+unix:/\", \"\"))\r\nPermissionError: [Errno 13] Permission denied\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"/usr/local/lib/python3.6/dist-packages/requests-2.18.4-py3.6.egg/requests/adapters.py\", line 445, in send\r\n timeout=timeout\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/urllib3/connectionpool.py\", line 639, in urlopen\r\n _stacktrace=sys.exc_info()[2])\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/urllib3/util/retry.py\", line 357, in increment\r\n raise six.reraise(type(error), error, _stacktrace)\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/urllib3/packages/six.py\", line 685, in reraise\r\n raise value.with_traceback(tb)\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/urllib3/connectionpool.py\", line 601, in urlopen\r\n chunked=chunked)\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/urllib3/connectionpool.py\", line 357, in _make_request\r\n conn.request(method, url, **httplib_request_kw)\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/docker/unixconn/unixconn.py\", line 48, in request\r\n super(UnixHTTPConnection, self).request(method, url, **kwargs)\r\n File \"/usr/lib/python3.6/http/client.py\", line 1239, in request\r\n self._send_request(method, url, body, headers, encode_chunked)\r\n File \"/usr/lib/python3.6/http/client.py\", line 1285, in _send_request\r\n self.endheaders(body, encode_chunked=encode_chunked)\r\n File \"/usr/lib/python3.6/http/client.py\", line 1234, in endheaders\r\n self._send_output(message_body, encode_chunked=encode_chunked)\r\n File \"/usr/lib/python3.6/http/client.py\", line 1026, in _send_output\r\n self.send(msg)\r\n File \"/usr/lib/python3.6/http/client.py\", line 964, in send\r\n self.connect()\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/docker/unixconn/unixconn.py\", line 39, in connect\r\n sock.connect(self.base_url.replace(\"http+unix:/\", \"\"))\r\nurllib3.exceptions.ProtocolError: ('Connection aborted.', PermissionError(13, 'Permission denied'))\r\n\r\nDuring handling of the above exception, another exception occurred:\r\n\r\nTraceback (most recent call last):\r\n File \"test.py\", line 5, in <module>\r\n env.configure(remotes=1) # automatically creates a local docker container\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/universe/universe/universe/wrappers/timer.py\", line 14, in configure\r\n self.env.configure(**kwargs)\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/universe/universe/universe/wrappers/render.py\", line 21, in configure\r\n self.env.configure(**kwargs)\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/universe/universe/universe/wrappers/throttle.py\", line 32, in configure\r\n self.env.configure(**kwargs)\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/universe/universe/universe/envs/vnc_env.py\", line 199, in configure\r\n use_recorder_ports=record,\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/universe/universe/universe/remotes/build.py\", line 19, in build\r\n n=n,\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/universe/universe/universe/remotes/docker_remote.py\", line 44, in __init__\r\n self._assigner = PortAssigner(reuse=reuse)\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/universe/universe/universe/remotes/docker_remote.py\", line 165, in __init__\r\n self._refresh_ports()\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/universe/universe/universe/remotes/docker_remote.py\", line 169, in _refresh_ports\r\n for container in self.client.containers():\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/docker/client.py\", line 493, in containers\r\n res = self._result(self._get(u, params=params), True)\r\n File \"/home/shashank/.local/lib/python3.6/site-packages/docker/client.py\", line 76, in _get\r\n return self.get(url, **self._set_request_timeout(kwargs))\r\n File \"/usr/local/lib/python3.6/dist-packages/requests-2.18.4-py3.6.egg/requests/sessions.py\", line 526, in get\r\n return self.request('GET', url, **kwargs)\r\n File \"/usr/local/lib/python3.6/dist-packages/requests-2.18.4-py3.6.egg/requests/sessions.py\", line 513, in request\r\n resp = self.send(prep, **send_kwargs)\r\n File \"/usr/local/lib/python3.6/dist-packages/requests-2.18.4-py3.6.egg/requests/sessions.py\", line 623, in send\r\n r = adapter.send(request, **kwargs)\r\n File \"/usr/local/lib/python3.6/dist-packages/requests-2.18.4-py3.6.egg/requests/adapters.py\", line 495, in send\r\n raise ConnectionError(err, request=request)\r\nrequests.exceptions.ConnectionError: ('Connection aborted.', PermissionError(13, 'Permission denied'))\r\nI try everything as specified above but none of them work.please help\r\n", "With [requests toolbelt](https://github.com/requests/toolbelt), `requests` now works in both production and development:\r\n\r\n```\r\nfrom requests_toolbelt.adapters import appengine\r\nappengine.monkeypatch()\r\n```\r\n\r\nFor more information, [see this article](https://cloud.google.com/appengine/docs/standard/python/issue-requests)." ]
https://api.github.com/repos/psf/requests/issues/3017
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3017/labels{/name}
https://api.github.com/repos/psf/requests/issues/3017/comments
https://api.github.com/repos/psf/requests/issues/3017/events
https://github.com/psf/requests/issues/3017
133,979,726
MDU6SXNzdWUxMzM5Nzk3MjY=
3,017
Whitepace characters surrounding a URL should be ignored
{ "avatar_url": "https://avatars.githubusercontent.com/u/120788?v=4", "events_url": "https://api.github.com/users/geckon/events{/privacy}", "followers_url": "https://api.github.com/users/geckon/followers", "following_url": "https://api.github.com/users/geckon/following{/other_user}", "gists_url": "https://api.github.com/users/geckon/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/geckon", "id": 120788, "login": "geckon", "node_id": "MDQ6VXNlcjEyMDc4OA==", "organizations_url": "https://api.github.com/users/geckon/orgs", "received_events_url": "https://api.github.com/users/geckon/received_events", "repos_url": "https://api.github.com/users/geckon/repos", "site_admin": false, "starred_url": "https://api.github.com/users/geckon/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/geckon/subscriptions", "type": "User", "url": "https://api.github.com/users/geckon", "user_view_type": "public" }
[]
closed
true
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
16
2016-02-16T13:17:02Z
2021-09-08T19:00:32Z
2016-02-18T22:34:54Z
NONE
resolved
It might be beneficial to unite the behavior with other tools (including browsers) and start ignoring whitespaces surrounding a URL. See the examples: ``` $ curl -s "https://github.com/ " -o /dev/null -w "%{http_code}"; echo 200 $ python -c 'import urllib2; print(urllib2.urlopen("https://github.com/ ").getcode())' 200 $ python -c 'import requests; print(requests.get("https://github.com/ ").status_code)' 404 ``` Also, W3C [defines](https://www.w3.org/TR/2012/WD-html5-20121025/links.html#attr-hyperlink-href) the `a` tag's `href` attribute as... > The href attribute on a and area elements must have a value that is a valid URL potentially surrounded by spaces. Where _URL potentially surrounded by spaces_ is [defined](https://www.w3.org/TR/2012/WD-html5-20121025/urls.html#valid-url-potentially-surrounded-by-spaces) as... > A string is a valid non-empty URL potentially surrounded by spaces if, after stripping leading and trailing whitespace from it, it is a valid non-empty URL. I know this is not a killer argument for the requests library but it would be beneficial if it behaved the same way as the other approaches. It would also make HTML processing with requests (and e.g. beautifulsoup4) easier.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3017/reactions" }
https://api.github.com/repos/psf/requests/issues/3017/timeline
null
completed
null
null
false
[ "I have no objection to calling `strip` on a URL that's passed into requests.\n", "Do we currently `%20` these?\n", "Yes.\n", "> I have no objection to calling strip on a URL that's passed into requests.\n\nDo you mean the requests should do it or the caller?\n\nI may be very well mistaken but it might be enough to add a `url.strip()` call to the [`PreparedRequest.prepare_url()` method](https://github.com/kennethreitz/requests/blob/master/requests/models.py#L324), right?\n", "I mean that I have no objection to requests doing that.\n", "I'm prone to saying that calling `strip()` is fine as well, but it does also seem a bit silly to do that — I hardly doubt that this issue comes up very often, and the user is very explicitly specifying that there is a space in the URL. \n\nI suppose, if someone wants a space at the end of a URL, they'll just have to `%20` it themselves.\n\nI am also prone to having us send the URL they provided us with, though (our current behavior). A URL ending in a space is perfectly valid. By stripping, we're assuming a correction for a user mistake. \n", "I can't say how often this comes up but I encountered it while rewriting my code from using `urllib2` to `requests`. What worked before, didn't work with `requests` anymore and that's how I found the issue.\n\nThe `strip()` isn't really supposed to help those who _want to_ have spaces in their URLs but those who need to process such HTML :)\n", "We are an HTTP library, not an HTML library :)\n\nHowever, this may make things easier for some humans. \n", "Of course I know, that's why I mentioned this not being a strong argument for `requests` in the original post. But as you say, it makes things easier.\n\nOf course I can call the `strip()` myself. It only seems easier the other way and since urllib2 (and others) do it, I thought it would be nice to have it in `requests` as well.\n", "I'm fine with us doing it; it won't hurt anything. But, I think it's dumb. \n\nThe whole thing, that is, not your suggestion @geckon :)\n", "Eh, my webbrowser does it. I'm in.\n", "So this could break people who expect us to percent encode the space at the end of a string, can we please not BGB merge this to 2.x and instead put it off until 3.x?\n", "> Eh, my webbrowser does it. I'm in.\n\nYep, I mentioned browsers in my original post as well.\n", "@geckon sorry I missed that!\n", "Generally, my experience with HTTP via web browser is _fantastic_, so I modeled a lot of the design around those experiences. So, in this case, this is definitely something we should be doing. \n", "Closing since this merged to `proposed/3.0.0` and will be released with 3.0\n" ]
https://api.github.com/repos/psf/requests/issues/3016
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3016/labels{/name}
https://api.github.com/repos/psf/requests/issues/3016/comments
https://api.github.com/repos/psf/requests/issues/3016/events
https://github.com/psf/requests/issues/3016
133,752,844
MDU6SXNzdWUxMzM3NTI4NDQ=
3,016
"Request.hooks" and "Session.hooks" merged inappropriately.
{ "avatar_url": "https://avatars.githubusercontent.com/u/8749411?v=4", "events_url": "https://api.github.com/users/elonzh/events{/privacy}", "followers_url": "https://api.github.com/users/elonzh/followers", "following_url": "https://api.github.com/users/elonzh/following{/other_user}", "gists_url": "https://api.github.com/users/elonzh/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/elonzh", "id": 8749411, "login": "elonzh", "node_id": "MDQ6VXNlcjg3NDk0MTE=", "organizations_url": "https://api.github.com/users/elonzh/orgs", "received_events_url": "https://api.github.com/users/elonzh/received_events", "repos_url": "https://api.github.com/users/elonzh/repos", "site_admin": false, "starred_url": "https://api.github.com/users/elonzh/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/elonzh/subscriptions", "type": "User", "url": "https://api.github.com/users/elonzh", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-02-15T16:12:19Z
2021-09-08T19:00:32Z
2016-02-16T09:07:55Z
NONE
resolved
Both `Session` and `Request` have attribute `hooks`. The hook function list execute sequentially. In this situation, `Session.hooks` executed for every request and `Request.hooks` executed for a specifical request. But in fact, `Session.hooks` were replaced with `Request.hooks` when code ran into `merge_setting`: ``` Python merged_setting = dict_class(to_key_val_list(session_setting)) merged_setting.update(to_key_val_list(request_setting)) ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3016/reactions" }
https://api.github.com/repos/psf/requests/issues/3016/timeline
null
completed
null
null
false
[ "@evilerliang Do the requests docs guarantee that `Session.hooks` will not be replaced by `Request.hooks`? In general, things declared for specific requests _override_ the things declared on the `Session`, rather than _supplementing_ them.\n", "I reviewed the doc, thk.\n\n> Any dictionaries that you pass to a request method will be merged with the session-level values that are set. The method-level parameters override session parameters.\n> \n> Note, however, that method-level parameters will not be persisted across requests, even if using a session. \n", "Hmmm, we should consider appending any additional hooks provided by a method-level request, instead of overwriting. \n\n_e.g._\n\n```\n# session has a 'response' hook defined; additionally provided hooks get *merged*.\nsession.get(..., hooks={'respose': ...}\n\n# session has a 'response' hook defined; all 'response' hooks removed.\nsession.get(..., hooks={'respose': None}\n```\n\nThis is tricky, because in this scenario there is no way to specify nullifying the session hooks while simultaneously providing your own for the request. I believe this must be why I implemented it the way it is today, and justifies the current behavior as _correct_. \n\n---\n\n**Random hooks thoughts:** Not sure if it's worth changing any hooks behavior at this point, since hooks were at one point in time much more powerful (since there were more of them). Now they're really just used for internal stuff, e.g. custom authentication providers. \n\nI don't actually perceive hooks, in their current state, being used much by users, and as such could potentially be removed from the documentation, to further their internal-ness. Or, possibly, they could be moved to a new section of the documentation for people extending requests.\n\nJust random thoughts. \n", "That's fine. I changed the hook behavior in my package. Thanks again, you help me a lot.\n" ]
https://api.github.com/repos/psf/requests/issues/3015
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3015/labels{/name}
https://api.github.com/repos/psf/requests/issues/3015/comments
https://api.github.com/repos/psf/requests/issues/3015/events
https://github.com/psf/requests/issues/3015
133,623,869
MDU6SXNzdWUxMzM2MjM4Njk=
3,015
Ability to set timeout after response
{ "avatar_url": "https://avatars.githubusercontent.com/u/9677399?v=4", "events_url": "https://api.github.com/users/ofek/events{/privacy}", "followers_url": "https://api.github.com/users/ofek/followers", "following_url": "https://api.github.com/users/ofek/following{/other_user}", "gists_url": "https://api.github.com/users/ofek/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ofek", "id": 9677399, "login": "ofek", "node_id": "MDQ6VXNlcjk2NzczOTk=", "organizations_url": "https://api.github.com/users/ofek/orgs", "received_events_url": "https://api.github.com/users/ofek/received_events", "repos_url": "https://api.github.com/users/ofek/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ofek/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ofek/subscriptions", "type": "User", "url": "https://api.github.com/users/ofek", "user_view_type": "public" }
[]
closed
true
null
[]
null
12
2016-02-15T05:04:00Z
2021-09-08T19:00:33Z
2016-02-15T23:21:51Z
CONTRIBUTOR
resolved
For devs who use this great library, it would be very beneficial to be able to set the timeout AFTER initial connection. There are a few scenarios where this is useful but one of the main patterns/use cases is this: ``` import requests import socket # May or may not subclass threading.Thread class Getter(object): def __init__(self): self.request = requests.get(url, stream=True) def run(self): with open(path, 'r+b') as file: bytes_consumed = 0 while True: try: chunk = self.request.raw.read(size) if not chunk: break chunk_length = len(chunk) file.write(chunk) bytes_consumed += chunk_length except socket.timeout: # handle incomplete download by using range header next time, etc. ``` Handling incomplete downloads due to connection loss is common and especially important when downloading large or many files (or both). As you can see, this can be achieved in a fairly straightforward way. The issue is there is really no good way to write tests for this. Each method would involve OS specific code which would also be a no-go for CI services. What would be an option is the ability to set the timeout after establishing a connection. This way in a test you could do "r.timeout = (None, 0.00001)" and during reading it would simulate a timeout. To my knowledge this is no way currently to inject a new Timeout class retroactively. Is this correct?
{ "avatar_url": "https://avatars.githubusercontent.com/u/9677399?v=4", "events_url": "https://api.github.com/users/ofek/events{/privacy}", "followers_url": "https://api.github.com/users/ofek/followers", "following_url": "https://api.github.com/users/ofek/following{/other_user}", "gists_url": "https://api.github.com/users/ofek/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ofek", "id": 9677399, "login": "ofek", "node_id": "MDQ6VXNlcjk2NzczOTk=", "organizations_url": "https://api.github.com/users/ofek/orgs", "received_events_url": "https://api.github.com/users/ofek/received_events", "repos_url": "https://api.github.com/users/ofek/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ofek/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ofek/subscriptions", "type": "User", "url": "https://api.github.com/users/ofek", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3015/reactions" }
https://api.github.com/repos/psf/requests/issues/3015/timeline
null
completed
null
null
false
[ "Out of interest, why do you need to set the timeout retroactively? You could just set the timeout in the constructor, could you not?\n", "To test connection loss handling as explained above. Is there already some way to do this? I was looking through the code but couldn't quite wrap my head around how one might do it.\n", "Oh I'm sorry, I see what you're doing. \n\nThere is no way to change the timeout midway through handling, but it's not a widely useful feature. It'll likely be most effective for you just to either reach into the socket object directly to set the timeout, or to mock out the calls in order to test the connection loss. \n", "Thank you for your swift reply. Could you please explain the first approach?\n", "So the socket object lives `response.raw._fp.fp`. You can call [`socket.settimeout()`](https://docs.python.org/2/library/socket.html#socket.socket.settimeout) on it. That's fundamentally how the timeouts work in requests.\n", "Great thanks! So like this? `response.raw._fp.fp.raw._sock.settimeout(0.00001)`\n", "Yup, that should work fine.\n", "Hmm, doing that doesn't seem to affect `response.raw.read` at all even when setting timeout to extremely low. Though based on your help I _think_ I found a solution in `response.raw._fp.fp.raw._timeout_occurred = True`. Is that switch internally how it works? If I do a read after that it does indeed raise:\n\n`requests.packages.urllib3.exceptions.ProtocolError: (\"Connection broken: OSError('cannot read from timed out object',)\", OSError('cannot read from timed out object',))`\n\nSo that's the correct timeout error, right?\n", "That's not the correct timeout error, no: the one to expect is a urllib3 `ReadTimeoutError`.\n\nAt this point I have to ask why you're not just mocking out here to get the expected behaviour.\n", "I guess I'm unsure how to do that in this case..\n", "In this case replacing `self.request.raw` with a mock object, and then add a side-effect that raises the appropriate exception.\n", "Thank you very much!\n" ]
https://api.github.com/repos/psf/requests/issues/3014
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3014/labels{/name}
https://api.github.com/repos/psf/requests/issues/3014/comments
https://api.github.com/repos/psf/requests/issues/3014/events
https://github.com/psf/requests/pull/3014
133,576,032
MDExOlB1bGxSZXF1ZXN0NTkzMDQ4NjA=
3,014
PreparedRequest.dump()
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
[]
closed
true
null
[]
null
63
2016-02-14T20:26:41Z
2021-09-06T00:06:51Z
2016-04-06T19:08:02Z
CONTRIBUTOR
resolved
``` >>> r = requests.get('http://httpbin.org/ip') >>> print r.request.dump() REQUESTS/2.9.1 GET http://httpbin.org/ip Connection: keep-alive Accept-Encoding: gzip, deflate Accept: */* User-Agent: python-requests/2.9.1 ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 1, "-1": 1, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 2, "url": "https://api.github.com/repos/psf/requests/issues/3014/reactions" }
https://api.github.com/repos/psf/requests/issues/3014/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3014.diff", "html_url": "https://github.com/psf/requests/pull/3014", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3014.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3014" }
true
[ "I think `render` is a good name if it was reporting `HTTP/1.1` in the strings. Another name may be more appropriate now. \n", "I find it entertaining how this showcases just how much we completely disregard header order. \n", "Thinking of removing Response's render. \n", "Maybe `dump()`?\n", "@Lukasa hmmm, that works.\n", "This looks exactly like what I was hoping for.\n\nA couple of comments (feel free to disregard and ship anyway, I'm already happy):\n1. I'm not sure about the name **render()**. The method feels like a variant of `__str__` but with all the gory details. I expect it to be used mainly for debugging, so maybe just debug() instead?\n2. I really would like both of them (Response.render() and Request.render()). For debugging purposes I often need both the sent request and the response. I then take both of them and send them in an e-mail to the one with the server, so they can debug why my requests are not working right. Without a way to render responses I still would have to have my own utility for that.\n3. Not that fond of the \"REQUESTS/2.9.1\" string. Maybe we could do PreparedRequest<[string_here]> as a way to mimic `__repr__`?\n", "@EmilStenstrom `render` has been renamed to `dump`.\n", "dump() works for me!\n", "> Not that fond of the \"REQUESTS/2.9.1\" string. Maybe we could do PreparedRequest<[string_here]> as a way to mimic **repr**?\n\nI agree with @EmilStenstrom. I don't like that much either. It's confusing and may make some people think we're doing some weird protocol instead of HTTP.\n", "Concrete suggestion of syntax for dump():\n\n``` python\n>>> import requests\n>>> r = requests.get('http://httpbin.org/ip')\n>>> r.request\n<PreparedRequest [GET]>\n>>> r.request.dump()\n<PreparedRequest [\n GET http://httpbin.org/ip\n Connection: keep-alive\n Accept-Encoding: gzip, deflate\n Accept: */*\n User-Agent: python-requests/2.9.1\n]>\n```\n", "Any rationale for removing Response.dump()? I find it really useful for debugging a remote service in the same way that Request.dump() is useful.\n", "@EmilStenstrom hmm, that's not a bad way to present it! I like it. \n\nFor the record, I'm still not 100% on including this. I'm still tossing the idea back and forth in my mind. This syntax definitely helps, though.\n\nThis syntax also helps support the inclusion of `Response.dump`. It just didn't make sense with the other syntax.\n", "@EmilStenstrom updated the syntax. \n\n```\n<PreparedRequest [\n POST http://httpbin.org/post\n User-Agent: python-requests/2.9.1\n Connection: keep-alive\n Accept-Encoding: gzip, deflate\n Accept: */*\n Content-Length: 3447\n Content-Type: multipart/form-data; boundary=bf96155646964bf082b7a30531aa7c7e\n]>\n```\n", "Using colors to display warning about non string-like content bodies now. I like where this is going. \n\n<img width=\"484\" alt=\"screen shot 2016-02-16 at 1 12 22 am\" src=\"https://cloud.githubusercontent.com/assets/119893/13068556/73c11082-d44a-11e5-9650-315aedf0ca21.png\">\n\nThings to do:\n- [x] disable color output when not a tty\n- [x] potentially color the rest of the output, not sure how i feel about that. \n", "Ooooh, it's so pretty (and cross-platform).\n\n<img width=\"473\" alt=\"screen shot 2016-02-16 at 1 51 13 am\" src=\"https://cloud.githubusercontent.com/assets/119893/13069150/f9b10c92-d44f-11e5-8da1-079c58f8bb5b.png\">\n\nI dig this. \n", "I'm very happy with how this turned out, and by also including Response.dump() all three of my concerns have been delt with. It's ready to ship in my mind.\n", "@EmilStenstrom as soon as I'm done perfecting `PreparedRequest`, I'll move on to `Response` :)\n", "Two methods are now included, `dump()`, and `dumps()`. \n\n_dumps_ is for people that want to grab a string representation of the dump and use it programatically. _dump_ prints the dump directly to the screen, in colorful glory. Colors can optionally be disabled by passing `colored=False`.\n", "Reponse is done! \n\n<img width=\"420\" alt=\"screen shot 2016-02-16 at 3 13 46 am\" src=\"https://cloud.githubusercontent.com/assets/119893/13070557/69a961d8-d45b-11e5-8eb0-1803fd4bebee.png\">\n\nI think this is ready to ship. \n", "@EmilStenstrom Response is now included (see above). :cake:\n\nThanks a lot for your help with this, you've been instrumental ;)\n", "@sigmavirus24 @Lukasa last call for comments. Will merge/release tomorrow. \n", "I have a preference to remove the functionality to dump colours, rather than vendor in two more modules, but otherwise this is fine by me. =)\n", "Yeah. So, I agree with @Lukasa whole-heartedly. I don't think we should be vendoring libraries just for colorizing this output. This feels like a vast step outside the domain of this library - HTTP. That kind of behaviour belongs elsewhere. We also don't even remotely attempt to reflect whether or not the user is attempting to use proxies, which I think will be far more confusing for those users when they expect to see that output here.\n\nTo be clear:\n- I think we're violating the law of constraints of limiting this library to HTTP\n- I think we're doing our users a grave disservice by having this functionality here in such a naive implementation\n- I think we're adding more (vendored) dependencies than we should ever rightfully need\n", "If I hear the word _vendored_ from either one of you again, you're fired :)\n\nBut in all seriousness, thoughts on colorization noted. I quite like it, and am not 100% on it yet myself (but am leaning towards yes). \n\nNow, the purpose of these methods are to provide a convenient way to print out the state of a `PreparedRequest` and a `Response` when building applications by hand (I typically resort to printing `PreparedRequest.__dict__`, which is terrible. Proxies are not involved at this point, and no user should expect any proxy information to be present at this point. The output clearly shows that we are representing the `PreparedRequest` object, and they can look that up in the documentation if they get confused. \n", "> Proxies are not involved at this point, and no user should expect any proxy information to be present at this point. \n\nI disagree. The usage you showed above (reproduced below for clarity)\n\n``` py\n>>> r = requests.get(url)\n>>> r.request.dump()\n...\n```\n\nIs a request that has been sent. Requests has no way to distinguish that, realistically, but the user knows it has been sent and will expect that information. I know it's hard for us to easily distinguish between a PreparedRequest that someone has made using the PreparedRequest flow and one that was already sent, but users tend to not care if we can determine that or not. Introducing this here will open us up to a host of bug reports about not including various pieces of information in the dump.\n\n> If I hear the word vendored from either one of you again, you're fired :)\n\nYeah, I'm just thinking of two groups of people:\n- Developers who closely follow requests development and are constantly looking to pick fights with us over different choices (vendoring, certifi vs system bundles, etc.)\n- Downstream redistributors\n\nThe former I don't particularly care about because they generally include people whose opinion I already deprioritize (for various reasons). The latter I'm not worried too much about in this case because colorama is almost certainly already packaged in their repositories. What I care about in that case is when someone does `{package-manager-install} python-requests` and suddenly sees a few new dependencies that look completely unrelated to requests. I think the inclusion of those libraries should be a large warning sign.\n\nAnd now I'll shut up ;)\n", "> Is a request that has been sent. Requests has no way to distinguish that, realistically, but the user knows it has been sent and will expect that information. I know it's hard for us to easily distinguish between a PreparedRequest that someone has made using the PreparedRequest flow and one that was already sent, but users tend to not care if we can determine that or not. Introducing this here will open us up to a host of bug reports about not including various pieces of information in the dump.\n\nThat information would be expected to be present in the headers, right? If so, then anyone manually printing `r.request.headers` today would be equally confused. I don't see that to be the case. Perhaps this is just surfacing a bug that `r.request` should include that information (I don't necessarily think that it should). \n", "@kennethreitz the proxy information is tricky. We have a hacky way of incorporating it into the information in the toolbelt, but I don't think this implementation would have the same ability to determine if a proxy is in use.\n", "I have no concern for proxy representation here, but if you're concerned for user confusion, we can make a special note in the documentation. `PreparedRequests` and `Response` have no information or knowledge of proxies, and the `dump()` of their contents has no need to represent such information.\n", "If we're going to document as notes everything that might confuse a person with these implementations, those notes will probably outweigh the documentation of the feature itself.\n", "I see nothing here that would be confusing to anyone. \n" ]
https://api.github.com/repos/psf/requests/issues/3013
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3013/labels{/name}
https://api.github.com/repos/psf/requests/issues/3013/comments
https://api.github.com/repos/psf/requests/issues/3013/events
https://github.com/psf/requests/issues/3013
133,534,638
MDU6SXNzdWUxMzM1MzQ2Mzg=
3,013
Easy printing of Requests and Responses, two new utils?
{ "avatar_url": "https://avatars.githubusercontent.com/u/224130?v=4", "events_url": "https://api.github.com/users/EmilStenstrom/events{/privacy}", "followers_url": "https://api.github.com/users/EmilStenstrom/followers", "following_url": "https://api.github.com/users/EmilStenstrom/following{/other_user}", "gists_url": "https://api.github.com/users/EmilStenstrom/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/EmilStenstrom", "id": 224130, "login": "EmilStenstrom", "node_id": "MDQ6VXNlcjIyNDEzMA==", "organizations_url": "https://api.github.com/users/EmilStenstrom/orgs", "received_events_url": "https://api.github.com/users/EmilStenstrom/received_events", "repos_url": "https://api.github.com/users/EmilStenstrom/repos", "site_admin": false, "starred_url": "https://api.github.com/users/EmilStenstrom/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/EmilStenstrom/subscriptions", "type": "User", "url": "https://api.github.com/users/EmilStenstrom", "user_view_type": "public" }
[ { "color": "02e10c", "default": false, "description": null, "id": 76800, "name": "Feature Request", "node_id": "MDU6TGFiZWw3NjgwMA==", "url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request" } ]
closed
true
null
[]
null
12
2016-02-14T12:25:29Z
2021-09-08T18:00:54Z
2016-04-16T04:06:18Z
NONE
resolved
Several times now I've stumbled over cases where I would like to print the request and response objects as strings to the console. Simply to see when headers and content are sent. Seems more people than I have had this problem: http://stackoverflow.com/questions/20658572/python-requests-print-entire-http-request-raw My suggestion: add two util methods that print requests and responses according to the HTTP spec. They are fully optional to use, and would not break backwards compatibility: To be clear, I'm suggesting something link this be added to requests.utils: ``` python def print_request(req): print('HTTP/1.1 {method} {url}\n{headers}\n\n{body}'.format( method=req.method, url=req.url, headers='\n'.join('{}: {}'.format(k, v) for k, v in req.headers.items()), body=req.body, )) def print_response(res): print('HTTP/1.1 {status_code}\n{headers}\n\n{body}'.format( status_code=res.status_code, headers='\n'.join('{}: {}'.format(k, v) for k, v in res.headers.items()), body=res.content, )) ``` Is this a good idea?
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 9, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 9, "url": "https://api.github.com/repos/psf/requests/issues/3013/reactions" }
https://api.github.com/repos/psf/requests/issues/3013/timeline
null
completed
null
null
false
[ "@EmilStenstrom Generally this is a good idea, but we tend to oppose having 'utility' functions be merged into requests itself: we want to restrict the scope of the main library so that it remains really good at doing its specific roles.\n\nFor that reason, you should check out the [requests toolbelt](https://toolbelt.readthedocs.org/en/latest/dumputils.html), which has exactly the utilities you want.\n", "Didn't we just have another issue about this a couple days ago?\n", "I actually wouldn't be against having something like this added directly to the `PreparedRequest` and `Responses` classes. I think that functionality is useful enough to warrant inclusion. \n\nWould have to think hard about the name, though.\n", "`PreparedRequest.render(body=False)` (defaults to True).\n\nI would want it to be a near exact representation of what we're going to send across the wire (minus any post processing that occurs in a transport adapter, of course).\n", "@kennethreitz How \"near exact\" is \"near exact\"?\n\nNote, for example, that the `PreparedRequest` does not have a `Host:` header: this is because httplib attaches it for us. It is also missing some `Accept-Encoding` headers that urllib3 adds for us. The reality is that Requests does not have access to enough information to render this out unless we want to guess at what the rest of the stack will do, and if we do that then we'll start getting bugs raised when the output of `render` does not match the actual transmitted bytes.\n", "That may be okay, as we'd be rendering a representation of the `PreparedRequest` object, not the request itself. It would have to be documented as such.\n\nPart of me thinks this is a great idea, the other part has hesitation.\n\nIt would be nice to be able to `print requests.get(...).request.render()` to get a quick view of what was sent, rather than having to print a bunch of `request` attrs. \n", "@kennethreitz I have no objection to rendering a representation of the preparedrequest object, but I suspect it's _extremely_ unwise to render in a structure that suggests completeness.\n\nIn particular, the structure given here (which renders out to a 'valid' HTTP/1.1 request/response) doesn't work in the following cases:\n1. If using HTTP/2, when the request/response are structured entirely differently and made of binary frames.\n2. If we send a file or send using chunked transfer-encoding, where we cannot render the body before we send it.\n3. If we _receive_ data using chunked transfer-encoding, where we cannot accurately represent it because urllib3 has transparently removed the chunking.\n4. If the request is sent via a proxy there's an extra CONNECT request we're simply not printing.\n\nThere's nothing wrong with wanting to have a printable representation of all the data on the `PreparedRequest`, but we should be very wary before we format that like a HTTP/1.1 request.\n", "Agreed. I think we could make it _look like_ an HTTP/1.1 request/response, but clearly not be one. That would present the data in a friendly and familiar format, but would make it clear this is not a representation of the wire layer.\n\nFor example, instead of opening with `HTTP/1.1`, we'd open with `REQUESTS/2.9.2`.\n", "I'm open to that direction.\n", "Cool. I like this idea.\n", "Thanks for continuing the discussion and making a very incomplete idea into something awesome. \n\nIs this something you think someone inexperienced with contributing to requests could work on? It looks fairly straightforward with my newbie eyes and I think some code could be inspired by requests_toolbelt. Should I start working on a pull request? Do you want to see something more of a spec first?\n", "@EmilStenstrom what do you think of this? https://github.com/kennethreitz/requests/pull/3014\n" ]
https://api.github.com/repos/psf/requests/issues/3012
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3012/labels{/name}
https://api.github.com/repos/psf/requests/issues/3012/comments
https://api.github.com/repos/psf/requests/issues/3012/events
https://github.com/psf/requests/issues/3012
133,487,050
MDU6SXNzdWUxMzM0ODcwNTA=
3,012
Memory usage
{ "avatar_url": "https://avatars.githubusercontent.com/u/7945525?v=4", "events_url": "https://api.github.com/users/Leo675/events{/privacy}", "followers_url": "https://api.github.com/users/Leo675/followers", "following_url": "https://api.github.com/users/Leo675/following{/other_user}", "gists_url": "https://api.github.com/users/Leo675/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Leo675", "id": 7945525, "login": "Leo675", "node_id": "MDQ6VXNlcjc5NDU1MjU=", "organizations_url": "https://api.github.com/users/Leo675/orgs", "received_events_url": "https://api.github.com/users/Leo675/received_events", "repos_url": "https://api.github.com/users/Leo675/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Leo675/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Leo675/subscriptions", "type": "User", "url": "https://api.github.com/users/Leo675", "user_view_type": "public" }
[]
closed
true
null
[]
null
10
2016-02-14T00:25:05Z
2021-09-08T19:00:33Z
2016-02-14T01:22:20Z
NONE
resolved
Memory usage seems to be abnormally high for requests when using python 2.7. Request version being tested is 2.9.1 installed by pip. "pip show requests" output will be at the end. I'm unsure if this is an "issue", but it seems to be using 25MB just by importing requests. Testing was done using memory_profiler. I've included simple test code and some output. https://pypi.python.org/pypi/memory_profiler There's also a noticeable wait time when importing with python 2.7. ``` from memory_profiler import profile @profile def request_test(): import requests @profile def urllib3_test(): import urllib3 request_test() urllib3_test() ``` ``` $ python test.py Filename: test.py Line # Mem usage Increment Line Contents ================================================ 3 12.2 MiB 0.0 MiB @profile 4 def request_test(): 5 37.8 MiB 25.6 MiB import requests Filename: test.py Line # Mem usage Increment Line Contents ================================================ 8 37.8 MiB 0.0 MiB @profile 9 def urllib3_test(): 10 38.2 MiB 0.3 MiB import urllib3 ``` ``` $ python3 test.py Filename: test.py Line # Mem usage Increment Line Contents ================================================ 3 12.4 MiB 0.0 MiB @profile 4 def request_test(): 5 19.4 MiB 7.0 MiB import requests Filename: test.py Line # Mem usage Increment Line Contents ================================================ 8 19.5 MiB 0.0 MiB @profile 9 def urllib3_test(): 10 20.0 MiB 0.4 MiB import urllib3 ``` ``` $ pip2.7 show requests --- Metadata-Version: 2.0 Name: requests Version: 2.9.1 Summary: Python HTTP for Humans. Home-page: http://python-requests.org Author: Kenneth Reitz Author-email: [email protected] License: Apache 2.0 Location: /usr/local/lib/python2.7/dist-packages ``` ``` $ pip3.4 show requests --- Metadata-Version: 2.0 Name: requests Version: 2.9.1 Summary: Python HTTP for Humans. Home-page: http://python-requests.org Author: Kenneth Reitz Author-email: [email protected] License: Apache 2.0 Location: /usr/local/lib/python3.4/dist-packages ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/7945525?v=4", "events_url": "https://api.github.com/users/Leo675/events{/privacy}", "followers_url": "https://api.github.com/users/Leo675/followers", "following_url": "https://api.github.com/users/Leo675/following{/other_user}", "gists_url": "https://api.github.com/users/Leo675/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Leo675", "id": 7945525, "login": "Leo675", "node_id": "MDQ6VXNlcjc5NDU1MjU=", "organizations_url": "https://api.github.com/users/Leo675/orgs", "received_events_url": "https://api.github.com/users/Leo675/received_events", "repos_url": "https://api.github.com/users/Leo675/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Leo675/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Leo675/subscriptions", "type": "User", "url": "https://api.github.com/users/Leo675", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3012/reactions" }
https://api.github.com/repos/psf/requests/issues/3012/timeline
null
completed
null
null
false
[ "update:\n\nAfter upgrading all OTHER python packages, the memory usage with python 2.7 dropped down to 7.8 MiB\n\nSeems to be an issue with another library that is imported into requests and not actually caused by requests itself.\n", "@IDSninja do you have a list of the packages you upgraded? Requests doesn't import anything external, with a few small exceptions. \n", "I just used this command to update everything.... So I'm not really sure which one fixed it.\n\n```\npip2.7 freeze --local | grep -v '^\\-e' | cut -d = -f 1 | xargs -n1 pip2.7 install -U\n```\n", "@IDSninja can you show us the output of `$ pip2.7 freeze`?\n", "Here you go. This is running on Kali so there's quite a bit installed.\n\n```\n$ pip2.7 freeze\nadns-python==1.2.1\napsw==3.9.2.post1\nargcomplete==1.0.0\nasync==0.6.2\nbackports-abc==0.4\nbackports.ssl-match-hostname==3.5.0.1\nBBQSQL==1.2\nBeautifulSoup==3.2.1\nbeautifulsoup4==4.4.1\nbinwalk==2.1.0\nbitarray==0.8.1\nBlindElephant==1.0\nblinker==1.4\ncapstone==3.0.4\ncatfacts==0.1.0\ncertifi==2015.11.20.1\ncffi==1.5.2\ncharacteristic==14.3.0\nchardet==2.3.0\nCheetah==2.4.4\nchirp==1.2.2\nclamd==1.0.2\ncluster==1.3.1\ncoinbase==2.0.3\ncolorama==0.3.6\nConfigArgParse==0.10.0\nconfigobj==5.0.6\nconfigparser==3.3.0.post2\nconstruct==2.5.2\ncryptography==1.2.2\nctypeslib==0.5.6\ncycler==0.9.0\nd2to1==0.2.12.post1\ndarts.util.lru==0.5\ndecorator==4.0.9\ndefusedxml==0.4.1\ndicttoxml==1.6.6\ndissy==9\ndistorm3==3.3.0\ndnspython==1.12.0\ndocutils==0.12\neasygui==0.97.4\necdsa==0.13\nElixir==0.7.1\nenum34==1.1.2\nesmre==0.3.1\net-xmlfile==1.0.1\nfeedparser==5.2.1\nFlask==0.10.1\nflickrapi==2.1.2\nfuncsigs==0.4\nfunkload==1.16.1\nfuse-python==0.2.1\nfutures==3.0.4\nGeoIP==1.3.2\ngevent==1.0.2\ngitdb==0.6.4\nGitPython==1.0.2\ngreenlet==0.4.9\ngrey-harvest==0.1.3.5\nguess-language==0.2\ngyp==0.1\nhalberd==0.2.4\nhpack==2.0.1\nhtml2text==2015.11.4\nhtml5lib==0.9999999\nhttplib2==0.9.2\nidna==2.0\nimpacket==0.9.14\nintelhex==2.0\nipaddress==1.0.16\nIPy==0.83\nitsdangerous==0.24\njdcal==1.2\nJinja2==2.8\njsonpickle==0.9.2\njsonrpclib==0.1.7\nkeepnote==0.7.8\nkillerbee==1.0\nlivestreamer==1.12.2\nlxml==3.4.0\nM2Crypto==0.21.1\nMagic-file-extensions==0.2\nMako==1.0.3\nMarkdown==2.6.5\nMarkupSafe==0.23\nmatplotlib==1.5.1\nmechanize==0.2.5\nmemory-profiler==0.41\nmercurial==3.7.1\nmetaconfig==0.1.4\nmitmproxy==0.13\nmock==1.3.0\nmsgpack-python==0.4.7\nMySQL-python==1.2.5\nndg-httpsclient==0.4.0\nnetaddr==0.7.18\nnetlib==0.15.1\nnetworkx==1.11\nNfSpy==1.0\nnltk==3.1\nnose==1.3.7\nnumpy==1.10.4\noauth==1.0.1\noauthlib==1.0.3\nopenpyxl==2.3.3\noursql==0.9.3.1\nPAM==0.4.2\npandas==0.17.1\nparamiko==1.16.0\nparse==1.6.6\npasslib==1.6.5\nPaste==2.0.2\nPasteDeploy==1.5.2\nPasteScript==2.0.2\npbr==1.8.1\npcapy==0.10.8\npdfminer==20140328\npefile==1.2.10.post114\npexpect==4.0.1\nphply==0.9.1\nPillow==2.6.1\nply==3.8\nprettytable==0.7.2\npsutil==3.4.2\npsycopg2==2.5.4\nptyprocess==0.5.1\npyasn1==0.1.9\npyasn1-modules==0.0.8\npybloomfiltermmap==0.3.14\nPyBluez==0.22\npycparser==2.14\npycrypto==2.6.1\npycryptopp==0.7.1.869544967005693312591928092448767568728501330214\npycups==1.9.63\npycurl==7.43.0\npydns==2.3.6\nPyGithub==1.26.0\nPygments==2.1\npygobject==3.14.0\npygraphviz==1.3rc2\npyinotify==0.9.6\npylibmc==1.5.0\npymssql==1.0.2\nPyMySQL==0.7.1\nPyOpenGL==3.1.0\npyOpenSSL==0.15.1\npyparsing==2.1.0\npyPdf==1.13\npyperclip==1.5.26\npyqtgraph==0.9.10\npyregfi==99.99.99.277\npyrit==0.4.0\npyrtlsdr==0.2.0\npyscard==1.6.14\npyserial==3.0.1\npysmbc==1.0.15.3\npysmi==0.0.7\npysnmp==4.3.2\npysnmp-apps==0.4.1\npysnmp-mibs==0.1.6\nPySocks==1.5.6\npysqlite==2.6.3\npython-apt==0.9.3.11\npython-dateutil==2.4.2\npython-debian==0.1.27\npython-debianbts==1.11\npython-Levenshtein==0.12.0\npython-ntlm==1.1.0\npytidylib==0.2.4\npytz==2015.7\npyusb==1.0.0b2\nPyX==0.12.1\npyxdg==0.25\nPyYAML==3.11\npyzmq==14.4.0\nreportbug==6.6.3\nrequesocks==0.10.8\nrequests==2.9.1\nrequests-oauthlib==0.6.0\nrequests-toolbelt==0.6.0\nrfidiot==1.0\nroman==2.0.0\nruamel.ordereddict==0.4.9\nscapy==2.3.2\nscipy==0.14.0\nservice-identity==14.0.0\nshove==0.6.6\nsimplejson==3.8.1\nsingledispatch==3.4.0.3\nsix==1.10.0\nslowaes==0.1a1\nsmmap==0.9.0\nSOAPpy==0.12.22\nSQLAlchemy==1.0.11\nstopit==1.1.1\nstuf==0.9.16\ntblib==1.2.0\ntornado==4.3\ntweepy==3.5.0\ntwilio==5.3.0\nTwisted==15.5.0\nTwisted-Conch==14.0.2\nTwisted-Core==14.0.2\nTwisted-Lore==14.0.2\nTwisted-Mail==14.0.2\nTwisted-Names==14.0.2\nTwisted-News==14.0.2\nTwisted-Runner==14.0.2\nTwisted-Web==14.0.2\nTwisted-Words==14.0.2\nurllib3==1.14\nurwid==1.3.1\nuTidylib==0.2\nvolatility==2.4\nvulndb==0.0.19\nwapiti==2.3.0\nwebunit==1.3.10\nWerkzeug==0.11.3\nwfuzz==0.0.0\nwheel==0.29.0\nwstools==0.4.3\nwxPython==3.0.1.1\nwxPython-common==3.0.1.1\nxdot==0.6\nXlsxWriter==0.8.4\nxsser==1.6\nYapsy==1.11.223\nyara-python==3.1\nzenmap==7.1\nzim==0.62\nzope.interface==4.1.3\n```\n", "Aha, you do have `pyOpenSSL`! I suspect that was the cause of your memory usage. It was definitely what caused the delayed startup time. \n", "Interesting. Removing pyOpenSSL brings memory usage down even more (from 7.9 to 5.4MiB)\n\nIs this recommended and does it impact functionality?\n", "@IDSninja that's a complex question. Basically, the standard library's `ssl` library has some limitations, which `pyOpenSSL` fixes. Using it is best practice if security is a large priority for application (or you need the extra functionality it provides), but there's a slight cost. \n", "ok, thanks for everything.\n", ":sparkles: :cake: :sparkles:\n" ]
https://api.github.com/repos/psf/requests/issues/3011
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3011/labels{/name}
https://api.github.com/repos/psf/requests/issues/3011/comments
https://api.github.com/repos/psf/requests/issues/3011/events
https://github.com/psf/requests/issues/3011
133,412,794
MDU6SXNzdWUxMzM0MTI3OTQ=
3,011
Requests SSL Handshake Failure
{ "avatar_url": "https://avatars.githubusercontent.com/u/9329553?v=4", "events_url": "https://api.github.com/users/wraithseeker/events{/privacy}", "followers_url": "https://api.github.com/users/wraithseeker/followers", "following_url": "https://api.github.com/users/wraithseeker/following{/other_user}", "gists_url": "https://api.github.com/users/wraithseeker/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/wraithseeker", "id": 9329553, "login": "wraithseeker", "node_id": "MDQ6VXNlcjkzMjk1NTM=", "organizations_url": "https://api.github.com/users/wraithseeker/orgs", "received_events_url": "https://api.github.com/users/wraithseeker/received_events", "repos_url": "https://api.github.com/users/wraithseeker/repos", "site_admin": false, "starred_url": "https://api.github.com/users/wraithseeker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wraithseeker/subscriptions", "type": "User", "url": "https://api.github.com/users/wraithseeker", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2016-02-13T08:00:39Z
2021-09-08T19:00:33Z
2016-02-13T09:04:42Z
NONE
resolved
I'm getting requests.exceptions.SSLError: [SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:645) using the Requests library to fetch a webpage with Python3. Most other https website work but there are only a couple of websites that causes me to encounter this error. Pip3 list ``` cffi (1.5.0) cryptography (1.2.2) idna (2.0) ndg-httpsclient (0.4.0) pip (8.0.2) pyasn1 (0.1.9) pycparser (2.14) pyOpenSSL (0.15.1) pyperclip (1.5.26) requests (2.9.1) setuptools (18.2) six (1.10.0) ``` ``` import requests res = requests.get('https://visualnoveler.com/') ``` ``` Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 559, in urlopen body=body, headers=headers) File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 345, in _make_request self._validate_conn(conn) File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 784, in _validate_conn conn.connect() File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/packages/urllib3/connection.py", line 252, in connect ssl_version=resolved_ssl_version) File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/packages/urllib3/util/ssl_.py", line 305, in ssl_wrap_socket return context.wrap_socket(sock, server_hostname=server_hostname) File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 376, in wrap_socket _context=self) File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 747, in __init__ self.do_handshake() File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 983, in do_handshake self._sslobj.do_handshake() File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 628, in do_handshake self._sslobj.do_handshake() ssl.SSLError: [SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:645) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/adapters.py", line 376, in send timeout=timeout File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 588, in urlopen raise SSLError(e) requests.packages.urllib3.exceptions.SSLError: [SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:645) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/Users/wraithseeker/Desktop/Untitled.py", line 3, in <module> res = requests.get('https://visualnoveler.com/') File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/api.py", line 67, in get return request('get', url, params=params, **kwargs) File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/api.py", line 53, in request return session.request(method=method, url=url, **kwargs) File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/sessions.py", line 468, in request resp = self.send(prep, **send_kwargs) File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/sessions.py", line 576, in send r = adapter.send(request, **kwargs) File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/site-packages/requests/adapters.py", line 447, in send raise SSLError(e, request=request) requests.exceptions.SSLError: [SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:645) [Finished in 0.5s with exit code 1] ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/9329553?v=4", "events_url": "https://api.github.com/users/wraithseeker/events{/privacy}", "followers_url": "https://api.github.com/users/wraithseeker/followers", "following_url": "https://api.github.com/users/wraithseeker/following{/other_user}", "gists_url": "https://api.github.com/users/wraithseeker/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/wraithseeker", "id": 9329553, "login": "wraithseeker", "node_id": "MDQ6VXNlcjkzMjk1NTM=", "organizations_url": "https://api.github.com/users/wraithseeker/orgs", "received_events_url": "https://api.github.com/users/wraithseeker/received_events", "repos_url": "https://api.github.com/users/wraithseeker/repos", "site_admin": false, "starred_url": "https://api.github.com/users/wraithseeker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/wraithseeker/subscriptions", "type": "User", "url": "https://api.github.com/users/wraithseeker", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3011/reactions" }
https://api.github.com/repos/psf/requests/issues/3011/timeline
null
completed
null
null
false
[ "This works on my machine (just adding additional information). I do not, however, have pyOpenSSL installed. \n", "PyOpenSSL is not currently used on Python 3.\n\nWhat version of OpenSSL are you using? `python -c 'import ssl; print ssl.OPENSSL_VERSION'`\n", "`OpenSSL 0.9.8zg 14 July 2015`\n\nI tried uninstalling PyOpenSSL and also tried downgrading Python 3.5 to 3.4 but I'm still getting the same error.\n", "That version of OpenSSL is _very_ old, which may be why you're having problems but @kennethreitz is not. Unfortunately, our PyOpenSSL-based workaround does not currently work on Python 3, though we're working on a fix for that. \n\nAn alternative approach would be to install a newer OpenSSL from Homebrew and then get your Python 3 from Homebrew as well. That would likely resolve your problem. \n", "Thanks, I reinstalled it on Homebrew and it worked!\n" ]
https://api.github.com/repos/psf/requests/issues/3010
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3010/labels{/name}
https://api.github.com/repos/psf/requests/issues/3010/comments
https://api.github.com/repos/psf/requests/issues/3010/events
https://github.com/psf/requests/issues/3010
133,394,357
MDU6SXNzdWUxMzMzOTQzNTc=
3,010
get_redirect_location return error hostname
{ "avatar_url": "https://avatars.githubusercontent.com/u/1644359?v=4", "events_url": "https://api.github.com/users/jsfs2019/events{/privacy}", "followers_url": "https://api.github.com/users/jsfs2019/followers", "following_url": "https://api.github.com/users/jsfs2019/following{/other_user}", "gists_url": "https://api.github.com/users/jsfs2019/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jsfs2019", "id": 1644359, "login": "jsfs2019", "node_id": "MDQ6VXNlcjE2NDQzNTk=", "organizations_url": "https://api.github.com/users/jsfs2019/orgs", "received_events_url": "https://api.github.com/users/jsfs2019/received_events", "repos_url": "https://api.github.com/users/jsfs2019/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jsfs2019/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jsfs2019/subscriptions", "type": "User", "url": "https://api.github.com/users/jsfs2019", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-02-13T03:39:26Z
2021-09-08T19:00:34Z
2016-02-13T03:56:03Z
NONE
resolved
when I visited site http://bitsnoop.com/chano-dominguez-hecho-a-mano-200-q18406062.html like this: ``` In [3]: r = get('http://bitsnoop.com/chano-dominguez-hecho-a-mano-200-q18406062.html', allow_redirects=False) In [4]: r.status_code Out[4]: 302 In [5]: r.raw.get_redirect_location() Out[5]: 'http://127.0.0.1/' ``` I got the error redirect address. So I test the website with curl: ``` $ curl -v http://bitsnoop.com/chano-dominguez-hecho-a-mano-200-q18406062.html * Hostname was NOT found in DNS cache * Trying 31.7.59.14... * Trying 2a02:29b8:1925::2... * Immediate connect fail for 2a02:29b8:1925::2: Network is unreachable * Connected to bitsnoop.com (31.7.59.14) port 80 (#0) > GET /chano-dominguez-hecho-a-mano-200-q18406062.html HTTP/1.1 > User-Agent: curl/7.35.0 > Host: bitsnoop.com > Accept: */* > < HTTP/1.1 301 Moved Permanently < Content-Type: text/html; charset=utf-8 < Keep-Alive: timeout=300 < Location: http://bitsnoop.com/chano-dominguez-hecho-a-mano-2002-q18406062.html < Transfer-Encoding: chunked < Date: Sat, 13 Feb 2016 01:29:09 GMT < Connection: keep-alive * Server Hampsterbox/2.0 is not blacklisted < Server: Hampsterbox/2.0 < * Connection #0 to host bitsnoop.com left intact ``` Obviously the right redirect address is http://bitsnoop.com/chano-dominguez-hecho-a-mano-2002-q18406062.html. It's really confusing;)
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3010/reactions" }
https://api.github.com/repos/psf/requests/issues/3010/timeline
null
completed
null
null
false
[ "`r.raw.get_redirect_location()` is not actually part of the Requests API, and is not a supported way to procure that information.\n\nThe proper way (as it stands) is to simply look at the value of the `Location` header:\n\n``` python\n>>> r.headers['Location']\n'http://127.0.0.1/'\n```\n", "@jaron92: It looks like this server is sending a phony redirect based on the `User-Agent` header the request used. \n\nThis works for me:\n\n``` python\nurl = 'http://bitsnoop.com/chano-dominguez-hecho-a-mano-200-q18406062.html'\nheaders = {'User-Agent': 'curl/7.43.0'}\n\nr = requests.get(url, headers=headers, allow_redirects=False)\n```\n\n``` python\n >>> r.headers['location']\n'http://bitsnoop.com/chano-dominguez-hecho-a-mano-2002-q18406062.html'\n```\n\nThis is why you were seeing differing results with cURL. \n", "Thx.\n\nIt helps me a lot;)\n", "@jaron92 sure thing! :cake:\n" ]
https://api.github.com/repos/psf/requests/issues/3009
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3009/labels{/name}
https://api.github.com/repos/psf/requests/issues/3009/comments
https://api.github.com/repos/psf/requests/issues/3009/events
https://github.com/psf/requests/issues/3009
133,365,559
MDU6SXNzdWUxMzMzNjU1NTk=
3,009
Inspecting 'response.ok' with non-ascii HTTP status "reason" content causes UnicodeDecodeError
{ "avatar_url": "https://avatars.githubusercontent.com/u/420896?v=4", "events_url": "https://api.github.com/users/rene-armida/events{/privacy}", "followers_url": "https://api.github.com/users/rene-armida/followers", "following_url": "https://api.github.com/users/rene-armida/following{/other_user}", "gists_url": "https://api.github.com/users/rene-armida/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/rene-armida", "id": 420896, "login": "rene-armida", "node_id": "MDQ6VXNlcjQyMDg5Ng==", "organizations_url": "https://api.github.com/users/rene-armida/orgs", "received_events_url": "https://api.github.com/users/rene-armida/received_events", "repos_url": "https://api.github.com/users/rene-armida/repos", "site_admin": false, "starred_url": "https://api.github.com/users/rene-armida/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rene-armida/subscriptions", "type": "User", "url": "https://api.github.com/users/rene-armida", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" } ]
closed
true
null
[]
null
12
2016-02-12T23:09:13Z
2021-09-08T16:00:39Z
2016-08-05T00:10:35Z
NONE
resolved
Duplicate bug disclaimer: this is related to other Unicode + header issues, however, those all focus on errors raised while generating an outbound request. This applies only to inbound requests. (See also: #1082, #400, and to a lesser extent, #390, #409, #421, and #424; if this is already covered by another bug, sorry for bugging you guys. Thanks!) When inspecting responses with Unicode-encoded non-latin content trailing the HTTP status header in the response, requests raises UnicodeDecodeError. ``` (requests) marmida@loquat:~$ python Python 2.7.10 (default, Oct 14 2015, 16:09:02) [GCC 5.2.1 20151010] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import requests >>> resp = requests.get('http://jp.apps.gree.net/ja/1604') >>> resp.ok Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/home/marmida/venvs/requests/local/lib/python2.7/site-packages/requests/models.py", line 623, in ok self.raise_for_status() File "/home/marmida/venvs/requests/local/lib/python2.7/site-packages/requests/models.py", line 837, in raise_for_status http_error_msg = '%s Server Error: %s for url: %s' % (self.status_code, self.reason, self.url) UnicodeDecodeError: 'ascii' codec can't decode byte 0xe3 in position 23: ordinal not in range(128) ``` The content causing the problem, fetched via curl: ``` $ curl -v 'http://jp.apps.gree.net/ja/1604' * Connected to jp.apps.gree.net (116.93.155.51) port 80 (#0) > GET /ja/1604 HTTP/1.1 > Host: jp.apps.gree.net > User-Agent: curl/7.43.0 > Accept: */* > < HTTP/1.1 503 AKB48ステージファイター < Server: nginx ``` In the stack trace above, formatting `http_error_msg` as a `string` won't work, because `self.reason` contains non-ascii `unicode`. Requests version: 2.9.1 Run from a fresh virtualenv with only `requests` installed.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3009/reactions" }
https://api.github.com/repos/psf/requests/issues/3009/timeline
null
completed
null
null
false
[ "I think the best way for this to be solved is for us to switch the format specifier for the reason phrase from `%s` to `%r`. Given that RFC 7230 tells us that a client should basically ignore the reason phrase anyway, we should avoid trying to interpret it. \n", ":+1: \n", "Extra datapoint, I've been bitten by this too while interacting with the API of a Spanish-language server. The correct fixes here are, I believe:\n1. Properly decode the HTTP Reason Phrase field as described by @denis-ryzhkov in [#1181](https://github.com/kennethreitz/requests/pull/1181#issuecomment-13423623) for header values.\n - According to [RFC2616](https://tools.ietf.org/html/rfc2616#section-6.1.1), Reason Phrase is `TEXT` which defaults to `latin1` and supports other characters under RFC2047.\n2. Store the `.reason` attribute as unicode if it isn't already.\n3. In `.raise_for_status` use `%r` as @Lukasa suggests (since exception messages are supposed to be bytestrings).\n", "@erydo Changing the decoding of the reason phrase is beyond the scope of requests: that is handled entirely by httplib, and we can't change it. Generally speaking that works in a way that is acceptable: it misbehaves only in a few situations on Python 2.\n", "(Please note also that RFC 2616 is superseded by RFC 7230 in this case, which defines the reason phrase as HTAB/SP/VCHAR/obs-text, which basically allows the printable US-ASCII characters and octets from 0x80 to 0xFF. The easiest way to handle such a field is just to let it stay as bytes.)\n", "@Lukasa I would expect that RFC obsolescence to apply more to developers of new servers rather than consumers of them.\n\nApache Tomcat, for example, sends down ~~unicode~~ `latin1` reason phrases for non-en locales, and that's not a rarely-encountered server.\n", "The RFC obsolescence applies to all developers always. RFC 7230-7233 _are_ the specification of HTTP/1.1. RFC 7230 codifies the current best practice for deploying HTTP/1.1 and deviating from it is unwise.\n\nIn this case, as long as Tomcat is sending UTF-8 encoded data it is _not_ deviating from RFC 7230 (though using any ABNF labelled \"obsolete\" is rarely a good sign). However, if Tomcat issues data using something like UTF-16 that is _clearly_ unwise. So I think we can agree that Tomcat's behaviour could be bad depending on encoding choices.\n\nUsing bytes outside of printable US-ASCII in the HTTP header block is exceedingly unwise. Tomcat can do whatever they feel like, but I don't think that represents a sensible choice, and I'd be happy to express that opinion to any Tomcat developer who cares to ask. \n", "Sorry, I updated my comment which I misstated: Tomcat sends the phrase down in `latin1` encoding, not `utf8` or `utf16`. It conforms correctly to RFC2616. `latin1` includes common characters like `é` which are not US-ASCII and which do trigger this bug.\n\nRFC7230 has this to say:\n\n> Historically, HTTP has allowed field content with text in the ISO-8859-1 charset [ISO-8859-1], supporting other charsets only through use of [RFC2047] encoding. In practice, most HTTP header field values use only a subset of the US-ASCII charset [USASCII]. Newly defined header fields SHOULD limit their field values to US-ASCII octets. A recipient SHOULD treat other octets in field content (obs-text) as opaque data.\n\nWhich agrees with your preference to keep it as bytes.\n\nHowever, the \"header values\" it refers to are not for typically for human consumption. The Reason Phrase is _explicitly_ for the user; so you have to be able to decode it somehow. Given these factors:\n1. The explicit purpose of the Reason Phrase is to be shown to the user.\n2. RFC2047 offers no useful guidance on how to interpret the Reason Phrase for non-ASCII characters.\n3. RFC2616 _does_ offer useful guidance for how to do that for non-ASCII characters, and in a way that is not incompatible with RFC2047.\n4. Common, in-production servers send responses conforming with RFC2616.\n\nI still believe the correct action here is decode it.\n", "#3034 proposes a solution to this with more detail on my reasoning behind it. Specifically, I believe RFC2047's recommendation (which only discusses headers) about the treatment of `obs-text` without excluding Reason Phrase is a bug in the specification. To treat Reason Phrase as \"opaque data\" while simultaneously recommending that servers localize it and clients display it to the user is contradictory, especially given the existence of a previous specification that addresses those needs.\n\nI'll go hands off on this for now though, if anyone on the team would like to take a look. If rejected I can run an internally patched version, it isn't the end of the world for us of course.\n", "> It conforms correctly to RFC2616\n\nPlease stop trying to justify the wrong behaviour by saying it conforms to an obsoleted specification. You're not convincing us of anything as we've been working towards conforming with the new set of HTTP/1.1 specifications as best we can while dealing with `httplib`.\n\nI also don't understand why you're referring to sections in the RFC when the ABNF clearly states (as @Lukasa has already shown) that the reason phrase should be `HTAB`, `SP`, `VCHAR`, and `obs-text`.\n\n> I believe RFC2047's recommendation ... is a bug in the specification.\n\nThen please file errata against the RFC to address that. Please also note that 2047 is not the latest RFC on the topic.\n\n---\n\nRegardless, I have to agree with @Lukasa that the reason string should be treated as opaque bytes. I don't understand why you think your decision to run an internally patched would sway us, since you can decode the raw reason bytes before using the reason string.\n", "This seems to have been resolved by https://github.com/kennethreitz/requests/pull/3385\n", "You're right. Thanks @olivierlefloch \n" ]
https://api.github.com/repos/psf/requests/issues/3008
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3008/labels{/name}
https://api.github.com/repos/psf/requests/issues/3008/comments
https://api.github.com/repos/psf/requests/issues/3008/events
https://github.com/psf/requests/issues/3008
133,362,834
MDU6SXNzdWUxMzMzNjI4MzQ=
3,008
invalid git history
{ "avatar_url": "https://avatars.githubusercontent.com/u/1505226?v=4", "events_url": "https://api.github.com/users/thestinger/events{/privacy}", "followers_url": "https://api.github.com/users/thestinger/followers", "following_url": "https://api.github.com/users/thestinger/following{/other_user}", "gists_url": "https://api.github.com/users/thestinger/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/thestinger", "id": 1505226, "login": "thestinger", "node_id": "MDQ6VXNlcjE1MDUyMjY=", "organizations_url": "https://api.github.com/users/thestinger/orgs", "received_events_url": "https://api.github.com/users/thestinger/received_events", "repos_url": "https://api.github.com/users/thestinger/repos", "site_admin": false, "starred_url": "https://api.github.com/users/thestinger/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/thestinger/subscriptions", "type": "User", "url": "https://api.github.com/users/thestinger", "user_view_type": "public" }
[]
closed
true
null
[]
null
12
2016-02-12T22:51:09Z
2016-03-09T21:27:22Z
2016-02-13T00:11:32Z
NONE
null
The Git history for this repository is invalid. You'll run into this if you enable verification of the history in your configuration (which should really be the default): ``` [transfer] fsckobjects = true [fetch] fsckobjects = true [receive] fsckobjects = true ``` ``` % git clone https://github.com/kennethreitz/requests.git Cloning into 'requests'... remote: Counting objects: 16580, done. error: object 5e6ecdad9f69b1ff789a17733b8edc6fd7091bd8: badTimezone: invalid author/committer line - bad time zone fatal: Error in object ``` If you look at the commit with verification disabled, you can see the problem: ``` commit 5e6ecdad9f69b1ff789a17733b8edc6fd7091bd8 Author: Shrikant Sharat Kandula <[email protected]> Date: Thu Sep 8 02:38:50 2011 +51800 Typo in documentation The kwarg is named `headers`, not `header`. Also, its a dict, not a set. ``` I don't have a suggestion on how to approach this. It's quite unfortunate. The history would probably have to be rewritten, and that would be awful. I do think it's worth noting though... verification by default may become the default on various distributions or even upstream.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3008/reactions" }
https://api.github.com/repos/psf/requests/issues/3008/timeline
null
completed
null
null
false
[ "> verification by default may become the default on various distributions\n\nI'd assume there are many more large/popular repositories that contain at least broken timestamps. IMO it'd be impudent if a distro patches git in a way that would break cloning repos like this. Given requests' history with downstream patches, I guess others would agree.\n\n> or even upstream.\n\nUpstream should decide, yes.\n\n---\n\nEDIT: From the larger Python repos, it seems that [Werkzeug](https://github.com/mitsuhiko/werkzeug) is also affected by this.\n", "If verification is off, there aren't even checks to make sure that the hashes are correct. It would make sense if there was a loose mode accepting some invalid history, like invalid timestamps.\n", "Well, one solution is to `$ git config fsck.badTimezone ignore`\n", "OK, here's my position on this.\n- This is nothing for us to be concerned with. The only people effected by this are people who signed up to be effected by this by automatically running `git fsck` when performing normal Git operations. I can definitely see why this could be considered a good idea, but since almost no one does this, this is not something Requests needs be concerned with.\n- I say this mainly because the only solution is to rewrite our Git history. Honestly, I don't think anyone would even notice that we've done so (I've done it a few times in the past), but that's definitely not something I want to do to correct this.\n- If a time comes with this behavior becomes default to git, I'd be happy to reconsider. \n", "Also of note is the fact that this commit has existed since 2011 and no one has noticed for 5 years. \n", "Also of note is that this is a duplicate issue: https://github.com/kennethreitz/requests/issues/2690\n", "FYI, this is breaking YouCompleteMe install for those who turned full fsck on.\n", "@junkblocker I'm sorry about that but we're not going to rewrite history.\n", "@junkblocker perhaps you can do a shallow clone instead?\n", "I am not directly cloning YCM, I am doing it via [NeoBundle](https://github.com/Shougo/neobundle.vim) (a vim plugin manager). I think I have shallow clone set up for that but it probably doesn't do shallow checkout of submodules (if there is a way to do that via git).\n\nAdditionally, I wonder if there is a way to ignore fsck per repository via global .gitconfig.\n", "> I am not directly cloning YCM, I am doing it via NeoBundle (a vim plugin manager). I think I have shallow clone set up for that but it probably doesn't do shallow checkout of submodules (if there is a way to do that via git).\n\nSee http://duncansungwkim.blogspot.ca/2014/04/workaround-of-git-shalow-clone-with.html. There is a workaround but it may involve forking your plugin manager. (Ran into the same with [vim-plug](https://github.com/junegunn/vim-plug/issues/217#issuecomment-96365992).)\n", "Because further conversation on this bug will not be productive and there are 844 people subscribed to notifications for this repository, I'm locking this conversation. If you have problems with other projects, please continue the discussion there.\n" ]
https://api.github.com/repos/psf/requests/issues/3007
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3007/labels{/name}
https://api.github.com/repos/psf/requests/issues/3007/comments
https://api.github.com/repos/psf/requests/issues/3007/events
https://github.com/psf/requests/issues/3007
133,202,073
MDU6SXNzdWUxMzMyMDIwNzM=
3,007
Add ability to pass CA certificates by fileobj
{ "avatar_url": "https://avatars.githubusercontent.com/u/88809?v=4", "events_url": "https://api.github.com/users/Kentzo/events{/privacy}", "followers_url": "https://api.github.com/users/Kentzo/followers", "following_url": "https://api.github.com/users/Kentzo/following{/other_user}", "gists_url": "https://api.github.com/users/Kentzo/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Kentzo", "id": 88809, "login": "Kentzo", "node_id": "MDQ6VXNlcjg4ODA5", "organizations_url": "https://api.github.com/users/Kentzo/orgs", "received_events_url": "https://api.github.com/users/Kentzo/received_events", "repos_url": "https://api.github.com/users/Kentzo/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Kentzo/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Kentzo/subscriptions", "type": "User", "url": "https://api.github.com/users/Kentzo", "user_view_type": "public" }
[]
closed
true
null
[]
null
8
2016-02-12T10:16:35Z
2021-09-08T13:05:30Z
2016-02-12T13:27:52Z
NONE
resolved
Sometimes requests has no ability to open file with CA because is no such file, e.g. when CA is embedded in the app. requests should allow to specify fileobj (which can be a file or io wrapper on top of string and bytes) for that.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3007/reactions" }
https://api.github.com/repos/psf/requests/issues/3007/timeline
null
completed
null
null
false
[ "Unfortunately that's extremely difficult for requests to do. \n\nThe 'path to a file' approach is actually the API used by OpenSSL. It's also possible to pass an in-memory serialized representation (as read from a file object), but that API is not exposed by the standard library so we can't rely on that being present. \n", "We could create a tmpfile automatically when passed a file-like-object, but we would likely want to ensure its deletion after execution is complete.\n\nSo, definitely possible, but probably not the best idea.\n", "Creating intermediate file can be handled by user of requests.\n\nAfter checking Python's code I agree that it's, at least, a limitation in Python.\nSad that OpenSSL relies on file being present in FS.\n", "@Lukasa is there a way to query presense of this API in runtime? Could you give me some links to read more about it?\n", "Is it possible to verify by using system's CA without shipping cacert.pem nor certifi?\n", "@Kentzo: for a discussion about the system cert bundle, see #2966. It is possible to query for this API at runtime because it's only exposed by PyOpenSSL.\n", "Python 2.7.9+ and 3.5's implementation of `load_verify_locations()` in `Modules/_ssl.c` support passing of a `cadata` argument, which contains the serialized trust store. This is also exposed in the `ssl.create_default_context()` function.", "@Lukasa I think the original issue is orthogonal to #2966 and this issue should be re-opened because of the discovery made by @dsully." ]
https://api.github.com/repos/psf/requests/issues/3006
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3006/labels{/name}
https://api.github.com/repos/psf/requests/issues/3006/comments
https://api.github.com/repos/psf/requests/issues/3006/events
https://github.com/psf/requests/issues/3006
133,149,897
MDU6SXNzdWUxMzMxNDk4OTc=
3,006
requests.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:645)
{ "avatar_url": "https://avatars.githubusercontent.com/u/2891235?v=4", "events_url": "https://api.github.com/users/caizixian/events{/privacy}", "followers_url": "https://api.github.com/users/caizixian/followers", "following_url": "https://api.github.com/users/caizixian/following{/other_user}", "gists_url": "https://api.github.com/users/caizixian/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/caizixian", "id": 2891235, "login": "caizixian", "node_id": "MDQ6VXNlcjI4OTEyMzU=", "organizations_url": "https://api.github.com/users/caizixian/orgs", "received_events_url": "https://api.github.com/users/caizixian/received_events", "repos_url": "https://api.github.com/users/caizixian/repos", "site_admin": false, "starred_url": "https://api.github.com/users/caizixian/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/caizixian/subscriptions", "type": "User", "url": "https://api.github.com/users/caizixian", "user_view_type": "public" }
[]
closed
true
null
[]
null
77
2016-02-12T03:30:56Z
2021-09-04T00:06:18Z
2016-08-05T07:52:35Z
NONE
resolved
Here is the first issue. https://github.com/kennethreitz/requests/issues/2906 Python 3.5.1 (https://www.python.org/downloads/) Virtualenv 14.0.5 Mac OS X 10.11.3 First, I created a virtualenv and `pip install requests[security]` Then I got ``` >>> from cryptography.hazmat.backends.openssl.backend import backend >>> print(backend.openssl_version_text()) OpenSSL 1.0.2f 28 Jan 2016 ``` which was what I expected. Everything worked great for about an hour. Then, some of my own scripts crashed which was normal. After that, when I try to run my script again, I got following exceptions ``` ssl.SSLEOFError: EOF occurred in violation of protocol (_ssl.c:645) requests.packages.urllib3.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:645) requests.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:645) ``` So I opened another Python console and ``` >>> requests.get("https://www.google.com") <Response [200]> >>> requests.get("https://www.telegram.org") Traceback (most recent call last): File "VirtualenvPath/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 559, in urlopen body=body, headers=headers) File "VirtualenvPath/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 345, in _make_request self._validate_conn(conn) File "VirtualenvPath/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 784, in _validate_conn conn.connect() File "VirtualenvPath/lib/python3.5/site-packages/requests/packages/urllib3/connection.py", line 252, in connect ssl_version=resolved_ssl_version) File "VirtualenvPath/lib/python3.5/site-packages/requests/packages/urllib3/util/ssl_.py", line 305, in ssl_wrap_socket return context.wrap_socket(sock, server_hostname=server_hostname) File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 376, in wrap_socket _context=self) File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 747, in __init__ self.do_handshake() File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 983, in do_handshake self._sslobj.do_handshake() File "/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py", line 628, in do_handshake self._sslobj.do_handshake() ssl.SSLEOFError: EOF occurred in violation of protocol (_ssl.c:645) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "VirtualenvPath/lib/python3.5/site-packages/requests/adapters.py", line 376, in send timeout=timeout File "VirtualenvPath/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py", line 588, in urlopen raise SSLError(e) requests.packages.urllib3.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:645) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "<stdin>", line 1, in <module> File "VirtualenvPath/lib/python3.5/site-packages/requests/api.py", line 67, in get return request('get', url, params=params, **kwargs) File "VirtualenvPath/lib/python3.5/site-packages/requests/api.py", line 53, in request return session.request(method=method, url=url, **kwargs) File "VirtualenvPath/lib/python3.5/site-packages/requests/sessions.py", line 468, in request resp = self.send(prep, **send_kwargs) File "VirtualenvPath/lib/python3.5/site-packages/requests/sessions.py", line 576, in send r = adapter.send(request, **kwargs) File "VirtualenvPath/lib/python3.5/site-packages/requests/adapters.py", line 447, in send raise SSLError(e, request=request) requests.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:645) >>> ``` So I rebooted, uninstalled all these libs and pip install them again. Everything worked again. But after 1 hour or so, same exception again.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3006/reactions" }
https://api.github.com/repos/psf/requests/issues/3006/timeline
null
completed
null
null
false
[ "> Then, some of my own scripts crashed which was normal.\n\nWhat does a crash like that look like?\n\n> After that, when I try to run my script again, I got following exceptions\n\nAre you completely unable to run them at all after that?\n", "@sigmavirus24 It's just some bug in my own code without anything to do with networking.\nI'm not sure whether this is the real cause or just coincidence.\n\nBut one thing is for sure, after some point, I'm completely unable to make request to https://www.telegram.org, which I can do right after install request.\n\nJust FYI: #2906\n", "So I'd like to point out that you're installing `requests[security]` which means we should be using pyOpenSSL but your stacktrace shows that we aren't. That's intriguing.\n", "@sigmavirus24 So this there anything I can do to help you?\n", "Any idea? @Lukasa \n", "So what matters most here is: why does your code stop using PyOpenSSL? When you encounter your crash, can you open a python console in your virtual environment and then run `import urllib3.contrib.pyopenssl`, to see if that works?\n", "Couldn't reproduce it now. I will close this first, and if I encounter the problem again, I will paste the result and reopen it.\n", "@Lukasa I think since requests is shipped with its own urllib3, I could not import urllib3 alone. And the following result confirms it.\n\n```\nPython 3.5.1 (v3.5.1:37a07cee5969, Dec 5 2015, 21:12:44) \n[GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import urllib3.contrib.pyopenssl\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\nImportError: No module named 'urllib3'\n>>> ^D\n\n\n\n\npip list\ncffi (1.5.0)\ncryptography (1.2.2)\nidna (2.0)\nndg-httpsclient (0.4.0)\npip (8.0.2)\npyasn1 (0.1.9)\npycparser (2.14)\npyOpenSSL (0.15.1)\nrequests (2.9.1)\nsetuptools (20.0)\nsix (1.10.0)\nwheel (0.26.0)\n```\n", "I'm sorry, try importing `requests.packages.urllib3.contrib.pyopenssl`\n", "@Lukasa \n\n```\nPython 3.5.1 (v3.5.1:37a07cee5969, Dec 5 2015, 21:12:44) \n[GCC 4.2.1 (Apple Inc. build 5666) (dot 3)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import requests.packages.urllib3.contrib.pyopenssl\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"VirtualenvPath/lib/python3.5/site-packages/requests/packages/urllib3/contrib/pyopenssl.py\", line 57, in <module>\n from socket import _fileobject, timeout, error as SocketError\nImportError: cannot import name '_fileobject'\n>>> \n```\n", "I bet that is a bug on 3.5 with pyOpenSSL that we didn't know about @lukasa (with respect to `_fileobject` not existing. \n", "Nope, I know about it, and have proposed a fix upstream in urllib3 when we started testing PyOpenSSL. The reality is that `requests[security]` does not (and has never) worked on Python 3. That'll be fixed in an upcoming version of Requests.\n\nSo that doesn't solve our puzzle: why did this work for a bit and then stop?\n", "tangent comment based on original report, the built in ssl.py in python has an option to suppress ragged EOFs as there are plenty of sites that will uncleanly shutdown SSL connections. sometimes unreliably as if it were a matter of timing or coincidence. the following is an extract from ssl.py\n\n``` python\nclass SSLSocket(socket):\n [...]\n def read(self, len=0, buffer=None):\n \"\"\"Read up to LEN bytes and return them.\n Return zero-length string on EOF.\"\"\"\n\n self._checkClosed()\n if not self._sslobj:\n raise ValueError(\"Read on closed or unwrapped SSL socket.\")\n try:\n return self._sslobj.read(len, buffer)\n except SSLError as x:\n if x.args[0] == SSL_ERROR_EOF and self.suppress_ragged_eofs:\n if buffer is not None:\n return 0\n else:\n return b''\n else:\n raise\n```\n", "https://github.com/shazow/urllib3/pull/795\nhttps://github.com/shazow/urllib3/issues/791\n", "Having the same exception on Python 2.7.11/OSX when pounding a server with `requests` under `gevent`. Could this be related?\n", "@the-efi could you be more specific about which exception you're seeing?\n", "```\n r = requests.post(self.MY_URL, data=parameters)\n File \"/Users/me/Envs/my_env/lib/python2.7/site-packages/requests/api.py\", line 109, in post\n return request('post', url, data=data, json=json, **kwargs)\n File \"/Users/me/Envs/my_env/lib/python2.7/site-packages/requests/api.py\", line 50, in request\n response = session.request(method=method, url=url, **kwargs)\n File \"/Users/me/Envs/my_env/lib/python2.7/site-packages/requests/sessions.py\", line 465, in request\n resp = self.send(prep, **send_kwargs)\n File \"/Users/me/Envs/my_env/lib/python2.7/site-packages/requests/sessions.py\", line 573, in send\n r = adapter.send(request, **kwargs)\n File \"/Users/me/Envs/my_env/lib/python2.7/site-packages/requests/adapters.py\", line 431, in send\n raise SSLError(e, request=request)\nSSLError: EOF occurred in violation of protocol (_ssl.c:590)\n```\n", "@the-efi Do you have `pyopenssl`, `pyasn1`, and `ndg-httpsclient` installed?\n", "pyopenssl: negative\npyasn1: pyasn1==0.1.8 (transient, I guess)\nndg-httpsclient: negative\n", "Ok. You're also running Python 2.7, so you and the poster seem to be having different problems.\n\nDo you know if you hit your problem during connection setup, or on a long-running connection?\n", "I would imagine this is during connection setup, but if `requests` reuses previously opened connections with `Connection: keep-alive` I might be wrong.\n", "Requests does indeed re-use previously opened connections where possible, which is why I asked the question. ;)\n\nIt would be very useful if we could get a packet capture of this problem in your case, though that may be tricky given that it occurs under heavy load.\n", "No problem, I will see what I can do about it after the weekend. Would you like me to submit that as a new issue?\n", "Yes please. =)\n", "@Lukasa Any progress with urllib3?\n", "@caizixian We're getting there, but we have some problems with our CI testing because Travis CI has a fairly old PyPy image that doesn't behave well with PyOpenSSL at the moment. I'll see if I can get this to work sometime this weekend.\n", "@Lukasa @shazow urllib3 always has a home at http://ci.kennethreitz.org, if desired!\n", "Any update on this one? It happens quite a lot when working with aws.\n", "@mindw As best as I know we don't have a good understanding of exactly where and when it's happening. In the backtrace above it's happening during the handshake, which is usually a problem with negotitation: when are you encountering this error?\n", "OS X, Python 3.5.1, requests 2.10.0 + security.\nI will try and provide any additional info upom request :)\n\n```\nTraceback (most recent call last):\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 578, in urlopen\n chunked=chunked)\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 351, in _make_request\n self._validate_conn(conn)\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 814, in _validate_conn\n conn.connect()\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/requests/packages/urllib3/connection.py\", line 289, in connect\n ssl_version=resolved_ssl_version)\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/requests/packages/urllib3/util/ssl_.py\", line 308, in ssl_wrap_socket\n return context.wrap_socket(sock, server_hostname=server_hostname)\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py\", line 376, in wrap_socket\n _context=self)\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py\", line 747, in __init__\n self.do_handshake()\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py\", line 983, in do_handshake\n self._sslobj.do_handshake()\n File \"/Library/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py\", line 628, in do_handshake\n self._sslobj.do_handshake()\nssl.SSLEOFError: EOF occurred in violation of protocol (_ssl.c:645)\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/requests/adapters.py\", line 403, in send\n timeout=timeout\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 604, in urlopen\n raise SSLError(e)\nrequests.packages.urllib3.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:645)\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/Users/gabdav01/.virtualenvs/splatt/bin/splatt\", line 9, in <module>\n load_entry_point('splatt', 'console_scripts', 'splatt')()\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/click/core.py\", line 716, in __call__\n return self.main(*args, **kwargs)\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/click/core.py\", line 696, in main\n rv = self.invoke(ctx)\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/click/core.py\", line 1060, in invoke\n return _process_result(sub_ctx.command.invoke(sub_ctx))\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/click/core.py\", line 889, in invoke\n return ctx.invoke(self.callback, **ctx.params)\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/click/core.py\", line 534, in invoke\n return callback(*args, **kwargs)\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/click/decorators.py\", line 17, in new_func\n return f(get_current_context(), *args, **kwargs)\n File \"/Users/gabdav01/work/paas/splatt/splatt/cmd.py\", line 108, in create\n url, json=json, headers=headers)\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/requests/sessions.py\", line 518, in post\n return self.request('POST', url, data=data, json=json, **kwargs)\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/requests/sessions.py\", line 475, in request\n resp = self.send(prep, **send_kwargs)\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/requests/sessions.py\", line 585, in send\n r = adapter.send(request, **kwargs)\n File \"/Users/gabdav01/.virtualenvs/splatt/lib/python3.5/site-packages/requests/adapters.py\", line 477, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:645)\n```\n" ]
https://api.github.com/repos/psf/requests/issues/3005
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3005/labels{/name}
https://api.github.com/repos/psf/requests/issues/3005/comments
https://api.github.com/repos/psf/requests/issues/3005/events
https://github.com/psf/requests/pull/3005
133,051,769
MDExOlB1bGxSZXF1ZXN0NTkwNzMzMzk=
3,005
Use time.monotonic() for tracking elapsed time where available
{ "avatar_url": "https://avatars.githubusercontent.com/u/109152?v=4", "events_url": "https://api.github.com/users/scop/events{/privacy}", "followers_url": "https://api.github.com/users/scop/followers", "following_url": "https://api.github.com/users/scop/following{/other_user}", "gists_url": "https://api.github.com/users/scop/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/scop", "id": 109152, "login": "scop", "node_id": "MDQ6VXNlcjEwOTE1Mg==", "organizations_url": "https://api.github.com/users/scop/orgs", "received_events_url": "https://api.github.com/users/scop/received_events", "repos_url": "https://api.github.com/users/scop/repos", "site_admin": false, "starred_url": "https://api.github.com/users/scop/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/scop/subscriptions", "type": "User", "url": "https://api.github.com/users/scop", "user_view_type": "public" }
[ { "color": "e11d21", "default": false, "description": null, "id": 44501305, "name": "Not Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTMwNQ==", "url": "https://api.github.com/repos/psf/requests/labels/Not%20Ready%20To%20Merge" }, { "color": "e11d21", "default": false, "description": null, "id": 136589914, "name": "Needs Info", "node_id": "MDU6TGFiZWwxMzY1ODk5MTQ=", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20Info" } ]
closed
true
null
[]
null
5
2016-02-11T18:30:48Z
2021-09-08T04:01:11Z
2016-04-06T18:56:17Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3005/reactions" }
https://api.github.com/repos/psf/requests/issues/3005/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3005.diff", "html_url": "https://github.com/psf/requests/pull/3005", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3005.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3005" }
true
[ "Hey @scop I was wondering if you could provide detail about why this is desirable and why we should change the existing behaviour as well as a description of when/where monotonic will be available and where/when it won't.\n", "Most of the requested details should be answered by the `time.monotonic` docs: https://docs.python.org/3/library/time.html#time.monotonic\n\nIn a nutshell, this change makes `r.elapsed` unaffected by system clock changes where `time.monotonic` is available. There are no changes to behavior on setups where it isn't. It was introduced in Python 3.3. In 3.3 and 3.4 it wasn't available on all platforms, but in 3.5 it is.\n", "Have you run into issues with the elapsed attribute that this would help with?\n", "No, but I have elsewhere with similar things, just thought it'd be pretty much a no-brainer to fix in requests as well.\n", "I don't think this is anything we need to be concerned about, and it adds unnecessary complexity to one of our simplest (and least used) features. I understand the thought process behind this PR, and if we were all using 3.5, then I'd probably accept it (with a few improvements). \n" ]
https://api.github.com/repos/psf/requests/issues/3004
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3004/labels{/name}
https://api.github.com/repos/psf/requests/issues/3004/comments
https://api.github.com/repos/psf/requests/issues/3004/events
https://github.com/psf/requests/issues/3004
132,781,456
MDU6SXNzdWUxMzI3ODE0NTY=
3,004
Durable way to obtain SSLError errno
{ "avatar_url": "https://avatars.githubusercontent.com/u/1385596?v=4", "events_url": "https://api.github.com/users/AlanCoding/events{/privacy}", "followers_url": "https://api.github.com/users/AlanCoding/followers", "following_url": "https://api.github.com/users/AlanCoding/following{/other_user}", "gists_url": "https://api.github.com/users/AlanCoding/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/AlanCoding", "id": 1385596, "login": "AlanCoding", "node_id": "MDQ6VXNlcjEzODU1OTY=", "organizations_url": "https://api.github.com/users/AlanCoding/orgs", "received_events_url": "https://api.github.com/users/AlanCoding/received_events", "repos_url": "https://api.github.com/users/AlanCoding/repos", "site_admin": false, "starred_url": "https://api.github.com/users/AlanCoding/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/AlanCoding/subscriptions", "type": "User", "url": "https://api.github.com/users/AlanCoding", "user_view_type": "public" }
[]
closed
false
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2024-05-19T18:29:04Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4", "events_url": "https://api.github.com/users/sethmlarson/events{/privacy}", "followers_url": "https://api.github.com/users/sethmlarson/followers", "following_url": "https://api.github.com/users/sethmlarson/following{/other_user}", "gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sethmlarson", "id": 18519037, "login": "sethmlarson", "node_id": "MDQ6VXNlcjE4NTE5MDM3", "organizations_url": "https://api.github.com/users/sethmlarson/orgs", "received_events_url": "https://api.github.com/users/sethmlarson/received_events", "repos_url": "https://api.github.com/users/sethmlarson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions", "type": "User", "url": "https://api.github.com/users/sethmlarson", "user_view_type": "public" }, "description": "", "due_on": null, "html_url": "https://github.com/psf/requests/milestone/34", "id": 11073254, "labels_url": "https://api.github.com/repos/psf/requests/milestones/34/labels", "node_id": "MI_kwDOABTKOs4AqPbm", "number": 34, "open_issues": 0, "state": "open", "title": "Bankruptcy", "updated_at": "2024-05-20T14:37:16Z", "url": "https://api.github.com/repos/psf/requests/milestones/34" }
6
2016-02-10T18:33:23Z
2024-05-20T14:35:28Z
2024-05-20T14:35:28Z
NONE
null
## Issue I want to get the errno property of an SSLError object the right way. ## Description I have a Session object, for which I call the request method made inside of a try-catch block. I would like to distinguish between different situations, all of which will have the same traceback, and all of which produce an SSLError. Here is the specific code I'm working with, starting with the except block with SSLError: https://github.com/ansible/tower-cli/blob/master/lib/tower_cli/api.py#L103 Different situations will return (at least) two different types of SSLErrors: 1. requests.exceptions.SSLError: EOF occurred in violation of protocol (_ssl.c:590) 2. requests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590) ## Work-around Attempts Here is an example of someone else who tried to distinguish between these scenarios: https://github.com/kisom/woofs/blob/ca589bf57390069f61981fdae1b216d8c657f75c/woofs.py#L237 ``` python if 'EOF occurred in violation of protocol' in str(e): continue ``` (note: My exception is named `ex`, but it is named `e` in the above code) I'm not a fan of this code. If I do pdb.set_trace() and inspect the error in my case, I find that `ex.errno` is `None`. ### EOF error ``` python (Pdb) print ex.args[0].args[0].args (1, u'[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)') ``` ### CA error ``` python (Pdb) print ex.args[0].args[0].args (8, u'EOF occurred in violation of protocol (_ssl.c:590)') ``` You can see that the first error has an error number 1, and the second error has an error number 8. This is what I want. So I could obtain the SSLError number from `ex.args[0].args[0].args[0]`, and branch different cases based on that. But if the requests library changes in the future, another layer of try-except might be added, so realistically I would need to write a loop that recursively reads `args` as long as it's a tuple, until I get to an integer. Is this the best available method to get the error number? It seems strange that `ex.message` will automatically look into the args chain and obtain the error message string, but `ex.errno` will not do the same for the error number. I'm using requests 2.9.1.
{ "avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4", "events_url": "https://api.github.com/users/sethmlarson/events{/privacy}", "followers_url": "https://api.github.com/users/sethmlarson/followers", "following_url": "https://api.github.com/users/sethmlarson/following{/other_user}", "gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sethmlarson", "id": 18519037, "login": "sethmlarson", "node_id": "MDQ6VXNlcjE4NTE5MDM3", "organizations_url": "https://api.github.com/users/sethmlarson/orgs", "received_events_url": "https://api.github.com/users/sethmlarson/received_events", "repos_url": "https://api.github.com/users/sethmlarson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions", "type": "User", "url": "https://api.github.com/users/sethmlarson", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3004/reactions" }
https://api.github.com/repos/psf/requests/issues/3004/timeline
null
completed
null
null
false
[ "This would be helpful as i get `SSLError: (\"bad handshake: SysCallError(-1, 'Unexpected EOF')\",)` when i run my scripts and i am not sure what error is causing this issue exactly. The traceback does lead to the Requests `adapter.py` library.\n", "@girishsortur This would not help you at all. You can already see the errno (it's -1), and the error message (\"unexpected EOF\"). That means that the remote server terminated the connection without cleanly shutting down the TLS connection, likely midway through the handshake.\n", "@Lukasa thanks. I wasn't sure if `-1` was the error code exactly. Is this error being caused from server end or from client machine? Any help or link that you can share which explains what this error is? Thanks\n", "Unfortunately, OpenSSL is very bad at reporting errors. Likely this is caused by the server and client having an incompatible set of settings, and the server choosing to terminate the handshake. You can try looking at a [blog post I wrote](https://lukasa.co.uk/2016/01/Debugging_With_Wireshark_TLS/) and see if that helps you debug the problem.\n", "@AlanCoding I think you're on the right track with that code. At the least, it seems like using `args[0]` as the original exception isn't going away anytime soon. The code that currently raises an `SSLError` [passes the original argument as the first parameter](https://github.com/kennethreitz/requests/blob/3f8b1fb617cfdd3911602b5c3e668ad30602c64d/requests/adapters.py#L447). If you look in that same file, there are lots of places where the original exception is passed as the first param.\n\nThat said, if you're lucky enough to use python3 you can use the `__context__` property to reliably get at the root cause of an exception and (I don't think) it can change despite whatever changes might happen in requests.\n\nI've put together what I think is the best way to get what you're looking for.\n\n``` py\nimport requests\n\n\ndef get_errno_py2k(ex):\n error_number = None\n current_error = ex\n while isinstance(current_error, Exception) and error_number is None:\n error_number = getattr(current_error, 'errno', None)\n current_error = current_error.args[0]\n\n return error_number\n\n\ndef get_errno_py3k(ex):\n error_number = None\n current_error = ex\n while current_error and error_number is None:\n error_number = getattr(current_error, 'errno', None)\n current_error = current_error.__context__\n\n return error_number\n\ntry:\n requests.get('https://expired.badssl.com/')\nexcept requests.exceptions.SSLError as ex:\n get_errno = get_errno_py3k if hasattr(ex, '__context__') else get_errno_py2k\n errno = get_errno(ex)\n print(ex)\n print('SSL error number: {}'.format(errno))\n```\n\nAnd the results:\n\n```\npython2 test.py\n[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)\nSSL error number: 1\n\npython3 test.py\n[SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)\nSSL error number: 1\n```\n\nThoughts?\n", "In an effort to clean up the issue tracker to only have issues that are still relevant to the project we've done a quick pass and decided this issue may no longer be relevant for a variety of potential reasons, including:\r\n\r\n* Applies to a much older version, unclear whether the issue still applies.\r\n* Change requires a backwards incompatible release and it's unclear if the benefits are worth the migration effort from the community.\r\n* There isn't a clear demand from the community on the change landing in Requests.\r\n\r\nIf you think the issue should remain open, please comment so below or open a new issue and link back to the original issue. Again, thank you for opening the issue and for the discussion, it's much appreciated." ]
https://api.github.com/repos/psf/requests/issues/3003
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3003/labels{/name}
https://api.github.com/repos/psf/requests/issues/3003/comments
https://api.github.com/repos/psf/requests/issues/3003/events
https://github.com/psf/requests/pull/3003
132,563,476
MDExOlB1bGxSZXF1ZXN0NTg4MjM1NTY=
3,003
Added pprint() function to Response and PreparedRequest objects for human-reading goodness
{ "avatar_url": "https://avatars.githubusercontent.com/u/5515484?v=4", "events_url": "https://api.github.com/users/mtj42/events{/privacy}", "followers_url": "https://api.github.com/users/mtj42/followers", "following_url": "https://api.github.com/users/mtj42/following{/other_user}", "gists_url": "https://api.github.com/users/mtj42/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mtj42", "id": 5515484, "login": "mtj42", "node_id": "MDQ6VXNlcjU1MTU0ODQ=", "organizations_url": "https://api.github.com/users/mtj42/orgs", "received_events_url": "https://api.github.com/users/mtj42/received_events", "repos_url": "https://api.github.com/users/mtj42/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mtj42/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mtj42/subscriptions", "type": "User", "url": "https://api.github.com/users/mtj42", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2016-02-09T23:05:14Z
2021-09-08T05:00:57Z
2016-02-10T20:26:21Z
NONE
resolved
This feature is aimed at making debugging/following HTTP traffic easier on a developer. Many times in the past, I would send Requests' traffic through Burp proxy in order to have a visual representation of my HTTP traffic. With this pull request, a developer can simply call pprint() on their HTTP requests to see a Burp-like view of the traffic. Example usage: ``` import requests r = requests.get("http://www.praetorian.com") r.request.pprint() r.pprint(body=False) ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/5515484?v=4", "events_url": "https://api.github.com/users/mtj42/events{/privacy}", "followers_url": "https://api.github.com/users/mtj42/followers", "following_url": "https://api.github.com/users/mtj42/following{/other_user}", "gists_url": "https://api.github.com/users/mtj42/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mtj42", "id": 5515484, "login": "mtj42", "node_id": "MDQ6VXNlcjU1MTU0ODQ=", "organizations_url": "https://api.github.com/users/mtj42/orgs", "received_events_url": "https://api.github.com/users/mtj42/received_events", "repos_url": "https://api.github.com/users/mtj42/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mtj42/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mtj42/subscriptions", "type": "User", "url": "https://api.github.com/users/mtj42", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3003/reactions" }
https://api.github.com/repos/psf/requests/issues/3003/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/3003.diff", "html_url": "https://github.com/psf/requests/pull/3003", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/3003.patch", "url": "https://api.github.com/repos/psf/requests/pulls/3003" }
true
[ "This already exists (to a degree) in the requests-toolbelt: https://toolbelt.readthedocs.org/en/latest/dumputils.html\n", "My major problem with the dump utility is that it is not as granular as pprint(), which gives a developer the ability to dump a single request or a response without the body attached (good for when the response is a large static page). The dump util returns a lot more data than is usually needed while monitoring Requests traffic.\n", "@7uice Presumably we could add flags to `dump` to resolve that problem though?\n\nGenerally speaking I'm in favour of keeping this kind of code out of requests: it becomes one more thing we have to maintain.\n", "@Lukasa It looks like this could be accomplished using utils.dump by adding a `print_body=True` flag and wrapping the following lines in a conditional statement:\n\n`line 78: bytearr.extend(prefix + _coerce_to_bytes(request.body))`\n\n`line 106: bytearr.extend(response.content)`\n\nThis would allow a user to print Request & Response while not flooding their screen with data returned in the response. If it's preferred to keep this out of Requests, I can move the code over to the utils.dump code.\n", "I think that would be better for now. =) At some time that may make its way over here some time in the future. =)\n", "Sounds good :) Closing this for now and moving the feature into utils.dump.\n" ]
https://api.github.com/repos/psf/requests/issues/3002
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3002/labels{/name}
https://api.github.com/repos/psf/requests/issues/3002/comments
https://api.github.com/repos/psf/requests/issues/3002/events
https://github.com/psf/requests/issues/3002
132,525,007
MDU6SXNzdWUxMzI1MjUwMDc=
3,002
Request via IPv6 ip result in a broken Host header
{ "avatar_url": "https://avatars.githubusercontent.com/u/569440?v=4", "events_url": "https://api.github.com/users/pneumaticdeath/events{/privacy}", "followers_url": "https://api.github.com/users/pneumaticdeath/followers", "following_url": "https://api.github.com/users/pneumaticdeath/following{/other_user}", "gists_url": "https://api.github.com/users/pneumaticdeath/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/pneumaticdeath", "id": 569440, "login": "pneumaticdeath", "node_id": "MDQ6VXNlcjU2OTQ0MA==", "organizations_url": "https://api.github.com/users/pneumaticdeath/orgs", "received_events_url": "https://api.github.com/users/pneumaticdeath/received_events", "repos_url": "https://api.github.com/users/pneumaticdeath/repos", "site_admin": false, "starred_url": "https://api.github.com/users/pneumaticdeath/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/pneumaticdeath/subscriptions", "type": "User", "url": "https://api.github.com/users/pneumaticdeath", "user_view_type": "public" }
[]
closed
true
null
[]
null
27
2016-02-09T20:30:32Z
2021-09-08T16:00:35Z
2016-08-05T07:51:40Z
NONE
resolved
This is a regression from 2.7.0 and 2.9.1. When using a URL with an IPv6 ip in the host part of the url: http://[2620:10d:c021:11::75]/foo, in 2.7.0 the Host header looks correct as so: Host: [2620:10d:c021:11::75] But when using 2.9.1, it gets munged: Host: [[2620:10d:c021:11::75]] This causes Django to barf.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3002/reactions" }
https://api.github.com/repos/psf/requests/issues/3002/timeline
null
completed
null
null
false
[ "Just to clarify, where did you install requests from?\n", "So this was probably introduced by shazow/urllib3#708. Looks like we'll have to revert that @shazow.\n\n@pneumaticdeath Why Python version are you using?\n", "I got it from https://pypi.python.org/packages/source/r/requests/requests-2.9.1.tar.gz#md5=0b7f480d19012ec52bab78292efd976d\n", "lol if only we had a better comment on that line, maybe this time...\n", "I've tried in both python 2.7.9, and in python 3.3.x (don't remember right now which version)\n", "Yup, so that's why we need an explanation for the logic. I'll fix it up tomorrow if no one beats me to it. \n", "Ugh, typically while I found that [this bug](https://bugs.python.org/issue5111) 'fixed' the issue I didn't look any closer than that. It does fix the issue, but in the most naive way possible, meaning that of course it does not fix it at all.\n\nGoddamn I hate httplib sometimes.\n", "And in fact my previous example was not good enough. This is what I've found so far:\n\n``` python\nPython 3.4.3 (default, Sep 15 2015, 12:08:21) \n[GCC 4.2.1 Compatible Apple LLVM 6.1.0 (clang-602.0.53)] on darwin\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\n>>> import http.client\n>>> c = http.client.HTTPConnection('[2001:0:53aa:64c:104c:2c10:2bef:4f7b]')\n>>> c = http.client.HTTPConnection('2001:0:53aa:64c:104c:2c10:2bef:4f7b')\nTraceback (most recent call last):\n File \"/Users/cory/.pyenv/versions/3.4.3/lib/python3.4/http/client.py\", line 786, in _get_hostport\n port = int(host[i+1:])\nValueError: invalid literal for int() with base 10: '4f7b'\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/Users/cory/.pyenv/versions/3.4.3/lib/python3.4/http/client.py\", line 750, in __init__\n (self.host, self.port) = self._get_hostport(host, port)\n File \"/Users/cory/.pyenv/versions/3.4.3/lib/python3.4/http/client.py\", line 791, in _get_hostport\n raise InvalidURL(\"nonnumeric port: '%s'\" % host[i+1:])\nhttp.client.InvalidURL: nonnumeric port: '4f7b'\n```\n\nBut also\n\n```\n>>> c = http.client.HTTPConnection('::1', 8000)\n>>> c.request('GET', '/')\n>>> c = http.client.HTTPConnection('[::1]', 8000)\n>>> c.request('GET', '/')\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/Users/cory/.pyenv/versions/3.5.1/lib/python3.5/http/client.py\", line 1083, in request\n self._send_request(method, url, body, headers)\n File \"/Users/cory/.pyenv/versions/3.5.1/lib/python3.5/http/client.py\", line 1128, in _send_request\n self.endheaders(body)\n File \"/Users/cory/.pyenv/versions/3.5.1/lib/python3.5/http/client.py\", line 1079, in endheaders\n self._send_output(message_body)\n File \"/Users/cory/.pyenv/versions/3.5.1/lib/python3.5/http/client.py\", line 911, in _send_output\n self.send(msg)\n File \"/Users/cory/.pyenv/versions/3.5.1/lib/python3.5/http/client.py\", line 854, in send\n self.connect()\n File \"/Users/cory/.pyenv/versions/3.5.1/lib/python3.5/http/client.py\", line 826, in connect\n (self.host,self.port), self.timeout, self.source_address)\n File \"/Users/cory/.pyenv/versions/3.5.1/lib/python3.5/socket.py\", line 693, in create_connection\n for res in getaddrinfo(host, port, 0, SOCK_STREAM):\n File \"/Users/cory/.pyenv/versions/3.5.1/lib/python3.5/socket.py\", line 732, in getaddrinfo\n for res in _socket.getaddrinfo(host, port, family, type, proto, flags):\nsocket.gaierror: [Errno 8] nodename nor servname provided, or not known\n```\n\nWhat this appears to mean is that in some cases the square brackets are mandatory, and in some cases they're optional. I'm so far not capable of getting the Host header to be wrong though, so I don't fully understand how that's happening\n", "httplib/http.client seems to just be making a total hash of this. I don't understand at all. And that causes us a problem, because we can't revert the change unconditionally without breaking shazow/urllib3#707. So now everything is sad and terrible.\n", "AHA! This isn't a urllib3 bug per se, it's a requests bug. That is, the example used in urllib3 in the original bug was fixed, we just do something different in requests.\n\nThank goodness, I thought I was losing my mind.\n", "Ah, no, it's all terrible again: urllib3 gets this wrong if you use a poolmanager instead of `connection_from_url`. So we're back to urllib3 again.\n", "Ok, so the summary here is that the original fix is bad. Rather than removing that strip, we needed to fix `connection_from_url`.\n\nOk, good, well now we have a plan of action at least.\n", "So @shazow: the problem we have here is that the connection_from_url would not pass the port explicitly. That's avoided with the pool manager (which in `connection_from_host` defaults the port appropriately) and it's avoided in requests because it uses pool manager's `connection_from_url` which _also_ defaults the port.\n\nI think this suggests a good fix.\n", "Ok, a proposed fix is in shazow/urllib3#801.\n", "Silly github auto-closed this issue, re-opened.\n", "I'm pretty impressed it did that :)\n", "I'm a little disappointed it did that.\n", "I'm assuming it was because all collaborators were involved. I've never seen that happen before. \n", "Is there any plan to release a new version of the request library that contains the fixed urllib3?\n\nThe urllib3 1.15 has been released on 6 April 2016: https://pypi.python.org/pypi/urllib3\n\nIf need be, I can perhaps help with a PR to make this happen.\n", "There is, yes. =) The next release of Requests will contain that update.\n", "Ah, thank you. Is there any possible timeframe? This is so I can plan my work around this.\n\nAlso would the next release be backward compatible?\n\nIt seems there's this PR #2431 marked for next release that would break backward compatibility.\n", "The next release will be backward compatible because it won't be 3.0.0, it'll be 2.X.\n", "I know I might be stretching it - any time frame? This is so I can plan my work around this.\n", "We have no explicit time frame, but Kenneth has been wanting to release for a while so I'd expect it to be soon.\n", "Ok, thank you @Lukasa !\n", "Hopefully this week :)\n", "Thanks @kennethreitz!\n" ]
https://api.github.com/repos/psf/requests/issues/3001
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3001/labels{/name}
https://api.github.com/repos/psf/requests/issues/3001/comments
https://api.github.com/repos/psf/requests/issues/3001/events
https://github.com/psf/requests/issues/3001
132,501,201
MDU6SXNzdWUxMzI1MDEyMDE=
3,001
Multipart mixed responses generate warnings
{ "avatar_url": "https://avatars.githubusercontent.com/u/36793?v=4", "events_url": "https://api.github.com/users/ndw/events{/privacy}", "followers_url": "https://api.github.com/users/ndw/followers", "following_url": "https://api.github.com/users/ndw/following{/other_user}", "gists_url": "https://api.github.com/users/ndw/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ndw", "id": 36793, "login": "ndw", "node_id": "MDQ6VXNlcjM2Nzkz", "organizations_url": "https://api.github.com/users/ndw/orgs", "received_events_url": "https://api.github.com/users/ndw/received_events", "repos_url": "https://api.github.com/users/ndw/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ndw/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ndw/subscriptions", "type": "User", "url": "https://api.github.com/users/ndw", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-02-09T18:57:40Z
2021-09-08T19:00:35Z
2016-02-09T19:17:12Z
NONE
resolved
If you read a multipart/mixed response, the connectionpool issues a couple of warnings about defects in the message. I'm not sure what the expected, correct behavior is, but these warnings seem spurious. Stick this perl script somewhere: ``` #!/usr/bin/perl print "Server: Some Server Name\r\n"; print "Content-Type: multipart/mixed; boundary=36eeb8c4e26d842a\r\n"; print "Content-Length: 178\r\n"; print "\r\n\r\n"; print "--36eeb8c4e26d842a\r\n"; print "Content-Type: text/plain\r\n"; print "\r\n"; print "7\r\n"; print "--36eeb8c4e26d842a\r\n"; print "Content-Type: text/plain\r\n"; print "\r\n"; print "9\r\n"; print "--36eeb8c4e26d842a\r\n"; print "Content-Type: text/plain\r\n"; print "\r\n"; print "11\r\n"; print "--36eeb8c4e26d842a--\r\n"; ``` Read it with requests (naturally, you'll have to change the URI to wherever you put the script): ``` import requests, logging logging.basicConfig(level=logging.WARNING) logging.getLogger("requests").setLevel(logging.DEBUG) headers = {'accept': "multipart/mixed"} r = requests.get("http://localhost:8124/cgi-bin/mpm.pl", headers=headers) print(r) ``` The following errors are displayed: ``` DEBUG:requests.packages.urllib3.connectionpool:"GET http://localhost:8124/cgi-bin/mpm.pl HTTP/1.1" 200 178 WARNING:requests.packages.urllib3.connectionpool:Failed to parse headers (url=http://localhost:8888/http://localhost:8124/cgi-bin/mpm.pl): [StartBoundaryNotFoundDefect(), MultipartInvariantViolationDefect()], unparsed data: '' Traceback (most recent call last): File "/home/ndw/.virtualenvs/pyapi/lib/python3.4/site-packages/requests-2.8.0-py3.4.egg/requests/packages/urllib3/connectionpool.py", line 390, in _make_request assert_header_parsing(httplib_response.msg) File "/home/ndw/.virtualenvs/pyapi/lib/python3.4/site-packages/requests-2.8.0-py3.4.egg/requests/packages/urllib3/util/response.py", line 58, in assert_header_parsing raise HeaderParsingError(defects=defects, unparsed_data=unparsed_data) requests.packages.urllib3.exceptions.HeaderParsingError: [StartBoundaryNotFoundDefect(), MultipartInvariantViolationDefect()], unparsed data: '' ``` It took me quite a while to work out that they were spurious (because in real life, the server side that is generating the multipart/mixed is more complicated!)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3001/reactions" }
https://api.github.com/repos/psf/requests/issues/3001/timeline
null
completed
null
null
false
[ "Ah, can you raise this issue on the urllib3 repo! We'd love to merge a fix at the project! It's [here](https://github.com/shazow/urllib3).\n" ]
https://api.github.com/repos/psf/requests/issues/3000
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/3000/labels{/name}
https://api.github.com/repos/psf/requests/issues/3000/comments
https://api.github.com/repos/psf/requests/issues/3000/events
https://github.com/psf/requests/issues/3000
132,283,591
MDU6SXNzdWUxMzIyODM1OTE=
3,000
Problem with Content-Lenght is missing
{ "avatar_url": "https://avatars.githubusercontent.com/u/17131854?v=4", "events_url": "https://api.github.com/users/victor1969/events{/privacy}", "followers_url": "https://api.github.com/users/victor1969/followers", "following_url": "https://api.github.com/users/victor1969/following{/other_user}", "gists_url": "https://api.github.com/users/victor1969/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/victor1969", "id": 17131854, "login": "victor1969", "node_id": "MDQ6VXNlcjE3MTMxODU0", "organizations_url": "https://api.github.com/users/victor1969/orgs", "received_events_url": "https://api.github.com/users/victor1969/received_events", "repos_url": "https://api.github.com/users/victor1969/repos", "site_admin": false, "starred_url": "https://api.github.com/users/victor1969/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/victor1969/subscriptions", "type": "User", "url": "https://api.github.com/users/victor1969", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2016-02-08T23:11:45Z
2021-09-08T19:00:35Z
2016-02-08T23:37:19Z
NONE
resolved
I'm playing clonk rage but i can't play online because this happend Content-Lenght is missing pls help me PS: i don't speak english so well
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/3000/reactions" }
https://api.github.com/repos/psf/requests/issues/3000/timeline
null
completed
null
null
false
[ "@victor1969 can you please share the code you are using so we can understand what you're asking?\n", "where i get the code?\n", "@victor1969 is this an error you saw while playing a game? Why did you decide to post about it here?\n", "i'm just trying to play online Clonk rage but says on the server list\nInternet server on league.clonkspot.org\nInvalid server response: Content-Length is missing!\n", "@victor1969 unfortunately, that has nothing to do with project. There's probably somewhere else online for you to report this, though!\n", "ok \n:(\n" ]
https://api.github.com/repos/psf/requests/issues/2999
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2999/labels{/name}
https://api.github.com/repos/psf/requests/issues/2999/comments
https://api.github.com/repos/psf/requests/issues/2999/events
https://github.com/psf/requests/issues/2999
131,995,844
MDU6SXNzdWUxMzE5OTU4NDQ=
2,999
URL-encoded boolean parameter passed as ?foo=True instead of ?foo=1
{ "avatar_url": "https://avatars.githubusercontent.com/u/6293202?v=4", "events_url": "https://api.github.com/users/FlyingLotus1983/events{/privacy}", "followers_url": "https://api.github.com/users/FlyingLotus1983/followers", "following_url": "https://api.github.com/users/FlyingLotus1983/following{/other_user}", "gists_url": "https://api.github.com/users/FlyingLotus1983/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/FlyingLotus1983", "id": 6293202, "login": "FlyingLotus1983", "node_id": "MDQ6VXNlcjYyOTMyMDI=", "organizations_url": "https://api.github.com/users/FlyingLotus1983/orgs", "received_events_url": "https://api.github.com/users/FlyingLotus1983/received_events", "repos_url": "https://api.github.com/users/FlyingLotus1983/repos", "site_admin": false, "starred_url": "https://api.github.com/users/FlyingLotus1983/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/FlyingLotus1983/subscriptions", "type": "User", "url": "https://api.github.com/users/FlyingLotus1983", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-02-07T18:34:30Z
2021-09-08T19:00:35Z
2016-02-07T19:48:24Z
NONE
resolved
When encoding a boolean into a URL encoded parameter: ``` payload = {'foo': True} resp = requests.post(path, params=payload) ``` Requests currently generates: `?foo=True` .. I would expect the boolean to be encoded as a 1 or 0, e.g. `?foo=1` , as some clients may not understand `True` as a boolean. I did some google searching, and I am not even sure if there are rules for how a boolean should be encoded, I didn't find any specific guidelines, just that a lot of URL encoding libraries seem to always encode booleans as 1's and 0's. Apologies if this is known and/or expected behavior. I'm not a web guru by any means, but at the least, Requests does seem to be handling it differently than other libraries I've used in the past.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2999/reactions" }
https://api.github.com/repos/psf/requests/issues/2999/timeline
null
completed
null
null
false
[ "There are no rules for how to encode Boolean types, because urlencoding is performed on strings, not on typed objects. For that reason, Reuests coerces all objects passed to the urlencoding logic to strings (except for `None`, which it handles specially).\n\nThis is not considered a bug by the requests team: while it can be surprising if you come from a non-requests background, I'd argue that in aggregate our behaviour is far less surprising than converting objects into entirely different representations. \n", "@FlyingLotus1983 if you want a `1` there, you should provide a `1` :)\n" ]
https://api.github.com/repos/psf/requests/issues/2998
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2998/labels{/name}
https://api.github.com/repos/psf/requests/issues/2998/comments
https://api.github.com/repos/psf/requests/issues/2998/events
https://github.com/psf/requests/issues/2998
131,786,503
MDU6SXNzdWUxMzE3ODY1MDM=
2,998
Requests default to environment CA even when there's a session-set CA
{ "avatar_url": "https://avatars.githubusercontent.com/u/1117819?v=4", "events_url": "https://api.github.com/users/jcmcken/events{/privacy}", "followers_url": "https://api.github.com/users/jcmcken/followers", "following_url": "https://api.github.com/users/jcmcken/following{/other_user}", "gists_url": "https://api.github.com/users/jcmcken/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jcmcken", "id": 1117819, "login": "jcmcken", "node_id": "MDQ6VXNlcjExMTc4MTk=", "organizations_url": "https://api.github.com/users/jcmcken/orgs", "received_events_url": "https://api.github.com/users/jcmcken/received_events", "repos_url": "https://api.github.com/users/jcmcken/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jcmcken/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jcmcken/subscriptions", "type": "User", "url": "https://api.github.com/users/jcmcken", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" } ]
closed
true
null
[]
null
4
2016-02-05T23:16:51Z
2021-09-08T19:00:34Z
2016-02-10T08:46:49Z
NONE
resolved
If I set a CA via an environment variable, such as: ``` console $ export REQUESTS_CA_BUNDLE=/etc/pki/tls/certs/ca-bundle.crt ``` (Where the CA bundle actually does exist) And then I prepare a session like this: ``` python import requests session = requests.Session() session.verify = '/path/to/custom/ca.pem' ``` (Again, assume this 2nd CA path exists) Executing a `session.get` against an HTTPS endpoint signed by the 2nd CA but not the first will result in an SSL verification error. This is because the request is defaulting to using the environment CA rather than the session CA. This sort of contradicts my expectations, where the session is much closer to the request in the stack than the process's environment. The source of this issue is the merging happening here: https://github.com/kennethreitz/requests/blob/3a77b59df222920391359f052cab9319dda31ac7/requests/sessions.py#L625 ...where the conditional validates whether the verification is set at the request level (either `True` or `None`) before choosing to set the env CA, but doesn't look at the session (`self.verify`) (which likely has, and in my case does have, a more accurate CA setting).
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2998/reactions" }
https://api.github.com/repos/psf/requests/issues/2998/timeline
null
completed
null
null
false
[ "I believe this is related to https://github.com/kennethreitz/requests/issues/2018 which is scheduled for 3.0.0\n", "As an aside, you can disable using the environment by setting `trust_env=False`.\n", "Closing to centralise on #2018.\n", "@sigmavirus24 That's the workaround I intend to use, although it's a little troublesome since I'm using a library which is itself using `requests`, and it doesn't expose `trust_env`, so I'm having to patch it in. It's also troublesome because it disables proxy configuration via the environment.\n\nThanks though, I didn't see the other issue!\n" ]
https://api.github.com/repos/psf/requests/issues/2997
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2997/labels{/name}
https://api.github.com/repos/psf/requests/issues/2997/comments
https://api.github.com/repos/psf/requests/issues/2997/events
https://github.com/psf/requests/issues/2997
131,736,140
MDU6SXNzdWUxMzE3MzYxNDA=
2,997
allow_redirects=True for one get affects the next one
{ "avatar_url": "https://avatars.githubusercontent.com/u/39889?v=4", "events_url": "https://api.github.com/users/yarikoptic/events{/privacy}", "followers_url": "https://api.github.com/users/yarikoptic/followers", "following_url": "https://api.github.com/users/yarikoptic/following{/other_user}", "gists_url": "https://api.github.com/users/yarikoptic/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/yarikoptic", "id": 39889, "login": "yarikoptic", "node_id": "MDQ6VXNlcjM5ODg5", "organizations_url": "https://api.github.com/users/yarikoptic/orgs", "received_events_url": "https://api.github.com/users/yarikoptic/received_events", "repos_url": "https://api.github.com/users/yarikoptic/repos", "site_admin": false, "starred_url": "https://api.github.com/users/yarikoptic/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yarikoptic/subscriptions", "type": "User", "url": "https://api.github.com/users/yarikoptic", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" } ]
closed
true
null
[]
null
7
2016-02-05T19:41:55Z
2021-09-08T19:00:36Z
2016-02-05T22:53:55Z
NONE
resolved
Here is an example with 3 similar `get` calls with all allow_redirects=False: ``` $> PYTHONPATH=$PWD python -c "import requests; url = 'http://127.0.0.1:45508/ds666'; sess = requests.Session(); print sess.get(url, allow_redirects=False); print sess.get(url, allow_redirects=False); print sess.get(url, allow_redirects=False)" <Response [301]> <Response [301]> <Response [301]> ``` all of them logically 301. If 2nd one says to allow redirects, then 3rd would also succeed even if I disallow redirects for that invocation ``` $> PYTHONPATH=$PWD python -c "import requests; url = 'http://127.0.0.1:45508/ds666'; sess = requests.Session(); print sess.get(url, allow_redirects=False); print sess.get(url, allow_redirects=True); print sess.get(url, allow_redirects=False)" <Response [301]> <Response [200]> <Response [200]> ``` Demonstrated on ``` v2.9.1-63-g6143b7e ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2997/reactions" }
https://api.github.com/repos/psf/requests/issues/2997/timeline
null
completed
null
null
false
[ "Good spot! I suspect this might be the result of the 301 cache misbehaving somewhat. @sigmavirus24, does that sound right to you?\n", "Yikes, this is a nasty bug.\n", "I actually can't reproduce this, using the code available in our `master` branch.\n", "copy paste after me! ;)\n\n```\nhopa:/tmp\n$> git clone https://github.com/kennethreitz/requests\nCloning into 'requests'...\nremote: Counting objects: 16549, done.\nremote: Compressing objects: 100% (18/18), done.\nremote: Total 16549 (delta 7), reused 0 (delta 0), pack-reused 16530\nReceiving objects: 100% (16549/16549), 6.78 MiB | 10.12 MiB/s, done.\nResolving deltas: 100% (9464/9464), done.\nChecking connectivity... done.\n2 31038.....................................:Fri 05 Feb 2016 04:00:55 PM EST:.\nhopa:/tmp\n\n$> cd requests \nAUTHORS.rst HISTORY.rst MANIFEST.in NOTICE docs/ requests/ setup.cfg tests/\nCONTRIBUTING.md LICENSE Makefile README.rst ext/ requirements.txt setup.py*\n2 31039.....................................:Fri 05 Feb 2016 04:00:57 PM EST:.\n(git)hopa:/tmp/requests[master]\n\n$> git describe\nv2.6.2-327-gbbadf47\n2 31040.....................................:Fri 05 Feb 2016 04:01:03 PM EST:.\n(git)hopa:/tmp/requests[master]\n\n$> PYTHONPATH=$PWD python -c \"import requests; print requests.__version__\" \n2.9.1 \n2 31041.....................................:Fri 05 Feb 2016 04:01:14 PM EST:.\n(git)hopa:/tmp/requests[master]\n\n$> git describe --tags \nv2.9.1-65-gbbadf47\n2 31042.....................................:Fri 05 Feb 2016 04:01:17 PM EST:.\n(git)hopa:/tmp/requests[master]git\n\n$> PYTHONPATH=$PWD python -c \"import requests; print requests.__version__\"\n2 31042 ->1.....................................:Fri 05 Feb 2016 04:01:22 PM EST:.\n(git)hopa:/tmp/requests[master]git\n\n$> python -m SimpleHTTPServer & \n[1] 25321 \n2 31043 [1].....................................:Fri 05 Feb 2016 04:01:26 PM EST:.\n(git)hopa:/tmp/requests[master]git\n\n$> Serving HTTP on 0.0.0.0 port 8000 ...\n\n2 31043 [1].....................................:Fri 05 Feb 2016 04:01:28 PM EST:.\n(git)hopa:/tmp/requests[master]git\n\n$> PYTHONPATH=$PWD python -c \"import requests; url = 'http://127.0.0.1:8000/docs'; sess = requests.Session(); print sess.get(url, allow_redirects=False); print sess.get(url, allow_redirects=True); print sess.get(url, allow_redirects=False)\" \n127.0.0.1 - - [05/Feb/2016 16:01:56] \"GET /docs HTTP/1.1\" 301 -\n<Response [301]>\n127.0.0.1 - - [05/Feb/2016 16:01:56] \"GET /docs HTTP/1.1\" 301 -\n127.0.0.1 - - [05/Feb/2016 16:01:56] \"GET /docs/ HTTP/1.1\" 200 -\n<Response [200]>\n127.0.0.1 - - [05/Feb/2016 16:01:56] \"GET /docs/ HTTP/1.1\" 200 -\n<Response [200]>\n```\n", "Ah, it looks like this is occuring with 301 redirects, not 302s.\n", "@yarikoptic thanks for the help, this should now be fixed.\n", "cool. Thanks!\n" ]
https://api.github.com/repos/psf/requests/issues/2996
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2996/labels{/name}
https://api.github.com/repos/psf/requests/issues/2996/comments
https://api.github.com/repos/psf/requests/issues/2996/events
https://github.com/psf/requests/pull/2996
131,638,696
MDExOlB1bGxSZXF1ZXN0NTg0MjU5MzA=
2,996
Added tests module
{ "avatar_url": "https://avatars.githubusercontent.com/u/1236561?v=4", "events_url": "https://api.github.com/users/Stranger6667/events{/privacy}", "followers_url": "https://api.github.com/users/Stranger6667/followers", "following_url": "https://api.github.com/users/Stranger6667/following{/other_user}", "gists_url": "https://api.github.com/users/Stranger6667/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Stranger6667", "id": 1236561, "login": "Stranger6667", "node_id": "MDQ6VXNlcjEyMzY1NjE=", "organizations_url": "https://api.github.com/users/Stranger6667/orgs", "received_events_url": "https://api.github.com/users/Stranger6667/received_events", "repos_url": "https://api.github.com/users/Stranger6667/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Stranger6667/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Stranger6667/subscriptions", "type": "User", "url": "https://api.github.com/users/Stranger6667", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-02-05T13:01:41Z
2021-09-08T05:00:57Z
2016-02-05T20:54:25Z
CONTRIBUTOR
resolved
Hello, I want to propose some tests changes again :) What is done: 1. All tests are moved to`tests` module. 2. All tests related to `requests.utils` are separated in `tests.test_utils` module. 3. Fixtures `httpbin` and `httpbin_secure` were slightly refactored and moved to `tests.conftest` module. 4. `ThreadPool` was removed, because it was not used. 5. All compatibility code was moved to `tests.compat`. 6. More parametrization was added. I want to ask you if I'm going in the right way or not. Also I noticed, that `requests.utils` has ~45% coverage by unit tests, so, would you mind if I'll add more unit tests i this PR? If everything is ok
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2996/reactions" }
https://api.github.com/repos/psf/requests/issues/2996/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2996.diff", "html_url": "https://github.com/psf/requests/pull/2996", "merged_at": "2016-02-05T20:54:25Z", "patch_url": "https://github.com/psf/requests/pull/2996.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2996" }
true
[ "Pinging @kennethreitz again for review on test code.\n\nThanks for this @Stranger6667!\n", "@Stranger6667 more tests for `utils` are more than welcome!\n", ":sparkles: :cake: :sparkles:\n", "Thank you, @Lukasa @kennethreitz ! :)\n" ]
https://api.github.com/repos/psf/requests/issues/2995
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2995/labels{/name}
https://api.github.com/repos/psf/requests/issues/2995/comments
https://api.github.com/repos/psf/requests/issues/2995/events
https://github.com/psf/requests/issues/2995
131,486,326
MDU6SXNzdWUxMzE0ODYzMjY=
2,995
SSL Get Server Certificate Error
{ "avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4", "events_url": "https://api.github.com/users/ghost/events{/privacy}", "followers_url": "https://api.github.com/users/ghost/followers", "following_url": "https://api.github.com/users/ghost/following{/other_user}", "gists_url": "https://api.github.com/users/ghost/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ghost", "id": 10137, "login": "ghost", "node_id": "MDQ6VXNlcjEwMTM3", "organizations_url": "https://api.github.com/users/ghost/orgs", "received_events_url": "https://api.github.com/users/ghost/received_events", "repos_url": "https://api.github.com/users/ghost/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ghost/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ghost/subscriptions", "type": "User", "url": "https://api.github.com/users/ghost", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-02-04T22:02:03Z
2021-09-08T19:00:36Z
2016-02-04T22:05:37Z
NONE
resolved
I'm getting certificate verify failed errors for a few websites including Google. I've just tried with a new Digital Ocean droplet and still having the same issues so it's not just my server. I did the following, create a new Ubuntu 14.04 droplet on Digital Ocean. Installed python-pip (python and OpenSSL were already installed). Upgraded requests to the latest version (pip install requests --upgrade) and installed security (pip install requests[security]). Just to confirm, the droplet originally came with requests version 2.2.1, hence the reason for upgrading to 2.9.1. I also installed the pip modules: certifi and urllib3. Before upgrading requests and installing the other modules, I ran `import requests` and `requests.get("https://www.google.co.uk")`. This worked fine and returned 200. After upgrading and installing the other packages, I got the following: `Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 67, in get return request('get', url, params=params, **kwargs) File "/usr/local/lib/python2.7/dist-packages/requests/api.py", line 53, in request return session.request(method=method, url=url, **kwargs) File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 468, in request resp = self.send(prep, **send_kwargs) File "/usr/local/lib/python2.7/dist-packages/requests/sessions.py", line 576, in send r = adapter.send(request, **kwargs) File "/usr/local/lib/python2.7/dist-packages/requests/adapters.py", line 447, in send raise SSLError(e, request=request) requests.exceptions.SSLError: [Errno 1] _ssl.c:510: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed ` I tried it on https://github.com and it worked perfectly. I tried it on https://google.com and https://themoviedb.org and both has certificate verify failed errors even though they both have valid certificates.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2995/reactions" }
https://api.github.com/repos/psf/requests/issues/2995/timeline
null
completed
null
null
false
[ "You'll need to read certifi/python-certifi#26.\n" ]
https://api.github.com/repos/psf/requests/issues/2994
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2994/labels{/name}
https://api.github.com/repos/psf/requests/issues/2994/comments
https://api.github.com/repos/psf/requests/issues/2994/events
https://github.com/psf/requests/pull/2994
131,140,637
MDExOlB1bGxSZXF1ZXN0NTgxOTQ1MzE=
2,994
Fixed test execution.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1236561?v=4", "events_url": "https://api.github.com/users/Stranger6667/events{/privacy}", "followers_url": "https://api.github.com/users/Stranger6667/followers", "following_url": "https://api.github.com/users/Stranger6667/following{/other_user}", "gists_url": "https://api.github.com/users/Stranger6667/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Stranger6667", "id": 1236561, "login": "Stranger6667", "node_id": "MDQ6VXNlcjEyMzY1NjE=", "organizations_url": "https://api.github.com/users/Stranger6667/orgs", "received_events_url": "https://api.github.com/users/Stranger6667/received_events", "repos_url": "https://api.github.com/users/Stranger6667/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Stranger6667/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Stranger6667/subscriptions", "type": "User", "url": "https://api.github.com/users/Stranger6667", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-02-03T20:20:08Z
2021-09-08T05:00:58Z
2016-02-03T20:53:08Z
CONTRIBUTOR
resolved
Now hook executes as expected.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2994/reactions" }
https://api.github.com/repos/psf/requests/issues/2994/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2994.diff", "html_url": "https://github.com/psf/requests/pull/2994", "merged_at": "2016-02-03T20:53:08Z", "patch_url": "https://github.com/psf/requests/pull/2994.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2994" }
true
[ "This looks good to me, thanks!\n" ]
https://api.github.com/repos/psf/requests/issues/2993
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2993/labels{/name}
https://api.github.com/repos/psf/requests/issues/2993/comments
https://api.github.com/repos/psf/requests/issues/2993/events
https://github.com/psf/requests/pull/2993
131,107,460
MDExOlB1bGxSZXF1ZXN0NTgxNzY5NzI=
2,993
Changes to support sending a complete response object when 407 errors…
{ "avatar_url": "https://avatars.githubusercontent.com/u/17050642?v=4", "events_url": "https://api.github.com/users/angelcaido19/events{/privacy}", "followers_url": "https://api.github.com/users/angelcaido19/followers", "following_url": "https://api.github.com/users/angelcaido19/following{/other_user}", "gists_url": "https://api.github.com/users/angelcaido19/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/angelcaido19", "id": 17050642, "login": "angelcaido19", "node_id": "MDQ6VXNlcjE3MDUwNjQy", "organizations_url": "https://api.github.com/users/angelcaido19/orgs", "received_events_url": "https://api.github.com/users/angelcaido19/received_events", "repos_url": "https://api.github.com/users/angelcaido19/repos", "site_admin": false, "starred_url": "https://api.github.com/users/angelcaido19/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/angelcaido19/subscriptions", "type": "User", "url": "https://api.github.com/users/angelcaido19", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-02-03T18:08:43Z
2021-09-08T05:00:58Z
2016-02-03T18:10:24Z
NONE
resolved
… is found while doing the CONNECT to proxy. Added a way to stablish a connection using connect over an already connected socket Trying to connect to a "https" web page using a proxy uses a "HTTP CONNECT" verb to tell the proxy that we want to connect to a https destination. With "basic authentication" the "CONNECT" is sent with the appropriate headers so the proxy can authenticate the request and do the request to the https web server. (That headers are setup by the requests library) With "NTLM" we use a plugin (requests-ntlm) that is set as a hook and it is called on every request response. For example: with HTTP, the requests library sends a "GET" to a web page, as a response we get a HTTP-407 that is handled by the plugin hook to do the NTLM negotiation and authentication In a HTTPS scenario, the first connection to the proxy is being done with the "CONNECT". At that time, requests does not set any authentication header (because it is handled by a plugin) and we get an 407 authentication Error. This Patch tries to avoid raising an exception on "Proxy Authentication" errors and it sends back a complete http response so it can be handled by the requests-ntlm plugin and reuse an already connected and authenticated socket to do the connection
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2993/reactions" }
https://api.github.com/repos/psf/requests/issues/2993/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2993.diff", "html_url": "https://github.com/psf/requests/pull/2993", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2993.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2993" }
true
[ "@angelcaido19 Sorry, but this includes changes in urllib3. You'll need to propose this patch over there first, I'm afraid. =(\n", "@Lukasa Hey thanks. Help me with this: Is this \"https://github.com/shazow/urllib3\" the oficial urllib3 repo?\nThanks again!!\n", "Ah, sorry I didn't link it: yeah it is!\n" ]
https://api.github.com/repos/psf/requests/issues/2992
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2992/labels{/name}
https://api.github.com/repos/psf/requests/issues/2992/comments
https://api.github.com/repos/psf/requests/issues/2992/events
https://github.com/psf/requests/pull/2992
131,049,763
MDExOlB1bGxSZXF1ZXN0NTgxNDg0Mjc=
2,992
Use TTLCache to cache proxy_bypass.
{ "avatar_url": "https://avatars.githubusercontent.com/u/2840571?v=4", "events_url": "https://api.github.com/users/lazywei/events{/privacy}", "followers_url": "https://api.github.com/users/lazywei/followers", "following_url": "https://api.github.com/users/lazywei/following{/other_user}", "gists_url": "https://api.github.com/users/lazywei/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/lazywei", "id": 2840571, "login": "lazywei", "node_id": "MDQ6VXNlcjI4NDA1NzE=", "organizations_url": "https://api.github.com/users/lazywei/orgs", "received_events_url": "https://api.github.com/users/lazywei/received_events", "repos_url": "https://api.github.com/users/lazywei/repos", "site_admin": false, "starred_url": "https://api.github.com/users/lazywei/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/lazywei/subscriptions", "type": "User", "url": "https://api.github.com/users/lazywei", "user_view_type": "public" }
[]
closed
true
null
[]
null
18
2016-02-03T14:59:15Z
2021-09-07T00:06:36Z
2017-02-10T17:19:47Z
NONE
resolved
As mentioned in https://github.com/kennethreitz/requests/issues/2988, we might leverage the LRU cache to speed up the `proxy_bypass`, which might cause unexpected delay.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2992/reactions" }
https://api.github.com/repos/psf/requests/issues/2992/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2992.diff", "html_url": "https://github.com/psf/requests/pull/2992", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2992.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2992" }
true
[ "The bigger problem here is that `functools.lru_cache` is not available in Python 2. It should be fairly trivial to lift the code into a backport module and vendor it though.\n", "For general code review I'm not best placed to do it because I suggested the approach to begin with, so let's tag in @sigmavirus24 and @kennethreitz as well.\n", "OK. I'll add the doc latter. As for the backport for LRU cache, do you think it should reside in `compat.py` or in a separate module?\n", "I think it should go under `packages`.\n", "> The bigger problem here is that functools.lru_cache is not available in Python 2. It should be fairly trivial to lift the code into a backport module and vendor it though.\n\nIMO you should invalidate single items in the cache based on the timestamp added, so you would need a custom cache function anyway...\n", "@Lukasa @schlamar I agree I'd need to make a custom cache function. Do you think it is appropriate to make it a standalone package and vendor it under `packages`?\n", "I do. =)\n", "I just found there is a 3rd party package that can do the cache for us: https://github.com/tkem/cachetools\nSpecifically, we can use TTL cache, instead of making a timestamp invalid_indicator ourself.\nHow do you think?\n", "Requests doesn't like bringing in third-party modules as dependencies if we can avoid it.\n", "I thought if I make a standalone package it would also be considered third party, right? Then would it make any difference for using another caching tool?\nThanks\n", "> I thought if I make a standalone package it would also be considered third party, right? Then would it make any difference for using another caching tool?\n\nSorry, I'm not sure that I understand what you're proposing here.\n", "I think a simple TTL cache can be implemented in 50-80 LOC or even less so there is no need to vendor a new standalone package nor another third-party module. I would just put it into `requests.utils`.\n", "sure, no problem. will do it in utils.\nMarc Schlaich [email protected]於 2016年3月21日 週一,下午7:24寫道:\n\n> I think a simple TTL cache can be implemented in 50-80 LOC or even less so\n> there is no need to vendor a new standalone package nor another third-party\n> module. I would just put it into requests.utils.\n> \n> —\n> You are receiving this because you authored the thread.\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/pull/2992#issuecomment-199232165\n> \n> ## \n> \n> Chih-Wei (Bert)\n", "This pull request has been open for over two months now. Let's do what we can to get it merged or closed :)\n", "To remember which functions have been cached, I'd need to store it somewhere.\nWhich approach, using a scope (under utils.py) variable or using a Class's instance variable, would you prefer?\n", "@kennethreitz @Lukasa Sorry for taking so long. I implemented a naive TTLCache for caching the `proxy_bypass` function call. Please have a look at it and let me know your thoughts.\nIf there is no big problem with the implementation, I'll add some documentation and testing if necessary later.\n\nThanks.\n", "I reviewed it. A second opinion would be nice.\n", "Closing due to inactivity (not sure if that's our fault or not, sorry!). Please re-submit if you're still interested in contributing this code :)" ]
https://api.github.com/repos/psf/requests/issues/2991
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2991/labels{/name}
https://api.github.com/repos/psf/requests/issues/2991/comments
https://api.github.com/repos/psf/requests/issues/2991/events
https://github.com/psf/requests/pull/2991
130,677,837
MDExOlB1bGxSZXF1ZXN0NTc5ODM3MTk=
2,991
[WIP] Test suite refactoring
{ "avatar_url": "https://avatars.githubusercontent.com/u/1236561?v=4", "events_url": "https://api.github.com/users/Stranger6667/events{/privacy}", "followers_url": "https://api.github.com/users/Stranger6667/followers", "following_url": "https://api.github.com/users/Stranger6667/following{/other_user}", "gists_url": "https://api.github.com/users/Stranger6667/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Stranger6667", "id": 1236561, "login": "Stranger6667", "node_id": "MDQ6VXNlcjEyMzY1NjE=", "organizations_url": "https://api.github.com/users/Stranger6667/orgs", "received_events_url": "https://api.github.com/users/Stranger6667/received_events", "repos_url": "https://api.github.com/users/Stranger6667/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Stranger6667/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Stranger6667/subscriptions", "type": "User", "url": "https://api.github.com/users/Stranger6667", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2016-02-02T13:04:14Z
2021-09-08T05:00:58Z
2016-02-03T08:26:27Z
CONTRIBUTOR
resolved
Hello, I'd like to propose some refactorings for current test suite. First of all it is more about testing convenience, to make development process easier. Current changes: - Changed test running to setup.py. It allows to use some tuning on dependencies during test runs - Updated classifiers list. All tests run fine on Python 2.6 and PyPy. And almost all on Jython. I think it will not be a problem to deal with 2 failing tests, but will see. - Added config file for coverage. But I'm not sure about omitting "_/packages/_". - Added tox.ini to run tests against different Python versions locally. - Removed `unittest` from tests. It is not required for `py.test`. Also I've removed `_multiprocess_can_split_` because it is only related to `nose`. - Added some of py.test parametrization. - Replaced `assert False` with `pytest.fail`. - Fixed typo and added some minor improvements. Some thoughts and questions: - Coverage report shows, that code [here](https://github.com/kennethreitz/requests/compare/master...Stranger6667:tests-refactoring?expand=1#diff-56c2d754173a4a158ce8f445834c8fe8R616) is not executed. So, is there problem in hooks execution? - Split test module to some logically separated files. Current file is kind of huge. May be it will be better to have test suite in multiply files? - Python 3.2. Basically all tests pass on Python 3.2 / PyPy3 (based on Python 3.2). But `httpbin` dependency is not compatible with this Python version. But `requests` seems to be compatible. Any plans to keep compatibility with Python 3.2? May be rewrite some tests for Python 3.2 or something? - Jython. Most things are OK on Jython. Only 2 tests are failing - `TestRequests.test_pyopenssl_redirect` and `TestMorselToCookieExpires.test_expires_valid_str`. I don't know detail yet, but I hope, that it won't be a problem to fix it. Is it reasonable to run test suite against Jython? - Travis CI. I saw some CI-related lines in `Makefile`, but I don't see any info about in docs. Can you help me here? Also I can add some Travis CI config to run all configurations on it (including Jython) - Public coverage report. May I add some `codecov/coveralls` integration? Will it be helpful? It is not a complete lists and I'll be glad to hear what do you think about it. Also I'll be happy to write more tests here :) Thanks!
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2991/reactions" }
https://api.github.com/repos/psf/requests/issues/2991/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2991.diff", "html_url": "https://github.com/psf/requests/pull/2991", "merged_at": "2016-02-03T08:26:27Z", "patch_url": "https://github.com/psf/requests/pull/2991.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2991" }
true
[ "Thanks for this! I'll handle answering some questions, but I know that @kennethreitz has got opinions about some of the changes you're making here so I'll let him weigh in to do the actual review work.\n\nTo your first question, I suspect that test is just poorly written.\n\nTo Python 3.2 support, requests explicitly considers Python 3.2 a best-effort platform. There's no reason to continue to support it formally, because Python 3.2 is no longer supported by the PSF: the last release was in 2014. The only reason we appear to be compatible with Python 3.2 is because pip needed to support it, and even pip is planning to drop support.\n\nTo Jython: we'd like to work on Jython, but it's moderately painful for us to get a CI suite for it. It's not unreasonable to run the tests against Jython though.\n\nRe: Travis. We moved away from Travis because our test suite used to run against the real httpbin instance, and Travis' network access was sufficiently flaky that our tests failed a lot. We've therefore been using Jenkins for a while. However, now that our tests are entirely offline I'm interested in moving back to Travis. Of course, that's @kennethreitz's call.\n\nFor coverage, I see no reason to use anything but coverage.py, but again, this is @kennethreitz's area.\n", "- The rewrite of `test_requests.py` looks great to me. I originally wanted our test suite to be fully compatible with unittest's test runner (and it originally was), but over time this slowly became less and less the case. \n- Jython is a very rarely used Python implementation. If someone fixes a Jython-specific bug in a way that doesn't complicate the Requests codebase, I'm happy for us to accept the patch. But, our level of support should (and does) stop there. \n- I don't think having `tox.ini` is very useful, as we won't be using it in any official capacity. Use of `tox.ini` requires that every specified Python installation be available in order to be used effectively, which is quite difficult to do properly on most systems. This project shipped `tox.ini` long ago, but it was removed for a number of reasons. Tox is a wonderful local testing tool, and anyone is free to use it with Requests — that doesn't mean we need to provide the file, though. It adds (yet) another file to the root of the repository, which I would like to keep as clean as possible.\n- I have little opinion regarding coverage, except to echo @Lukasa's `coverage.py` statement. If we can get rid of the `.coveragerc` file, that would be great. Again, I'd like to keep those files to a minimum.\n", "Regarding Travis: we used to utilize Travis, but I quickly grew frustrated with it reporting false negatives for our test suite on a regular basis. I was also frustrated by them randomly removing a few older versions of Python one day, as well. I was also frustrated by all the email notifications it sent me (many regarding forks of kennethreitz/requests). I'm also not a fan of its UI. :)\n\nThat was a long time ago, and I'm sure it's improved quite a bit since then. \n\nOur current CI integration with my Jenkins server seems to be suiting our needs perfectly fine. I personally quite enjoy having a private Jenkins server. However, I wouldn't be against considering a switch back to Travis if there was an obvious benefit to myself, @Lukasa, or @sigmavirus24. I'm not aware of any. Are there any?\n", "Merged. Made some changes. \n- Removed `.coveragerc` (this may be useful to add later).\n- Removed `tox.ini`, see above.\n- Changed the recommended way to run tests (`$ make tests`) back to using `$ py.test test_requests.py` (instead of `$ python setup.py test`). Why? Because this method outputs many lines of noise before running the suite, and creates/requires a `requests.egg-info` directory to be present in my development environment, which I do not need nor want. \n- Removed references to Jython in `setup.py`.\n\nHope none of that sounded critical! I just wanted to document my perspective behind the decisions. Thanks so much for this patch! The improvements to the test suite are great. \n\n:sparkles: :cake: :sparkles:\n", "Thank you for the review and merge! I'm happy to contribute :)\nI still have some questions about tests, but I'll create new issues for that.\n" ]
https://api.github.com/repos/psf/requests/issues/2990
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2990/labels{/name}
https://api.github.com/repos/psf/requests/issues/2990/comments
https://api.github.com/repos/psf/requests/issues/2990/events
https://github.com/psf/requests/issues/2990
130,479,500
MDU6SXNzdWUxMzA0Nzk1MDA=
2,990
Consider simplifying Session.resolve_redirects
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
[ { "color": "eb6420", "default": false, "description": null, "id": 44501256, "name": "Breaking API Change", "node_id": "MDU6TGFiZWw0NDUwMTI1Ng==", "url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change" } ]
closed
true
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
3
2016-02-01T20:52:02Z
2021-09-08T19:00:36Z
2016-02-02T06:44:36Z
CONTRIBUTOR
resolved
Currently, the `Session.resolve_redirects` method requires a `Response` to be passed in, as well as a `Request` (the response's originating request). When using this method directly, this is a bit confusing, especially considering that the desired/required `Request` is an attribute of the obviously required `Response`. There must have been some reason behind why I made this design decision, but I'm not seeing it currently.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2990/reactions" }
https://api.github.com/repos/psf/requests/issues/2990/timeline
null
completed
null
null
false
[ "Adding this to 3.0.0 because it would be a fairly significant breaking change. I was also never quite certain why that was required, but I'm not certain it is actively harmful either :/\n", "Yeah, this would be a 3.0.0 change. \n\n@sigmavirus24 I was just trying to use it directly while debugging an infinitely redirecting URL, and I had to look at the source code to figure out how to do so. I just really need to spend some time figuring out why I wrote it that way, there must have been a reason.\n", "Done! https://github.com/kennethreitz/requests/blob/proposed/3.0.0/requests/sessions.py#L88\n" ]
https://api.github.com/repos/psf/requests/issues/2989
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2989/labels{/name}
https://api.github.com/repos/psf/requests/issues/2989/comments
https://api.github.com/repos/psf/requests/issues/2989/events
https://github.com/psf/requests/issues/2989
130,462,974
MDU6SXNzdWUxMzA0NjI5NzQ=
2,989
Any idea if it would be possible to use something like oscrypto instead of the bundled SSL lib for HTTPS?
{ "avatar_url": "https://avatars.githubusercontent.com/u/10779010?v=4", "events_url": "https://api.github.com/users/danechitoaie/events{/privacy}", "followers_url": "https://api.github.com/users/danechitoaie/followers", "following_url": "https://api.github.com/users/danechitoaie/following{/other_user}", "gists_url": "https://api.github.com/users/danechitoaie/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/danechitoaie", "id": 10779010, "login": "danechitoaie", "node_id": "MDQ6VXNlcjEwNzc5MDEw", "organizations_url": "https://api.github.com/users/danechitoaie/orgs", "received_events_url": "https://api.github.com/users/danechitoaie/received_events", "repos_url": "https://api.github.com/users/danechitoaie/repos", "site_admin": false, "starred_url": "https://api.github.com/users/danechitoaie/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/danechitoaie/subscriptions", "type": "User", "url": "https://api.github.com/users/danechitoaie", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-02-01T19:46:29Z
2021-09-08T19:00:34Z
2016-02-10T08:47:09Z
NONE
resolved
Any idea if it would be possible to use something like [oscrypto](https://github.com/wbond/oscrypto) instead of the bundled SSL lib for HTTPS? I need to use this in a Sublime Text 3 plugin where the bundled version of SSL is pretty old and I can't connect to some hosts supporting only TLS1.2. Using the above should fix this issue. Any idea how this can be used for SSL?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2989/reactions" }
https://api.github.com/repos/psf/requests/issues/2989/timeline
null
completed
null
null
false
[ "@danechitoaie Let's split this into two questions.\n\nFirst, the problem with OpenSSL. If you can use a third-party library, you should be able to install `PyOpenSSL`, `ndg-httpsclient`, and `pyasn1`. If the `cryptography` library that backs PyOpenSSL is linked against a newer version of OpenSSL than the bundled one (or if you're on Windows or OS X, where `cryptography` is statically linked against a modern OpenSSL).\n\nUsing oscrypto instead is extremely tricky. Requests cannot add support for this in the mainline. You could in principle add support for this in urllib3, but that won't happen quickly. PyOpenSSL is probably your best bet here.\n", "I can't use PyOpenSSL as that requires compilation and it would not work as a SublimeText3 plugin as they have their own version of python bundled that executes the plugins code. So you can't use pip, easy_install, etc... you can only use \"pure\" python code that doesn't have external dependencies.\nOscrypto would work as it's just Python code (nothing to compile) that uses the native OS SSL lib trough CFFI. \n\nI understand that it can't be added to requests, I'm looking at maybe some pointers or ideas on how/if this could be implemented, even if it means I have to change some code in requests or urllib3 (private fork). I know it doesn't sound good but it's kind of the only option.\n", "@danechitoaie PyOpenSSL and oscrypto both use the exact same backing library. PyOpenSSL has not required compilation for more than a year now.\n", "Hmm, then I need to look more into PyOpenSSL then. Maybe I can just use that. Thanks for the input.\n" ]
https://api.github.com/repos/psf/requests/issues/2988
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2988/labels{/name}
https://api.github.com/repos/psf/requests/issues/2988/comments
https://api.github.com/repos/psf/requests/issues/2988/events
https://github.com/psf/requests/issues/2988
130,289,799
MDU6SXNzdWUxMzAyODk3OTk=
2,988
Proxy bypass detection could add delay of ~4.5 seconds to each request
{ "avatar_url": "https://avatars.githubusercontent.com/u/238652?v=4", "events_url": "https://api.github.com/users/schlamar/events{/privacy}", "followers_url": "https://api.github.com/users/schlamar/followers", "following_url": "https://api.github.com/users/schlamar/following{/other_user}", "gists_url": "https://api.github.com/users/schlamar/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/schlamar", "id": 238652, "login": "schlamar", "node_id": "MDQ6VXNlcjIzODY1Mg==", "organizations_url": "https://api.github.com/users/schlamar/orgs", "received_events_url": "https://api.github.com/users/schlamar/received_events", "repos_url": "https://api.github.com/users/schlamar/repos", "site_admin": false, "starred_url": "https://api.github.com/users/schlamar/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/schlamar/subscriptions", "type": "User", "url": "https://api.github.com/users/schlamar", "user_view_type": "public" }
[ { "color": "02e10c", "default": false, "description": null, "id": 76800, "name": "Feature Request", "node_id": "MDU6TGFiZWw3NjgwMA==", "url": "https://api.github.com/repos/psf/requests/labels/Feature%20Request" }, { "color": "0b02e1", "default": false, "description": null, "id": 191274, "name": "Contributor Friendly", "node_id": "MDU6TGFiZWwxOTEyNzQ=", "url": "https://api.github.com/repos/psf/requests/labels/Contributor%20Friendly" } ]
closed
true
null
[]
null
27
2016-02-01T08:16:17Z
2021-09-08T11:00:35Z
2017-03-04T14:53:00Z
CONTRIBUTOR
resolved
Requests uses `urlib.proxy_bypass` to check if the proxy should be bypassed. This uses `socket.gethostbyaddr` to check if the provided IP matches a hostname entered in the proxy bypass list. In some cases, `gethostbyaddr` takes up to 4.5 seconds on Windows (e.g. when the IP is found but the primary DNS doesn't recognize it). Currently, the proxy bypass detection is done for _every_ request, making it pretty unusable when the conditions are met where `gethostbyaddr` is this slow. A reasonable solution might be caching the results from `urlib.proxy_bypass`.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2988/reactions" }
https://api.github.com/repos/psf/requests/issues/2988/timeline
null
completed
null
null
false
[ "Yeah, 4.5 seconds is probably too long.\n\nThe concern with caching the result is that the resolver doesn't give enough information for us to invalidate that cache appropriately, but we will from time to time want to invalidate it (so that long running processes don't end up so broken that they need a restart).\n\nA simple approach here might be to use an `@memoize` decorator with a cache that invalidates entries after a certain amount of time. Given that [`functools.lru_cache`](https://docs.python.org/3/library/functools.html#functools.lru_cache) provides a `cache_clear` function, we may be able to provide a very simple wrapper function that handles this logic.\n", "May I try this one :-)?\n\nMy plan for implementation:\n1. wrap the `proxy_bypass` by our function like `cached_proxy_bypass`\n2. cache our wrapper function\n\n``` python\n @functools.lru_cache(maxsize=32)\n def cached_proxy_bypass(netloc):\n return proxy_bypass(netloc)\n```\n\nThis can cache all the function call to `proxy_bypass` with the same `netloc` as long as they are in the latest 32 calls. However, if we want to invalidate the same `netloc` from time to time, we might maintain a counter, such that we define the wrapper function like this\n\n``` python\[email protected]_cache(maxsize=32)\ndef cached_proxy_bypass(netloc, _):\n return proxy_bypass(netloc)\n```\n\nand call it like\n\n``` python\ncached_proxy_bypass(netloc, int(counter / 5)); counter += 1;\n```\n\n5 is a magic number here, which can be any other number, such that lru will invalidate the result every 5 calls with the same `netloc` (because for every 5 calls, the passing arguments are the same, and thus lru will cache them).\n\nHow do you think?\n\nThanks a lot.\n", "@lazywei That could definitely work. The issue there is that the cache gets busted after every _n_ calls regardless of how fast they happen or how broadly distributed they are. That seems problematic.\n\nAnother option would be to try getting a unix timestamp, dividing that by some interval (e.g. 60 seconds), and then using that as the 'counter'.\n", "@Lukasa Oh, thanks for your advice! Using timestamp is brilliant!\nI have made the first attempt and opened the PR!\n\nThanks!\n", "Has this been solved?\n", "@TetraEtc The pull-request is still open, I'm afraid.\n", "Sorry I was busy and distracted last few weeks. I'll work on it this weekend. Hope I can deliver some progress by that time.\n", "How is the progress on this issue?\n\nWhich workarounds (besides turning off proxy settings in Windows) are known? \n", "Right now there is a proposed patch in #2992 that is currently outstanding. Work would be needed to evaluate whether the patch there is suitable. Otherwise, the only work arounds involve monkeypatching the standard library to affect the results of those function calls.\n", "Isn´t there a way to tell 'requests' to completly ignore proxy settings instead of checking which URLs/IPs can bypass it? \n", "Yes, you can do this:\n\n``` python\nimport requests\ns = requests.Session()\ns.trust_env = False\n```\n\nThis will prevent requests getting any information from its environment: specifically, it'll disable environment searches for proxies and for certificate bundles.\n", "@Lukasa : Thanks! That´s exactly what I thought of. I tried to implement your suggestion into [robotframework-requests](https://github.com/bulkan/robotframework-requests) (see [PR 117](https://github.com/bulkan/robotframework-requests/pull/117)) but was not successful - still have a delay of approx. 5 seconds. But I am just starting out with Python and programming at all, so I am very sure that I missed something :(\n", "Your patch there won't work.\n\nYou have this line: `s.trust_env = trust_env if trust_env else s.trust_env`.\n\nThis means that if you try to set `trust_env` to `False`, `trust_env` is not `True` and so you set `s.trust_env` to `s.trust_env`. In essence:\n\n``` python\n# If trust_env == True\ns.trust_env = trust_env\n\n# If trust_env == False\ns.trust_env = s.trust_env\n```\n\nThis means that there's no way to actually _set_ `trust_env` to `False`. Change line 82 to `s.trust_env = trust_env if trust_env is not None else s.trust_env`, and then change line 134 to default `trust_env` to `None`, not `True`.\n", "@Lukasa : I knew I missed somethink :). Thanks again! I fixed it now but still see a delay :( (more details [here](https://github.com/bulkan/robotframework-requests/issues/116#issuecomment-224866699))\n", "Just to be sure: you did _actually_ end up calling with trust_env=False in your own code, yes?\n", "Yes, at least Robotframework log tells so\n\n![image](https://cloud.githubusercontent.com/assets/5635462/15931681/0d907b86-2e59-11e6-9846-94a3503b6c22.png)\n", "@Tset-Noitamotua And you're entirely confident that the session is being used?\n", "Hi @Lukasa, \n\nWe are using the [requests.Session](https://github.com/bulkan/robotframework-requests/blob/master/src/RequestsLibrary/RequestsKeywords.py#L78)\n\n[Get request](https://github.com/bulkan/robotframework-requests/blob/master/src/RequestsLibrary/RequestsKeywords.py#L840)\n", "@bulkan Sorry, my question wasn't clear. I meant: are we really sure that the request that is being delayed is going through the `Session` constructed with `trust_env=False`?\n", "Hi @Lukasa, @bulkan \nhow can I help so that we make progress on this issue. Please give me some noob friendly instructions how to verify whether the used session is the expected one :-)\n", "@Tset-Noitamotua You'll need some way to evaluate the value of `trust_env` while the request is hanging, or about to hang. \n", "@Lukasa Thank you! Did as you said and it came out value of `trust_env` was string represantation of `False` ... thus it was `'False'` (If you interessted [here](https://github.com/bulkan/robotframework-requests/issues/116#issuecomment-230270721) are more details).\n", "@Lukasa Is this issue still open? Would be willing to take a shot at it using the `@functools.lru_cache` and time-stamping approach.", "As far as I know, yes.", "@schlamar would you be able to review?", "Looks like this was fixed in #3885.", "@sigmavirus24 Actually, an expiration time of 60 seconds doesn't help at all for my original use case. I have a long running GUI process which (might) does HTTP requests every few minutes. So while the expiration time is this low and not adjustable, this issue is not fixed for me.\r\n\r\nHowever, thinking again about this issue, I would say that `urllib` is doing it wrong. Forward and reverse DNS lookups to check if a proxy should be bypassed doesn't feel right." ]
https://api.github.com/repos/psf/requests/issues/2987
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2987/labels{/name}
https://api.github.com/repos/psf/requests/issues/2987/comments
https://api.github.com/repos/psf/requests/issues/2987/events
https://github.com/psf/requests/issues/2987
130,101,833
MDU6SXNzdWUxMzAxMDE4MzM=
2,987
Enhanced form-data serialization
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
[]
closed
false
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" } ]
{ "closed_at": null, "closed_issues": 29, "created_at": "2024-05-19T18:29:04Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4", "events_url": "https://api.github.com/users/sethmlarson/events{/privacy}", "followers_url": "https://api.github.com/users/sethmlarson/followers", "following_url": "https://api.github.com/users/sethmlarson/following{/other_user}", "gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sethmlarson", "id": 18519037, "login": "sethmlarson", "node_id": "MDQ6VXNlcjE4NTE5MDM3", "organizations_url": "https://api.github.com/users/sethmlarson/orgs", "received_events_url": "https://api.github.com/users/sethmlarson/received_events", "repos_url": "https://api.github.com/users/sethmlarson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions", "type": "User", "url": "https://api.github.com/users/sethmlarson", "user_view_type": "public" }, "description": "", "due_on": null, "html_url": "https://github.com/psf/requests/milestone/34", "id": 11073254, "labels_url": "https://api.github.com/repos/psf/requests/milestones/34/labels", "node_id": "MI_kwDOABTKOs4AqPbm", "number": 34, "open_issues": 0, "state": "open", "title": "Bankruptcy", "updated_at": "2024-05-20T14:37:16Z", "url": "https://api.github.com/repos/psf/requests/milestones/34" }
8
2016-01-31T08:41:47Z
2024-05-20T14:36:28Z
2024-05-20T14:36:27Z
CONTRIBUTOR
null
Would be great to include the functionality of `requests_toolbelt.utils.formdata.urlencode` in Requests. http://toolbelt.readthedocs.org/en/latest/formdata.html Thoughts? /cc @Lukasa @sigmavirus24
{ "avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4", "events_url": "https://api.github.com/users/sethmlarson/events{/privacy}", "followers_url": "https://api.github.com/users/sethmlarson/followers", "following_url": "https://api.github.com/users/sethmlarson/following{/other_user}", "gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sethmlarson", "id": 18519037, "login": "sethmlarson", "node_id": "MDQ6VXNlcjE4NTE5MDM3", "organizations_url": "https://api.github.com/users/sethmlarson/orgs", "received_events_url": "https://api.github.com/users/sethmlarson/received_events", "repos_url": "https://api.github.com/users/sethmlarson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions", "type": "User", "url": "https://api.github.com/users/sethmlarson", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2987/reactions" }
https://api.github.com/repos/psf/requests/issues/2987/timeline
null
completed
null
null
false
[ "No objection from me, but the tool belt is @sigmavirus24's baby so he may have a stronger position,\n", "To be clear, I mean making that the default behavior. \n", "I think I'm the only contributing author of that code and both requests and the toolbelt are Apache-2.0.\n\nIf we want to move that into requests by default, we absolutely can and I'm happy to do that. We'll leave it in the toolbelt for people who are using requests prior to whatever version we ship that in.\n", "> I think I'm the only contributing author of that code and both requests and the toolbelt are Apache-2.0.\n\nI only bring this up because copyright is a huge PITA ;)\n", "That's true, but I think that definitionally requests can use that code by reproducing the license: any other contributors have by definition made that code available under Apache 2.0.\n\nOtherwise, I'm happy to do this. I'll happily do the (probably cursory) code review. \n\nThis is another great use of the toolbelt by the way: letting ideas marinate and get tested outside the core codebase. \n", "> This is another great use of the toolbelt by the way: letting ideas marinate and get tested outside the core codebase. \n\nMy thoughts exactly.\n\n> Otherwise, I'm happy to do this. I'll happily do the (probably cursory) code review. \n\nI think you did the original CR too ;)\n", "> I think you did the original CR too ;)\n\nOh god in that case we should do a really thorough code review!\n", "In an effort to clean up the issue tracker to only have issues that are still relevant to the project we've done a quick pass and decided this issue may no longer be relevant for a variety of potential reasons, including:\r\n\r\n* Applies to a much older version, unclear whether the issue still applies.\r\n* Change requires a backwards incompatible release and it's unclear if the benefits are worth the migration effort from the community.\r\n* There isn't a clear demand from the community on the change landing in Requests.\r\n\r\nIf you think the issue should remain open, please comment so below or open a new issue and link back to the original issue. Again, thank you for opening the issue and for the discussion, it's much appreciated." ]
https://api.github.com/repos/psf/requests/issues/2986
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2986/labels{/name}
https://api.github.com/repos/psf/requests/issues/2986/comments
https://api.github.com/repos/psf/requests/issues/2986/events
https://github.com/psf/requests/pull/2986
130,024,126
MDExOlB1bGxSZXF1ZXN0NTc3NDczMDM=
2,986
Fix syntax error
{ "avatar_url": "https://avatars.githubusercontent.com/u/837573?v=4", "events_url": "https://api.github.com/users/untitaker/events{/privacy}", "followers_url": "https://api.github.com/users/untitaker/followers", "following_url": "https://api.github.com/users/untitaker/following{/other_user}", "gists_url": "https://api.github.com/users/untitaker/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/untitaker", "id": 837573, "login": "untitaker", "node_id": "MDQ6VXNlcjgzNzU3Mw==", "organizations_url": "https://api.github.com/users/untitaker/orgs", "received_events_url": "https://api.github.com/users/untitaker/received_events", "repos_url": "https://api.github.com/users/untitaker/repos", "site_admin": false, "starred_url": "https://api.github.com/users/untitaker/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/untitaker/subscriptions", "type": "User", "url": "https://api.github.com/users/untitaker", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-01-30T18:56:10Z
2021-09-08T05:00:59Z
2016-01-30T19:14:20Z
CONTRIBUTOR
resolved
null
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2986/reactions" }
https://api.github.com/repos/psf/requests/issues/2986/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2986.diff", "html_url": "https://github.com/psf/requests/pull/2986", "merged_at": "2016-01-30T19:14:20Z", "patch_url": "https://github.com/psf/requests/pull/2986.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2986" }
true
[ "Thanks @untitaker!\n", "oops\n" ]
https://api.github.com/repos/psf/requests/issues/2985
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2985/labels{/name}
https://api.github.com/repos/psf/requests/issues/2985/comments
https://api.github.com/repos/psf/requests/issues/2985/events
https://github.com/psf/requests/pull/2985
129,987,754
MDExOlB1bGxSZXF1ZXN0NTc3NDAwNjg=
2,985
Fixed markup for/rendering of list in section "Custom Headers".
{ "avatar_url": "https://avatars.githubusercontent.com/u/95277?v=4", "events_url": "https://api.github.com/users/homeworkprod/events{/privacy}", "followers_url": "https://api.github.com/users/homeworkprod/followers", "following_url": "https://api.github.com/users/homeworkprod/following{/other_user}", "gists_url": "https://api.github.com/users/homeworkprod/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/homeworkprod", "id": 95277, "login": "homeworkprod", "node_id": "MDQ6VXNlcjk1Mjc3", "organizations_url": "https://api.github.com/users/homeworkprod/orgs", "received_events_url": "https://api.github.com/users/homeworkprod/received_events", "repos_url": "https://api.github.com/users/homeworkprod/repos", "site_admin": false, "starred_url": "https://api.github.com/users/homeworkprod/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/homeworkprod/subscriptions", "type": "User", "url": "https://api.github.com/users/homeworkprod", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-01-30T13:45:23Z
2021-09-08T05:01:01Z
2016-01-30T14:00:14Z
CONTRIBUTOR
resolved
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2985/reactions" }
https://api.github.com/repos/psf/requests/issues/2985/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2985.diff", "html_url": "https://github.com/psf/requests/pull/2985", "merged_at": "2016-01-30T14:00:14Z", "patch_url": "https://github.com/psf/requests/pull/2985.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2985" }
true
[ "Thanks! :sparkles: :cake: :sparkles:\n", "You're welcome. Glad I could help :)\n" ]
https://api.github.com/repos/psf/requests/issues/2984
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2984/labels{/name}
https://api.github.com/repos/psf/requests/issues/2984/comments
https://api.github.com/repos/psf/requests/issues/2984/events
https://github.com/psf/requests/pull/2984
129,912,566
MDExOlB1bGxSZXF1ZXN0NTc3MTg0MjE=
2,984
Remove int type on HTTPAdapter's max_retries argument doc
{ "avatar_url": "https://avatars.githubusercontent.com/u/46059?v=4", "events_url": "https://api.github.com/users/carsonyl/events{/privacy}", "followers_url": "https://api.github.com/users/carsonyl/followers", "following_url": "https://api.github.com/users/carsonyl/following{/other_user}", "gists_url": "https://api.github.com/users/carsonyl/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/carsonyl", "id": 46059, "login": "carsonyl", "node_id": "MDQ6VXNlcjQ2MDU5", "organizations_url": "https://api.github.com/users/carsonyl/orgs", "received_events_url": "https://api.github.com/users/carsonyl/received_events", "repos_url": "https://api.github.com/users/carsonyl/repos", "site_admin": false, "starred_url": "https://api.github.com/users/carsonyl/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/carsonyl/subscriptions", "type": "User", "url": "https://api.github.com/users/carsonyl", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-01-29T23:41:23Z
2021-09-08T05:00:59Z
2016-01-30T02:59:04Z
CONTRIBUTOR
resolved
The argument description says `max_retries` can be an int, or a `urllib3.util.Retry` object. However, the type declared in the doc restricts it to int only. Passing a Retry object causes a warning in PyCharm code inspection. The behaviour described for `max_retries` is accurate: HTTPAdapter passes `max_retries` to `Retry.from_int()`, which just returns the input if `max_retries` is a Retry object.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2984/reactions" }
https://api.github.com/repos/psf/requests/issues/2984/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2984.diff", "html_url": "https://github.com/psf/requests/pull/2984", "merged_at": "2016-01-30T02:59:04Z", "patch_url": "https://github.com/psf/requests/pull/2984.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2984" }
true
[ "Thanks!\n" ]
https://api.github.com/repos/psf/requests/issues/2983
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2983/labels{/name}
https://api.github.com/repos/psf/requests/issues/2983/comments
https://api.github.com/repos/psf/requests/issues/2983/events
https://github.com/psf/requests/issues/2983
129,640,029
MDU6SXNzdWUxMjk2NDAwMjk=
2,983
First line of uploading files disappears.
{ "avatar_url": "https://avatars.githubusercontent.com/u/11981676?v=4", "events_url": "https://api.github.com/users/amdei/events{/privacy}", "followers_url": "https://api.github.com/users/amdei/followers", "following_url": "https://api.github.com/users/amdei/following{/other_user}", "gists_url": "https://api.github.com/users/amdei/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/amdei", "id": 11981676, "login": "amdei", "node_id": "MDQ6VXNlcjExOTgxNjc2", "organizations_url": "https://api.github.com/users/amdei/orgs", "received_events_url": "https://api.github.com/users/amdei/received_events", "repos_url": "https://api.github.com/users/amdei/repos", "site_admin": false, "starred_url": "https://api.github.com/users/amdei/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/amdei/subscriptions", "type": "User", "url": "https://api.github.com/users/amdei", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-01-29T01:56:38Z
2021-09-08T19:00:37Z
2016-01-29T17:24:35Z
NONE
resolved
requests 2.9.1 Executing of this code (from samples in documentation): `import requests url = 'http://192.168.88.164:8000/' files = {'file': ('conf.txt', 'some, data,to send\r\nnanother,row,to,send\r\nyet another,row,to,send\r\n')} r = requests.post(url, files=files) print r.text` Results in correctly uploaded file, but without first line. I.e. part 'some, data,to send\r\n' is missed. Same with actoual file upload: `import requests url = 'http://192.168.88.164:8000/' files = {'file': ('conf.txt', open('conf.txt', 'rb'))} r = requests.post(url, files=files) print r.text` First line disappears from uploaded content. The same file uploaded to the same server with curl: `curl -v -X POST -F [email protected] http://192.168.88.164:8000` do not such an issue. Issue seems to be in incorrect encoding - extra \r\n where missed somewhere.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2983/reactions" }
https://api.github.com/repos/psf/requests/issues/2983/timeline
null
completed
null
null
false
[ "Sorry, but I can't reproduce this using a publicly available website. If I run this code:\n\n``` python\nimport requests\nurl = 'http://http2bin.org/post'\nfiles = {'file': ('conf.txt', 'some, data,to send\\r\\nnanother,row,to,send\\r\\nyet another,row,to,send\\r\\n')}\nr = requests.post(url, files=files)\nprint r.content\n```\n\nthen the terminal prints:\n\n```\n{\n \"args\": {}, \n \"data\": \"\", \n \"files\": {\n \"file\": \"some, data,to send\\r\\nnanother,row,to,send\\r\\nyet another,row,to,send\\r\\n\"\n }, \n \"form\": {}, \n \"headers\": {\n \"Accept\": \"*/*\", \n \"Accept-Encoding\": \"gzip, deflate\", \n \"Connection\": \"keep-alive\", \n \"Content-Length\": \"211\", \n \"Content-Type\": \"multipart/form-data; boundary=83f1175700aa4a2b8a49111a87fce425\", \n \"Host\": \"http2bin.org\", \n \"User-Agent\": \"python-requests/2.9.1\", \n \"Via\": \"1.1 http2bin.org\"\n }, \n \"json\": null, \n \"origin\": \"87.115.225.178\", \n \"url\": \"http://http2bin.org/post\"\n}\n```\n\nAs you can see, all the data is present. The actual HTTP request on the wire looked like this:\n\n```\nPOST /post HTTP/1.1\nHost: http2bin.org\nContent-Length: 211\nAccept-Encoding: gzip, deflate\nAccept: */*\nUser-Agent: python-requests/2.9.1\nConnection: keep-alive\nContent-Type: multipart/form-data; boundary=83f1175700aa4a2b8a49111a87fce425\n\n--83f1175700aa4a2b8a49111a87fce425\nContent-Disposition: form-data; name=\"file\"; filename=\"conf.txt\"\n\nsome, data,to send\nnanother,row,to,send\nyet another,row,to,send\n\n--83f1175700aa4a2b8a49111a87fce425--\n```\n\nThis all seems correct to me. Are you sure your server is handling this appropriately?\n", "Indeed, issue was on server-side.\nWhen requests send string, there is no 'Content-Type' field, and it confused server so it ate first line of uploaded file.\n\nThanks for great library!\nSorry for hesitate.\n", "No worries @amdei \n" ]
https://api.github.com/repos/psf/requests/issues/2982
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2982/labels{/name}
https://api.github.com/repos/psf/requests/issues/2982/comments
https://api.github.com/repos/psf/requests/issues/2982/events
https://github.com/psf/requests/issues/2982
129,265,819
MDU6SXNzdWUxMjkyNjU4MTk=
2,982
Error 404 for url, that contains relative path parts
{ "avatar_url": "https://avatars.githubusercontent.com/u/3865300?v=4", "events_url": "https://api.github.com/users/Lol4t0/events{/privacy}", "followers_url": "https://api.github.com/users/Lol4t0/followers", "following_url": "https://api.github.com/users/Lol4t0/following{/other_user}", "gists_url": "https://api.github.com/users/Lol4t0/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lol4t0", "id": 3865300, "login": "Lol4t0", "node_id": "MDQ6VXNlcjM4NjUzMDA=", "organizations_url": "https://api.github.com/users/Lol4t0/orgs", "received_events_url": "https://api.github.com/users/Lol4t0/received_events", "repos_url": "https://api.github.com/users/Lol4t0/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lol4t0/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lol4t0/subscriptions", "type": "User", "url": "https://api.github.com/users/Lol4t0", "user_view_type": "public" }
[ { "color": "eb6420", "default": false, "description": null, "id": 44501256, "name": "Breaking API Change", "node_id": "MDU6TGFiZWw0NDUwMTI1Ng==", "url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change" } ]
closed
false
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2024-05-19T18:29:04Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4", "events_url": "https://api.github.com/users/sethmlarson/events{/privacy}", "followers_url": "https://api.github.com/users/sethmlarson/followers", "following_url": "https://api.github.com/users/sethmlarson/following{/other_user}", "gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sethmlarson", "id": 18519037, "login": "sethmlarson", "node_id": "MDQ6VXNlcjE4NTE5MDM3", "organizations_url": "https://api.github.com/users/sethmlarson/orgs", "received_events_url": "https://api.github.com/users/sethmlarson/received_events", "repos_url": "https://api.github.com/users/sethmlarson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions", "type": "User", "url": "https://api.github.com/users/sethmlarson", "user_view_type": "public" }, "description": "", "due_on": null, "html_url": "https://github.com/psf/requests/milestone/34", "id": 11073254, "labels_url": "https://api.github.com/repos/psf/requests/milestones/34/labels", "node_id": "MI_kwDOABTKOs4AqPbm", "number": 34, "open_issues": 0, "state": "open", "title": "Bankruptcy", "updated_at": "2024-05-20T14:37:16Z", "url": "https://api.github.com/repos/psf/requests/milestones/34" }
39
2016-01-27T20:55:05Z
2024-05-20T14:34:59Z
2024-05-20T14:34:59Z
NONE
null
Browser & other tools seem to ignore trailing dot in the URL. Unfortunately `requests` does not. Compare ``` $ curl -s https://github.com/. -o /dev/null -w "%{http_code}"; echo 200 ``` With ``` $ python -c 'import requests; print(requests.get("https://github.com/.").status_code)' 404 ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4", "events_url": "https://api.github.com/users/sethmlarson/events{/privacy}", "followers_url": "https://api.github.com/users/sethmlarson/followers", "following_url": "https://api.github.com/users/sethmlarson/following{/other_user}", "gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sethmlarson", "id": 18519037, "login": "sethmlarson", "node_id": "MDQ6VXNlcjE4NTE5MDM3", "organizations_url": "https://api.github.com/users/sethmlarson/orgs", "received_events_url": "https://api.github.com/users/sethmlarson/received_events", "repos_url": "https://api.github.com/users/sethmlarson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions", "type": "User", "url": "https://api.github.com/users/sethmlarson", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2982/reactions" }
https://api.github.com/repos/psf/requests/issues/2982/timeline
null
completed
null
null
false
[ "requests/api.py - request()\n\n```\nwith sessions.Session() as session:\n return session.request(method=method, url=url.rstrip(\".\"), **kwargs)\n```\n\nNot sure if this matches any style guides, but should help solve this issue\n", "@TetraEtc that's overly general. While curl and browsers detect the `.` as the sole element of the path, they don't do anything fancy for `https://github.com/foo.` which yields a 404.\n\nFurther Requests is an HTTP client, not an oracle or a browser. We can't know what URLs to meaningfully trim/munge/etc. unless a specification tells us to do so. Even then I'd air on the side of not doing this behind the scenes for the user.\n", "You are not exactly right.\n\nhttps://github.com/kennethreitz/requests/. works.\nEven https://github.com/kennethreitz/blahblah/../requests/ works.\n\nThat is called relative URIs\n\nSee [some RFC](http://tools.ietf.org/html/rfc2396#section-5)\n", "@Lol4t0 \nFrom the RFC you referenced (which is btw deprecated, the new one says the [same](http://tools.ietf.org/html/rfc3986#section-3.3)):\n\n```\n Although this is\n very similar to their use within Unix-based filesystems to indicate\n directory levels, these path components are only considered special\n when resolving a relative-path reference to its absolute form\n```\n\n`only considered special when resolving a relative-path...`. requests does not resolve a path, it uses is as is.\n", "It have to resolve at least to perform redirects\n28 янв. 2016 г. 2:51 пользователь \"Thomas Weißschuh\" <\[email protected]> написал:\n\n> @Lol4t0 https://github.com/Lol4t0\n> From the RFC you referenced (which is btw deprecated, the new one says the\n> same http://tools.ietf.org/html/rfc3986#section-3.3):\n> \n> Although this is\n> very similar to their use within Unix-based filesystems to indicate\n> directory levels, these path components are only considered special\n> when resolving a relative-path reference to its absolute form\n> \n> only considered special when resolving a relative-path.... requests does\n> not resolve a path, it uses is as is.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2982#issuecomment-175914604\n> .\n", "@Lol4t0 as @t-8ch points out, relative-references are not what requests accepts (we only accept absolute references which is what you passed us).\n\n> It have to resolve at least to perform redirects\n\nWhat?\n", "@Lol4t0 those links work in your browser (reminder, requests is not a browser)\n\n```\n❯❯❯ curl -i -vv -s 'https://github.com/kennethreitz/requests.'\n* Trying 192.30.252.128...\n* Connected to github.com (192.30.252.128) port 443 (#0)\n* TLS 1.2 connection using TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256\n* Server certificate: github.com\n* Server certificate: DigiCert SHA2 Extended Validation Server CA\n* Server certificate: DigiCert High Assurance EV Root CA\n> GET /kennethreitz/requests. HTTP/1.1\n> Host: github.com\n> User-Agent: curl/7.43.0\n> Accept: */*\n>\n< HTTP/1.1 404 Not Found\nHTTP/1.1 404 Not Found\n< Server: GitHub.com\nServer: GitHub.com\n< Date: Wed, 27 Jan 2016 23:55:32 GMT\nDate: Wed, 27 Jan 2016 23:55:32 GMT\n< Content-Type: application/json; charset=utf-8\nContent-Type: application/json; charset=utf-8\n< Transfer-Encoding: chunked\nTransfer-Encoding: chunked\n< Status: 404 Not Found\nStatus: 404 Not Found\n< Cache-Control: no-cache\nCache-Control: no-cache\n< Vary: X-PJAX\nVary: X-PJAX\n< X-UA-Compatible: IE=Edge,chrome=1\nX-UA-Compatible: IE=Edge,chrome=1\n< X-Request-Id: 312fc05db0965f607e692a21c1217b43\nX-Request-Id: 312fc05db0965f607e692a21c1217b43\n< X-Runtime: 0.007642\nX-Runtime: 0.007642\n< Content-Security-Policy: default-src 'none'; base-uri 'self'; connect-src 'self'; form-action 'self'; img-src data:; script-src 'self'; style-src 'unsafe-inline'\nContent-Security-Policy: default-src 'none'; base-uri 'self'; connect-src 'self'; form-action 'self'; img-src data:; script-src 'self'; style-src 'unsafe-inline'\n< Strict-Transport-Security: max-age=31536000; includeSubdomains; preload\nStrict-Transport-Security: max-age=31536000; includeSubdomains; preload\n< Public-Key-Pins: max-age=300; pin-sha256=\"WoiWRyIOVNa9ihaBciRSC7XHjliYS9VwUGOIud4PB18=\"; pin-sha256=\"JbQbUG5JMJUoI6brnx0x3vZF6jilxsapbXGVfjhN8Fg=\"; includeSubDomains\nPublic-Key-Pins: max-age=300; pin-sha256=\"WoiWRyIOVNa9ihaBciRSC7XHjliYS9VwUGOIud4PB18=\"; pin-sha256=\"JbQbUG5JMJUoI6brnx0x3vZF6jilxsapbXGVfjhN8Fg=\"; includeSubDomains\n< X-Content-Type-Options: nosniff\nX-Content-Type-Options: nosniff\n< X-Frame-Options: deny\nX-Frame-Options: deny\n< X-XSS-Protection: 1; mode=block\nX-XSS-Protection: 1; mode=block\n< X-GitHub-Request-Id: 48A0CA7F:4757:60D2E69:56A958F4\nX-GitHub-Request-Id: 48A0CA7F:4757:60D2E69:56A958F4\n\n<\n* Connection #0 to host github.com left intact\n{\"error\":\"Not Found\"}\n```\n\nThey don't work with curl though.\n", "Server can return you a relative URL as redirect target with 302 for\nexample. How are you going to handle that?\nL\n28 янв. 2016 г. 2:55 пользователь \"Ian Cordasco\" [email protected]\nнаписал:\n\n> @Lol4t0 https://github.com/Lol4t0 as @t-8ch https://github.com/t-8ch\n> points out, relative-references are not what requests accepts (we only\n> accept absolute references which is what you passed us).\n> \n> It have to resolve at least to perform redirects\n> \n> What?\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2982#issuecomment-175915919\n> .\n", "Note even if we did start to use [rfc3986](/sigmavirus24/rfc3986) this wouldn't be handled because (as discussed above) these are not relative references. These are absolute references. \n", "@Lol4t0 relative Location headers are different. We already handle relative Location headers although I don't think we've tried handling any with a `.` or `..` in them. You're changing the bug report though. User-input will not be modified. We'll investigate our handling of relative Location headers with `.` or `..` in them.\n", "1. May be you copy-paste correctly in curl and try again? (`requests. != requests/.`, first is not relative in any sense)\n2. Those are valid URLs. User input should not be changed, but URL should be simplified before sending request. That is not any different from URL encoding\n\nI believe requests should work consistently with other tools anyway. Note that curl is the same level to requests, and it simplifies urls. More, handling Location and direct input differently makes requests inconsistent with itself.\n", "> More, handling Location and direct input differently makes requests inconsistent with itself.\n\nI completely disagree. One (the Location header provided by the server) is an _actual_ relative reference. The one you're providing is an absolute reference. There's nothing inconsistent with resolving relative references provided by the server and not munging an absolute reference provided by the user.\n", "> absolute reference provided by the user\n\nIf you look into 5.1, 5.2.2, you'll see, that to resolve an URL it should first be converted to absolute form, and that relative URL can contain scheme, host, etc. Strict rules of conversion also provided, so you didn't have to be oracle to make conversion.\n\nIf you tell me that you are not going to resolve URLs, I'm asking _who_ should do it? Doesn't it mean that _every_ URL that has some sort of foreign input (i.e. user input, configuration file, obtained as a result of another request, so, just _most_ of) should be resolved _additionally_ before using requests, that is sort of _super verbose and should not be in python_?\n", "This is a remarkably tricky issue, but in general I'm sympathetic to the idea that requests should normalise the URL as much as possible when provided by the user or when received in the Location header.\n\nI appreciate that generally speaking we try not to manipulate the URLs provided by users _too_ extensively, but we do still play with them: we try to urlencode partially encoded URLs and generally just try to make sense of what the remote party has sent us. For that reason, I also think we should take a similar attitude to partial paths.\n\nHowever, there are some reasons to be tentative here. The first is that this amounts to a major change, which would mean it has to go in to 3.0.0. The second is that if we're determined to start handling URLs \"properly\", we should aim to handle them _really_ properly. I believe @glyph is working on a URL handling library, but generally speaking I think we should be aiming to bring in something that lets us work with URLs more effectively so that we can do something that amounts to appropriate behaviour.\n\nThe upshot means that, if @sigmavirus24 agrees with me, we'll need to put this on the backburner until we get a library that can let us work with URLs more effectively.\n", "Could you please refer to the @glyph 's repository so I could use it meanwhile it is not integrated into requests or participate in its development may be\n", "Currently I think @glyph is developing it as part of Twisted, so I'm honestly not sure where it lives. However, I'm sure @glyph would be delighted to weigh into this thread at some time today. =)\n", "It's been part of a Twisted release, in fact, so you can `pip install twisted` and play with it.\n\nYou can see the documentation here:\n\nhttps://twistedmatrix.com/documents/15.5.0/api/twisted.python.url.URL.html\n\nThis module is part of Twisted currently, but:\n1. the plan is to spin it out into a separate library, and release it on PyPI (probably as `pip install iri`, since it is concerned largely with proper representation and transcoding of internationalized resource identifiers)\n2. the module is already totally self-contained; since `requests` already vendors absolutely everything, feel free to vendor this code out of a Twisted source release; it will work fine. It even goes so far as to avoid importing Twisted's own `TestCase` subclass, so that the unit tests will be portable.\n3. I wanted to leave it in Twisted to let some people experiment with it before putting the API in stone, since I assume if we actually do a separate PyPI release it would firm up the interface. However, thus far I've been very happy with the interface, and so if requests wanted to make use of it as-is, I'd be happy to harden its compatibility guarantees so you can pass them on to your users.\n", "Thanks @glyph. I'll try to play around with this at some point and see if it's useful for requests.\n", "> I appreciate that generally speaking we try not to manipulate the URLs provided by users too extensively, but we do still play with them: we try to urlencode partially encoded URLs and generally just try to make sense of what the remote party has sent us. For that reason, I also think we should take a similar attitude to partial paths.\n\nThe way that `URL` deals with this tension is as follows:\n\nWhen you call `URL.fromText`, the parser attempts to retain as much information as possible from the text that you have passed to it. `URL` intentionally does _not_ have a `fromBytes` method, because at a bare minimum, URLs always need to be representable as text that can be displayed to a human, because you have to put these things into URL bars or logs or somesuch, and those are textual data.\n\nIf you want to \"promote\" your URI to be more textually comprehensible, at the expense of losing some information about the input bytes, you can do `URI.asIRI()`, which will, to the extent possible, decode all percent-encodings and de-UTF-8 all the resulting octets. The resulting URI is also textual, but follows the rules for encoding of non-ASCII characters.\n\nSimilarly, if you have some text a user typed and you need to ensure that it is ASCII-only for network serialization, you can call `.asURI()` on it.\n\n`.asIRI()` has the subtle nuance of decoding any %-encoded ASCII-only characters as well, including the _extra_ subtle nuance of changing the _internal_ representation of `%2F` to `/`, so that applications can access completely decoded text in each segment. However, since `/` cannot be serialized, `.asText()` will still %-encode slashes on the fly (without modifying the internal state of the `URL` being serialized; `URL` objects are completely immutable).\n\nThis is actually a restatement and expansion of the final part of the class docstring, and the rule is very simple, but its implications can be slightly confusing. See for example this (ultimately invalid) bug report, which goes into some more detail: https://twistedmatrix.com/trac/ticket/8178\n", "> Thanks @glyph. I'll try to play around with this at some point and see if it's useful for requests.\n\nThanks for considering this code to solve this problem, @Lukasa. I'm optimistic you'll like it; we've been screwing up URLs real bad for a real long time in Twisted, and the design on this one really clicked together nicely. The one part that might seem a little odd to Python programmers at first is the totally functional interface, but being able to treat URLs as immutable like the strings they came from actually ends up being really handy.\n", "Oh and of course be sure to check out the [`click`](https://twistedmatrix.com/documents/15.5.0/api/twisted.python.url.URL.html#click) method, which handles relative reference resolution. `URL` objects can represent both absolute and relative URLs, as well as non-HTTP URLs. One possible feature that would be necessary would be relative reference resolution on already-parsed `URL`s, as well as just strings, which would be a minor change.\n", "Looking back earlier in the discussion we may also be missing one other thing, which is an explicit relative-dot-resolution phase. Currently this is just handled in `.click`, and there's an internal utility function for it. It seems like we will need a `.normalizePath()` method or somesuch to accommodate that use-case fully.\n", "Yeah, `.normalizePath()` is pretty important to this use-case. =)\n", "@Lukasa `URL.normalizePath() → URL with no \".\" or \"..\" segments` work as a signature? Any other options which might be required? I could probably have that landed on trunk by Friday.\n", "I think strictly we want to follow section 5 of RFC 3986. Note also that @sigmavirus24 has a library with exactly that name that may also end up a good fit here, so you may not want to leap immediately into writing that code: we need to work out exactly what we need here!\n", "Looks like path normalization should follow [RFC algorithm](http://tools.ietf.org/html/rfc3986#section-5.2.4):\n\n> 5.2.4. Remove Dot Segments\n> \n> The pseudocode also refers to a \"remove_dot_segments\" routine for\n> interpreting and removing the special \".\" and \"..\" complete path\n> segments from a referenced path. This is done after the path is\n> extracted from a reference, whether or not the path was relative, in\n> order to remove any invalid or extraneous dot-segments prior to\n> forming the target URI. Although there are many ways to accomplish\n> this removal process, we describe a simple method using two string\n> buffers.\n> 1. The input buffer is initialized with the now-appended path\n> components and the output buffer is initialized to the empty\n> string.\n> 2. While the input buffer is not empty, loop as follows:\n> \n> A. If the input buffer begins with a prefix of \"../\" or \"./\",\n> then remove that prefix from the input buffer; otherwise,\n> \n> B. if the input buffer begins with a prefix of \"/./\" or \"/.\",\n> where \".\" is a complete path segment, then replace that\n> prefix with \"/\" in the input buffer; otherwise,\n> \n> C. if the input buffer begins with a prefix of \"/../\" or \"/..\",\n> where \"..\" is a complete path segment, then replace that\n> prefix with \"/\" in the input buffer and remove the last\n> segment and its preceding \"/\" (if any) from the output\n> buffer; otherwise,\n> \n> D. if the input buffer consists only of \".\" or \"..\", then remove\n> that from the input buffer; otherwise,\n> \n> E. move the first path segment in the input buffer to the end of\n> the output buffer, including the initial \"/\" character (if\n> any) and any subsequent characters up to, but not including,\n> the next \"/\" character or the end of the input buffer.\n> 3. Finally, the output buffer is returned as the result of\n> remove_dot_segments.\n", "We already have an implementation of exactly that algorithm (there's an RFC citation in the docstring and everything) it's just incompletely applied. If you call `.click` with a proper relative reference, all those rules will be followed. (If you call it with an absolute URI then you just get the absolute URI back with no dot segment removal; that might be a bug.)\n", "I think it _is_ an issue because of this part of [5.2.2](http://tools.ietf.org/html/rfc3986#section-5.2.2):\n\n`R` is relative and `T` is base. So here If `R` is absolute (i.e. has scheme and authority), its path should be normalized anyway\n\n```\n if defined(R.scheme) then\n T.scheme = R.scheme;\n T.authority = R.authority;\n T.path = remove_dot_segments(R.path);\n T.query = R.query;\n```\n", "Hi,\nRfc3986, section 5.4.2 mentions the exact case presented in this issue and states that every uri parser should handle those cases consistently.\nI'm new around here, by the way, so, hello!\n", "Testing rfc3986, shows that this already works there @lukasa and we had been talking about using that in the past. @glyph maybe we should try to converge our libraries\n" ]
https://api.github.com/repos/psf/requests/issues/2981
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2981/labels{/name}
https://api.github.com/repos/psf/requests/issues/2981/comments
https://api.github.com/repos/psf/requests/issues/2981/events
https://github.com/psf/requests/pull/2981
129,122,256
MDExOlB1bGxSZXF1ZXN0NTczNDk4NzA=
2,981
Warn about encrypted keys in the docs.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-01-27T11:57:13Z
2021-09-08T05:01:03Z
2016-01-27T16:19:27Z
MEMBER
resolved
We don't support them!
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2981/reactions" }
https://api.github.com/repos/psf/requests/issues/2981/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2981.diff", "html_url": "https://github.com/psf/requests/pull/2981", "merged_at": "2016-01-27T16:19:27Z", "patch_url": "https://github.com/psf/requests/pull/2981.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2981" }
true
[ "See also #2312.\n" ]
https://api.github.com/repos/psf/requests/issues/2980
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2980/labels{/name}
https://api.github.com/repos/psf/requests/issues/2980/comments
https://api.github.com/repos/psf/requests/issues/2980/events
https://github.com/psf/requests/issues/2980
128,481,273
MDU6SXNzdWUxMjg0ODEyNzM=
2,980
request question
{ "avatar_url": "https://avatars.githubusercontent.com/u/3785409?v=4", "events_url": "https://api.github.com/users/rfyiamcool/events{/privacy}", "followers_url": "https://api.github.com/users/rfyiamcool/followers", "following_url": "https://api.github.com/users/rfyiamcool/following{/other_user}", "gists_url": "https://api.github.com/users/rfyiamcool/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/rfyiamcool", "id": 3785409, "login": "rfyiamcool", "node_id": "MDQ6VXNlcjM3ODU0MDk=", "organizations_url": "https://api.github.com/users/rfyiamcool/orgs", "received_events_url": "https://api.github.com/users/rfyiamcool/received_events", "repos_url": "https://api.github.com/users/rfyiamcool/repos", "site_admin": false, "starred_url": "https://api.github.com/users/rfyiamcool/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/rfyiamcool/subscriptions", "type": "User", "url": "https://api.github.com/users/rfyiamcool", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-01-25T07:46:14Z
2021-09-08T19:00:38Z
2016-01-25T08:06:50Z
NONE
resolved
requests.get('xxx', params={"id":111}) requests.get('xxx', params=json.dumps{"id":111}) These two are the same ?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2980/reactions" }
https://api.github.com/repos/psf/requests/issues/2980/timeline
null
completed
null
null
false
[ "@rfyiamcool No. The first case does this: `xxx?id=111`. The second case does this: `xxx?%7B%22id%22:%20111%7D`.\n\nIn future, please ask questions on Stack Overflow: the GitHub issues page is strictly for bugs.\n" ]
https://api.github.com/repos/psf/requests/issues/2979
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2979/labels{/name}
https://api.github.com/repos/psf/requests/issues/2979/comments
https://api.github.com/repos/psf/requests/issues/2979/events
https://github.com/psf/requests/issues/2979
128,423,683
MDU6SXNzdWUxMjg0MjM2ODM=
2,979
JSON content type for PUT method
{ "avatar_url": "https://avatars.githubusercontent.com/u/1684705?v=4", "events_url": "https://api.github.com/users/Neraste/events{/privacy}", "followers_url": "https://api.github.com/users/Neraste/followers", "following_url": "https://api.github.com/users/Neraste/following{/other_user}", "gists_url": "https://api.github.com/users/Neraste/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Neraste", "id": 1684705, "login": "Neraste", "node_id": "MDQ6VXNlcjE2ODQ3MDU=", "organizations_url": "https://api.github.com/users/Neraste/orgs", "received_events_url": "https://api.github.com/users/Neraste/received_events", "repos_url": "https://api.github.com/users/Neraste/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Neraste/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Neraste/subscriptions", "type": "User", "url": "https://api.github.com/users/Neraste", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-01-24T20:39:05Z
2021-09-08T19:00:38Z
2016-01-24T20:53:58Z
NONE
resolved
According to the [doc](http://docs.python-requests.org/en/latest/api/), you can specify to send a request with JSON content type for POST method. But there is nothing like this for PUT or PATCH methods. Is there a reason? Can this be implemented?
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2979/reactions" }
https://api.github.com/repos/psf/requests/issues/2979/timeline
null
completed
null
null
false
[ "If you look at the documentation for [`requests.put`](http://docs.python-requests.org/en/latest/api/#requests.put) and [`requests.patch`](http://docs.python-requests.org/en/latest/api/#requests.patch) they take `**kwargs` that are passed to [`requests.request`](http://docs.python-requests.org/en/latest/api/#requests.request).\n\nThis means that `json=` is already a valid parameter for those methods.\n" ]
https://api.github.com/repos/psf/requests/issues/2978
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2978/labels{/name}
https://api.github.com/repos/psf/requests/issues/2978/comments
https://api.github.com/repos/psf/requests/issues/2978/events
https://github.com/psf/requests/issues/2978
128,419,712
MDU6SXNzdWUxMjg0MTk3MTI=
2,978
Inconsistent behaviour with https between urllib3 and requestsion.
{ "avatar_url": "https://avatars.githubusercontent.com/u/410950?v=4", "events_url": "https://api.github.com/users/jorourke/events{/privacy}", "followers_url": "https://api.github.com/users/jorourke/followers", "following_url": "https://api.github.com/users/jorourke/following{/other_user}", "gists_url": "https://api.github.com/users/jorourke/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jorourke", "id": 410950, "login": "jorourke", "node_id": "MDQ6VXNlcjQxMDk1MA==", "organizations_url": "https://api.github.com/users/jorourke/orgs", "received_events_url": "https://api.github.com/users/jorourke/received_events", "repos_url": "https://api.github.com/users/jorourke/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jorourke/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jorourke/subscriptions", "type": "User", "url": "https://api.github.com/users/jorourke", "user_view_type": "public" }
[]
closed
true
null
[]
null
9
2016-01-24T19:41:01Z
2021-09-08T16:00:35Z
2016-08-05T07:49:52Z
NONE
resolved
In trying to make a https connection as so: ``` shell (trustmile-api-p2710)jBeast:trustmile james$ python -c "import requests; r = requests.get('https://devapi.trustmile.com/static/index.html', verify=True)" Traceback (most recent call last): File "<string>", line 1, in <module> File "/Users/james/.virtualenvs/trustmile-api-p2710/lib/python2.7/site-packages/requests/api.py", line 67, in get return request('get', url, params=params, **kwargs) File "/Users/james/.virtualenvs/trustmile-api-p2710/lib/python2.7/site-packages/requests/api.py", line 53, in request return session.request(method=method, url=url, **kwargs) File "/Users/james/.virtualenvs/trustmile-api-p2710/lib/python2.7/site-packages/requests/sessions.py", line 468, in request resp = self.send(prep, **send_kwargs) File "/Users/james/.virtualenvs/trustmile-api-p2710/lib/python2.7/site-packages/requests/sessions.py", line 576, in send r = adapter.send(request, **kwargs) File "/Users/james/.virtualenvs/trustmile-api-p2710/lib/python2.7/site-packages/requests/adapters.py", line 447, in send raise SSLError(e, request=request) requests.exceptions.SSLError: ("bad handshake: Error([('SSL routines', 'ssl3_get_server_certificate', 'certificate verify failed')],)",) ``` I have latest necessary packages installed: ``` shell (trustmile-api-p2710)jBeast:trustmile james$ pip freeze | egrep '(requests|urllib3|certifi)' certifi==2015.11.20.1 requests==2.9.1 urllib3==1.14 ``` However, urllib works in the same environment: ``` python import urllib3 import certifi http = urllib3.PoolManager( cert_reqs='CERT_REQUIRED', # Force certificate check. ca_certs=certifi.where(), # Path to the Certifi bundle. ) # You're ready to make verified HTTPS requests. try: r = http.request('GET', 'https://devapi.trustmile.com/static/index.html') print r.status except urllib3.exceptions.SSLError as e: print e ``` Output is status 200. Am I missing something? I tried to set the relevant environment variables. The cert is a legit one and it's root cert is in the cacerts.pem in certifi package. Thanks again for an awesome package I use all the time Kenneth et al!
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2978/reactions" }
https://api.github.com/repos/psf/requests/issues/2978/timeline
null
completed
null
null
false
[ "Out of interest, what version of OpenSSL are you using? Run `python -c 'import ssl; print ssl.OPENSSL_VERSION'`\n", "OpenSSL 0.9.8zg 14 July 2015\n", "Interesting. Does this problem occur if you don't pass `verify=True`?\n", "Yep, same issue\n", "Hmm. What other packages do you have installed?\n", "Ah, that must be it, created a clean env with just certifi, requests, and urllib3 in it. Will investigate further. I have a lot installed.\n", "ok, fixed in my env. Not sure which was the culprit but this worked\n\n``` shell\npip uninstall backports.ssl-match-hostname\npip uninstall pyOpenSSL\n```\n", "Yeah, so PyOpenSSL is the likely culprit. Requests, unlike urllib3, will automatically try to use PyOpenSSL if it's present. Can you reinstall it, and then run your urllib3 script with these two lines added to the top:\n\n``` python\nimport urllib3.contrib.pyopenssl\nurllib3.contrib.pyopenssl.inject_into_urllib3()\n```\n", "Your script would look like this:\n\n``` python\nimport urllib3\nimport urllib3.contrib.pyopenssl\nimport certifi\n\nurllib3.contrib.pyopenssl.inject_into_urllib3()\n\nhttp = urllib3.PoolManager(\n cert_reqs='CERT_REQUIRED', # Force certificate check.\n ca_certs=certifi.where(), # Path to the Certifi bundle.\n)\n\n# You're ready to make verified HTTPS requests.\ntry:\n r = http.request('GET', 'https://devapi.trustmile.com/static/index.html')\n print r.status\nexcept urllib3.exceptions.SSLError as e:\n print e\n```\n" ]
https://api.github.com/repos/psf/requests/issues/2977
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2977/labels{/name}
https://api.github.com/repos/psf/requests/issues/2977/comments
https://api.github.com/repos/psf/requests/issues/2977/events
https://github.com/psf/requests/issues/2977
128,412,088
MDU6SXNzdWUxMjg0MTIwODg=
2,977
JSON-ROA data ignored
{ "avatar_url": "https://avatars.githubusercontent.com/u/113206?v=4", "events_url": "https://api.github.com/users/birk/events{/privacy}", "followers_url": "https://api.github.com/users/birk/followers", "following_url": "https://api.github.com/users/birk/following{/other_user}", "gists_url": "https://api.github.com/users/birk/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/birk", "id": 113206, "login": "birk", "node_id": "MDQ6VXNlcjExMzIwNg==", "organizations_url": "https://api.github.com/users/birk/orgs", "received_events_url": "https://api.github.com/users/birk/received_events", "repos_url": "https://api.github.com/users/birk/repos", "site_admin": false, "starred_url": "https://api.github.com/users/birk/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/birk/subscriptions", "type": "User", "url": "https://api.github.com/users/birk", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-01-24T17:29:44Z
2021-09-08T19:00:39Z
2016-01-24T17:50:40Z
NONE
resolved
Hi, I'm trying to read JSON-ROA responses from an API that look like this: ``` json { "_json-roa": { "filled_with": "lots of data" }, "created_at": "2015-11-05T10:33:37.249Z", "creator_id": "a41b963a-e50e-40ad-9c37-dc5dfcc9a1de", "id": "c97e9d4c-aaff-415a-a996-4a7e79ce0c81", } ``` A call to `r.json()` recognizes `created_at`, `creator_id`, and `id` correctly but ignores `_json-roa`. `r.text` shows likewise only the last three nodes. If I open and read the same JSON not as a server response but as a static file, parsing it is not problem. Any ideas? Best, birk
{ "avatar_url": "https://avatars.githubusercontent.com/u/113206?v=4", "events_url": "https://api.github.com/users/birk/events{/privacy}", "followers_url": "https://api.github.com/users/birk/followers", "following_url": "https://api.github.com/users/birk/following{/other_user}", "gists_url": "https://api.github.com/users/birk/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/birk", "id": 113206, "login": "birk", "node_id": "MDQ6VXNlcjExMzIwNg==", "organizations_url": "https://api.github.com/users/birk/orgs", "received_events_url": "https://api.github.com/users/birk/received_events", "repos_url": "https://api.github.com/users/birk/repos", "site_admin": false, "starred_url": "https://api.github.com/users/birk/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/birk/subscriptions", "type": "User", "url": "https://api.github.com/users/birk", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2977/reactions" }
https://api.github.com/repos/psf/requests/issues/2977/timeline
null
completed
null
null
false
[ "If `r.text` only shows the last three nodes, that means that only the last three nodes are present. This means that for some reason the server is choosing not to send you that data.\n\nYou'll need to consult with your API documentation or server operator to determine why that data is absent.\n", "Thanks for the quit response. If I do the same request in the shell via curl I do get all the data from the server.\n", "So in this case the problem is likely to be one of user-agent filtering. Try using `curl -v` to see what headers curl sends, and then set the user-agent header to the same thing using requests.\n", "The header it was … I used `'accept': 'application/json'` but it has to be either `'accept': '*/*'` or correctly `'accept': 'application/json-roa+json'`.\n" ]
https://api.github.com/repos/psf/requests/issues/2976
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2976/labels{/name}
https://api.github.com/repos/psf/requests/issues/2976/comments
https://api.github.com/repos/psf/requests/issues/2976/events
https://github.com/psf/requests/pull/2976
128,077,424
MDExOlB1bGxSZXF1ZXN0NTY4NDM3OTU=
2,976
Fix api doc of debug logging for Python 3
{ "avatar_url": "https://avatars.githubusercontent.com/u/71619?v=4", "events_url": "https://api.github.com/users/shoma/events{/privacy}", "followers_url": "https://api.github.com/users/shoma/followers", "following_url": "https://api.github.com/users/shoma/following{/other_user}", "gists_url": "https://api.github.com/users/shoma/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/shoma", "id": 71619, "login": "shoma", "node_id": "MDQ6VXNlcjcxNjE5", "organizations_url": "https://api.github.com/users/shoma/orgs", "received_events_url": "https://api.github.com/users/shoma/received_events", "repos_url": "https://api.github.com/users/shoma/repos", "site_admin": false, "starred_url": "https://api.github.com/users/shoma/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/shoma/subscriptions", "type": "User", "url": "https://api.github.com/users/shoma", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-01-22T03:48:40Z
2021-09-08T05:01:03Z
2016-01-22T07:12:01Z
CONTRIBUTOR
resolved
`httplib` was renamed in Python 3.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2976/reactions" }
https://api.github.com/repos/psf/requests/issues/2976/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2976.diff", "html_url": "https://github.com/psf/requests/pull/2976", "merged_at": "2016-01-22T07:12:01Z", "patch_url": "https://github.com/psf/requests/pull/2976.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2976" }
true
[ "This then breaks for users on Python 2. Can you simply add comments explaining how to do this on Python 3 instead of removing the recipe for people using Python 2?\n", "@sigmavirus24 \n\nOk, I'll update.\n", "Thanks! :sparkles: :cake: :sparkles:\n" ]
https://api.github.com/repos/psf/requests/issues/2975
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2975/labels{/name}
https://api.github.com/repos/psf/requests/issues/2975/comments
https://api.github.com/repos/psf/requests/issues/2975/events
https://github.com/psf/requests/issues/2975
127,923,191
MDU6SXNzdWUxMjc5MjMxOTE=
2,975
Why retry doesn't work on sesson.post method
{ "avatar_url": "https://avatars.githubusercontent.com/u/1451096?v=4", "events_url": "https://api.github.com/users/demonguy/events{/privacy}", "followers_url": "https://api.github.com/users/demonguy/followers", "following_url": "https://api.github.com/users/demonguy/following{/other_user}", "gists_url": "https://api.github.com/users/demonguy/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/demonguy", "id": 1451096, "login": "demonguy", "node_id": "MDQ6VXNlcjE0NTEwOTY=", "organizations_url": "https://api.github.com/users/demonguy/orgs", "received_events_url": "https://api.github.com/users/demonguy/received_events", "repos_url": "https://api.github.com/users/demonguy/repos", "site_admin": false, "starred_url": "https://api.github.com/users/demonguy/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/demonguy/subscriptions", "type": "User", "url": "https://api.github.com/users/demonguy", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2016-01-21T13:23:15Z
2021-09-08T19:00:39Z
2016-01-21T13:29:33Z
NONE
resolved
I've pasted requests.packages.urllib3.util.Retry object to max_retries argument of HTTPAdapter But i found, it only works on session.get. How could i make it work on session.post ?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2975/reactions" }
https://api.github.com/repos/psf/requests/issues/2975/timeline
null
completed
null
null
false
[ "When you say it doesn't work on post, what does that mean? What are you expecting to see and what actually happens?\n", "it means. when i use session.post to access url = 'https://httpbin.org/status/500', it doesn't retry and just return an 500 error, not maxretry error.\n", "@demonguy That's not how retries work by default. Generally speaking, retries only affect actual connection issues that prevent us from sending the request or retrieving the response altogether. Normally, we do not issue retries on 500 errors. You can change that behaviour by initializing your `Retry` object with `status_forcelist=range(500, 600)`, as documented on the [Retry object API document](https://urllib3.readthedocs.org/en/latest/helpers.html#urllib3.util.retry.Retry).\n", "I am doing what you said, did you read my original post? i past \"requests.packages.urllib3.util.Retry\" object.\n\nAnd of course i passed status_forcelist, but it only works on get, not on post\n", "If you look at the API documentation, you'll notice that it says this:\n\n> method_whitelist (iterable) –\n> Set of uppercased HTTP method verbs that we should retry on.\n> By default, we only retry on methods which are considered to be indempotent (multiple requests with the same parameters end with the same state).\n\nPOST is not considered IDEMPOTENT. You'll need to extend the method whitelist with POST to allow max_retries to function there.\n", "@demonguy in the future **questions** are to be asked on [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests). Please do not ask a question in an issue again.\n", "Sorry. I search for the whole Request document, i should expect it's in urllib3 documents.\nReally sorry. \nAnd i did search Google and stackoverflow, maybe i use wrong keyword so i didn't get answer from that.\nSorry\n" ]
https://api.github.com/repos/psf/requests/issues/2974
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2974/labels{/name}
https://api.github.com/repos/psf/requests/issues/2974/comments
https://api.github.com/repos/psf/requests/issues/2974/events
https://github.com/psf/requests/issues/2974
127,714,446
MDU6SXNzdWUxMjc3MTQ0NDY=
2,974
Response encoding defaulting to ISO-8859-1
{ "avatar_url": "https://avatars.githubusercontent.com/u/208952?v=4", "events_url": "https://api.github.com/users/eskerda/events{/privacy}", "followers_url": "https://api.github.com/users/eskerda/followers", "following_url": "https://api.github.com/users/eskerda/following{/other_user}", "gists_url": "https://api.github.com/users/eskerda/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/eskerda", "id": 208952, "login": "eskerda", "node_id": "MDQ6VXNlcjIwODk1Mg==", "organizations_url": "https://api.github.com/users/eskerda/orgs", "received_events_url": "https://api.github.com/users/eskerda/received_events", "repos_url": "https://api.github.com/users/eskerda/repos", "site_admin": false, "starred_url": "https://api.github.com/users/eskerda/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/eskerda/subscriptions", "type": "User", "url": "https://api.github.com/users/eskerda", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-01-20T15:59:55Z
2021-09-08T19:00:39Z
2016-01-20T16:04:35Z
NONE
resolved
Hello, For the last month I've been getting reported issues on badly encoded data (ie: https://github.com/eskerda/pybikes/issues/109) I've searched through the issues and also tried to pinpoint it to a particular change, but only one if found is from 2014. Has something changed recently? Can be reproduced as: ``` python import requests import re rgx = r'setEstacao\((.+)\)' resp = requests.get('http://www.ciclosampa.com.br/estacoes.php') print "Uses default encoding %r" % resp.encoding print re.findall(rgx, resp.text)[0] print "" resp.encoding = 'UTF-8' print "Uses set encoding %r" % resp.encoding print re.findall(rgx, resp.text)[0] ``` ``` Uses default encoding 'ISO-8859-1' -23.572208,-46.644283,"1","Estação Hcor","Rua Desembargador Eliseu Guilherme 14","5","11" Uses set encoding 'UTF-8' -23.572208,-46.644283,"1","Estação Hcor","Rua Desembargador Eliseu Guilherme 14","5","11" ``` Sorry if this has already been reported or resolved. Looked for issues but most changes on this are too old to be relevant.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2974/reactions" }
https://api.github.com/repos/psf/requests/issues/2974/timeline
null
completed
null
null
false
[ "This has not changed: requests has behaved like this for an extremely long time. See #2086 for a discussion about it.\n\nIf this used to work but has now stopped working, it's likely that the server used to send a `charset` field in the Content-Type header but no longer does.\n\nIf you know you're going to be handling HTML, you should look to extract the encoding from the HTML body. You can handle that automatically by using [this utility function from the requests toolbelt](https://toolbelt.readthedocs.org/en/latest/deprecated.html#requests_toolbelt.utils.deprecated.get_encodings_from_content).\n" ]
https://api.github.com/repos/psf/requests/issues/2973
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2973/labels{/name}
https://api.github.com/repos/psf/requests/issues/2973/comments
https://api.github.com/repos/psf/requests/issues/2973/events
https://github.com/psf/requests/pull/2973
127,687,147
MDExOlB1bGxSZXF1ZXN0NTY2MTAwNTQ=
2,973
Stop falling back to distutils in setup.py
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" } ]
null
2
2016-01-20T13:53:40Z
2021-09-08T05:01:04Z
2016-01-20T13:54:58Z
CONTRIBUTOR
resolved
distutils has been deprecated for years now and pip is refusing to install projects using it. As such, _some_ users are reporting problems installing requests. Let's avoid this entirely by not falling back to distutils at all.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2973/reactions" }
https://api.github.com/repos/psf/requests/issues/2973/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2973.diff", "html_url": "https://github.com/psf/requests/pull/2973", "merged_at": "2016-01-20T13:54:58Z", "patch_url": "https://github.com/psf/requests/pull/2973.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2973" }
true
[ "/cc @dstufft (re: https://github.com/pypa/pip/issues/3384#issuecomment-173202320) although I'm starting to suspect that that is a version of requests packaged by the distro that is causing wichert's problems.\n", "Still, we just don't need this fallback now.\n" ]
https://api.github.com/repos/psf/requests/issues/2972
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2972/labels{/name}
https://api.github.com/repos/psf/requests/issues/2972/comments
https://api.github.com/repos/psf/requests/issues/2972/events
https://github.com/psf/requests/pull/2972
127,576,727
MDExOlB1bGxSZXF1ZXN0NTY1NDk3Mjk=
2,972
Return request & response with TooManyRedirects
{ "avatar_url": "https://avatars.githubusercontent.com/u/500774?v=4", "events_url": "https://api.github.com/users/munro/events{/privacy}", "followers_url": "https://api.github.com/users/munro/followers", "following_url": "https://api.github.com/users/munro/following{/other_user}", "gists_url": "https://api.github.com/users/munro/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/munro", "id": 500774, "login": "munro", "node_id": "MDQ6VXNlcjUwMDc3NA==", "organizations_url": "https://api.github.com/users/munro/orgs", "received_events_url": "https://api.github.com/users/munro/received_events", "repos_url": "https://api.github.com/users/munro/repos", "site_admin": false, "starred_url": "https://api.github.com/users/munro/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/munro/subscriptions", "type": "User", "url": "https://api.github.com/users/munro", "user_view_type": "public" }
[]
closed
true
null
[]
null
6
2016-01-20T00:53:26Z
2021-09-08T05:01:03Z
2016-01-21T23:01:58Z
CONTRIBUTOR
resolved
I'm using requests to fetch data from websites that I don't maintain, and sometimes websites have broken cyclic redirects, but still return useful responses. So I would like to grab the last cut off response, similar to the browser. The problem is that `TooManyRedirects` doesn't throw the ending response state for me to inspect, so this PR adds that.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2972/reactions" }
https://api.github.com/repos/psf/requests/issues/2972/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2972.diff", "html_url": "https://github.com/psf/requests/pull/2972", "merged_at": "2016-01-21T23:01:58Z", "patch_url": "https://github.com/psf/requests/pull/2972.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2972" }
true
[ "@Lukasa I'm +0.5 on this. The reason I'm not a full +1 is that we might start leaking sockets this way.\n", "@sigmavirus24 We shouldn't leak sockets here: the redirect logic consumes all the data from the connection, which should return it back to the connection pool, and that happens before we raise `TooManyRedirects`.\n", "Got a few small notes, but this looks really good! :cake:\n", "> Got a few small notes, but this looks really good! :cake:\n\nYay! Thanks a lot for the fast response! @sigmavirus24 @Lukasa \n", "LGTM. @Lukasa thoughts?\n", "LGTM.\n" ]
https://api.github.com/repos/psf/requests/issues/2971
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2971/labels{/name}
https://api.github.com/repos/psf/requests/issues/2971/comments
https://api.github.com/repos/psf/requests/issues/2971/events
https://github.com/psf/requests/issues/2971
127,079,968
MDU6SXNzdWUxMjcwNzk5Njg=
2,971
SSL error: bad handshake
{ "avatar_url": "https://avatars.githubusercontent.com/u/7250762?v=4", "events_url": "https://api.github.com/users/Mohammed-Aadil/events{/privacy}", "followers_url": "https://api.github.com/users/Mohammed-Aadil/followers", "following_url": "https://api.github.com/users/Mohammed-Aadil/following{/other_user}", "gists_url": "https://api.github.com/users/Mohammed-Aadil/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Mohammed-Aadil", "id": 7250762, "login": "Mohammed-Aadil", "node_id": "MDQ6VXNlcjcyNTA3NjI=", "organizations_url": "https://api.github.com/users/Mohammed-Aadil/orgs", "received_events_url": "https://api.github.com/users/Mohammed-Aadil/received_events", "repos_url": "https://api.github.com/users/Mohammed-Aadil/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Mohammed-Aadil/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Mohammed-Aadil/subscriptions", "type": "User", "url": "https://api.github.com/users/Mohammed-Aadil", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-01-17T07:30:44Z
2021-09-08T19:00:40Z
2016-01-19T19:32:41Z
NONE
resolved
Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/home/dev-django1/.virtualenvs/logic_srv/local/lib/python2.7/site-packages/requests/api.py", line 68, in get return request('get', url, *_kwargs) File "/home/dev-django1/.virtualenvs/logic_srv/local/lib/python2.7/site-packages/requests/api.py", line 50, in request response = session.request(method=method, url=url, *_kwargs) File "/home/dev-django1/.virtualenvs/logic_srv/local/lib/python2.7/site-packages/requests/sessions.py", line 464, in request resp = self.send(prep, *_send_kwargs) File "/home/dev-django1/.virtualenvs/logic_srv/local/lib/python2.7/site-packages/requests/sessions.py", line 576, in send r = adapter.send(request, *_kwargs) File "/home/dev-django1/.virtualenvs/logic_srv/local/lib/python2.7/site-packages/requests/adapters.py", line 431, in send raise SSLError(e, request=request) requests.exceptions.SSLError: [Errno bad handshake] [('SSL routines', 'SSL3_GET_SERVER_CERTIFICATE', 'certificate verify failed')] I do not know why it is coming in requests 2.6.0 while it is not coming in requests==2.2.1. This problem start arising yesterday when I hit it to google api on link https://www.googleapis.com/oauth2/v1/tokeninfo. I can't use verify=False as many stackoverflow thread is advising. And I'm pretty sure I have used ssl certificate properly in my web site too.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2971/reactions" }
https://api.github.com/repos/psf/requests/issues/2971/timeline
null
completed
null
null
false
[ "What OS are you using?\n", "Yeah im using linux. Btw I fixed the error by removing and installing virtualenv package, not sure why this happened but i got it running. Thank you for your reply :)\n" ]
https://api.github.com/repos/psf/requests/issues/2970
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2970/labels{/name}
https://api.github.com/repos/psf/requests/issues/2970/comments
https://api.github.com/repos/psf/requests/issues/2970/events
https://github.com/psf/requests/pull/2970
126,712,324
MDExOlB1bGxSZXF1ZXN0NTYwNzExODk=
2,970
Rebase to 2.9.1 changes
{ "avatar_url": "https://avatars.githubusercontent.com/u/2167844?v=4", "events_url": "https://api.github.com/users/hwms/events{/privacy}", "followers_url": "https://api.github.com/users/hwms/followers", "following_url": "https://api.github.com/users/hwms/following{/other_user}", "gists_url": "https://api.github.com/users/hwms/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/hwms", "id": 2167844, "login": "hwms", "node_id": "MDQ6VXNlcjIxNjc4NDQ=", "organizations_url": "https://api.github.com/users/hwms/orgs", "received_events_url": "https://api.github.com/users/hwms/received_events", "repos_url": "https://api.github.com/users/hwms/repos", "site_admin": false, "starred_url": "https://api.github.com/users/hwms/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hwms/subscriptions", "type": "User", "url": "https://api.github.com/users/hwms", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-01-14T18:02:38Z
2021-09-08T04:01:02Z
2016-01-21T19:58:26Z
NONE
resolved
I just rebased the proposed 3.0.0 to current master near 2.9.1 to further help with socks support there also :)
{ "avatar_url": "https://avatars.githubusercontent.com/u/2167844?v=4", "events_url": "https://api.github.com/users/hwms/events{/privacy}", "followers_url": "https://api.github.com/users/hwms/followers", "following_url": "https://api.github.com/users/hwms/following{/other_user}", "gists_url": "https://api.github.com/users/hwms/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/hwms", "id": 2167844, "login": "hwms", "node_id": "MDQ6VXNlcjIxNjc4NDQ=", "organizations_url": "https://api.github.com/users/hwms/orgs", "received_events_url": "https://api.github.com/users/hwms/received_events", "repos_url": "https://api.github.com/users/hwms/repos", "site_admin": false, "starred_url": "https://api.github.com/users/hwms/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/hwms/subscriptions", "type": "User", "url": "https://api.github.com/users/hwms", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2970/reactions" }
https://api.github.com/repos/psf/requests/issues/2970/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2970.diff", "html_url": "https://github.com/psf/requests/pull/2970", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2970.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2970" }
true
[ "As long as it pushes the 3.0 to current upgrades (and works of course) i am happy with it :)\n" ]
https://api.github.com/repos/psf/requests/issues/2969
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2969/labels{/name}
https://api.github.com/repos/psf/requests/issues/2969/comments
https://api.github.com/repos/psf/requests/issues/2969/events
https://github.com/psf/requests/issues/2969
126,671,871
MDU6SXNzdWUxMjY2NzE4NzE=
2,969
Cut a release to match urllib3-1.14?
{ "avatar_url": "https://avatars.githubusercontent.com/u/331338?v=4", "events_url": "https://api.github.com/users/ralphbean/events{/privacy}", "followers_url": "https://api.github.com/users/ralphbean/followers", "following_url": "https://api.github.com/users/ralphbean/following{/other_user}", "gists_url": "https://api.github.com/users/ralphbean/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ralphbean", "id": 331338, "login": "ralphbean", "node_id": "MDQ6VXNlcjMzMTMzOA==", "organizations_url": "https://api.github.com/users/ralphbean/orgs", "received_events_url": "https://api.github.com/users/ralphbean/received_events", "repos_url": "https://api.github.com/users/ralphbean/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ralphbean/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ralphbean/subscriptions", "type": "User", "url": "https://api.github.com/users/ralphbean", "user_view_type": "public" }
[]
closed
true
null
[]
null
12
2016-01-14T15:03:07Z
2021-09-08T16:00:36Z
2016-08-05T07:49:39Z
CONTRIBUTOR
resolved
Hi, we noticed that urllib3-1.14 was released right at the end of 2015. Are there plans to release a new version of requests bundling that? Downstream in Fedora we had a bug filed requesting urllib3-1.14, but with the way things stand now, we can't update to that until we have a matching requests release. https://bugzilla.redhat.com/show_bug.cgi?id=1295402
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 2, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 2, "url": "https://api.github.com/repos/psf/requests/issues/2969/reactions" }
https://api.github.com/repos/psf/requests/issues/2969/timeline
null
completed
null
null
false
[ "@ralphbean At some point, sure. We weren't generally planning on racing to new urllib3 versions, especially as really the only change in that urllib3 release is the SOCKS proxy support.\n\nIs there a particular rush to get urllib3 1.14 into Fedora?\n", "I think Nir Magnezi in that ticket was having issues setting up an OpenStack dep that may have pinned version 1.14 of urllib3. I'll ask for more details.\n", ":+1: for this. \nThere was another release in 1.14 as well: \"Fixed AppEngine handling of transfer-encoding header and bug in Timeout defaults checking.\" Together with this patch: https://github.com/sigmavirus24/requests-toolbelt/pull/119 and the use of requests_toolbelt, requests will actually work on appengine. \n\nSo it would be much appreciated if requests got updated to urllib3 1.14. \n", "So to be clear, we _will_ cut such a release. I'm just trying to work out how time-sensitive this release is.\n", "Thank you :). Looking forward to when it will be. \n", "As the one doing the changes to get urllib3/requests working on appengine, I would say \"not high priority\" on that count, as it's not blocked on this urllib3 import yet. Once we get https://github.com/sigmavirus24/requests-toolbelt/pull/119 submitted, and that branch integrated into master, then I'll come pushing on this thread again to get appengine functionality unblocked.\n", "Okay that went quicker than expected, based on extrapolation of past activity on my pull request. :) My changes are now integrated into a requests-toolbelt branch, conditioned against a hypothetical 2.10 version of requests containing the latest urllib3 functionality. I assume once requests cuts this release, @sigmavirus24 can then integrate the branch into mainline requests-toolbelt.\n\nI'll leave it to you guys to determine the time-sensitivity of appengine functionality.\n", "+1 - looking forward to the appengine fix\n", "+1 for SOCKS support. [requesocks](https://github.com/dvska/requesocks) is horribly out of date.\n", "+1 for SOCKS support as well.\n", "+1 for SOCKS support especially for using with tor directly without tsocks/proxychains in middle.\n", "+1 for SOCKS support, especially with the recent release of Docker 1.11.0 with SOCKS support, and the fact that docker-py can't support SOCKS until requests does.\n" ]
https://api.github.com/repos/psf/requests/issues/2968
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2968/labels{/name}
https://api.github.com/repos/psf/requests/issues/2968/comments
https://api.github.com/repos/psf/requests/issues/2968/events
https://github.com/psf/requests/issues/2968
126,354,819
MDU6SXNzdWUxMjYzNTQ4MTk=
2,968
how can i use keystore file in request?
{ "avatar_url": "https://avatars.githubusercontent.com/u/935632?v=4", "events_url": "https://api.github.com/users/vinsia/events{/privacy}", "followers_url": "https://api.github.com/users/vinsia/followers", "following_url": "https://api.github.com/users/vinsia/following{/other_user}", "gists_url": "https://api.github.com/users/vinsia/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/vinsia", "id": 935632, "login": "vinsia", "node_id": "MDQ6VXNlcjkzNTYzMg==", "organizations_url": "https://api.github.com/users/vinsia/orgs", "received_events_url": "https://api.github.com/users/vinsia/received_events", "repos_url": "https://api.github.com/users/vinsia/repos", "site_admin": false, "starred_url": "https://api.github.com/users/vinsia/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vinsia/subscriptions", "type": "User", "url": "https://api.github.com/users/vinsia", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-01-13T06:48:21Z
2021-09-08T20:00:47Z
2016-01-13T08:16:17Z
NONE
resolved
i got a keystore file, it contains much x.509 certificates. how to import them, while request something?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2968/reactions" }
https://api.github.com/repos/psf/requests/issues/2968/timeline
null
completed
null
null
false
[ "@vinsia At this time requests can only pass X509 certificates to OpenSSL from the filesystem: either a file of PEM-encoded certificates or a directory of them. If you serialize your keystore file out to PEM-encoding, you should be able to use the `verify` argument to use them.\n" ]
https://api.github.com/repos/psf/requests/issues/2967
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2967/labels{/name}
https://api.github.com/repos/psf/requests/issues/2967/comments
https://api.github.com/repos/psf/requests/issues/2967/events
https://github.com/psf/requests/issues/2967
126,214,154
MDU6SXNzdWUxMjYyMTQxNTQ=
2,967
SSL-Client-Certificate and Subject Alternative Names
{ "avatar_url": "https://avatars.githubusercontent.com/u/3438840?v=4", "events_url": "https://api.github.com/users/felskrone/events{/privacy}", "followers_url": "https://api.github.com/users/felskrone/followers", "following_url": "https://api.github.com/users/felskrone/following{/other_user}", "gists_url": "https://api.github.com/users/felskrone/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/felskrone", "id": 3438840, "login": "felskrone", "node_id": "MDQ6VXNlcjM0Mzg4NDA=", "organizations_url": "https://api.github.com/users/felskrone/orgs", "received_events_url": "https://api.github.com/users/felskrone/received_events", "repos_url": "https://api.github.com/users/felskrone/repos", "site_admin": false, "starred_url": "https://api.github.com/users/felskrone/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/felskrone/subscriptions", "type": "User", "url": "https://api.github.com/users/felskrone", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-01-12T16:14:02Z
2021-09-08T20:00:47Z
2016-01-12T18:08:10Z
NONE
resolved
I have a (Puppet-)generated SSL-certificate with a CN and several SANs. A stripped output of openssl-output looks like this: ``` bash # openssl x509 -in cert.crt -text Certificate: ... Signature Algorithm: sha256WithRSAEncryption Issuer: CN=Puppet CA: puppet.domain.com ... X509v3 extensions: ... X509v3 Subject Alternative Name: DNS:srv01.domain.com, DNS:srv02.domain.com, DNS:srv03.domain.com ... ``` Now when i use this certificate with requests, it always picks the CN to do the actual request. Meaning my remote always receives "puppet.domain.com" to verify the client-certificate against. Since the remote IS puppet.domain.com, and the client is srv01.domain.com, the verification always fails. Is there any way, to have requests use any of the SANs to do its request? Or is it my remote the actually only uses the CN to do verification?
{ "avatar_url": "https://avatars.githubusercontent.com/u/3438840?v=4", "events_url": "https://api.github.com/users/felskrone/events{/privacy}", "followers_url": "https://api.github.com/users/felskrone/followers", "following_url": "https://api.github.com/users/felskrone/following{/other_user}", "gists_url": "https://api.github.com/users/felskrone/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/felskrone", "id": 3438840, "login": "felskrone", "node_id": "MDQ6VXNlcjM0Mzg4NDA=", "organizations_url": "https://api.github.com/users/felskrone/orgs", "received_events_url": "https://api.github.com/users/felskrone/received_events", "repos_url": "https://api.github.com/users/felskrone/repos", "site_admin": false, "starred_url": "https://api.github.com/users/felskrone/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/felskrone/subscriptions", "type": "User", "url": "https://api.github.com/users/felskrone", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2967/reactions" }
https://api.github.com/repos/psf/requests/issues/2967/timeline
null
completed
null
null
false
[ "Sorry...I'm confused. When you say \"to do the actual request\", what do you mean?\n", "I meant when i do request.post(url, cert()..) that requests always picks the CN.\n\nBut i think thats not the problem here. Im quite certain that the remote (apache) or maybe even the application (foreman) has to look at the SANs to actually do the verification since the client-certificate is sent as is, isnt it?\n", "Correct: requests doesn't touch or extract any information at all from the client certificate. If the remote peer is failing to validate it, that's a configuration error with the remote peer.\n", "Great, thx :-)\n" ]
https://api.github.com/repos/psf/requests/issues/2966
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2966/labels{/name}
https://api.github.com/repos/psf/requests/issues/2966/comments
https://api.github.com/repos/psf/requests/issues/2966/events
https://github.com/psf/requests/issues/2966
125,993,142
MDU6SXNzdWUxMjU5OTMxNDI=
2,966
Consider using system trust stores by default in 3.0.0.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "color": "fef2c0", "default": false, "description": null, "id": 60669570, "name": "Please Review", "node_id": "MDU6TGFiZWw2MDY2OTU3MA==", "url": "https://api.github.com/repos/psf/requests/labels/Please%20Review" }, { "color": "777777", "default": false, "description": null, "id": 162780722, "name": "Question/Not a bug", "node_id": "MDU6TGFiZWwxNjI3ODA3MjI=", "url": "https://api.github.com/repos/psf/requests/labels/Question/Not%20a%20bug" }, { "color": "fef2c0", "default": false, "description": null, "id": 298537994, "name": "Needs More Information", "node_id": "MDU6TGFiZWwyOTg1Mzc5OTQ=", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20More%20Information" } ]
closed
false
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
211
2016-01-11T17:15:47Z
2024-05-19T18:35:48Z
2024-05-19T18:35:47Z
MEMBER
null
It's been raised repeatedly, mostly by people using Linux systems, that it's annoying that requests doesn't use the system trust store and instead uses the one that certifi ships. This is an understandable position. I have some personal attachment to the certifi approach, but the other side of that argument definitely has a reasonable position too. For this reason, I'd like us to look into whether we should use the system trust store by default, and make certifi's bundle a fallback option. I have some caveats here: 1. If we move to the system trust store, we must do so on _all_ platforms: Linux must not be its own special snowflake. 2. We must have broad-based support for Linux and Windows. 3. We must be able to fall back to certifi cleanly. Right now it seems like the best route to achieving this would be to use [certitude](https://github.com/python-hyper/certitude). This currently has support for dynamically generating the cert bundle OpenSSL needs directly from the system keychain on OS X. If we added Linux and Windows support to that library, we may have the opportunity to switch to using certitude. Given @kennethreitz's bundling policy, we probably cannot unconditionally switch to certitude, because certitude depends on cryptography (at least on OS X). However, certitude could take the current privileged position that certifi takes, or be a higher priority than certifi, as an optional dependency that is used if present on the system. Thoughts? This is currently a RFC, so please comment if you have opinions. /cc @sigmavirus24 @alex @kennethreitz @dstufft @glyph @reaperhulk @morganfainberg
{ "avatar_url": "https://avatars.githubusercontent.com/u/18519037?v=4", "events_url": "https://api.github.com/users/sethmlarson/events{/privacy}", "followers_url": "https://api.github.com/users/sethmlarson/followers", "following_url": "https://api.github.com/users/sethmlarson/following{/other_user}", "gists_url": "https://api.github.com/users/sethmlarson/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sethmlarson", "id": 18519037, "login": "sethmlarson", "node_id": "MDQ6VXNlcjE4NTE5MDM3", "organizations_url": "https://api.github.com/users/sethmlarson/orgs", "received_events_url": "https://api.github.com/users/sethmlarson/received_events", "repos_url": "https://api.github.com/users/sethmlarson/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sethmlarson/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sethmlarson/subscriptions", "type": "User", "url": "https://api.github.com/users/sethmlarson", "user_view_type": "public" }
{ "+1": 89, "-1": 0, "confused": 0, "eyes": 0, "heart": 3, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 92, "url": "https://api.github.com/repos/psf/requests/issues/2966/reactions" }
https://api.github.com/repos/psf/requests/issues/2966/timeline
null
completed
null
null
false
[ "I think the system trust stores (or not) essentially boils down to whether you want requests to act the same across platforms, or whether you want it to act in line with the platform it is running on. I do not think that _either_ of these options are wrong (or right), just different trade offs.\n\nI think that it's not as simple on Windows as it is on Linux or OSX (although @tiran might have a better idea). I _think_ that Windows doesn't ship with all of the certificates available and you have to do something (use WinHTTP?) to get it to download any additional certificates on demand. I think that means that a brand new Windows install, if you attempt to dump the certificate store will be missing a great many certificates.\n\nOn Linux, you still have the problem that there isn't one single set location for the certificate files, the best you can do is try to heuristically guess at where it might be. This gets better on Python 2.7.9+ and Python 3.4+ since you can use `ssl.get_default_verify_paths()` to get what the default paths are, but you can't rely on that unless you drop 2.6, 2.7.8, and 3.3. In pip we attempt to discover the location of the system trust store (just by looping over some common file locations) and if we can't find it we fall back to certifi, and one problem that has come up is that sometimes we'll find a location, but it's an old outdated copy that isn't being updated by anything. People then get really confused because it works in their browser, it works with requests, but it doesn't in pip.\n\nI assume the fall back to certifi ensures that things will still work correctly on platforms that either don't ship certificates at all, or don't ship them by default and they aren't installed? If so, that's another possible niggle here that you'd want to think about. Some platforms, like say FreeBSD, don't ship them by default at all. So it's possible that people will have a requests using thing running just fine without the FreeBSD certificates installed, and they then install them (explicitly or implicitly) and suddenly they trust something different and the behavior of the program changes.\n\nAnyways, the desire seems reasonable to me and, if all of the little niggles get worked out, it really just comes down to a question of if requests wants to fall on the side of \"fitting in\" with a particular platform, or if it wants to prefer cross platform uniformity.\n", "> I think the system trust stores (or not) essentially boils down to whether you want requests to act the same across platforms, or whether you want it to act in line with the platform it is running on. I do not think that either of these options are wrong (or right), just different trade offs.\n\nI wouldn't say that either option is _completely_ wrong, but I do think that using the platform trust store is significantly right-er, due to the availability of tooling to adjust trust roots on the platform and the relative unavailability of any such tooling for requests or certifi. If you go into Keychain Access to add an anchor (or the Windows equivalent) nothing about requests makes it seem like it would be special, that it would be using a different set of trust roots than you had already configured for everything else.\n", "It depends if your audience are people who are familiar with a particular platform or not. I have no idea how to manage the trust store on Windows but I know how to manage the trust store for requests because requests currently chooses being internally consistent cross platform over being externally consistent with any particular platform. \n\nIOW this change makes it easier for people who are familiar with the system the software is running on at the cost of people who are not. \n\nSent from my iPhone\n\n> On Jan 11, 2016, at 4:29 PM, Glyph [email protected] wrote:\n> \n> I wouldn't say that either option is completely wrong, but I do think that using the platform trust store is significantly right-er\n", "> IOW this change makes it easier for people who are familiar with the system the software is running on at the cost of people who are not.\n\nIn the abstract, I disagree. Of course, we may have distribution or build toolchain issues which make people have to care about this fact, but if it works _properly_ (`pip install`s without argument, doesn't require C compiler shenanigans for the end user) then what is the penalty to people who are not familiar with the platform?\n", "It forces them to learn the differences of every platform they are running on. \n\nSent from my iPhone\n\n> On Jan 11, 2016, at 4:59 PM, Glyph [email protected] wrote:\n> \n> what is the penalty to people who are not familiar with the platform?\n", "So I am in favour of using the system store in general because in most cases [outside of dev] if you're relying on requests or a similar library you're going to expect it to work similar to the rest of the system/tooling. Asking someone to learn the tooling for the system they are deploying on is not unreasonable. If an application uses requests and needs to trust a specific cert (end-user story here), it is usual that it'll handle the installation for that platform at install time (aka OS X or Windows). \n\nFrom a developer perspective, it becomes a little more difficult but still not insurmountable as long as we approach this in a way that the developer has clear methods to continue with the same behaviour as today.\n\nAs discussed in IRC, perhaps the easiest method to ensure sanity is to really finish up and polish certitude(? was this the tool discussed?) so we can encapsulate the platform/system-specifics clearly and try and ensure requests logic is the same on all of the platforms.\n", "> It forces them to learn the differences of every platform they are running on.\n\nHow so? By default, it ought to get the trust roots you expect on every platform. It's not like I need to learn new and exciting things when I launch Safari vs. IE just to type `https://`...\n", "> I think that it's not as simple on Windows as it is on Linux or OSX (although @tiran might have a better idea). I think that Windows doesn't ship with all of the certificates available and you have to do something (use WinHTTP?) to get it to download any additional certificates on demand. I think that means that a brand new Windows install, if you attempt to dump the certificate store will be missing a great many certificates.\n\nYou are right about this. I have verified it on my nearly-pristine Windows VM; in a Python prompt, I do:\n\n``` python\n>>> import wincertstore\n>>> print(len(list(wincertstore.CertSystemStore(\"ROOT\"))))\n```\n\nand get \"21\". Visit some HTTPS websites, up-arrow/enter in the python interpreter, and now I get \"23\".\n", "There's some technical documentation here:\n\nhttps://technet.microsoft.com/en-us/library/bb457160.aspx\n", "And some more explanation here:\n\nhttp://unmitigatedrisk.com/?p=259\n", "Frustratingly, I can't find an API that just tells it to grab the certificate store; it seems that verifying a certificate chain that you don't have the root to is the only way that it adds certificates, and it adds them one at a time as necessary. It baffles me that Microsoft seems to consider storage for certificates a scarce resource.\n", "After hours of scouring MSDN, I give up. Hopefully someone else can answer this question: https://stackoverflow.com/questions/34732586/is-there-an-api-to-pre-retrieve-the-list-of-trusted-root-certificates-on-windows\n", "In my experience with pip, which attempts to discover the system store and if it can't find it falls back to a bundled copy, I have had to learn how the system store works on platforms that I have no intention on ever running. This is for a fairly simple method of detection (look for file systems) but it absolutely ends up that way. The simple fact is, in my experience most people have absolutely no idea how their system manages a trust store (and if it manages a trust store or not). \n\nSent from my iPhone\n\n> On Jan 11, 2016, at 5:22 PM, Glyph [email protected] wrote:\n> \n> How so? By default, it ought to get the trust roots you expect on every platform. It's not like I need to learn new and exciting things when I launch Safari vs. IE just to type https://...\n", "Look for files on the system*\n", "FWIW, If I could prevent downstream redistributors from forcing pip to use the system store, I would revert the change to look in system locations immediately and only ever use a bundled copy. The UX of that tends to be so much nicer, the only reason we started to trust the system trust stores is because redistributors do patch pip to use the system trust store, so you end up in a situation where people get different trust stores based on where their copy of pip came from (which is likely also a concern for requests).\n\nAs an additional datapoint, if I remember correctly, the browsers which are _not_ shipped with the OS tend to not use the system trust store either. According to Chrome's [Root Certificate Policy](https://www.chromium.org/Home/chromium-security/root-ca-policy) they will use the system trust store on OSX and on Windows but they won't use it on Linux. I believe that even where Chrome does use the system store, they still layer their own trust ontop of that to allow them to blacklist certificates if need be. I assume this capability is in place because they do not wholly trust the OS trust stores to remove compromised certificates. If I recall correctly, Firefox does not use the system trust store at all on any OS.\n\nAnother question is what even is the \"root trust store\" on a Linux. The closest that you can get is wherever the system provided OpenSSL (assuming they even provide OpenSSL) is configured to point to. However AFAIK there is no way to determine if you're using a system provided OpenSSL or a copy that someone installed (perhaps via Anaconda?), the additional copy may have stale certificates or no certificates available. If it's stale certificates, then you've successfully lowered the security of requests user's by attempting to follow the system trust store. If it's an empty trust store, how do you determine the difference between \"empty because I trust nothing\" and \"empty because my copy of OpenSSL isn't shipping them\" or do I have to manage the certificates I trust using my OS, unless I want to trust nothing then I have to manage the certificates I trust using requests?\n\nI've also found that talks about trying to use the platform certificate trust store on Linux (see [here](https://www.happyassassin.net/2015/01/12/a-note-about-ssltls-trusted-certificate-stores-and-platforms/)). I've not personally verified the information in this article, however it makes me feel very wary about trying to make using the system trust store anything but an exercise in frustration. \n", "I think our current approach is the correct one, considering who Requests was built for. \n\nThat being said, it wouldn't hurt to add more documentation/functionality around using system certs for \"advanced\" users. \n", "I think talking about configurability is maybe a red herring. It's useful – necessary even – in certain circumstances, but users who know they need that can usually figure it out.\n\nThe more significant issue is that trust root database updates, especially for end-user client software, are both infrequent and extremely important to do in a timely manner. `certifi` has no mechanism for automatically updating. Not only will it not come down in a system update, it can't even be done globally; you have to do it once per Python environment (virtualenv, install, home directory, etc, where it's installed).\n", "@glyph I think you may have hit the nail on the head there. That is a significant reason (and addresses the other issues outlined) to use a more centralized location for the cert store.\n", "The flip side is that you have trust stores like Debian which trusted CACert for a long time, and still trusts SPI even though neither of those have gone through any sort of real audit that you'd expect for a trusted CA.\n", "> The flip side is that you have trust stores like Debian which trusted CACert for a long time, and still trusts SPI even though neither of those have gone through any sort of real audit that you'd expect for a trusted CA.\n\nAs I understand it, Microsoft trusts their own CA, too, and Apple trusts theirs. SPI is just Debian's version of that, isn't it?\n", "SPI isn't run by Debian, it's a third party organization similar to that of the Software Freedom Conservancy that Debian happens to be a member of. It'd be more like Microsoft and Apple trusting the CA of their datacenter just because they happened to use them as their data center. In addition, I'm pretty sure that Apple and Microsoft have both passed a WebTrust audit for their root CAs.\n", "To be fair to Debian, I think the _current_ plan is to stop using SPI certificates for their infrastructure and switch to more generally trusted certificates and then stop including SPI and switch to using just the Mozilla bundle without any additions.\n\nThat being said, is it even true that they are shipping updates to them? Looking at [packages.debian.org for the ca-certificates packages](https://packages.debian.org/search?keywords=ca-certificates) it shows that the versions there are:\n- squeeze (oldoldstable): `20090814+nmu3squeeze1`\n- wheezy (oldstable): `20130119+deb7u1`\n- jessie (stable): `20141019`\n- stretch (testing): `20160104`\n- sid (unstable): `20160104`\n\nI haven't looked at the actual _contents_ of these packages, but the version numbers lead me to believe that they are not infact keeping the ca-certificates package up to date.\n\nIn addition to that, looking at the open bugs for ca-certificates there are bugs like [#721976](https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=721976) which means that the ca-certificates store includes roots which are not valid for validating servers and are only valid for other topics (like email) which means you can't actually use the current ca-certificate package without massaging it to remove those certificates yourself.\n\nAnother issue [#808600](https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=808600) has Comodo requesting a removal of a particular root that they no longer consider to be inscope for the CAB \nForum's Baseline Requirements. That has been removed from testing and sid, but has not been removed from jessie, wheezy, or squeeze. The maintainers of ca-certificates claim they'll be requesting an upload to jessie and wheezy, but not to squeeze (which still has LTS suport).\n\nThat's just from spending a little bit of time looking at one, fairly popular, distribution. I imagine the long tail of the issues with the system provided bundle gets worse the further away from the popular distributions you get. It's not clear to me that it's a reasonable assumption that the certificates included in any random OS are going to be properly maintained.\n\nThe other elephant in the room is we're just assuming that because an update is available to their ca-certificates package that someone is going to have pulled it in. As far as I know, most (if not all?) of the Linux systems do not automatically update by default and require configuration to start to do so. There is likely to be a bigger chance of this with Docker in the picture. On the flip side, I think people generally try to update to the latest versions of their dependencies when working on a project.\n", "Debian's cert bundle is almost certainly like ubuntu's, which does not update to the Mozilla bundle that removed 1024-bit roots in order to avoid the pain like that which hit certifi. All of the out-of-date cert bundles are OpenSSL pre 1.0.2, which means they cannot correctly build the cert chain to a cross-signed root without having the 1024-bit cross-signing root still present. I suspect that's the real concern there.\n", "Maybe they should stop shipping an OpenSSL that can't correctly validate certificate chains.\n", "I reached out to Kurt Roebx about backporting the fixes for that, he said\nit was a thing he was looking at doing, I have no clue what the timeline is.\n\nOn Wed, Jan 13, 2016 at 7:27 AM, Donald Stufft [email protected]\nwrote:\n\n> Maybe they should stop shipping an OpenSSL that can't correctly validate\n> certificate chains.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2966#issuecomment-171276369\n> .\n\n## \n\n\"I disapprove of what you say, but I will defend to the death your right to\nsay it.\" -- Evelyn Beatrice Hall (summarizing Voltaire)\n\"The people's good is the highest law.\" -- Cicero\nGPG Key fingerprint: 125F 5C67 DFE9 4084\n", "So, what if we made the behaviour of looking at the system CA bundle an extra, e.g., `pip install requests[system_ca]` where it would ship without our certificate bundle and instead use certifi. This allows us to keep on \"just work\" ing for our default user base while supporting the people who need to use the system bundle.\n", "Seems reasonable.\n", "I would start with adding an API for it, before worrying about how to\nenable it via installation, you need `requests.get(url,\nverify=<something>)` to exist first.\n\nOn Wed, Jan 13, 2016 at 7:41 AM, Ian Cordasco [email protected]\nwrote:\n\n> So, what if we made the behaviour of looking at the system CA bundle an\n> extra, e.g., pip install requests[system_ca] where it would ship without\n> our certificate bundle and instead use certifi. This allows us to keep on\n> \"just work\" ing for our default user base while supporting the people who\n> need to use the system bundle.\n> \n> —\n> Reply to this email directly or view it on GitHub\n> https://github.com/kennethreitz/requests/issues/2966#issuecomment-171279745\n> .\n\n## \n\n\"I disapprove of what you say, but I will defend to the death your right to\nsay it.\" -- Evelyn Beatrice Hall (summarizing Voltaire)\n\"The people's good is the highest law.\" -- Cicero\nGPG Key fingerprint: 125F 5C67 DFE9 4084\n", "Debian / Ubuntu have more issues. They ignore CKT_NSS_MUST_VERIFY_TRUST flag and throw all trust anchors in one PEM file. The flag overrides extended key usage (EKU) for root certs, e.g. only trust a root cert for S/MIME but not for TLS server auth.\n\nhttps://bugs.debian.org/cgi-bin/bugreport.cgi?bug=721976\nhttps://bugs.launchpad.net/ubuntu/+source/ca-certificates/+bug/1207004\n\nApple, Microsoft and Red Hat/Fedora have means to disable certs for a purpose. Red Hat only puts trust anchors for TLS server auth in the default cert bundle. With FreeDesktop.org's p11kit and PKCS#11 bindings the policies can even be modified by user. This doesn't work for OpenSSL yet because OpenSSL uses a static PEM file instead of PKCS#11 to fetch trust anchors.\n\n@glyph Does MS trust their own cert for all EKUs or just for some EKUs like code signing?\n", "@alex would something like\n\n``` py\nimport requests\nfrom requests import certs\n\nrequests.get(url, verify=certs.system_bundle_where())\n```\n\nBe a sufficiently good API?\n" ]
https://api.github.com/repos/psf/requests/issues/2965
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2965/labels{/name}
https://api.github.com/repos/psf/requests/issues/2965/comments
https://api.github.com/repos/psf/requests/issues/2965/events
https://github.com/psf/requests/issues/2965
125,929,971
MDU6SXNzdWUxMjU5Mjk5NzE=
2,965
HTTPAdapter ignored
{ "avatar_url": "https://avatars.githubusercontent.com/u/5319224?v=4", "events_url": "https://api.github.com/users/swmerko/events{/privacy}", "followers_url": "https://api.github.com/users/swmerko/followers", "following_url": "https://api.github.com/users/swmerko/following{/other_user}", "gists_url": "https://api.github.com/users/swmerko/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/swmerko", "id": 5319224, "login": "swmerko", "node_id": "MDQ6VXNlcjUzMTkyMjQ=", "organizations_url": "https://api.github.com/users/swmerko/orgs", "received_events_url": "https://api.github.com/users/swmerko/received_events", "repos_url": "https://api.github.com/users/swmerko/repos", "site_admin": false, "starred_url": "https://api.github.com/users/swmerko/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/swmerko/subscriptions", "type": "User", "url": "https://api.github.com/users/swmerko", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2016-01-11T12:11:55Z
2021-09-08T20:00:48Z
2016-01-11T12:13:40Z
NONE
resolved
Hi, i'm trying to force ssl.PROTOCOL_TLSv1 as written in documentation: http://docs.python-requests.org/en/latest/user/advanced/#example-specific-ssl-version ``` from requests.adapters import HTTPAdapter from requests.packages.urllib3.poolmanager import PoolManager import ssl class MyAdapter(HTTPAdapter): def init_poolmanager(self, connections, maxsize, block=False): self.poolmanager = PoolManager(num_pools=connections, maxsize=maxsize, block=block, ssl_version=ssl.PROTOCOL_TLSv1) import requests s = requests.Session() s.mount('https://', MyAdapter()) requests.get('https://services.mailup.com/') ``` but this gives me ``` requests.exceptions.SSLError: [SSL: SSLV3_ALERT_HANDSHAKE_FAILURE] sslv3 alert handshake failure (_ssl.c:600) ``` so i tried to patch requests/packages/urllib3/poolmanager.py with: ``` class PoolManager(RequestMethods): proxy = None def __init__(self, num_pools=10, headers=None, **connection_pool_kw): RequestMethods.__init__(self, headers) connection_pool_kw['ssl_version'] = ssl.PROTOCOL_TLSv1 self.connection_pool_kw = connection_pool_kw self.pools = RecentlyUsedContainer(num_pools, dispose_func=lambda p: p.close()) ``` and i get: ``` >>> import requests >>> requests.get('https://services.mailup.com/') <Response [403]> ``` Any ideas? Thanks, Matteo
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2965/reactions" }
https://api.github.com/repos/psf/requests/issues/2965/timeline
null
completed
null
null
false
[ "@swmerko You have to route your request through the Session object. In your code example you create a Session object `s`, but then call `requests.get`. That's not how Session objects work. Change your last line to `s.get()` instead of `requests.get()` and your problem should be solved.\n", "Is there a way to define globally requests with my Adapter?\n", "No, requests deliberately maintains very little global state. Under the covers, each requests.\\* call creates a new Session to execute the request. If you really wanted to do this (and I strongly do not recommend this approach), you could monkeypatch `Session.__init__()` to mount your adapters over the default ones.\n", "Thank you very much!\n" ]
https://api.github.com/repos/psf/requests/issues/2964
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2964/labels{/name}
https://api.github.com/repos/psf/requests/issues/2964/comments
https://api.github.com/repos/psf/requests/issues/2964/events
https://github.com/psf/requests/pull/2964
125,830,552
MDExOlB1bGxSZXF1ZXN0NTU1Njg4MDI=
2,964
Python 3.5 : AttributeError: 'socket' object has no attribute 'getpeercert'
{ "avatar_url": "https://avatars.githubusercontent.com/u/2358471?v=4", "events_url": "https://api.github.com/users/FabriceSalvaire/events{/privacy}", "followers_url": "https://api.github.com/users/FabriceSalvaire/followers", "following_url": "https://api.github.com/users/FabriceSalvaire/following{/other_user}", "gists_url": "https://api.github.com/users/FabriceSalvaire/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/FabriceSalvaire", "id": 2358471, "login": "FabriceSalvaire", "node_id": "MDQ6VXNlcjIzNTg0NzE=", "organizations_url": "https://api.github.com/users/FabriceSalvaire/orgs", "received_events_url": "https://api.github.com/users/FabriceSalvaire/received_events", "repos_url": "https://api.github.com/users/FabriceSalvaire/repos", "site_admin": false, "starred_url": "https://api.github.com/users/FabriceSalvaire/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/FabriceSalvaire/subscriptions", "type": "User", "url": "https://api.github.com/users/FabriceSalvaire", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2016-01-10T16:48:25Z
2021-09-08T05:01:05Z
2016-01-10T16:50:18Z
NONE
resolved
Traceback: File "/opt/python-virtual-env/py35/lib/python3.5/site-packages/yieldfrom/requests/api.py", line 61, in get r = yield from request('get', url, *_kwargs) File "/opt/python-virtual-env/py35/lib/python3.5/site-packages/yieldfrom/requests/api.py", line 49, in request r = yield from session.request(method=method, url=url, *_kwargs) File "/opt/python-virtual-env/py35/lib/python3.5/site-packages/yieldfrom/requests/sessions.py", line 465, in request resp = yield from self.send(prep, *_send_kwargs) File "/opt/python-virtual-env/py35/lib/python3.5/site-packages/yieldfrom/requests/sessions.py", line 579, in send r = yield from adapter.send(request, *_kwargs) File "/opt/python-virtual-env/py35/lib/python3.5/site-packages/yieldfrom/requests/adapters.py", line 364, in send timeout=timeout File "/opt/python-virtual-env/py35/lib/python3.5/site-packages/yieldfrom/urllib3/connectionpool.py", line 515, in urlopen body=body, headers=headers) File "/opt/python-virtual-env/py35/lib/python3.5/site-packages/yieldfrom/urllib3/connectionpool.py", line 309, in _make_request yield from self._validate_conn(conn) File "/opt/python-virtual-env/py35/lib/python3.5/site-packages/yieldfrom/urllib3/connectionpool.py", line 730, in _validate_conn yield from conn.connect() File "/opt/python-virtual-env/py35/lib/python3.5/site-packages/yieldfrom/urllib3/connection.py", line 295, in connect match_hostname(sock.getpeercert(), AttributeError: 'socket' object has no attribute 'getpeercert' sock instance is of type socket.socket
{ "avatar_url": "https://avatars.githubusercontent.com/u/2358471?v=4", "events_url": "https://api.github.com/users/FabriceSalvaire/events{/privacy}", "followers_url": "https://api.github.com/users/FabriceSalvaire/followers", "following_url": "https://api.github.com/users/FabriceSalvaire/following{/other_user}", "gists_url": "https://api.github.com/users/FabriceSalvaire/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/FabriceSalvaire", "id": 2358471, "login": "FabriceSalvaire", "node_id": "MDQ6VXNlcjIzNTg0NzE=", "organizations_url": "https://api.github.com/users/FabriceSalvaire/orgs", "received_events_url": "https://api.github.com/users/FabriceSalvaire/received_events", "repos_url": "https://api.github.com/users/FabriceSalvaire/repos", "site_admin": false, "starred_url": "https://api.github.com/users/FabriceSalvaire/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/FabriceSalvaire/subscriptions", "type": "User", "url": "https://api.github.com/users/FabriceSalvaire", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2964/reactions" }
https://api.github.com/repos/psf/requests/issues/2964/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2964.diff", "html_url": "https://github.com/psf/requests/pull/2964", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2964.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2964" }
true
[ "It is a mistake using \"New Pull Request\" from rdbhost:master. Sorry.\n" ]
https://api.github.com/repos/psf/requests/issues/2963
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2963/labels{/name}
https://api.github.com/repos/psf/requests/issues/2963/comments
https://api.github.com/repos/psf/requests/issues/2963/events
https://github.com/psf/requests/issues/2963
125,295,072
MDU6SXNzdWUxMjUyOTUwNzI=
2,963
ResourceWarning still triggered when warnings enabled
{ "avatar_url": "https://avatars.githubusercontent.com/u/796623?v=4", "events_url": "https://api.github.com/users/anarcat/events{/privacy}", "followers_url": "https://api.github.com/users/anarcat/followers", "following_url": "https://api.github.com/users/anarcat/following{/other_user}", "gists_url": "https://api.github.com/users/anarcat/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/anarcat", "id": 796623, "login": "anarcat", "node_id": "MDQ6VXNlcjc5NjYyMw==", "organizations_url": "https://api.github.com/users/anarcat/orgs", "received_events_url": "https://api.github.com/users/anarcat/received_events", "repos_url": "https://api.github.com/users/anarcat/repos", "site_admin": false, "starred_url": "https://api.github.com/users/anarcat/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/anarcat/subscriptions", "type": "User", "url": "https://api.github.com/users/anarcat", "user_view_type": "public" }
[]
closed
true
null
[]
null
10
2016-01-07T00:25:14Z
2021-09-08T20:00:47Z
2016-01-07T11:04:50Z
NONE
resolved
I can reproducibly get ResourceWarnings when running incomplete requests with [iter_lines](http://docs.python-requests.org/en/latest/api/#requests.Response.iter_lines). this is my test code: ``` python #!/usr/bin/python3 from contextlib import closing import re import time import warnings import requests print("requests %s" % requests.__version__) max_bytes = 640*1024 def test_get_title_close(url): response = requests.get(url, stream=True) try: content = '' for line in response.iter_lines(decode_unicode=True): content += line if '</title>' in content or len(content) > max_bytes: break except (UnicodeDecodeError, TypeError): # Fail silently when data can't be decoded return finally: # need to close the connexion because we have not read all the data response.connection.close() response.connection.close() print('title found: %s' % re.sub(r'^.*<title>', '', re.sub(r'</title>.*$', '', content))) def test_get_title_closing(url): with closing(requests.get(url, stream=True)) as response: try: content = '' for line in response.iter_lines(decode_unicode=True): content += line if '</title>' in content or len(content) > max_bytes: break except (UnicodeDecodeError, TypeError): # Fail silently when data can't be decoded return finally: # need to close the connexion because we have not read all the data response.close() # also trying the undocumented ones response.connection.close() print('title found: %s' % re.sub(r'^.*<title>', '', re.sub(r'</title>.*$', '', content))) warnings.filterwarnings('default') test_get_title_closing('http://example.com/') # this should not warn, but does test_get_title_close('http://example.com/') # this should not warn, but does also ``` it's taken from this pull request: https://github.com/sopel-irc/sopel/pull/988 - which tries to switch an irc bot from using a homegrown http library to the more standard "requests". the original code does not yield such warnings even though it also does HTTPS connections. here is the output: ``` $ python3 test_requests_fail.py requests 2.9.1 title found: Example Domain ../test_requests_fail.py:53: ResourceWarning: unclosed <socket.socket fd=4, family=AddressFamily.AF_INET6, type=SocketType.SOCK_STREAM, proto=6, laddr=('2607:f2c0:f00f:8f00:ea9a:8fff:fe6e:f60', 39628, 0, 0), raddr=('2606:2800:220:1:248:1893:25c8:1946', 80, 0, 0)> test_get_title_closing('http://example.com/') # this should not warn, but does title found: Example Domain ../test_requests_fail.py:54: ResourceWarning: unclosed <socket.socket fd=4, family=AddressFamily.AF_INET6, type=SocketType.SOCK_STREAM, proto=6, laddr=('2607:f2c0:f00f:8f00:ea9a:8fff:fe6e:f60', 39630, 0, 0), raddr=('2606:2800:220:1:248:1893:25c8:1946', 80, 0, 0)> test_get_title_close('http://example.com/') # this should not warn, but does also ``` notice how i tried my best to close the resources, using various methods, none of which work. there's something definitely fishy about the way this works. this is on Debian Jessie 8.2, with the latest requests installed from pip (2.9.1), or jessie-backports (2.8.1-1~bpo8+1) and Python 3.4.2. oddly enough, older requests versions do _not_ have that problem: 2.4.3 from plain old jessie is actually fine! so it sure looks like something broke since 2.4.3 in the fix proposed in #1882 and #2004.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 1, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 1, "url": "https://api.github.com/repos/psf/requests/issues/2963/reactions" }
https://api.github.com/repos/psf/requests/issues/2963/timeline
null
completed
null
null
false
[ "Requests works very hard to prevent you closing a response. This is because it gets in the way of our connection pooling. If you _really must_ close the connection, you need to close the underlying urllib3 connection: use `response.raw.close()`.\n\nThe reason the resource warnings are back is because we're much better about not returning partially complete connections to the connection pool. The lack of warnings in 2.5.4 was actually symptomatic of a problem: weirdly, the warnings are an indication that everything is working as intended.\n", "`response.raw.close()` does not work either here, i still get the warning.\n\ni don't get it: ResourceWarnings seem like a problem, not an indication that all is well. in this case, i deliberately want to close the connection before it finishes, because i do not want to download all of the request! how can i do that? and why do we leave sockets lying around like this? if it wasn't for the (not always deterministic) garbage collector, this would be a leaking resources like a sieve in a loop...\n\nin other words: how is having a warning a sign that all is well? or how can it be a problem that there are no warnings?\n\nand why are there functions in the API that basically do absolutely nothing? both the `close()` API and the `with closing(...)` suggestions are basically useless right now.\n\nso i think this is at least a documentation issue...\n", "Do you have a minimal reproduction case? Closing the fp really should work. Can you demonstrate it in a small chunk of code I can investigate please?\n", "i have a reproducible test case above. i tried adding `response.raw.close()` after `response.connection.close()`, without luck. but since you asked:\n\n``` python\n#!/usr/bin/python3\n\nfrom contextlib import closing\nimport re\nimport time\nimport warnings\n\nimport requests\n\nprint(\"requests %s\" % requests.__version__)\n\nmax_bytes = 640*1024\ndef test_get_title_close(url):\n response = requests.get(url, stream=True)\n\n try:\n content = ''\n for line in response.iter_lines(decode_unicode=True):\n content += line\n if '</title>' in content or len(content) > max_bytes:\n break\n except (UnicodeDecodeError, TypeError):\n # Fail silently when data can't be decoded\n return\n finally:\n # need to close the connexion because we have not read all the data\n response.connection.close()\n # also trying the undocumented ones\n response.close()\n response.raw.close()\n\n print('title found: %s' % re.sub(r'^.*<title>', '', re.sub(r'</title>.*$', '', content)))\n\ndef test_get_title_closing(url):\n with closing(requests.get(url, stream=True)) as response:\n try:\n content = ''\n for line in response.iter_lines(decode_unicode=True):\n content += line\n if '</title>' in content or len(content) > max_bytes:\n break\n except (UnicodeDecodeError, TypeError):\n # Fail silently when data can't be decoded\n return\n finally:\n # need to close the connexion because we have not read all the data\n response.connection.close()\n response.close()\n response.raw.close()\n\n print('title found: %s' % re.sub(r'^.*<title>', '', re.sub(r'</title>.*$', '', content)))\n\nwarnings.filterwarnings('default')\ntest_get_title_closing('http://example.com/') # this should not warn, but does\ntest_get_title_close('http://example.com/') # this should not warn, but does also\n```\n", "So the failure of `raw.close()` is because the http response object is claiming it _is_ closed. This requires a bit more digging on my end to work out why the object is lying.\n\nI'll take a look this weekend and see what I can see.\n", "Ok, so the problem here is that both the underlying HTTP connection _and_ the HTTP response have references to the socket, so the socket doesn't actually close until both references close it. Right now, urllib3's response `close` method doesn't do that, it only closes the response, and as a result the pooled connection remains alive until it passes out of scope.\n\nI consider that a urllib3 bug, so I've opened shazow/urllib3#779. In the meantime, you'll want to call both `response.raw.close()` and `response.raw._connection.close()`. Those two calls alone should solve your problem.\n\nThis is also why the documented API functions aren't working as you expect: they also call `response.raw.close()`, and are not expecting that that would fail to clean up the connection.\n\nThis should be resolved in a future release when a fix for the urllib3 issue is merged. Sorry about the bug!\n", "On 2016-01-09 11:56:02, Cory Benfield wrote:\n\n> This should be resolved in a future release when a fix for the urllib3 issue is merged. Sorry about the bug!\n\nshould this bug be kept open then?\n\n## thanks for the followup\n\nThe United States is a nation of laws:\nbadly written and randomly enforced.\n - Frank Zappa\n", "@anarcat why should we track an issue in a dependency?\n", "@sigmavirus24 is right. There is an issue open tracking this problem. We have zero code change required to use that fix, all we have to do is update our dependency (which we'll do in the next 2.X release just as part of the release process). The correct issue to track is the one on urllib3.\n", "okay, then i guess i'l watch that other issue.\n" ]
https://api.github.com/repos/psf/requests/issues/2962
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2962/labels{/name}
https://api.github.com/repos/psf/requests/issues/2962/comments
https://api.github.com/repos/psf/requests/issues/2962/events
https://github.com/psf/requests/issues/2962
125,270,260
MDU6SXNzdWUxMjUyNzAyNjA=
2,962
Stay in a queue
{ "avatar_url": "https://avatars.githubusercontent.com/u/1591920?v=4", "events_url": "https://api.github.com/users/Djokx/events{/privacy}", "followers_url": "https://api.github.com/users/Djokx/followers", "following_url": "https://api.github.com/users/Djokx/following{/other_user}", "gists_url": "https://api.github.com/users/Djokx/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Djokx", "id": 1591920, "login": "Djokx", "node_id": "MDQ6VXNlcjE1OTE5MjA=", "organizations_url": "https://api.github.com/users/Djokx/orgs", "received_events_url": "https://api.github.com/users/Djokx/received_events", "repos_url": "https://api.github.com/users/Djokx/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Djokx/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Djokx/subscriptions", "type": "User", "url": "https://api.github.com/users/Djokx", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2016-01-06T21:44:02Z
2021-09-08T20:00:48Z
2016-01-06T23:13:45Z
NONE
resolved
Hi, I've written a program which works perfectly well to seek inforamtions on a website. But when too many people are on this site, a queue is set in order to avoid crashes of the server. The issue is that the script stays in the queue forever. I mean, on a browser we the queue is there you join it automaticly and after a few minutes, you're on the website (even you reload the page, you keep your place in the queue). However, if we launch the site in a private navigation tab, it starts from afresh. So, I assume that is a question of Session or Cookies (it's not related to the IP). Does anyone have an idea of how to solve this issue ? I think it should be by using some Session/cookies function but I don't know how to do that. Thanks, Djokx.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2962/reactions" }
https://api.github.com/repos/psf/requests/issues/2962/timeline
null
completed
null
null
false
[ "@Djokx are you using the requests library in your project? This is a repository for a Python library called requests, not a repository to request help for arbitrary problems. If you are using this library, could you share some code so we can help you debug whatever problem it is that you're running into? It's not apparent what this has to do with this library at the moment.\n", "Hi, and thanks for the reply\n\nSo here's the function I use:\n\n``` python\nsession = requests.session()\n\ndef CompteMot(mot, site):\n global session\n r = session.post(site)\n P = r.text.upper();\n return P.count(mot)\n```\n\nAnd I have now these errors after a few loops in a while loop in which this function is called :\n\n> Traceback (most recent call last):\n> File \"/usr/local/lib/python3.4/dist-packages/requests/packages/urllib3/connectionpool.py\", line 376, in _make_request\n> httplib_response = conn.getresponse(buffering=True)\n> TypeError: getresponse() got an unexpected keyword argument 'buffering'\n> \n> During handling of the above exception, another exception occurred:\n> \n> Traceback (most recent call last):\n> File \"/usr/local/lib/python3.4/dist-packages/requests/packages/urllib3/connectionpool.py\", line 559, in urlopen\n> body=body, headers=headers)\n> File \"/usr/local/lib/python3.4/dist-packages/requests/packages/urllib3/connectionpool.py\", line 378, in _make_request\n> httplib_response = conn.getresponse()\n> File \"/usr/lib/python3.4/http/client.py\", line 1171, in getresponse\n> response.begin()\n> File \"/usr/lib/python3.4/http/client.py\", line 351, in begin\n> version, status, reason = self._read_status()\n> File \"/usr/lib/python3.4/http/client.py\", line 321, in _read_status\n> raise BadStatusLine(line)\n> http.client.BadStatusLine: ''\n> \n> During handling of the above exception, another exception occurred:\n> \n> Traceback (most recent call last):\n> File \"/usr/local/lib/python3.4/dist-packages/requests/adapters.py\", line 370, in send\n> timeout=timeout\n> File \"/usr/local/lib/python3.4/dist-packages/requests/packages/urllib3/connectionpool.py\", line 609, in urlopen\n> _stacktrace=sys.exc_info()[2])\n> File \"/usr/local/lib/python3.4/dist-packages/requests/packages/urllib3/util/retry.py\", line 245, in increment\n> raise six.reraise(type(error), error, _stacktrace)\n> File \"/usr/local/lib/python3.4/dist-packages/requests/packages/urllib3/packages/six.py\", line 309, in reraise\n> raise value.with_traceback(tb)\n> File \"/usr/local/lib/python3.4/dist-packages/requests/packages/urllib3/connectionpool.py\", line 559, in urlopen\n> body=body, headers=headers)\n> File \"/usr/local/lib/python3.4/dist-packages/requests/packages/urllib3/connectionpool.py\", line 378, in _make_request\n> httplib_response = conn.getresponse()\n> File \"/usr/lib/python3.4/http/client.py\", line 1171, in getresponse\n> response.begin()\n> File \"/usr/lib/python3.4/http/client.py\", line 351, in begin\n> version, status, reason = self._read_status()\n> File \"/usr/lib/python3.4/http/client.py\", line 321, in _read_status\n> raise BadStatusLine(line)\n> requests.packages.urllib3.exceptions.ProtocolError: ('Connection aborted.', BadStatusLine(\"''\",))\n> \n> During handling of the above exception, another exception occurred:\n> \n> Traceback (most recent call last):\n> File \"<stdin>\", line 1, in <module>\n> File \"/home/scripts/ScriptReq.py\", line 155, in VerifyNbMot\n> Nb_mots = CompteMot(mot, site)\n> File \"/home/scripts/ScriptReq.py\", line 58, in CompteMot\n> r = session.post(site)\n> File \"/usr/local/lib/python3.4/dist-packages/requests/sessions.py\", line 511, in post\n> return self.request('POST', url, data=data, json=json, *_kwargs)\n> File \"/usr/local/lib/python3.4/dist-packages/requests/sessions.py\", line 468, in request\n> resp = self.send(prep, *_send_kwargs)\n> File \"/usr/local/lib/python3.4/dist-packages/requests/sessions.py\", line 576, in send\n> r = adapter.send(request, **kwargs)\n> File \"/usr/local/lib/python3.4/dist-packages/requests/adapters.py\", line 420, in send\n> raise ConnectionError(err, request=request)\n> requests.exceptions.ConnectionError: ('Connection aborted.', BadStatusLine(\"''\",))\n", "So it looks like the server is closing the connection without sending a response. Your best bet is to read about [retries](http://www.coglib.com/~icordasc/blog/2014/12/retries-in-requests.html) and set them up. Otherwise this isn't a bug. If you need assistance and are looking for the correct place to ask questions, please head on over to [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests).\n" ]
https://api.github.com/repos/psf/requests/issues/2961
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2961/labels{/name}
https://api.github.com/repos/psf/requests/issues/2961/comments
https://api.github.com/repos/psf/requests/issues/2961/events
https://github.com/psf/requests/pull/2961
125,011,663
MDExOlB1bGxSZXF1ZXN0NTUxMTc5NzU=
2,961
Rename {Missing,Invalid}Schema to *Scheme
{ "avatar_url": "https://avatars.githubusercontent.com/u/134455?v=4", "events_url": "https://api.github.com/users/chadwhitacre/events{/privacy}", "followers_url": "https://api.github.com/users/chadwhitacre/followers", "following_url": "https://api.github.com/users/chadwhitacre/following{/other_user}", "gists_url": "https://api.github.com/users/chadwhitacre/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/chadwhitacre", "id": 134455, "login": "chadwhitacre", "node_id": "MDQ6VXNlcjEzNDQ1NQ==", "organizations_url": "https://api.github.com/users/chadwhitacre/orgs", "received_events_url": "https://api.github.com/users/chadwhitacre/received_events", "repos_url": "https://api.github.com/users/chadwhitacre/repos", "site_admin": false, "starred_url": "https://api.github.com/users/chadwhitacre/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/chadwhitacre/subscriptions", "type": "User", "url": "https://api.github.com/users/chadwhitacre", "user_view_type": "public" }
[]
closed
true
null
[]
null
7
2016-01-05T17:22:26Z
2021-09-08T05:01:05Z
2016-01-05T17:23:21Z
CONTRIBUTOR
resolved
As @merwok pointed out at https://github.com/kennethreitz/requests/pull/512#commitcomment-1296187, schemes are what they're called, not schemas. I got tripped up by this at https://github.com/CenterForOpenScience/osf.io/pull/4745#issuecomment-169038417, where "InvalidSchema" made me think I was hitting a database error, when in fact it was a configuration error in the format of a URL. This PR is a redo of #2960 against `proposed/3.0.0` instead of `master`.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2961/reactions" }
https://api.github.com/repos/psf/requests/issues/2961/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2961.diff", "html_url": "https://github.com/psf/requests/pull/2961", "merged_at": "2016-01-05T17:23:21Z", "patch_url": "https://github.com/psf/requests/pull/2961.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2961" }
true
[ "\\o/ Thanks @whit537! :sparkles: :cake: :sparkles:\n", "Huzzah, thanks! :-)\n\n@Lukasa++\n", "@Lukasa please add release notes for this. I think it may be a good idea to have some shim for this as people may be catching `InvalidSchema` or `MissingSchema` exceptions and they won't be able to catch those anymore in 3.0\n", "I'm away from the laptop for a bit, but I will do.\n", "> I think it may be a good idea to have some shim for this as people may be catching `InvalidSchema` or `MissingSchema` exceptions and they won't be able to catch those anymore in 3.0\n\nIf you add a shim then you're basically backwards-compatible, right? (That's what I was getting at with https://github.com/kennethreitz/requests/pull/2960#issuecomment-169070702.)\n", "@whit537 yeah, it isn't _necessary_ for 3.0 though and there's no way to tell people to move to those new messages unless we forcibly break things for them though.\n\nWhat we could do is add `InvalidScheme` and `MissingScheme` as sub-classes of their now defunct relatives so that people can start (with 2.10.0) using those exceptions so they're future proofed for 3.0.0. I'm not sure if @Lukasa would agree that there is value in that though.\n", "I'm open to that plan. \n" ]
https://api.github.com/repos/psf/requests/issues/2960
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2960/labels{/name}
https://api.github.com/repos/psf/requests/issues/2960/comments
https://api.github.com/repos/psf/requests/issues/2960/events
https://github.com/psf/requests/pull/2960
125,000,851
MDExOlB1bGxSZXF1ZXN0NTUxMTE1MzI=
2,960
Rename {Missing,Invalid}Schema to *Scheme
{ "avatar_url": "https://avatars.githubusercontent.com/u/134455?v=4", "events_url": "https://api.github.com/users/chadwhitacre/events{/privacy}", "followers_url": "https://api.github.com/users/chadwhitacre/followers", "following_url": "https://api.github.com/users/chadwhitacre/following{/other_user}", "gists_url": "https://api.github.com/users/chadwhitacre/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/chadwhitacre", "id": 134455, "login": "chadwhitacre", "node_id": "MDQ6VXNlcjEzNDQ1NQ==", "organizations_url": "https://api.github.com/users/chadwhitacre/orgs", "received_events_url": "https://api.github.com/users/chadwhitacre/received_events", "repos_url": "https://api.github.com/users/chadwhitacre/repos", "site_admin": false, "starred_url": "https://api.github.com/users/chadwhitacre/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/chadwhitacre/subscriptions", "type": "User", "url": "https://api.github.com/users/chadwhitacre", "user_view_type": "public" }
[ { "color": "eb6420", "default": false, "description": null, "id": 44501256, "name": "Breaking API Change", "node_id": "MDU6TGFiZWw0NDUwMTI1Ng==", "url": "https://api.github.com/repos/psf/requests/labels/Breaking%20API%20Change" }, { "color": "e11d21", "default": false, "description": null, "id": 78002701, "name": "Do Not Merge", "node_id": "MDU6TGFiZWw3ODAwMjcwMQ==", "url": "https://api.github.com/repos/psf/requests/labels/Do%20Not%20Merge" } ]
closed
true
null
[]
null
3
2016-01-05T16:30:37Z
2021-09-08T05:01:06Z
2016-01-05T17:15:13Z
CONTRIBUTOR
resolved
As @merwok pointed out at https://github.com/kennethreitz/requests/pull/512#commitcomment-1296187, schemes are what they're called, not schemas. I got tripped up by this at https://github.com/CenterForOpenScience/osf.io/pull/4745#issuecomment-169038417, where "InvalidSchema" made me think I was hitting a database error, when in fact it was a configuration error in the format of a URL.
{ "avatar_url": "https://avatars.githubusercontent.com/u/134455?v=4", "events_url": "https://api.github.com/users/chadwhitacre/events{/privacy}", "followers_url": "https://api.github.com/users/chadwhitacre/followers", "following_url": "https://api.github.com/users/chadwhitacre/following{/other_user}", "gists_url": "https://api.github.com/users/chadwhitacre/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/chadwhitacre", "id": 134455, "login": "chadwhitacre", "node_id": "MDQ6VXNlcjEzNDQ1NQ==", "organizations_url": "https://api.github.com/users/chadwhitacre/orgs", "received_events_url": "https://api.github.com/users/chadwhitacre/received_events", "repos_url": "https://api.github.com/users/chadwhitacre/repos", "site_admin": false, "starred_url": "https://api.github.com/users/chadwhitacre/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/chadwhitacre/subscriptions", "type": "User", "url": "https://api.github.com/users/chadwhitacre", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2960/reactions" }
https://api.github.com/repos/psf/requests/issues/2960/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2960.diff", "html_url": "https://github.com/psf/requests/pull/2960", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2960.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2960" }
true
[ "Thanks @whit537. Unfortunately, this is a breaking change so we can't accept it until 3.0.0: can you adjust this PR to point to the proposed/3.0.0 branch instead please?\n", "Sure thing. Closing this PR ...\n", "It occurs to me that it'd be possible to implement this in a backwards-compatible way, but since that would involve more than simply renaming the error classes (I guess you'd make an `InvalidScheme` class that subclasses `InvalidSchema`) it would take more time and might deserve a test or two. I can't commit to that work so I'm leaving this PR closed.\n" ]
https://api.github.com/repos/psf/requests/issues/2959
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2959/labels{/name}
https://api.github.com/repos/psf/requests/issues/2959/comments
https://api.github.com/repos/psf/requests/issues/2959/events
https://github.com/psf/requests/issues/2959
124,929,466
MDU6SXNzdWUxMjQ5Mjk0NjY=
2,959
CERTIFICATE_VERIFY_FAILED with python-requests 2.8.1 and curl, but works in browsers
{ "avatar_url": "https://avatars.githubusercontent.com/u/406946?v=4", "events_url": "https://api.github.com/users/progval/events{/privacy}", "followers_url": "https://api.github.com/users/progval/followers", "following_url": "https://api.github.com/users/progval/following{/other_user}", "gists_url": "https://api.github.com/users/progval/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/progval", "id": 406946, "login": "progval", "node_id": "MDQ6VXNlcjQwNjk0Ng==", "organizations_url": "https://api.github.com/users/progval/orgs", "received_events_url": "https://api.github.com/users/progval/received_events", "repos_url": "https://api.github.com/users/progval/repos", "site_admin": false, "starred_url": "https://api.github.com/users/progval/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/progval/subscriptions", "type": "User", "url": "https://api.github.com/users/progval", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2016-01-05T09:43:27Z
2021-09-08T20:00:49Z
2016-01-05T10:13:19Z
NONE
resolved
``` $ python3 Python 3.4.2 (default, Oct 8 2014, 10:45:20) [GCC 4.9.1] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import requests >>> requests.get('https://irestos.nuonet.fr/generation.php?crous=21&resto=351&ext=xml') Traceback (most recent call last): File "/home/progval/.local/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 559, in urlopen body=body, headers=headers) File "/home/progval/.local/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 345, in _make_request self._validate_conn(conn) File "/home/progval/.local/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 782, in _validate_conn conn.connect() File "/home/progval/.local/lib/python3.4/site-packages/requests/packages/urllib3/connection.py", line 250, in connect ssl_version=resolved_ssl_version) File "/home/progval/.local/lib/python3.4/site-packages/requests/packages/urllib3/util/ssl_.py", line 285, in ssl_wrap_socket return context.wrap_socket(sock, server_hostname=server_hostname) File "/usr/lib/python3.4/ssl.py", line 364, in wrap_socket _context=self) File "/usr/lib/python3.4/ssl.py", line 577, in __init__ self.do_handshake() File "/usr/lib/python3.4/ssl.py", line 804, in do_handshake self._sslobj.do_handshake() ssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:600) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/progval/.local/lib/python3.4/site-packages/requests/adapters.py", line 370, in send timeout=timeout File "/home/progval/.local/lib/python3.4/site-packages/requests/packages/urllib3/connectionpool.py", line 588, in urlopen raise SSLError(e) requests.packages.urllib3.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:600) During handling of the above exception, another exception occurred: Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/home/progval/.local/lib/python3.4/site-packages/requests/api.py", line 69, in get return request('get', url, params=params, **kwargs) File "/home/progval/.local/lib/python3.4/site-packages/requests/api.py", line 50, in request response = session.request(method=method, url=url, **kwargs) File "/home/progval/.local/lib/python3.4/site-packages/requests/sessions.py", line 468, in request resp = self.send(prep, **send_kwargs) File "/home/progval/.local/lib/python3.4/site-packages/requests/sessions.py", line 576, in send r = adapter.send(request, **kwargs) File "/home/progval/.local/lib/python3.4/site-packages/requests/adapters.py", line 433, in send raise SSLError(e, request=request) requests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:600) >>> requests.__version__ '2.8.1' ``` curl and openssl's CLI have the same issue: ``` $ curl https://irestos.nuonet.fr/generation.php\?crous\=21\&resto\=351\&ext\=xml curl: (60) SSL certificate problem: unable to get local issuer certificate ``` ``` $ openssl s_client -connect irestos.nuonet.fr:443 CONNECTED(00000003) depth=0 C = FR, ST = Picardie, L = AMIENS, O = CROUS d'Amiens-Picardie, OU = CROUS, CN = irestos.nuonet.fr verify error:num=20:unable to get local issuer certificate verify return:1 depth=0 C = FR, ST = Picardie, L = AMIENS, O = CROUS d'Amiens-Picardie, OU = CROUS, CN = irestos.nuonet.fr verify error:num=27:certificate not trusted verify return:1 depth=0 C = FR, ST = Picardie, L = AMIENS, O = CROUS d'Amiens-Picardie, OU = CROUS, CN = irestos.nuonet.fr verify error:num=21:unable to verify the first certificate verify return:1 --- Certificate chain 0 s:/C=FR/ST=Picardie/L=AMIENS/O=CROUS d'Amiens-Picardie/OU=CROUS/CN=irestos.nuonet.fr i:/C=NL/ST=Noord-Holland/L=Amsterdam/O=TERENA/CN=TERENA SSL CA 3 --- Server certificate -----BEGIN CERTIFICATE----- […] -----END CERTIFICATE----- subject=/C=FR/ST=Picardie/L=AMIENS/O=CROUS d'Amiens-Picardie/OU=CROUS/CN=irestos.nuonet.fr issuer=/C=NL/ST=Noord-Holland/L=Amsterdam/O=TERENA/CN=TERENA SSL CA 3 --- No client certificate CA names sent --- SSL handshake has read 2022 bytes and written 421 bytes --- New, TLSv1/SSLv3, Cipher is ECDHE-RSA-AES256-GCM-SHA384 Server public key is 2048 bit Secure Renegotiation IS supported Compression: NONE Expansion: NONE SSL-Session: Protocol : TLSv1.2 Cipher : ECDHE-RSA-AES256-GCM-SHA384 Session-ID: F603E70CBC9F7B2B5033280D6F72334EC63FA7F727464620B4790BA477556B25 Session-ID-ctx: Master-Key: C912001D02A8076AB864D9D51B2A056F76B49CD27B8A29EB7632EBA3EBB4124B1F908FAAF7CFE05028C51DAA07658762 Key-Arg : None PSK identity: None PSK identity hint: None SRP username: None TLS session ticket lifetime hint: 300 (seconds) TLS session ticket: 0000 - 46 7b 46 24 40 2f 6d ed-51 8c 7c e9 29 0a 55 99 F{F$@/m.Q.|.).U. 0010 - 20 61 4b 43 2a 9f 9e f8-15 0a 63 b3 e5 e7 20 75 aKC*.....c... u 0020 - d5 51 5c 8a 7b 26 cd 32-02 83 69 e6 ef 06 0d c6 .Q\.{&.2..i..... 0030 - 8a 35 40 82 d5 1b d3 c4-3e 57 dd 98 4d 29 f0 2d .5@.....>W..M).- 0040 - ed 62 d0 a5 ec 0e 41 1c-d4 61 e9 a2 d6 8f 2e 3e .b....A..a.....> 0050 - 29 a6 1e 83 43 40 4a 36-ac 01 6f f3 2f 6b e9 49 )[email protected]./k.I 0060 - b7 05 44 ff 57 5a e3 c6-8d 93 25 d1 ed 3f 7d 48 ..D.WZ....%..?}H 0070 - 8b dd 1b 3e f7 0f 36 3f-54 6a ac 36 8f a1 c0 97 ...>..6?Tj.6.... 0080 - ee ff 80 bf 52 a8 61 c3-e5 71 1e 4e 51 e3 d7 1f ....R.a..q.NQ... 0090 - 83 a3 f6 d1 79 9b e8 b4-b9 e7 d7 e0 d1 b4 1e e0 ....y........... 00a0 - 28 88 e0 5b e6 67 bd e1-96 50 5a 48 13 05 de b1 (..[.g...PZH.... 00b0 - 28 0b 1d e6 dd e1 d2 2f-8e 45 73 a7 8c 06 f7 47 (....../.Es....G Start Time: 1451986602 Timeout : 300 (sec) Verify return code: 21 (unable to verify the first certificate) --- ``` But the same URL works fine with Firefox 43 and Chromium 47.0.2526.80. System: Debian 8.2 (Jessie)
{ "avatar_url": "https://avatars.githubusercontent.com/u/406946?v=4", "events_url": "https://api.github.com/users/progval/events{/privacy}", "followers_url": "https://api.github.com/users/progval/followers", "following_url": "https://api.github.com/users/progval/following{/other_user}", "gists_url": "https://api.github.com/users/progval/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/progval", "id": 406946, "login": "progval", "node_id": "MDQ6VXNlcjQwNjk0Ng==", "organizations_url": "https://api.github.com/users/progval/orgs", "received_events_url": "https://api.github.com/users/progval/received_events", "repos_url": "https://api.github.com/users/progval/repos", "site_admin": false, "starred_url": "https://api.github.com/users/progval/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/progval/subscriptions", "type": "User", "url": "https://api.github.com/users/progval", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2959/reactions" }
https://api.github.com/repos/psf/requests/issues/2959/timeline
null
completed
null
null
false
[ "This is almost certainly because the site is not sending its intermediate certificates.\n\nGenerally, when configuring a site for TLS, the site author should ensure that the TLS handshake sends both the leaf certificate (the one valid for that site) _and_ any intermediate certificates between the leaf and the root. This is because clients may not have an up to date list of all the intermediate certificates in the world, and it's generally unwise to assume that they do. This does cause problems in some browsers: for example, Firefox 43 on my Mac also fails to validate the certificate chain.\n\nAre you able to contact the administrator of the server?\n", "I will try. Thanks for the help.\n\n(Closing the issue, as it is not a problem in requests)\n" ]
https://api.github.com/repos/psf/requests/issues/2958
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2958/labels{/name}
https://api.github.com/repos/psf/requests/issues/2958/comments
https://api.github.com/repos/psf/requests/issues/2958/events
https://github.com/psf/requests/pull/2958
124,792,493
MDExOlB1bGxSZXF1ZXN0NTQ5OTQyOTM=
2,958
Add equality functions for authentication handlers
{ "avatar_url": "https://avatars.githubusercontent.com/u/1901599?v=4", "events_url": "https://api.github.com/users/Malizor/events{/privacy}", "followers_url": "https://api.github.com/users/Malizor/followers", "following_url": "https://api.github.com/users/Malizor/following{/other_user}", "gists_url": "https://api.github.com/users/Malizor/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Malizor", "id": 1901599, "login": "Malizor", "node_id": "MDQ6VXNlcjE5MDE1OTk=", "organizations_url": "https://api.github.com/users/Malizor/orgs", "received_events_url": "https://api.github.com/users/Malizor/received_events", "repos_url": "https://api.github.com/users/Malizor/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Malizor/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Malizor/subscriptions", "type": "User", "url": "https://api.github.com/users/Malizor", "user_view_type": "public" }
[]
closed
true
null
[]
null
8
2016-01-04T16:39:11Z
2021-09-08T05:01:02Z
2016-01-30T03:08:26Z
CONTRIBUTOR
resolved
A use case for this is mocking. If you have a function that wrap a call like `requests.post('http://my_url', auth=HTTPBasicAuth(user, password))` and mock requests.post in your tests, calling `assert_called_once_with('http://my_url', auth=HTTPBasicAuth(user, password))` will fail. Implementing equality functions solves this use case.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2958/reactions" }
https://api.github.com/repos/psf/requests/issues/2958/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2958.diff", "html_url": "https://github.com/psf/requests/pull/2958", "merged_at": "2016-01-30T03:08:26Z", "patch_url": "https://github.com/psf/requests/pull/2958.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2958" }
true
[ "Was there already a bug or discussion of this? I didn't notice it if there was.\n\nI'm pretty sure there's no other practical use of this which makes me very conflicted about accepting this.\n\nYes it would be nice to be able to use `assert_called_once_with` but there are many other ways to do exactly that while also being more thorough/correct about what the mocked out request was. Specifically, I'm not convinced that the HTTPDigestAuth equality is correct.\n\nAs an entirely separate note, I'm pretty sure you only need to implement `__eq__` or `__ne__`, not both.\n", "I have some concerns about defining equality on auth handlers. One thing almost all auth handlers do is set hooks on a given request, but the auth handler equality functions don't (and cannot check for that). So while a may be equal to b, they're fundamentally very unlike each other, and cannot be used in the place of each other.\n\nDoes equality not fall back on identity? Is it not possible to simply confirm that it was called with a specific auth handler?\n", "Hi,\n\nI did not found a bug or discussion about this either.\n\nAbout `__ne__()`, the python 2.7 documentation states:\n\n> Accordingly, when defining `__eq__()`, one should also define `__ne__()` so that the operators will behave as expected.\n\nThis is no longer the case in python 3, where `__ne__()` is indeed not mandatory.\n", "> Does equality not fall back on identity? Is it not possible to simply confirm that it was called with a specific auth handler?\n\n@Lukasa you're correct. That said it's not exactly simple. Consider a poorly written API wrapper where someone does something like:\n\n``` py\nclass MyWrapper(object):\n def __init__(self, username, password):\n self.username = username\n self.password = password\n self.session = requests.Session()\n\n def list_objects(self, *args, **kwargs):\n return session.get(url, auth=HTTPBasicAuth(self.username, self.password))\n```\n\nObviously if ^ were better written it'd be easier to test the way you describe, but we know nothing about @Malizor's case (or anyone else's) and we always get push back for encouraging better design/testing practices.\n\n> the python 2.7 documentation states ...\n\nAh, I forgot that the always ancient Python 2 has silly things like this.\n", "Well my specific use case is similar to your example @sigmavirus24. It is a wrapper except it actually does some things before (build params depending on args…) and after (error handling, result formatting).\n\n@Lukasa : HTTPBasicAuth and HTTPProxyAuth just add a constant header on each request, so equality should not cause any problem here.\n\nHTTPDigestAuth is indeed more complicated with it's hook system. Perhaps I should only compare the username and password?\n", "@Malizor you could just do\n\n``` py\nclass MyWrapper(object):\n def __init__(self, username, password):\n self.basic_auth = HTTPBasicAuth(username, password)\n self.digest_auth = HTTPDigestAuth(username, password)\n\n def list_objects(self):\n return self.session.get(url, auth=self.basic_auth)\n```\n\nAnd then\n\n``` py\nself.client.session.assert_called_once_with(url, auth=self.client.basic_auth)\n```\n", "Well in fact it is not in a class, it's a simple function.\n\nSomething like this:\n\n``` python\ndef list_objects(user, password):\n return self.session.get(url, auth=HTTPDigestAuth(username, password))\n```\n\nwith a few more lines of course ;)\n", "@Malizor this seems like a fine usecase to me. I'm going to merge this in, then simplify the digest check to simply verify the parameters: username, password. \n\nIn other news, I hadn't looked at that digest code in a long time! I had a blast writing that long ago. So many memories. \n" ]
https://api.github.com/repos/psf/requests/issues/2957
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2957/labels{/name}
https://api.github.com/repos/psf/requests/issues/2957/comments
https://api.github.com/repos/psf/requests/issues/2957/events
https://github.com/psf/requests/issues/2957
124,580,502
MDU6SXNzdWUxMjQ1ODA1MDI=
2,957
Persistent Connection suffer from delay of Nagle
{ "avatar_url": "https://avatars.githubusercontent.com/u/4146182?v=4", "events_url": "https://api.github.com/users/zhangi/events{/privacy}", "followers_url": "https://api.github.com/users/zhangi/followers", "following_url": "https://api.github.com/users/zhangi/following{/other_user}", "gists_url": "https://api.github.com/users/zhangi/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zhangi", "id": 4146182, "login": "zhangi", "node_id": "MDQ6VXNlcjQxNDYxODI=", "organizations_url": "https://api.github.com/users/zhangi/orgs", "received_events_url": "https://api.github.com/users/zhangi/received_events", "repos_url": "https://api.github.com/users/zhangi/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zhangi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zhangi/subscriptions", "type": "User", "url": "https://api.github.com/users/zhangi", "user_view_type": "public" }
[]
closed
true
null
[]
null
5
2016-01-02T08:00:31Z
2021-09-08T20:00:49Z
2016-01-02T10:53:27Z
NONE
resolved
Not sure if this is a bug from urllib3 or Requests, the problem is that there is an extra delay introduced for all subsequent requests other than the first one when reusing a session object. The test are performed in Ubuntu 14 to a http server running on the localhost ``` python with requests.Session() as session: session.get(url) # this takes <1ms session.get(url) # this takes around 40ms session.get(url) # this takes around 40ms ``` Similar case was observed in httplib2 with Python 2.6(https://code.google.com/p/httplib2/issues/detail?id=91), and was fixed in Python 2.7.
{ "avatar_url": "https://avatars.githubusercontent.com/u/4146182?v=4", "events_url": "https://api.github.com/users/zhangi/events{/privacy}", "followers_url": "https://api.github.com/users/zhangi/followers", "following_url": "https://api.github.com/users/zhangi/following{/other_user}", "gists_url": "https://api.github.com/users/zhangi/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/zhangi", "id": 4146182, "login": "zhangi", "node_id": "MDQ6VXNlcjQxNDYxODI=", "organizations_url": "https://api.github.com/users/zhangi/orgs", "received_events_url": "https://api.github.com/users/zhangi/received_events", "repos_url": "https://api.github.com/users/zhangi/repos", "site_admin": false, "starred_url": "https://api.github.com/users/zhangi/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/zhangi/subscriptions", "type": "User", "url": "https://api.github.com/users/zhangi", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2957/reactions" }
https://api.github.com/repos/psf/requests/issues/2957/timeline
null
completed
null
null
false
[ "There should not be a Nagle's algorithm delay, because by default we disable Nagle's algorithm. Can you please answer the questions from our [CONTRIBUTING.md](https://github.com/kennethreitz/requests/blob/master/CONTRIBUTING.md), specifically:\n\n> Tell us **what version of Requests you're using**, and **how you installed it**.\n", "version: 2.9.1\ninstalled by: sudo pip install requests --upgrade\npython version: 2.7.3\nos: Ubuntu 14\n", "So [we disable Nagle's algorithm by default](https://github.com/kennethreitz/requests/blob/master/requests/packages/urllib3/connection.py#L97), which means that your diagnosis of this cannot be right.\n\nAre you familiar with wireshark or tcpdump?\n", "Let me investigate more, maybe it has something to do with server side. I will close it first.\n", "To be clear, if the server-side has delayed ACKs turned on it may cause this problem.\n" ]
https://api.github.com/repos/psf/requests/issues/2956
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2956/labels{/name}
https://api.github.com/repos/psf/requests/issues/2956/comments
https://api.github.com/repos/psf/requests/issues/2956/events
https://github.com/psf/requests/issues/2956
124,484,256
MDU6SXNzdWUxMjQ0ODQyNTY=
2,956
API feature - POST request to HTML element by id
{ "avatar_url": "https://avatars.githubusercontent.com/u/11060902?v=4", "events_url": "https://api.github.com/users/aewhatley/events{/privacy}", "followers_url": "https://api.github.com/users/aewhatley/followers", "following_url": "https://api.github.com/users/aewhatley/following{/other_user}", "gists_url": "https://api.github.com/users/aewhatley/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/aewhatley", "id": 11060902, "login": "aewhatley", "node_id": "MDQ6VXNlcjExMDYwOTAy", "organizations_url": "https://api.github.com/users/aewhatley/orgs", "received_events_url": "https://api.github.com/users/aewhatley/received_events", "repos_url": "https://api.github.com/users/aewhatley/repos", "site_admin": false, "starred_url": "https://api.github.com/users/aewhatley/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/aewhatley/subscriptions", "type": "User", "url": "https://api.github.com/users/aewhatley", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-12-31T16:29:29Z
2021-09-08T20:00:49Z
2015-12-31T18:16:37Z
NONE
resolved
Right now, the requests.post() function only appears to send post requests to HTML elements by their name. Are there any plans of creating something like a requests.post_to_name() and requests.post_to_id(), so that the user can make post requests to HTML elements via their id instead, similar to the API used in Splinter for accessing HTML elements?
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2956/reactions" }
https://api.github.com/repos/psf/requests/issues/2956/timeline
null
completed
null
null
false
[ "@aewhatley requests is an _HTTP_ client and has nothing to do with _HTML_. POST requests are made to URLs, not HTML elements. As such, we would not be adding any such methods because they wouldn't make sense as part of our API.\n\nThanks for the suggestion but I think you're a little confused about how requests works.\n" ]
https://api.github.com/repos/psf/requests/issues/2955
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2955/labels{/name}
https://api.github.com/repos/psf/requests/issues/2955/comments
https://api.github.com/repos/psf/requests/issues/2955/events
https://github.com/psf/requests/issues/2955
124,341,446
MDU6SXNzdWUxMjQzNDE0NDY=
2,955
Issue with Lets Encrypt certificates
{ "avatar_url": "https://avatars.githubusercontent.com/u/16189630?v=4", "events_url": "https://api.github.com/users/yhorian/events{/privacy}", "followers_url": "https://api.github.com/users/yhorian/followers", "following_url": "https://api.github.com/users/yhorian/following{/other_user}", "gists_url": "https://api.github.com/users/yhorian/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/yhorian", "id": 16189630, "login": "yhorian", "node_id": "MDQ6VXNlcjE2MTg5NjMw", "organizations_url": "https://api.github.com/users/yhorian/orgs", "received_events_url": "https://api.github.com/users/yhorian/received_events", "repos_url": "https://api.github.com/users/yhorian/repos", "site_admin": false, "starred_url": "https://api.github.com/users/yhorian/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yhorian/subscriptions", "type": "User", "url": "https://api.github.com/users/yhorian", "user_view_type": "public" }
[]
closed
true
null
[]
null
13
2015-12-30T12:24:20Z
2021-09-08T18:00:40Z
2015-12-30T13:12:57Z
NONE
resolved
Having problems getting Requests to work with ssl certificates issued by lets encrypt. The error message is very unhelpful in narrowing down what the issue is exactly. Lets see if I can post the right error this time: File "E:\anaconda27\lib\site-packages\requests\api.py", line 67, in get return request('get', url, params=params, *_kwargs) File "E:\anaconda27\lib\site-packages\requests\api.py", line 53, in request return session.request(method=method, url=url, *_kwargs) File "E:\anaconda27\lib\site-packages\requests\sessions.py", line 468, in request resp = self.send(prep, *_send_kwargs) File "E:\anaconda27\lib\site-packages\requests\sessions.py", line 576, in send r = adapter.send(request, *_kwargs) File "E:\anaconda27\lib\site-packages\requests\adapters.py", line 447, in send raise SSLError(e, request=request) requests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2955/reactions" }
https://api.github.com/repos/psf/requests/issues/2955/timeline
null
completed
null
null
false
[ "I previously submitted a post with error messages testing httplib2 and other libraries, which are having similar issues but aren't providing much more information on why they're returning this error when browser and a PHP script aren't having this problem.\n", "@yhorian1 Aha, there we go!\n\nSo, the problem here is almost certainly that your OpenSSL is too old. PHP is likely succeeding because it doesn't verify certificates at all or because it's not using OpenSSL.\n\nOpenSSL versions less than 1.0.2 don't backtrack when trying to build up certificate chains, which means that if it encounters a cross-signed certificate it is unable to check whether or not it has a non-cross-signed version of that certificate in the trust store. That's what's happening in this case I suspect: Let's Encrypt is currently a cross-signed CA.\n\nDo you mind providing a specific URL so I can validate my suspicion?\n", "https://evil-source.co.uk/ is the site I was testing it.\n", "Thanks. So this is working fine on my OS X machine, using Python 2.7.11 with OpenSSL 1.0.2e. Can you let me know what version of requests you're using, where you got it from, and whether you have `certifi` installed?\n", "Testing on python 3.4:\ncertifi 2015.11.20.1\nrequests 2.9.1\n\nTesting on python 2.7:\ncertifi 2015.11.20.1\nrequests 2.9.1\n\nbuilt using anaconda on win10\n", "Interesting. Can you try using `requests.get('https://evil-source.co.uk', verify=certifi.old_where())`? (You'll need to import requests and certifi).\n", " requests.get('https://evil-source.co.uk', verify=certifi.old_where())\n File \"E:\\anaconda27\\lib\\site-packages\\requests\\api.py\", line 67, in get\n return request('get', url, params=params, *_kwargs)\n File \"E:\\anaconda27\\lib\\site-packages\\requests\\api.py\", line 53, in request\n return session.request(method=method, url=url, *_kwargs)\n File \"E:\\anaconda27\\lib\\site-packages\\requests\\sessions.py\", line 468, in request\n resp = self.send(prep, *_send_kwargs)\n File \"E:\\anaconda27\\lib\\site-packages\\requests\\sessions.py\", line 576, in send\n r = adapter.send(request, *_kwargs)\n File \"E:\\anaconda27\\lib\\site-packages\\requests\\adapters.py\", line 447, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)\n\nProcess finished with exit code 1\n\nNo joy. Checked in python 3.4 and 3.5 as well, openssl is 1.0.2d\n", "Now that's _very_ interesting.\n\nDo you have PyOpenSSL, ndg-httpsclient, and pyasn1 installed?\n\nCan you find where anaconda has put the OpenSSL binary? \n", "Ugh, so checked logs and figured it out. \n\nThere was a routing policy that was causing the ISP to re-route requests internally, causing the server to issue the wrong page. Hence the failure. Tried from a different location with similar setup - works fine. VPN switched on, works fine. This is possibly one of the more frustrating network policies I've ever encountered - even if the DNS says the site is in the same block they should be treating it as a public request, not a local!\n\nSorry for wasting your time!\n", "That's definitely a perplexing setup, well debugged on your end! No need to apologise: you didn't waste my time, you needed help to get past the possibility that your libraries were doing something wrong. I'm just glad you were able to come to a resolution!\n", "I have similar problem, and it fails from two different locations and two different OS.\nOn my local machine (OSX), on fresh virtualenv:\n\nWithout certifi:\n\n```\n>>> r = requests.get('http://36apor.se')\nTraceback (most recent call last):\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/packages/urllib3/connectionpool.py\", line 516, in urlopen\n body=body, headers=headers)\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/packages/urllib3/connectionpool.py\", line 304, in _make_request\n self._validate_conn(conn)\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/packages/urllib3/connectionpool.py\", line 722, in _validate_conn\n conn.connect()\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/packages/urllib3/connection.py\", line 229, in connect\n ssl_version=resolved_ssl_version)\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/packages/urllib3/util/ssl_.py\", line 123, in ssl_wrap_socket\n return context.wrap_socket(sock, server_hostname=server_hostname)\n File \"/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py\", line 376, in wrap_socket\n _context=self)\n File \"/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py\", line 747, in __init__\n self.do_handshake()\n File \"/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py\", line 983, in do_handshake\n self._sslobj.do_handshake()\n File \"/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py\", line 628, in do_handshake\n self._sslobj.do_handshake()\nssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/adapters.py\", line 362, in send\n timeout=timeout\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/packages/urllib3/connectionpool.py\", line 543, in urlopen\n raise SSLError(e)\nrequests.packages.urllib3.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/api.py\", line 60, in get\n return request('get', url, **kwargs)\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/api.py\", line 49, in request\n return session.request(method=method, url=url, **kwargs)\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/sessions.py\", line 457, in request\n resp = self.send(prep, **send_kwargs)\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/sessions.py\", line 595, in send\n history = [resp for resp in gen] if allow_redirects else []\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/sessions.py\", line 595, in <listcomp>\n history = [resp for resp in gen] if allow_redirects else []\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/sessions.py\", line 189, in resolve_redirects\n allow_redirects=False,\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/sessions.py\", line 569, in send\n r = adapter.send(request, **kwargs)\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/adapters.py\", line 420, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)\n```\n\nWith certifi:\n\n```\n>>> r = requests.get('http://36apor.se', verify=certifi.old_where())\nTraceback (most recent call last):\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/packages/urllib3/connectionpool.py\", line 516, in urlopen\n body=body, headers=headers)\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/packages/urllib3/connectionpool.py\", line 304, in _make_request\n self._validate_conn(conn)\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/packages/urllib3/connectionpool.py\", line 722, in _validate_conn\n conn.connect()\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/packages/urllib3/connection.py\", line 229, in connect\n ssl_version=resolved_ssl_version)\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/packages/urllib3/util/ssl_.py\", line 123, in ssl_wrap_socket\n return context.wrap_socket(sock, server_hostname=server_hostname)\n File \"/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py\", line 376, in wrap_socket\n _context=self)\n File \"/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py\", line 747, in __init__\n self.do_handshake()\n File \"/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py\", line 983, in do_handshake\n self._sslobj.do_handshake()\n File \"/usr/local/Cellar/python3/3.5.1/Frameworks/Python.framework/Versions/3.5/lib/python3.5/ssl.py\", line 628, in do_handshake\n self._sslobj.do_handshake()\nssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/adapters.py\", line 362, in send\n timeout=timeout\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/packages/urllib3/connectionpool.py\", line 543, in urlopen\n raise SSLError(e)\nrequests.packages.urllib3.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/api.py\", line 60, in get\n return request('get', url, **kwargs)\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/api.py\", line 49, in request\n return session.request(method=method, url=url, **kwargs)\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/sessions.py\", line 457, in request\n resp = self.send(prep, **send_kwargs)\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/sessions.py\", line 595, in send\n history = [resp for resp in gen] if allow_redirects else []\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/sessions.py\", line 595, in <listcomp>\n history = [resp for resp in gen] if allow_redirects else []\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/sessions.py\", line 189, in resolve_redirects\n allow_redirects=False,\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/sessions.py\", line 569, in send\n r = adapter.send(request, **kwargs)\n File \"/Users/mysz/Library/Python/2.7/lib/python/site-packages/requests/adapters.py\", line 420, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)\n```\n\nAnd from other location with Ubuntu Xenial:\n\nWithout certifi:\n\n```\n>>> r = requests.get('http://36apor.se')\nTraceback (most recent call last):\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 578, in urlopen\n chunked=chunked)\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 351, in _make_request\n self._validate_conn(conn)\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 814, in _validate_conn\n conn.connect()\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/packages/urllib3/connection.py\", line 289, in connect\n ssl_version=resolved_ssl_version)\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/packages/urllib3/util/ssl_.py\", line 308, in ssl_wrap_socket\n return context.wrap_socket(sock, server_hostname=server_hostname)\n File \"/usr/lib/python3.5/ssl.py\", line 376, in wrap_socket\n _context=self)\n File \"/usr/lib/python3.5/ssl.py\", line 748, in __init__\n self.do_handshake()\n File \"/usr/lib/python3.5/ssl.py\", line 984, in do_handshake\n self._sslobj.do_handshake()\n File \"/usr/lib/python3.5/ssl.py\", line 629, in do_handshake\n self._sslobj.do_handshake()\nssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/adapters.py\", line 403, in send\n timeout=timeout\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 604, in urlopen\n raise SSLError(e)\nrequests.packages.urllib3.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/api.py\", line 71, in get\n return request('get', url, params=params, **kwargs)\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/api.py\", line 57, in request\n return session.request(method=method, url=url, **kwargs)\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/sessions.py\", line 475, in request\n resp = self.send(prep, **send_kwargs)\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/sessions.py\", line 606, in send\n history = [resp for resp in gen] if allow_redirects else []\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/sessions.py\", line 606, in <listcomp>\n history = [resp for resp in gen] if allow_redirects else []\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/sessions.py\", line 179, in resolve_redirects\n **adapter_kwargs\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/sessions.py\", line 585, in send\n r = adapter.send(request, **kwargs)\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/adapters.py\", line 477, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)\n```\n\nAnd without certifi:\n\n```\n>>> r = requests.get('http://36apor.se', verify=certifi.old_where())\nTraceback (most recent call last):\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 578, in urlopen\n chunked=chunked)\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 351, in _make_request\n self._validate_conn(conn)\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 814, in _validate_conn\n conn.connect()\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/packages/urllib3/connection.py\", line 289, in connect\n ssl_version=resolved_ssl_version)\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/packages/urllib3/util/ssl_.py\", line 308, in ssl_wrap_socket\n return context.wrap_socket(sock, server_hostname=server_hostname)\n File \"/usr/lib/python3.5/ssl.py\", line 376, in wrap_socket\n _context=self)\n File \"/usr/lib/python3.5/ssl.py\", line 748, in __init__\n self.do_handshake()\n File \"/usr/lib/python3.5/ssl.py\", line 984, in do_handshake\n self._sslobj.do_handshake()\n File \"/usr/lib/python3.5/ssl.py\", line 629, in do_handshake\n self._sslobj.do_handshake()\nssl.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/adapters.py\", line 403, in send\n timeout=timeout\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/packages/urllib3/connectionpool.py\", line 604, in urlopen\n raise SSLError(e)\nrequests.packages.urllib3.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n File \"<stdin>\", line 1, in <module>\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/api.py\", line 71, in get\n return request('get', url, params=params, **kwargs)\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/api.py\", line 57, in request\n return session.request(method=method, url=url, **kwargs)\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/sessions.py\", line 475, in request\n resp = self.send(prep, **send_kwargs)\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/sessions.py\", line 606, in send\n history = [resp for resp in gen] if allow_redirects else []\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/sessions.py\", line 606, in <listcomp>\n history = [resp for resp in gen] if allow_redirects else []\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/sessions.py\", line 179, in resolve_redirects\n **adapter_kwargs\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/sessions.py\", line 585, in send\n r = adapter.send(request, **kwargs)\n File \"/home/mysz/aportest/venv-2/lib/python3.5/site-packages/requests/adapters.py\", line 477, in send\n raise SSLError(e, request=request)\nrequests.exceptions.SSLError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:645)\n```\n\nCertificates on this domain is issued using letsencrypt.\nOn OSX, I have OpenSSL 0.9.8zh 14 Jan 2016, and on Ubuntu there is OpenSSL 1.0.2g-fips 1 Mar 2016.\n", "Well, the site in question is weird.\n\nThe validation failing here is actually for `https://36monkeys.com/`, which is where the original site redirects to. That site doesn't load in Chrome because it negotiates HTTP/2 or SPDY but doesn't actually define a secure-enough connection, which is moderately amusing, but it _does_ load in Safari.\n\nBut the problem is revealed by loading the site using OpenSSL. The server presents the following certificate chain:\n\n```\n---\nCertificate chain\n 0 s:/CN=36monkeys.com\n i:/C=US/O=Let's Encrypt/CN=Let's Encrypt Authority X3\n---\n```\n\nThat is, it presents only the leaf certificate, without presenting the `Let's Encrypt Authority X3` certificate. That is not enough: the reason Safari and friends can validate the certificate is because they ship with a list of intermediate certificates, but most operating systems and clients do not.\n\nThe Let's Encrypt [docs say this too](https://certbot.eff.org/docs/using.html#where-are-my-certificates):\n\n> Please note, that you must use either `chain.pem` or `fullchain.pem`.\n\nThis server is not doing that: it's serving only `cert.pem`.\n\nPlease contact the server administrator and ask them to reconfigure their server.\n", "@Lukasa You are absolutely right, server config was broken. I fixed it and now it works fine - thanks!\n" ]
https://api.github.com/repos/psf/requests/issues/2954
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2954/labels{/name}
https://api.github.com/repos/psf/requests/issues/2954/comments
https://api.github.com/repos/psf/requests/issues/2954/events
https://github.com/psf/requests/issues/2954
124,339,435
MDU6SXNzdWUxMjQzMzk0MzU=
2,954
HTTPS verification for Let's Encrypt certificates
{ "avatar_url": "https://avatars.githubusercontent.com/u/16189630?v=4", "events_url": "https://api.github.com/users/yhorian/events{/privacy}", "followers_url": "https://api.github.com/users/yhorian/followers", "following_url": "https://api.github.com/users/yhorian/following{/other_user}", "gists_url": "https://api.github.com/users/yhorian/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/yhorian", "id": 16189630, "login": "yhorian", "node_id": "MDQ6VXNlcjE2MTg5NjMw", "organizations_url": "https://api.github.com/users/yhorian/orgs", "received_events_url": "https://api.github.com/users/yhorian/received_events", "repos_url": "https://api.github.com/users/yhorian/repos", "site_admin": false, "starred_url": "https://api.github.com/users/yhorian/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yhorian/subscriptions", "type": "User", "url": "https://api.github.com/users/yhorian", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-12-30T12:01:29Z
2021-09-08T20:00:50Z
2015-12-30T12:04:52Z
NONE
resolved
I've been trying to get requests to work with sites using Let's Encrypt SSL certificates but getting a very unhelpful error: ``` File "E:\anaconda27\lib\site-packages\httplib2\__init__.py", line 1609, in request (response, content) = self._request(conn, authority, uri, request_uri, method, body, headers, redirections, cachekey) File "E:\anaconda27\lib\site-packages\httplib2\__init__.py", line 1351, in _request (response, content) = self._conn_request(conn, request_uri, method, body, headers) File "E:\anaconda27\lib\site-packages\httplib2\__init__.py", line 1272, in _conn_request conn.connect() File "E:\anaconda27\lib\site-packages\httplib2\__init__.py", line 1059, in connect raise SSLHandshakeError(e) httplib2.SSLHandshakeError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590) ``` I've replicated this in Python 3.4 and 3.5, with lib/ssl.py failing.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2954/reactions" }
https://api.github.com/repos/psf/requests/issues/2954/timeline
null
completed
null
null
false
[ "@yhorian1 Your bug is in httplib2. This is not httplib2.\n" ]
https://api.github.com/repos/psf/requests/issues/2953
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2953/labels{/name}
https://api.github.com/repos/psf/requests/issues/2953/comments
https://api.github.com/repos/psf/requests/issues/2953/events
https://github.com/psf/requests/pull/2953
124,332,472
MDExOlB1bGxSZXF1ZXN0NTQ3OTE0ODg=
2,953
SOCKS Proxy Support
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" } ]
{ "closed_at": "2016-09-06T00:07:25Z", "closed_issues": 1, "created_at": "2015-12-15T14:43:38Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/27", "id": 1461386, "labels_url": "https://api.github.com/repos/psf/requests/milestones/27/labels", "node_id": "MDk6TWlsZXN0b25lMTQ2MTM4Ng==", "number": 27, "open_issues": 0, "state": "closed", "title": "2.10.0", "updated_at": "2016-09-06T14:26:04Z", "url": "https://api.github.com/repos/psf/requests/milestones/27" }
27
2015-12-30T10:46:50Z
2021-09-08T04:01:02Z
2016-04-29T22:03:04Z
MEMBER
resolved
This pull request _finally_ adds SOCKS proxy support to the Requests module, pretty much four years _to the day_ since we were first asked (in #324 for those who want to follow it). This pull request relies on a pull request I have opened against urllib3, so we can't land it until urllib3 1.14.1 is released containing that code. However, once that's done this PR should be ready to rock. I'm setting this up for the 2.10.0 milestone as I see no reason to wait for 3.0.0 for SOCKS support. Resolves #324, #723, #1147, #1982, #2156, #2425, and #2562.
{ "avatar_url": "https://avatars.githubusercontent.com/u/119893?v=4", "events_url": "https://api.github.com/users/kennethreitz/events{/privacy}", "followers_url": "https://api.github.com/users/kennethreitz/followers", "following_url": "https://api.github.com/users/kennethreitz/following{/other_user}", "gists_url": "https://api.github.com/users/kennethreitz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/kennethreitz", "id": 119893, "login": "kennethreitz", "node_id": "MDQ6VXNlcjExOTg5Mw==", "organizations_url": "https://api.github.com/users/kennethreitz/orgs", "received_events_url": "https://api.github.com/users/kennethreitz/received_events", "repos_url": "https://api.github.com/users/kennethreitz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/kennethreitz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/kennethreitz/subscriptions", "type": "User", "url": "https://api.github.com/users/kennethreitz", "user_view_type": "public" }
{ "+1": 3, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 3, "url": "https://api.github.com/repos/psf/requests/issues/2953/reactions" }
https://api.github.com/repos/psf/requests/issues/2953/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2953.diff", "html_url": "https://github.com/psf/requests/pull/2953", "merged_at": "2016-04-29T22:03:04Z", "patch_url": "https://github.com/psf/requests/pull/2953.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2953" }
true
[ ":pray: \n", ":+1: on the code and :+1: on the milestone.\n", "@Lukasa what do you think about having a `socks` extra for requests that installs `PySocks` and whatever else?\n", "Done. =)\n", "Does PySocks require any compilation during installation? We should consider vendoring it if we want to consider this an actual feature of Requests. \n", "PySocks requires no compilation. However, we'll have to rewrite import paths in urllib3 if you want to vendor PySocks.\n", "Well, if it's going to be used by a very small subset of users (likely the case), it's likely fine to leave it as an optional dependency.\n\nWe could pack it in without modifying urllib3's code, technically:\n\n(`packages/__init__.py`)\n\n``` python\ntry:\n from . import socks\n sys.modules['socks'] = socks\nexcept ImportError:\n try:\n import socks\n except ImportError:\n pass\n```\n\nThis is obviously not the best idea, but it does function properly. \n", "Mucking with sys.modules in the past has caused us enough headaches that I'd rather not go too far down that path again\n", "I don't disagree :)\n", "@Lukasa is this was merged or not? Btw, how to configure the socks proxy?\n", "This has not been merged.\n\nWhen it is released, you can configure the socks proxy as you would configure other proxies: `proxies={'http': 'socks5://localhost:8888/'}`, for example.\n", "@Lukasa so it should work like\n\n``` python\n#proxy\n # SOCKS5 proxy for HTTP/HTTPS\n proxies = {\n 'http' : \"socks5://myproxy:9191\",\n 'https' : \"socks5://myproxy:9191\"\n }\n\n #headers\n headers = {\n\n }\n\n url='http://icanhazip.com/'\n res = requests.get(url, headers=headers, proxies=proxies)\n```\n", "Correct. Of course, that won't work unless you use a patched version of requests that contains both this patch _and_ an updated urllib3, _and_ you install the appropriate optional dependencies.\n", "@Lukasa uhm, so supposed to use on google appengine, I have to wait the `devappserver2` python sdk update...\n", "On GAE I suspect using a SOCKS proxy is going to take a while. I should also note that to use a SOCKS proxy on GAE will require that you enable socket support, which as I understand it is substantially more expensive than simply using the standard httplib replacement.\n", "btw, In the meanwhile I'm doing\n\n``` shell\n>>> import requesocks\n>>> import requests\n>>> session = requesocks.session()\n>>> session.proxies = {\n... 'http' : \"socks5://myproxy.net:9050\",\n... 'https' : \"socks5://smyproxy.net:9050\"\n... }\n>>> url='http://icanhazip.com/'\n>>> resp = requests.get(url)\n>>> resp\n<Response [200]>\n>>> resp.text\nu'89.97.90.230\\n'\n>>> resp.text.strip()\nu'89.97.90.230'\n>>> resp = session.get(url)\n>>> resp.text.strip()\nu'171.25.193.78'\n>>> \n```\n\nthat in most envs will work ( a part again appengine that lacks of `pwd` support 😡\n", "Is this done? Let's merge it if so. \n", "We need an updated urllib3. I think @shazow is expecting to release this week.\n", "Released!\n", "Sweet, let's do it. \n", "We'll include it in the next release, which can therefore happen this week (or even today).\n", "@kennethreitz I'm away from my laptop, so if you're in a merging mood you can land the urllib3 update and then merge this.\n\nPlease note that it makes everyone's life easier if you merge in urllib3 directly from the brand new tag (1.15). Just a suggestion!\n", "The anticipation is killing me.\n", "What's [WIP] about this? Previous comments seem to suggest it's done, just waiting for the release.\n\n(Also, referencing https://github.com/shazow/urllib3/pull/762 and #478)\n", ":sparkles: :cake: :sparkles:\n", "![DJ Khaled](https://media.giphy.com/media/RDbZGZ3O0UmL6/giphy.gif)\n", "Released, announced. \n" ]
https://api.github.com/repos/psf/requests/issues/2952
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2952/labels{/name}
https://api.github.com/repos/psf/requests/issues/2952/comments
https://api.github.com/repos/psf/requests/issues/2952/events
https://github.com/psf/requests/pull/2952
124,273,309
MDExOlB1bGxSZXF1ZXN0NTQ3NjEwMzk=
2,952
Discard 0-byte chunks from request body iterators
{ "avatar_url": "https://avatars.githubusercontent.com/u/4432952?v=4", "events_url": "https://api.github.com/users/tipabu/events{/privacy}", "followers_url": "https://api.github.com/users/tipabu/followers", "following_url": "https://api.github.com/users/tipabu/following{/other_user}", "gists_url": "https://api.github.com/users/tipabu/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/tipabu", "id": 4432952, "login": "tipabu", "node_id": "MDQ6VXNlcjQ0MzI5NTI=", "organizations_url": "https://api.github.com/users/tipabu/orgs", "received_events_url": "https://api.github.com/users/tipabu/received_events", "repos_url": "https://api.github.com/users/tipabu/repos", "site_admin": false, "starred_url": "https://api.github.com/users/tipabu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/tipabu/subscriptions", "type": "User", "url": "https://api.github.com/users/tipabu", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-12-29T21:51:44Z
2021-09-08T05:01:06Z
2015-12-29T22:06:16Z
NONE
resolved
Previously, an empty chunk in a request body iterator would be faithfully translated to a 0-length chunk on the wire, which the server would interpret as the end of the transfer. That is, a request like ``` def data(): yield 'foo' yield '' yield 'bar' requests.put(url, data=data()) ``` ... would cause the server to receive a chunk for 'foo', receive an empty chunk, and close the connection before receiving 'bar'. Further, if `Connection: keep-alive` is set, the server will attempt (and fail) to parse the next chunk, serve a 400, and close the connection. Now, the empty chunk will be ignored, and the server should receive all of the actual bytes emitted from the iterable.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2952/reactions" }
https://api.github.com/repos/psf/requests/issues/2952/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2952.diff", "html_url": "https://github.com/psf/requests/pull/2952", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2952.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2952" }
true
[ "This is a breaking change that cannot be merged for 2.x but has been fixed for 3.0.0 in #2631 by @neosab.\n" ]
https://api.github.com/repos/psf/requests/issues/2951
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2951/labels{/name}
https://api.github.com/repos/psf/requests/issues/2951/comments
https://api.github.com/repos/psf/requests/issues/2951/events
https://github.com/psf/requests/issues/2951
124,191,541
MDU6SXNzdWUxMjQxOTE1NDE=
2,951
TypeError: unsupported operand type(s) for -=: 'unicode' and 'int'
{ "avatar_url": "https://avatars.githubusercontent.com/u/13664257?v=4", "events_url": "https://api.github.com/users/vkosuri/events{/privacy}", "followers_url": "https://api.github.com/users/vkosuri/followers", "following_url": "https://api.github.com/users/vkosuri/following{/other_user}", "gists_url": "https://api.github.com/users/vkosuri/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/vkosuri", "id": 13664257, "login": "vkosuri", "node_id": "MDQ6VXNlcjEzNjY0MjU3", "organizations_url": "https://api.github.com/users/vkosuri/orgs", "received_events_url": "https://api.github.com/users/vkosuri/received_events", "repos_url": "https://api.github.com/users/vkosuri/repos", "site_admin": false, "starred_url": "https://api.github.com/users/vkosuri/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vkosuri/subscriptions", "type": "User", "url": "https://api.github.com/users/vkosuri", "user_view_type": "public" }
[]
closed
true
null
[]
null
11
2015-12-29T10:22:09Z
2021-09-08T20:00:51Z
2015-12-29T18:30:34Z
NONE
resolved
While sending huge amount requests to Unit rejecting requests, However I am using max_retries=20 still i am getting this error message ``` File "/home/mkosuri/venv/local/lib/python2.7/site-packages/RequestsLibrary/RequestsKeywords.py", line 258, in post_request response = self._post_request(session, uri, data, headers, files, redir, timeout) File "/home/mkosuri/venv/local/lib/python2.7/site-packages/RequestsLibrary/RequestsKeywords.py", line 556, in _post_request allow_redirects=allow_redirects) File "/home/mkosuri/venv/local/lib/python2.7/site-packages/requests/sessions.py", line 511, in post return self.request('POST', url, data=data, json=json, **kwargs) File "/home/mkosuri/venv/local/lib/python2.7/site-packages/requests/sessions.py", line 468, in request resp = self.send(prep, **send_kwargs) File "/home/mkosuri/venv/local/lib/python2.7/site-packages/requests/sessions.py", line 576, in send r = adapter.send(request, **kwargs) File "/home/mkosuri/venv/local/lib/python2.7/site-packages/requests/adapters.py", line 370, in send timeout=timeout File "/home/mkosuri/yang_gen/venv/local/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py", line 609, in urlopen _stacktrace=sys.exc_info()[2]) File "/home/mkosuri/venv/local/lib/python2.7/site-packages/requests/packages/urllib3/util/retry.py", line 226, in increment total -= 1 ``` Any workaround for this issue?
{ "avatar_url": "https://avatars.githubusercontent.com/u/13664257?v=4", "events_url": "https://api.github.com/users/vkosuri/events{/privacy}", "followers_url": "https://api.github.com/users/vkosuri/followers", "following_url": "https://api.github.com/users/vkosuri/following{/other_user}", "gists_url": "https://api.github.com/users/vkosuri/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/vkosuri", "id": 13664257, "login": "vkosuri", "node_id": "MDQ6VXNlcjEzNjY0MjU3", "organizations_url": "https://api.github.com/users/vkosuri/orgs", "received_events_url": "https://api.github.com/users/vkosuri/received_events", "repos_url": "https://api.github.com/users/vkosuri/repos", "site_admin": false, "starred_url": "https://api.github.com/users/vkosuri/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vkosuri/subscriptions", "type": "User", "url": "https://api.github.com/users/vkosuri", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2951/reactions" }
https://api.github.com/repos/psf/requests/issues/2951/timeline
null
completed
null
null
false
[ "Do you receive it always? Under what conditions do you receive it? What does your code look like? What version of requests are you using? From where did you install it?\n", "Do you receive it always? \n\n> Yes\n\nUnder what conditions do you receive it? \n\n> I am sending around different 5K combination of GET, PUT, PATCH, POST \n\nWhat does your code look like?\n\n``` Python\nresp = session.get(self._get_url(session, uri),\n headers=headers,\n params=params,\n cookies=self.cookies, timeout=self.timeout,\n allow_redirects=allow_redirects) \n```\n\nsimilarly for all other sessions\n\nWhat version of requests are you using?\n\n> 2.8\n\nFrom where did you install it?\n\n> I am using Jenkins virtual environments, through pip install --upgrade -r requirements\n> requests >= 2.8\n", "So you're not under any circumstances setting the `retries` parameter, or any other similar parameter? Do you have values set on the session? Do you have a custom transport adapter?\n", "I am using HTTPAdapter for max_retries, while creating session\n\n``` Python\ns = session = requests.Session()\nif max_retries > 0:\n a = requests.adapters.HTTPAdapter(max_retries=max_retries)\n s.mount('https://', a)\n s.mount('http://', a)\n```\n", "Ok, can you please print the values of `max_retries` for each adapter before each request? Both the value and the type. Something like this:\n\n``` python\nfor k, v in s.adapters.items():\n print \"Adapter %s: max_retries is %s (%s)\" % (k, v, type(v))\n```\n", "Send a GET request on the session object found using the\n\n```\nStart / End / Elapsed: 20151229 17:24:56.627 / 20151229 17:24:56.636 / 00:00:00.009\n17:24:56.629 INFO Starting new HTTP connection (7): 10.11.13.96 \n17:24:56.635 INFO Adapter https://: max_retries is <requests.adapters.HTTPAdapter object at 0x7f6ca703a250> (<class 'requests.adapters.HTTPAdapter'>)\nAdapter http://: max_retries is <requests.adapters.HTTPAdapter object at 0x7f6ca703a250> (<class 'requests.adapters.HTTPAdapter'>) \n17:24:56.636 FAIL TypeError: unsupported operand type(s) for -=: 'unicode' and 'int'\n```\n\nSend a POST request on the session object found using the\n\n```\nStart / End / Elapsed: 20151229 17:24:56.644 / 20151229 17:24:56.653 / 00:00:00.009\n17:24:56.645 INFO Starting new HTTP connection (8): 10.11.13.96 \n17:24:56.652 INFO Adapter https://: max_retries is <requests.adapters.HTTPAdapter object at 0x7f6ca703a250> (<class 'requests.adapters.HTTPAdapter'>)\nAdapter http://: max_retries is <requests.adapters.HTTPAdapter object at 0x7f6ca703a250> (<class 'requests.adapters.HTTPAdapter'>) \n17:24:56.652 FAIL TypeError: unsupported operand type(s) for -=: 'unicode' and 'int'\n```\n\nSend a PATCH request on the session object found using the\n\n```\nStart / End / Elapsed: 20151229 17:24:56.683 / 20151229 17:24:56.688 / 00:00:00.005\n17:24:56.684 INFO Starting new HTTP connection (10): 10.11.13.96 \n17:24:56.688 INFO Adapter https://: max_retries is <requests.adapters.HTTPAdapter object at 0x7f6ca703a250> (<class 'requests.adapters.HTTPAdapter'>)\nAdapter http://: max_retries is <requests.adapters.HTTPAdapter object at 0x7f6ca703a250> (<class 'requests.adapters.HTTPAdapter'>) \n17:24:56.688 FAIL TypeError: unsupported operand type(s) for -=: 'unicode' and 'int'\n```\n", "Oh, I apologise, my sample code was wrong (this is what I get for responding to bug reports before caffeine). Try this one:\n\n``` python\nfor k, v in s.adapters.items():\n print \"Adapter %s: max_retries is %s (%s)\" % (k, v.max_retries, type(v.max_retries))\n```\n", "Send a GET request on the session object found using the\n\n```\nStart / End / Elapsed: 20151229 17:36:24.093 / 20151229 17:36:24.099 / 00:00:00.006\n17:36:24.094 INFO Starting new HTTP connection (9): 10.11.11.179 \n17:36:24.098 INFO Adapter https://: max_retries is Retry(total=20, connect=None, read=None, redirect=None) (<class 'requests.packages.urllib3.util.retry.Retry'>)\nAdapter http://: max_retries is Retry(total=20, connect=None, read=None, redirect=None) (<class 'requests.packages.urllib3.util.retry.Retry'>) \n17:36:24.098 FAIL TypeError: unsupported operand type(s) for -=: 'unicode' and 'int'\n```\n\nSend a POST request on the session object found using the\n\n```\n17:36:24.109 INFO Starting new HTTP connection (10): 10.11.11.179 \n17:36:24.113 INFO Adapter https://: max_retries is Retry(total=20, connect=None, read=None, redirect=None) (<class 'requests.packages.urllib3.util.retry.Retry'>)\nAdapter http://: max_retries is Retry(total=20, connect=None, read=None, redirect=None) (<class 'requests.packages.urllib3.util.retry.Retry'>) \n17:36:24.114 FAIL TypeError: unsupported operand type(s) for -=: 'unicode' and 'int'\n```\n\nSend a PATCH request on the session object found using the\n\n```\n17:36:24.421 INFO Starting new HTTP connection (30): 10.11.11.179 \n17:36:24.430 INFO Adapter https://: max_retries is Retry(total=20, connect=None, read=None, redirect=None) (<class 'requests.packages.urllib3.util.retry.Retry'>)\nAdapter http://: max_retries is Retry(total=20, connect=None, read=None, redirect=None) (<class 'requests.packages.urllib3.util.retry.Retry'>) \n17:36:24.431 FAIL TypeError: unsupported operand type(s) for -=: 'unicode' and 'int'\n```\n", "Hmm, well this error looks impossible. Can you turn on debug level logging for the `requests` package please, and then show me the logs from requests just before the error?\n", "Hey @vkosuri what is the library providing \"RequestsLibrary/RequestsKeywords.py\" in your traceback? It seems like you're using something else that calls requests. If you have a link to that source, I'd like to inspect that as well.\n", "Sorry, The root cause of the issue is, Robot framework scalar variable are Unicode values, we have to explicitly type cast variable while passing to other library procedures. https://github.com/robotframework/robotframework/issues/2273 \n\nI resolved this issue by type casting max_retries to integer\n\n``` Python\ns = session = requests.Session()\nif max_retries > 0:\n a = requests.adapters.HTTPAdapter(max_retries=int(max_retries))\n s.mount('https://', a)\n s.mount('http://', a\n```\n\nThank all your support.\n" ]
https://api.github.com/repos/psf/requests/issues/2950
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2950/labels{/name}
https://api.github.com/repos/psf/requests/issues/2950/comments
https://api.github.com/repos/psf/requests/issues/2950/events
https://github.com/psf/requests/issues/2950
124,181,847
MDU6SXNzdWUxMjQxODE4NDc=
2,950
Unable to upload chunks to Flask
{ "avatar_url": "https://avatars.githubusercontent.com/u/5459596?v=4", "events_url": "https://api.github.com/users/johaven/events{/privacy}", "followers_url": "https://api.github.com/users/johaven/followers", "following_url": "https://api.github.com/users/johaven/following{/other_user}", "gists_url": "https://api.github.com/users/johaven/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/johaven", "id": 5459596, "login": "johaven", "node_id": "MDQ6VXNlcjU0NTk1OTY=", "organizations_url": "https://api.github.com/users/johaven/orgs", "received_events_url": "https://api.github.com/users/johaven/received_events", "repos_url": "https://api.github.com/users/johaven/repos", "site_admin": false, "starred_url": "https://api.github.com/users/johaven/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/johaven/subscriptions", "type": "User", "url": "https://api.github.com/users/johaven", "user_view_type": "public" }
[]
closed
true
null
[]
null
32
2015-12-29T08:46:10Z
2015-12-31T14:26:03Z
2015-12-29T13:35:40Z
NONE
null
I do not know if it comes from flask or requests , I link this issue posted on Flask issues https://github.com/mitsuhiko/flask/issues/1668
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2950/reactions" }
https://api.github.com/repos/psf/requests/issues/2950/timeline
null
completed
null
null
false
[ "Are you sure the headers you posted in the Flask issue are right? They look totally invalid. \n", "Yes, these headers are received by Flask.\nI'm working on localhost, request between \"requests\" and \"Flask\" are send directly.\n\nHowever I just tested this same case on a server with (flask / uwsgi / nginx and it works,uwsgi seems to put the content in a buffer before writing but that's another problem\n\nIs this a content decoding problem between requests and flask ? I don't know yet.\n\nFor information on my server (flask / nginx / uwsgi) I have the same header (but Content-Length is set):\n\n> Transfer-Encoding: chunked\n> Content-Length: 30617496\n> User-Agent: python-requests/2.8.1\n> Connection: keep-alive\n> Host: xxx.com\n> Accept: _/_\n> Content-Type: \n> Accept-Encoding: gzip, deflate\n", "That data is also wrong: transfer-encoding chunked should not be sent with content-length, really.\n\nI suspect Flask's built-in web server may have problems with chunked transfer encoding. Why are you using the generator instead of passing the file object directly to requests?\n", "I planned to attach a progress indicator and it is not possible if I pass the file object.\n", "@yoyoprs Sure it is.\n\n``` python\nclass ProgressFileWrapper(object):\n def __init__(self, file):\n self._file = file\n\n def read(self, amt=None):\n print \"reading\"\n return self._file.read(amt)\n\n def __getattr__(self, name):\n return getattr(self._file, name)\n```\n", "if i do this as the advanced example:\n\n``` python\nwith open(localpath, 'rb') as f:\n r = self.session.post(url, data=f, allow_redirects=False, params=params)\n```\n\ni lost the chunked transfer encoding:\n\n> Content-Length: 25468\n> User-Agent: python-requests/2.8.1\n> Connection: keep-alive\n> Host: xxx.com\n> Accept: _/_\n> Content-Type: \n> Accept-Encoding: gzip, deflate\n", "@yoyoprs That's because that's the defined behaviour of requests. This is why I have suggested using file objects directly, or via the wrapping proxy defined above.\n", "@Lukasa I do not understand the usefulness of the wrapper (except for progression).\n\nMy transfer is not streamed, (except if I control the sending of blocks in the proxy object, we agree) but the server does not receive header that tells it that the transfer is chunked. Should I force the header before send the request? Without this header, it will be complicated for components like uwsgi to understand the request and to avoid content buffering (although in my case for now).\n\nIt is not stated anywhere in the documentation it is not possible to stream with post method the contents of a file (using a generator or not) and I still do not understand why Flask (without nginx and uwsgi) can not read the stream\n", "@yoyoprs I don't understand what you're talking about.\n\nA file object passed to the `data` argument _is_ streamed, as the documentation [clearly states](http://docs.python-requests.org/en/latest/user/advanced/#streaming-uploads). The exact chunk of code you posted above streams the file.\n\nIn this context, what we mean by _streamed_ is that Requests does not load the file entirely into memory: instead, we loop over the file sending 8kB worth of data with each send to the socket.\n\nWhat you're talking about is not streaming, it is _chunked encoding_. Chunked encoding is a different notion: it refers to the way the data is constructed and sent on the wire. Once again, it is possible to do this: sending a generator does that exactly. This is why I believe your problem is not with requests: I believe that someone in your _server_ stack is adding in a content-length header. Requests most certainly is not. To prove it, let me use the following script to upload some data to http2bin:\n\n``` python\nimport requests\n\ndef gen():\n with open('/path/to/large/logfile', 'rb') as f:\n while True:\n data = f.read(1024)\n if not data:\n break\n yield data\nsession = requests.Session()\nr = session.post('http://http2bin.org/post', data=gen())\nprint r.status_code\nprint r.content\n```\n\nI captured the data I sent using Wireshark, which records _exactly_ the bytes on the wire, without any application processing. This says my request looked like this:\n\n```\nPOST /post HTTP/1.1\nHost: http2bin.org\nTransfer-Encoding: chunked\nConnection: keep-alive\nAccept-Encoding: gzip, deflate\nAccept: */*\nUser-Agent: python-requests/2.9.0\n\n400\nmachine-0: 2015-06-28 14:06:00 INFO juju.cmd supercommand.go:37 running jujud [1.24.0-trusty-amd64 gc]\nmachine-0: 2015-06-28 14:06:00 DEBUG juju.agent agent.go:425 read agent config, format \"1.18\"\nmachine-0: 2015-06-28 14:06:00 INFO juju.cmd.jujud machine.go:419 machine agent machine-0 start (1.24.0-trusty-amd64 [gc])\nmachine-0: 2015-06-28 14:06:00 DEBUG juju.wrench wrench.go:112 couldn't read wrench directory: stat /var/lib/juju/wrench: no such file or directory\nmachine-0: 2015-06-28 14:06:00 INFO juju.cmd.jujud upgrade.go:87 no upgrade steps required or upgrade steps for 1.24.0 have already been run.\nmachine-0: 2015-06-28 14:06:00 INFO juju.network network.go:194 setting prefer-ipv6 to false\nmachine-0: 2015-06-28 14:06:00 INFO juju.worker runner.go:261 start \"api\"\nmachine-0: 2015-06-28 14:06:00 INFO juju.worker runner.go:261 start \"statestarter\"\nmachine-0: 2015-06-28 14:06:00 INFO juju.api apiclient.go:331 dialing \"wss://localhost:17070/environment/adad29a3-1105-4483-863b-9496d5495c82/api\"\nmachine-\n400\n0: 2015-06-28 14:06:00 INFO juju.worker runner.go:261 start \"termination\"\nmachine-0: 2015-06-28 14:06:00 INFO juju.worker runner.go:261 start \"state\"\nmachine-0: 2015-06-28 14:06:00 INFO juju.api apiclient.go:339 error dialing \"wss://localhost:17070/environment/adad29a3-1105-4483-863b-9496d5495c82/api\": websocket.Dial wss://localhost:17070/environment/adad29a3-1105-4483-863b-9496d5495c82/api: dial tcp 127.0.0.1:17070: connection refused\nmachine-0: 2015-06-28 14:06:00 ERROR juju.worker runner.go:219 exited \"api\": unable to connect to \"wss://localhost:17070/environment/adad29a3-1105-4483-863b-9496d5495c82/api\"\nmachine-0: 2015-06-28 14:06:00 INFO juju.worker runner.go:253 restarting \"api\" in 3s\nmachine-0: 2015-06-28 14:06:00 DEBUG juju.service discovery.go:118 discovered init system \"upstart\" from local host\nmachine-0: 2015-06-28 14:06:00 DEBUG juju.cmd.jujud machine.go:1312 mongodb service is installed\nmachine-0: 2015-06-28 14:06:00 INFO juju.mongo open.go:125 dialled mongo successfully on address \"127.0.\n\n[ many more chunks of data stripped]\n```\n\nThis is a chunked transfer encoded body, as shown by the headers. There is no content length. The body is correctly encoded.\n\nRequests is _not_ getting this wrong. This is why I asked about your headers.\n\nNote, however, that WSGI provides no specific guidance on how a WSGI server should handle chunked uploads. Therefore, I recommend you look at the interface between uWSGI and Flask.\n", "@Lukasa sorry for my broken English, on the flask side issue I mention \"chunked\" and not on this one, my mistake.\nWhat interests me is to successfully write directly content of chunked transfer on Flask side and so now I know it is not \"requests\" the problem (i was redirected here ...)\n\nThere is a good chance that Flask adds this header, but it does not explain why without uwsgi interface (with it works), it is impossible to read the stream (Flask would require the Content-Length to start streaming ? Maybe ...)\n\nI will make several tests (without/with uwsgi interface) and try this: http://uwsgi-docs.readthedocs.org/en/latest/Transformations.html#streaming-vs-buffering\n\nThank you for your patience really.\n", "@yoyoprs do I understand correctly that we can close this now?\n", "@sigmavirus24 yes you can :)\n", "Okay @yoyoprs. I'll be in #python-requests on chat.freenode.net (IRC network) if you want to some one to chat with about this while you're debugging.\n", "@Lukasa Just a question off issue ... with this method:\n\n``` python\nwith open(localpath, 'rb') as f:\n r = self.session.post(url, data=f, allow_redirects=False, params=params)\n```\n\nDo you have a bug when the file size is 0 byte ?\n", "We did: [it was fixed in 2.9.0](https://github.com/kennethreitz/requests/blob/master/HISTORY.rst#290-2015-12-15).\n", "I use 2.9.1 ...\n", "Did you update? Because the headers you posted above said you were using 2.8.1. If you did update to 2.9.1, what problem are you seeing?\n", "Yes i have updated. I get a ConnectionError when file size is 0 byte.\n", "What ConnectionError?\n", " File \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 511, in post\n return self.request('POST', url, data=data, json=json, *_kwargs)\n File \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 468, in request\n resp = self.send(prep, *_send_kwargs)\n File \"C:\\Python27\\lib\\site-packages\\requests\\sessions.py\", line 576, in send\n r = adapter.send(request, **kwargs)\n File \"C:\\Python27\\lib\\site-packages\\requests\\adapters.py\", line 426, in send\n raise ConnectionError(err, request=request)\nrequests.exceptions.ConnectionError: ('Connection aborted.', BadStatusLine(\"''\",))\n", "So this suggests the server doesn't like the request at all. Do you have any logs from your server?\n", "On server side (Timeout): \"POST /api/.../ HTTP/1.1\" 408 0 \"-\" \"python-requests/2.9.1\"\nWith others files (where size is > 0), it works fine. \n", "408 is a weird request timeout issue, so it seems like something in your server stack is getting this wrong. Do you want to confirm that your flask app will handle this properly?\n\nRequests is sending a totally valid body in this case, specifically an empty chunked encoded body.\n\n```\nPOST /post HTTP/1.1\nHost: http2bin.org\nTransfer-Encoding: chunked\nConnection: keep-alive\nAccept-Encoding: gzip, deflate\nAccept: */*\nUser-Agent: python-requests/2.9.0\n\n0\n\n```\n\nThis means that the problem is server-side.\n", "I just tested, everything works with version 2.8.1\n\nI think the answer is:\nhttp://nginx.org/en/docs/http/ngx_http_uwsgi_module.html#uwsgi_request_buffering\nuwsgi_request_buffering is off on my nginx, to allow Flask write directly without pre-buffer data ...\n\nWhen you turn on Transfer-Encoding Chunked in 2.9.1, by default nginx want to put content in buffer ... and there is nothing to put in buffer (because no content) so he waits until the timeout ...\n", "nginx should be able to cope with an empty chunked body, and if it can't that's a bug with nginx.\n", "That seems normal that nginx wait chunked data, it does not know the content length.\nThere is a way to delete header \"Transfer-Encoding\" before send the post to avoid this type of case ?\n", "@yoyoprs This is _not_ normal. The way to signal the end of a chunked body is with a zero-length chunk (specifically, the bytes `0\\r\\n\\r\\n`). Requests has done that: the body is signaled to be complete, and it just happens to have zero length.\n\nnginx _must_ cope with this, it's a specification violation for it not to.\n", "Thank's for these details.\nWould there be a workaround ?\n", "Yes: do not send zero-length files. =) If `requests.utils.super_len()` returns `0` when passed the file object, simply don't send it.\n", "Or stay with 2.8.1 for life: p\nI reach my goals with a prepared query, thank you gain ;)\n" ]
https://api.github.com/repos/psf/requests/issues/2949
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2949/labels{/name}
https://api.github.com/repos/psf/requests/issues/2949/comments
https://api.github.com/repos/psf/requests/issues/2949/events
https://github.com/psf/requests/issues/2949
124,130,987
MDU6SXNzdWUxMjQxMzA5ODc=
2,949
Session's Authorization header isn't sent on redirect
{ "avatar_url": "https://avatars.githubusercontent.com/u/287161?v=4", "events_url": "https://api.github.com/users/jwineinger/events{/privacy}", "followers_url": "https://api.github.com/users/jwineinger/followers", "following_url": "https://api.github.com/users/jwineinger/following{/other_user}", "gists_url": "https://api.github.com/users/jwineinger/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jwineinger", "id": 287161, "login": "jwineinger", "node_id": "MDQ6VXNlcjI4NzE2MQ==", "organizations_url": "https://api.github.com/users/jwineinger/orgs", "received_events_url": "https://api.github.com/users/jwineinger/received_events", "repos_url": "https://api.github.com/users/jwineinger/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jwineinger/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jwineinger/subscriptions", "type": "User", "url": "https://api.github.com/users/jwineinger", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" } ]
closed
true
null
[]
null
37
2015-12-28T21:44:14Z
2021-12-07T23:00:27Z
2021-09-08T22:43:04Z
NONE
resolved
I'm using requests to hit developer-api.nest.com and setting an Authorization header with a bearer token. On some requests, that API responds with an 307 redirect. When that happens, I still need the Authorization header to be sent on the subsequent request. I've tried using `requests.get()` as well as a session. I suppose I could work around this by not allowing redirects, detecting the 307 and then issuing the new request myself but I'm wondering if this is a bug. Should I expect that the Authorization header would be sent on all requests made within the context of a session? ``` python In [41]: s = requests.Session() In [42]: s.headers Out[42]: {'Accept': '*/*', 'Accept-Encoding': 'gzip, deflate', 'Connection': 'keep-alive', 'User-Agent': 'python-requests/2.7.0 CPython/3.4.3 Darwin/15.2.0'} In [43]: s.headers['Authorization'] = "Bearer <snip>" In [45]: s.get("https://developer-api.nest.com/devices/thermostats/") Out[45]: <Response [401]> In [46]: s.get("https://developer-api.nest.com/devices/thermostats/") Out[46]: <Response [200]> In [49]: Out[45].history Out[49]: [<Response [307]>] In [50]: Out[46].history Out[50]: [] In [51]: Out[45].request.headers Out[51]: {'Accept': '*/*', 'Accept-Encoding': 'gzip, deflate', 'Connection': 'keep-alive', 'User-Agent': 'python-requests/2.7.0 CPython/3.4.3 Darwin/15.2.0'} In [52]: Out[46].request.headers Out[52]: {'Accept': '*/*', 'Accept-Encoding': 'gzip, deflate', 'Connection': 'keep-alive', 'User-Agent': 'python-requests/2.7.0 CPython/3.4.3 Darwin/15.2.0', 'Authorization': 'Bearer <snip>'} ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 3, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 3, "url": "https://api.github.com/repos/psf/requests/issues/2949/reactions" }
https://api.github.com/repos/psf/requests/issues/2949/timeline
null
completed
null
null
false
[ "Where is the redirect to?\n", "Ah, a different domain. firebase-apiserver03-tah01-iad01.dapi.production.nest.com\n", "Yup, so that's somewhat deliberate: we're very aggressive with stripping authorization headers when redirected to a new host. This is a safety feature to deal with [CVE 2014-1829](http://www.cve.mitre.org/cgi-bin/cvename.cgi?name=2014-1829), which was caused by us persisting headers on off-host redirects.\n\nHowever, from a certain perspective we've still got a bug here, because you set the Authorization header on the `Session`, not the request. In principle, what this means is \"I don't care where the redirect goes, add the header\". I still _think_ I'd rather have this approach, which at least ensures that we're not open to any form of attack, even if it makes this specific instance somewhat trickier. However, I'm open to being convinced that we're being too paranoid here.\n", "> However, I'm open to being convinced that we're being too paranoid here.\n\nI'm less open to being convinced but willing to listen.\n\nThat said, a separate Auth mechanism could be written to persist such headers across _allowed_ domains which would necessitate us doing some work in [`rebuild_auth`](https://github.com/kennethreitz/requests/blob/cbb0830ee9072d7d0bf8f28efe4e5825d69a9eb4/requests/sessions.py#L203).\n", "I wont argue with the safety of how it works right now. It would be nice to have some mechanism to opt into the \"unsafe\" behavior though. Perhaps setting a base domain to persist those headers to (nest.com, in this case) or perhaps a list of domains that are OK to send them to. \n", "Yeah that's far more complexity than the core of requests will ever provide though. That's why I'm wondering if a separate Auth class/handler might work best for this sort of thing. I'm not convinced it will work though because I'm fairly certain that we do not unconditionally call `prepare_auth`.\n", "It won't work in the standard model because we don't unconditionally call `prepare_auth`. However, a Transport Adapter could be used to fulfil this role, even if it is a slightly unusual use of that API.\n", "I think a TA is absolutely the wrong thing to recommend here though.\n", "- If auth is provided to a session, it should be sent for every request that session makes.\n- Perhaps we should remove `session.auth`. It's not particularly useful.\n", "> If auth is provided to a session, it should be sent for every request that session makes.\n\nI fundamentally disagree. Sessions aren't used for a single domain, if they were, I'd have no problem with this. \n\n> Perhaps we should remove session.auth. It's not particularly useful.\n\nI think it is useful. I think it would be better if assigning a tuple to it were not allowed though. I'd rather see a Auth class that specifies which domains to use it for. We could just adopt [the AuthHandler](https://toolbelt.readthedocs.org/en/latest/authentication.html#authhandler) from requests-toolbelt which allows people to specify credentials for a domain when using requests. This provides a slightly more secure way of handling Session based authentication. The downside is that it requires users opt-in to that kind of authentication though.\n", "I also need this to be fixed so I can make headers persist for redirects.\n", "@jtherrmann If this is an auth header, the easiest way to work around the problem is to set a session-level auth handler that simply always puts the header you want on the request.\n", "Has any progress or additional consideration been given to this?\nI am running in to the same issue.\n", "@ethanroy No additional consideration beyond my suggestion of using a session-level auth handler, in the comment directly above yours.\n", "Related: if a session redirects and strips auth, calling `get` again reapplies the auth and uses the cached redirect. So knock twice and you get in. Intended behaviour?\n\n```\n>>> s = requests.Session()\n>>> s.headers.update({\"Authorization\": \"Token {}\".format(API_TOKEN)})\n>>> s.get(url)\n\n<Response [403]>\n\n>>> s.get(url)\n\n<Response [200]>\n```\n", "@GregBakker Yes, ish. It's a confluence of intended behaviours. However, this bug notes that the original 403 shouldn't happen.\n", "@Lukasa when you say \"the easiest way to work around the problem is to set a session-level auth handler,\" is that something that works today? Based on what I'm seeing in the code, the answer is no but your wording makes me wonder if I'm missing something. You're talking about setting the Session auth attribute, right?\n", "Yeah, that _should_ work.\n", "@jwineinger so how did you end up getting around this problem? it still seems to behave the same.\n", "There's two Nest-specific workarounds. \n\nOne is to pass the `auth` parameter with the access_token rather than using the Authorization header. I found this on https://gist.github.com/tylerdave/409ffa08e1d47b1a1e23\n\nAnother is to save a dictionary with the headers you'd use, don't follow redirects, and then make a second request passing in the headers again:\n\n``` python\n headers = {'Authorization': 'Bearer ' + access_token, 'Content-Type': 'application/json'}\n initial_response = requests.get('https://developer-api.nest.com', headers=headers, allow_redirects=False)\n if initial_response.status_code == 307:\n api_response = requests.get(initial_response.headers['Location'], headers=headers, allow_redirects=False)\n```\n", "I encountered this same problem and got around it by overriding the `rebuild_auth` method in a custom `requests.Session` implementation:\r\n\r\n```python\r\nfrom requests import Session\r\n\r\nclass CustomSession(Session):\r\n def rebuild_auth(self, prepared_request, response):\r\n return\r\n\r\ns = CustomSession()\r\ns.get(url, auth=(\"username\", \"password\"))\r\n```", "@sigmavirus24 what is wrong with @gabriel-loo's [solution](https://github.com/requests/requests/issues/2949#issuecomment-288858676)? Security concerns?", "@j08lue yes. Please read the thread. There are CVE's associated with **not** stripping authentication before following arbitrary redirects to a new domain. Think about the problem this way:\r\n\r\nI'm making requests to `api.github.com` and an attacker manages to make me follow a redirect to `another-domain.com` that they control and I pass along my token with write access to my repositories (including requests) then it can appear as if I'm making commits to requests when in fact they are making those commits via the API. They can include code in Requests that will weaken its security posture and possibly actively harm you. That's what could happen when you unconditionally send your authentication credentials on every redirect.\r\n\r\nEven so, let's say the redirect isn't malicious, are you actually comfortable leaking your credentials for a service to another company or service? The original service may store confidential data for you, your customers, or something else. Even if the new domain that you've been redirected to doesn't use your credentials but potentially logs them as unexpected data, someone who attacks them and can retrieve those logs can then use your credentials against the original domain if they can puzzle out where they belong. Are you **really** willing to take that risk?", "Thanks for the illustration, @sigmavirus24. If this concern ultimately prohibits forwarding sensitive headers to redirects, then why is this thread still open? I could not think of a more appropriate error than the one you get (403), so there is no bug need for action here, is there? Or what [did you have in mind](https://github.com/requests/requests/issues/2949#issuecomment-245575963), @Lukasa?", "I was hitting this issue recently when working with a non-public API. The security concerns totally make sense as a reason for stripping auth out on redirects. I think a solution like @gabriel-loo's is something folks can consider if they believe they're in a safe enough environment to do so. Or the session level handler. Or find another way to work around by skipping the redirect entirely as suggested above, if that's possible. So in line with the view this isn't really a bug.\r\n\r\nHowever, I burned more time than I probably needed to confused about why a handful of other non-Python HTTP clients _did_ pass on the auth header and were working fine when this was not. One suggestion: it might be nice to issue a warning via `warnings` here to make it more clear to callers when the header is present and being stripped. I'd imagine it's rare that this is something a caller would _not_ want to be warned about.\r\n ", "@tlantz normally that would seem pretty reasonable. Requests as a project (as well as urllib3, one of its dependencies) has caused a significant amount of ire when it issues any sort of warning whether via the warnings module or via logging. Further, the warnings module is for things that people should take action on, for example, not using a version of Python that has been compiled against a recent version of OpenSSL.\r\n\r\nIn most cases, this behaviour isn't as problematic as, for example, being unable to verify a ceritificate for a TLS connection. That obviously doesn't help you or anyone else who has expressed their genuine and valid frustration on this issue. With that in mind, I wonder if it wouldn't be better to attempt to log this at the `DEBUG` level. If someone is using logging (generally a decent practice) and enables that level it show up for them. Futher, given how *little* Requests itself logs, this will be fairly prominent as a debug log. Does that seem like a fair trade-off?", "Yeah, that seems like a totally fair tradeoff. Reasoning around `warnings` makes sense to me. I think by the time you've been puzzled for 30 minutes or so you're usually adding `logging` around your own stuff anyway at `DEBUG`, so I think a `DEBUG` message would hit 95% of the cases where people are stuck trying to figure out what's not working.", "I use a session to hold the Authorization header, but it is not being sent in a redirect\r\nrequests (2.18.4)", "A coworker and I spent at least a couple hours debugging an issue directly related to this behavior. My use case is redirecting an API call from `api.my-example-site.org` to `www.api.my-example-site.org`. The headers were being stripped on the redirect.\r\n\r\n**If this is intended behavior (or if it won't be changed in the near future), can we please at least add it to the documentation?** I read and re-read the docs trying to figure out what I was doing incorrectly, and I even read through all the code in the `Request` class. If I had seen a warning about this behavior in the documentation, I would have fixed my issue in a couple minutes (which is the time it took after I found this thread). Perhaps we were reading in the wrong part of the documentation, however.", "Hi @ndmeiri, we do have a call out on this in the quick-start guide for Requests under the [Custom Headers](http://docs.python-requests.org/en/master/user/quickstart/#custom-headers) heading. If you feel there's a better place to put this, we're happy to review any suggestions you have. I would prefer we move that to a separate issue or PR though since it's not directly related to this ticket. Thanks!" ]
https://api.github.com/repos/psf/requests/issues/2948
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2948/labels{/name}
https://api.github.com/repos/psf/requests/issues/2948/comments
https://api.github.com/repos/psf/requests/issues/2948/events
https://github.com/psf/requests/pull/2948
124,080,084
MDExOlB1bGxSZXF1ZXN0NTQ2NjIwMDE=
2,948
advanced: use "client.*" to designate client certificate
{ "avatar_url": "https://avatars.githubusercontent.com/u/631446?v=4", "events_url": "https://api.github.com/users/vincentbernat/events{/privacy}", "followers_url": "https://api.github.com/users/vincentbernat/followers", "following_url": "https://api.github.com/users/vincentbernat/following{/other_user}", "gists_url": "https://api.github.com/users/vincentbernat/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/vincentbernat", "id": 631446, "login": "vincentbernat", "node_id": "MDQ6VXNlcjYzMTQ0Ng==", "organizations_url": "https://api.github.com/users/vincentbernat/orgs", "received_events_url": "https://api.github.com/users/vincentbernat/received_events", "repos_url": "https://api.github.com/users/vincentbernat/repos", "site_admin": false, "starred_url": "https://api.github.com/users/vincentbernat/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vincentbernat/subscriptions", "type": "User", "url": "https://api.github.com/users/vincentbernat", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-12-28T15:22:45Z
2021-09-08T05:01:06Z
2015-12-28T15:58:54Z
CONTRIBUTOR
resolved
Using "server.crt" is confusing as one may try to put the server certificate while this is really the client certificate that should be put here. Instead, use "client.cert", "client.key" and "client.pem".
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2948/reactions" }
https://api.github.com/repos/psf/requests/issues/2948/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2948.diff", "html_url": "https://github.com/psf/requests/pull/2948", "merged_at": "2015-12-28T15:58:54Z", "patch_url": "https://github.com/psf/requests/pull/2948.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2948" }
true
[ "Good spot, thanks! :sparkles: :cake: :sparkles:\n" ]
https://api.github.com/repos/psf/requests/issues/2947
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2947/labels{/name}
https://api.github.com/repos/psf/requests/issues/2947/comments
https://api.github.com/repos/psf/requests/issues/2947/events
https://github.com/psf/requests/issues/2947
124,023,695
MDU6SXNzdWUxMjQwMjM2OTU=
2,947
No protection against line breaks in header values
{ "avatar_url": "https://avatars.githubusercontent.com/u/555363?v=4", "events_url": "https://api.github.com/users/dsavenko/events{/privacy}", "followers_url": "https://api.github.com/users/dsavenko/followers", "following_url": "https://api.github.com/users/dsavenko/following{/other_user}", "gists_url": "https://api.github.com/users/dsavenko/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/dsavenko", "id": 555363, "login": "dsavenko", "node_id": "MDQ6VXNlcjU1NTM2Mw==", "organizations_url": "https://api.github.com/users/dsavenko/orgs", "received_events_url": "https://api.github.com/users/dsavenko/received_events", "repos_url": "https://api.github.com/users/dsavenko/repos", "site_admin": false, "starred_url": "https://api.github.com/users/dsavenko/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/dsavenko/subscriptions", "type": "User", "url": "https://api.github.com/users/dsavenko", "user_view_type": "public" }
[]
closed
true
null
[]
null
10
2015-12-28T06:17:25Z
2021-09-08T15:00:54Z
2016-09-06T00:07:58Z
NONE
resolved
Hi, The following call leads to 'another-header' appearing in the headers sent to the server: ``` >>> r = requests.get('https://httpbin.org/get', params={'q': 'query'}, headers={'my-header': 'a\r\nanother-header: b'}) ``` I expected either the line break to be escaped in some way, or an exception to be raised. A new header silently appearing in the request doesn't look right. Requests version is 2.9.1, installed with pip.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2947/reactions" }
https://api.github.com/repos/psf/requests/issues/2947/timeline
null
completed
null
null
false
[ "Ugh, yeah, good spot. This isn't really ideal: while what occurs here is kinda exactly what you'd expect if you have an understanding of the system, it provides a nice shiny footgun for developers who aren't expecting it, potentially allowing for header injection if they're passing user-provided data directly to a different service. We should probably remove it.\n\nIn general, I'm inclined to want to throw errors here, resisting the temptation to guess, but @sigmavirus24 or @kennethreitz may disagree and instead want us to simply strip line terminating characters from headers.\n", "Ugh this is tough. I want to raise exceptions here as well. I'm a bit torn as to when we should introduce those exceptions though. Let's look at it like this: I _think_ most people who are using requests will not be affected by us introducing a new exception around this problem because I _think_ most people aren't going to encounter this. That said, it's still a backwards incompatible change, and the above thinking is what has caused urllib3 to break requests a couple times this year and both of the projects to break other projects (e.g., OpenStack). (That is not to say that I think we're going to break OpenStack with this change, just that we've been fairly confident in the past year that changing X wouldn't hurt anyone and then it does.)\n", "Yeah, so we need to decide whether we can either a) break our normal release policy in order to throw exceptions here, or b) treat this as a strong impetus to move to 3.0.0 in the relatively short term.\n", "I think we can but we shouldn't (break our normal release policy).\n", "So @sigmavirus24, are you thinking that we should attempt to release 3.0.0 sometime in January including a fix for this issue?\n", "When I try the request as described above I get the following traceback. So the request does not succeed, so is this issue still valid? Or would we want to turn the `ValueError` into an exception thrown by the request library? I used requests 2.9.1. \n\n```\n File \"/Users/jonathan/.virtualenvs/test/lib/python2.7/site-packages/requests/adapters.py\", line 376, in send\n timeout=timeout\n File \"/Users/jonathan/.virtualenvs/test/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py\", line 559, in urlopen\n body=body, headers=headers)\n File \"/Users/jonathan/.virtualenvs/test/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py\", line 353, in _make_request\n conn.request(method, url, **httplib_request_kw)\n File \"/usr/local/Cellar/python/2.7.10/Frameworks/Python.framework/Versions/2.7/lib/python2.7/httplib.py\", line 1053, in request\n self._send_request(method, url, body, headers)\n File \"/usr/local/Cellar/python/2.7.10/Frameworks/Python.framework/Versions/2.7/lib/python2.7/httplib.py\", line 1092, in _send_request\n self.putheader(hdr, value)\n File \"/usr/local/Cellar/python/2.7.10/Frameworks/Python.framework/Versions/2.7/lib/python2.7/httplib.py\", line 1031, in putheader\n raise ValueError('Invalid header value %r' % (one_value,))\nValueError: Invalid header value 'a\\r\\nanother-header: b'\n```\n", "Because requests runs on a wide range of platforms, we should really be defensive against this possibility rather than relying on httplib to save us. However, the issue is not as urgent as it originally seems. =)\n", "Ok :). I looked through the exceptions that requests currently has. But it seems none of the exceptions match particularly well. I saw that there is\n\n```\nclass InvalidURL(RequestException, ValueError):\n \"\"\" The URL provided was somehow invalid. \"\"\"\n```\n\nSo perhaps making an exception called `InvalidHeader` would be a good idea? The reason I'm asking is because I thought that this issue might be a good start to contribute to Requests :)\n", "I couldn't find a pull request for this, so I threw an initial pass with PR #3366.\n", "Thanks for fixing this @nateprewitt!\n" ]
https://api.github.com/repos/psf/requests/issues/2946
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2946/labels{/name}
https://api.github.com/repos/psf/requests/issues/2946/comments
https://api.github.com/repos/psf/requests/issues/2946/events
https://github.com/psf/requests/issues/2946
123,902,291
MDU6SXNzdWUxMjM5MDIyOTE=
2,946
getting unexpected output
{ "avatar_url": "https://avatars.githubusercontent.com/u/3628354?v=4", "events_url": "https://api.github.com/users/L1ghtn1ng/events{/privacy}", "followers_url": "https://api.github.com/users/L1ghtn1ng/followers", "following_url": "https://api.github.com/users/L1ghtn1ng/following{/other_user}", "gists_url": "https://api.github.com/users/L1ghtn1ng/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/L1ghtn1ng", "id": 3628354, "login": "L1ghtn1ng", "node_id": "MDQ6VXNlcjM2MjgzNTQ=", "organizations_url": "https://api.github.com/users/L1ghtn1ng/orgs", "received_events_url": "https://api.github.com/users/L1ghtn1ng/received_events", "repos_url": "https://api.github.com/users/L1ghtn1ng/repos", "site_admin": false, "starred_url": "https://api.github.com/users/L1ghtn1ng/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/L1ghtn1ng/subscriptions", "type": "User", "url": "https://api.github.com/users/L1ghtn1ng", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-12-26T00:44:53Z
2021-09-08T20:00:51Z
2015-12-26T11:30:11Z
NONE
resolved
So I am creating a script to interact with an API for https://haveibeenpwned.com/API/v2 (link to the API docs) so when I run this from the example on the site in python3 this works as expected. ``` #!/usr/bin/python3 import requests import pprint headers = {'User-agent': 'Have I been pwn module'} API = 'https://haveibeenpwned.com/api/v2/breachedaccount/[email protected]' request = requests.get(API, headers=headers) res = request.json() pprint.pprint(res) ``` but when changing out the [email protected] email address to my personal email, I should get `404 Not found — the account could not be found and has therefore not been pwned` status code but get this instead ``` Traceback (most recent call last): File "/home/jay/.PyCharm50/config/scratches/scratch_4", line 11, in <module> res = request.json() File "/usr/lib/python3.5/site-packages/requests/models.py", line 808, in json return complexjson.loads(self.text, **kwargs) File "/usr/lib/python3.5/site-packages/simplejson/__init__.py", line 505, in loads return _default_decoder.decode(s) File "/usr/lib/python3.5/site-packages/simplejson/decoder.py", line 370, in decode obj, end = self.raw_decode(s) File "/usr/lib/python3.5/site-packages/simplejson/decoder.py", line 400, in raw_decode return self.scan_once(s, idx=_w(s, idx).end()) simplejson.scanner.JSONDecodeError: Expecting value: line 1 column 1 (char 0) Process finished with exit code 1 ``` Hope its not me being silly?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2946/reactions" }
https://api.github.com/repos/psf/requests/issues/2946/timeline
null
completed
null
null
false
[ "Can you let us know what your email address is? Do you feel comfortable with that?\n", "@Lukasa I would prefer not to but if you under the API var put your email address you should hit the same issue I am having as long as you are not in any of the breach lists\n", "@L1ghtn1ng So in both Python 3 and Python 2 I find that the response has no body if there have been no breaches. As a result the JSON decoders fail to decode correctly. You should check the status code (`res.status_code`) before using `r.json()`.\n" ]
https://api.github.com/repos/psf/requests/issues/2945
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2945/labels{/name}
https://api.github.com/repos/psf/requests/issues/2945/comments
https://api.github.com/repos/psf/requests/issues/2945/events
https://github.com/psf/requests/pull/2945
123,316,909
MDExOlB1bGxSZXF1ZXN0NTQyNzg1Mjg=
2,945
Docs: clarify the precedence of `auth=` over `netrc`
{ "avatar_url": "https://avatars.githubusercontent.com/u/235903?v=4", "events_url": "https://api.github.com/users/ibnIrshad/events{/privacy}", "followers_url": "https://api.github.com/users/ibnIrshad/followers", "following_url": "https://api.github.com/users/ibnIrshad/following{/other_user}", "gists_url": "https://api.github.com/users/ibnIrshad/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ibnIrshad", "id": 235903, "login": "ibnIrshad", "node_id": "MDQ6VXNlcjIzNTkwMw==", "organizations_url": "https://api.github.com/users/ibnIrshad/orgs", "received_events_url": "https://api.github.com/users/ibnIrshad/received_events", "repos_url": "https://api.github.com/users/ibnIrshad/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ibnIrshad/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ibnIrshad/subscriptions", "type": "User", "url": "https://api.github.com/users/ibnIrshad", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-12-21T16:57:08Z
2021-09-08T05:01:07Z
2015-12-21T18:49:52Z
NONE
resolved
This closes #2062 by clarifying in the docs which auth header takes precedence: 1st auth= 2nd .netrc 3rd headers= This precedence order is already tested in test_requests.py, in the test_basicauth_with_netrc method. Perhaps we should add further tests for non-basic auth schemes.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2945/reactions" }
https://api.github.com/repos/psf/requests/issues/2945/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2945.diff", "html_url": "https://github.com/psf/requests/pull/2945", "merged_at": "2015-12-21T18:49:52Z", "patch_url": "https://github.com/psf/requests/pull/2945.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2945" }
true
[ "Thanks! :sparkles: :cake: :sparkles:\n" ]
https://api.github.com/repos/psf/requests/issues/2944
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2944/labels{/name}
https://api.github.com/repos/psf/requests/issues/2944/comments
https://api.github.com/repos/psf/requests/issues/2944/events
https://github.com/psf/requests/pull/2944
123,201,647
MDExOlB1bGxSZXF1ZXN0NTQyMTY2NjI=
2,944
Clarification of .netrc and auth= precedence
{ "avatar_url": "https://avatars.githubusercontent.com/u/235903?v=4", "events_url": "https://api.github.com/users/ibnIrshad/events{/privacy}", "followers_url": "https://api.github.com/users/ibnIrshad/followers", "following_url": "https://api.github.com/users/ibnIrshad/following{/other_user}", "gists_url": "https://api.github.com/users/ibnIrshad/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ibnIrshad", "id": 235903, "login": "ibnIrshad", "node_id": "MDQ6VXNlcjIzNTkwMw==", "organizations_url": "https://api.github.com/users/ibnIrshad/orgs", "received_events_url": "https://api.github.com/users/ibnIrshad/received_events", "repos_url": "https://api.github.com/users/ibnIrshad/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ibnIrshad/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ibnIrshad/subscriptions", "type": "User", "url": "https://api.github.com/users/ibnIrshad", "user_view_type": "public" }
[]
closed
true
null
[]
null
0
2015-12-21T03:00:40Z
2021-09-08T05:01:08Z
2015-12-21T03:01:15Z
NONE
resolved
This closes #2062 by clarifying which auth header takes precedence: 1. auth= 2. .netrc 3. headers=
{ "avatar_url": "https://avatars.githubusercontent.com/u/235903?v=4", "events_url": "https://api.github.com/users/ibnIrshad/events{/privacy}", "followers_url": "https://api.github.com/users/ibnIrshad/followers", "following_url": "https://api.github.com/users/ibnIrshad/following{/other_user}", "gists_url": "https://api.github.com/users/ibnIrshad/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ibnIrshad", "id": 235903, "login": "ibnIrshad", "node_id": "MDQ6VXNlcjIzNTkwMw==", "organizations_url": "https://api.github.com/users/ibnIrshad/orgs", "received_events_url": "https://api.github.com/users/ibnIrshad/received_events", "repos_url": "https://api.github.com/users/ibnIrshad/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ibnIrshad/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ibnIrshad/subscriptions", "type": "User", "url": "https://api.github.com/users/ibnIrshad", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2944/reactions" }
https://api.github.com/repos/psf/requests/issues/2944/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2944.diff", "html_url": "https://github.com/psf/requests/pull/2944", "merged_at": null, "patch_url": "https://github.com/psf/requests/pull/2944.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2944" }
true
[]
https://api.github.com/repos/psf/requests/issues/2943
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2943/labels{/name}
https://api.github.com/repos/psf/requests/issues/2943/comments
https://api.github.com/repos/psf/requests/issues/2943/events
https://github.com/psf/requests/issues/2943
123,200,811
MDU6SXNzdWUxMjMyMDA4MTE=
2,943
Add json parameter to put methods
{ "avatar_url": "https://avatars.githubusercontent.com/u/763608?v=4", "events_url": "https://api.github.com/users/ibuchanan/events{/privacy}", "followers_url": "https://api.github.com/users/ibuchanan/followers", "following_url": "https://api.github.com/users/ibuchanan/following{/other_user}", "gists_url": "https://api.github.com/users/ibuchanan/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ibuchanan", "id": 763608, "login": "ibuchanan", "node_id": "MDQ6VXNlcjc2MzYwOA==", "organizations_url": "https://api.github.com/users/ibuchanan/orgs", "received_events_url": "https://api.github.com/users/ibuchanan/received_events", "repos_url": "https://api.github.com/users/ibuchanan/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ibuchanan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ibuchanan/subscriptions", "type": "User", "url": "https://api.github.com/users/ibuchanan", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-12-21T02:50:21Z
2021-09-08T20:00:53Z
2015-12-21T16:57:50Z
NONE
resolved
I enjoy both `requests.post(json=payload)` and `session.post(json=payload)`. I would like to have the same for both `requests.put()` and `session.put()`. In a cursory examination of the code, it looks like a simple change. I would be happy to create a pull request for it. Is there a reason the convenience wasn't already propagated to PUTs?
{ "avatar_url": "https://avatars.githubusercontent.com/u/763608?v=4", "events_url": "https://api.github.com/users/ibuchanan/events{/privacy}", "followers_url": "https://api.github.com/users/ibuchanan/followers", "following_url": "https://api.github.com/users/ibuchanan/following{/other_user}", "gists_url": "https://api.github.com/users/ibuchanan/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/ibuchanan", "id": 763608, "login": "ibuchanan", "node_id": "MDQ6VXNlcjc2MzYwOA==", "organizations_url": "https://api.github.com/users/ibuchanan/orgs", "received_events_url": "https://api.github.com/users/ibuchanan/received_events", "repos_url": "https://api.github.com/users/ibuchanan/repos", "site_admin": false, "starred_url": "https://api.github.com/users/ibuchanan/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/ibuchanan/subscriptions", "type": "User", "url": "https://api.github.com/users/ibuchanan", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2943/reactions" }
https://api.github.com/repos/psf/requests/issues/2943/timeline
null
completed
null
null
false
[ "> Is there a reason the convenience wasn't already propagated to PUTs?\n\nIt was. =) All our keyword arguments work on _all_ verbs, which means `json=` works on PUT as well as POST. The only thing we didn't do was add it as an explicit keyword argument so that it appears in the documentation. Would you like to make that change?\n", "Sure enough! I had tried it but when my tests failed, I thought that was the cause. But, as is usually the case, it was my code. :)\n\nThe lack of an explicit parameter doesn't bother me, so I'll pass for now.\n\nThanks for the quick response!\n" ]
https://api.github.com/repos/psf/requests/issues/2942
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2942/labels{/name}
https://api.github.com/repos/psf/requests/issues/2942/comments
https://api.github.com/repos/psf/requests/issues/2942/events
https://github.com/psf/requests/issues/2942
123,200,656
MDU6SXNzdWUxMjMyMDA2NTY=
2,942
How to POST a very Big file with 'chunk' mode?
{ "avatar_url": "https://avatars.githubusercontent.com/u/4278114?v=4", "events_url": "https://api.github.com/users/jchluo/events{/privacy}", "followers_url": "https://api.github.com/users/jchluo/followers", "following_url": "https://api.github.com/users/jchluo/following{/other_user}", "gists_url": "https://api.github.com/users/jchluo/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jchluo", "id": 4278114, "login": "jchluo", "node_id": "MDQ6VXNlcjQyNzgxMTQ=", "organizations_url": "https://api.github.com/users/jchluo/orgs", "received_events_url": "https://api.github.com/users/jchluo/received_events", "repos_url": "https://api.github.com/users/jchluo/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jchluo/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jchluo/subscriptions", "type": "User", "url": "https://api.github.com/users/jchluo", "user_view_type": "public" }
[]
closed
true
null
[]
null
3
2015-12-21T02:49:02Z
2021-09-08T20:00:53Z
2015-12-21T09:49:27Z
NONE
resolved
I want to Post to a very big file to the server, will requests use the "chunk" mode ? if >>> f = open("very big file") >>> requests.post(url, data=f) From the `PreparedRequest.prepare_body` method, file length can get from `util.super_len`, then the `'Content-Length'` will be a number, which make chunk mode False. Is these any way to use the chunk mode when post a big file ? Thank you!
{ "avatar_url": "https://avatars.githubusercontent.com/u/4278114?v=4", "events_url": "https://api.github.com/users/jchluo/events{/privacy}", "followers_url": "https://api.github.com/users/jchluo/followers", "following_url": "https://api.github.com/users/jchluo/following{/other_user}", "gists_url": "https://api.github.com/users/jchluo/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/jchluo", "id": 4278114, "login": "jchluo", "node_id": "MDQ6VXNlcjQyNzgxMTQ=", "organizations_url": "https://api.github.com/users/jchluo/orgs", "received_events_url": "https://api.github.com/users/jchluo/received_events", "repos_url": "https://api.github.com/users/jchluo/repos", "site_admin": false, "starred_url": "https://api.github.com/users/jchluo/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/jchluo/subscriptions", "type": "User", "url": "https://api.github.com/users/jchluo", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2942/reactions" }
https://api.github.com/repos/psf/requests/issues/2942/timeline
null
completed
null
null
false
[ "Requests will not chunk the file, but neither will it read it all into memory. Instead, it will \"stream\" it: that is, send it from the file directly to the network. Do you really specifically need chunked-transfer-encoding here? What does chunked transfer encoding get you that the streaming method does not?\n", "Thank you @Lukasa . My mistake, I thought requests has to read it all into memory before send to network.In fact requests just \"stream\" it, which is cool.Thank you!\n", "@jchluo in the future, please do **not** use the **issue** tracker for **questions**. Use [StackOverflow](https://stackoverflow.com/questions/tagged/python-requests).\n" ]
https://api.github.com/repos/psf/requests/issues/2941
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2941/labels{/name}
https://api.github.com/repos/psf/requests/issues/2941/comments
https://api.github.com/repos/psf/requests/issues/2941/events
https://github.com/psf/requests/issues/2941
123,151,294
MDU6SXNzdWUxMjMxNTEyOTQ=
2,941
COC?
{ "avatar_url": "https://avatars.githubusercontent.com/u/1840865?v=4", "events_url": "https://api.github.com/users/alison985/events{/privacy}", "followers_url": "https://api.github.com/users/alison985/followers", "following_url": "https://api.github.com/users/alison985/following{/other_user}", "gists_url": "https://api.github.com/users/alison985/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/alison985", "id": 1840865, "login": "alison985", "node_id": "MDQ6VXNlcjE4NDA4NjU=", "organizations_url": "https://api.github.com/users/alison985/orgs", "received_events_url": "https://api.github.com/users/alison985/received_events", "repos_url": "https://api.github.com/users/alison985/repos", "site_admin": false, "starred_url": "https://api.github.com/users/alison985/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/alison985/subscriptions", "type": "User", "url": "https://api.github.com/users/alison985", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-12-20T10:12:16Z
2021-09-08T20:00:54Z
2015-12-20T11:55:10Z
NONE
resolved
Hi, I'm just wondering what your code of conduct is for this project? Thanks, Alison
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 1, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 1, "url": "https://api.github.com/repos/psf/requests/issues/2941/reactions" }
https://api.github.com/repos/psf/requests/issues/2941/timeline
null
completed
null
null
false
[ "Hi @alison985!\n\nCurrently our relatively minor code of conduct is [defined here](http://docs.python-requests.org/en/latest/dev/contributing/). There is interest amongst some of the maintainers in moving to something based on the [Contributor Covenant](http://contributor-covenant.org/), given that both myself and @sigmavirus24 maintain personal projects that use it, but I'm unwilling to change that by fiat: I'd want consensus from the whole team.\n\nRegardless, I believe that the Contributor Covenant _implicitly_ applies to this project, and certainly that's the standard to which the maintainers hold themselves.\n" ]
https://api.github.com/repos/psf/requests/issues/2940
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2940/labels{/name}
https://api.github.com/repos/psf/requests/issues/2940/comments
https://api.github.com/repos/psf/requests/issues/2940/events
https://github.com/psf/requests/issues/2940
123,109,372
MDU6SXNzdWUxMjMxMDkzNzI=
2,940
Add response.closed attribute
{ "avatar_url": "https://avatars.githubusercontent.com/u/76945?v=4", "events_url": "https://api.github.com/users/vitorbaptista/events{/privacy}", "followers_url": "https://api.github.com/users/vitorbaptista/followers", "following_url": "https://api.github.com/users/vitorbaptista/following{/other_user}", "gists_url": "https://api.github.com/users/vitorbaptista/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/vitorbaptista", "id": 76945, "login": "vitorbaptista", "node_id": "MDQ6VXNlcjc2OTQ1", "organizations_url": "https://api.github.com/users/vitorbaptista/orgs", "received_events_url": "https://api.github.com/users/vitorbaptista/received_events", "repos_url": "https://api.github.com/users/vitorbaptista/repos", "site_admin": false, "starred_url": "https://api.github.com/users/vitorbaptista/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vitorbaptista/subscriptions", "type": "User", "url": "https://api.github.com/users/vitorbaptista", "user_view_type": "public" }
[]
closed
true
null
[]
null
4
2015-12-19T20:35:13Z
2021-09-08T20:00:54Z
2015-12-21T11:54:38Z
NONE
resolved
I'm able to explictly close a response by calling `response.close()`, but as far as I could see, the only way to check if the response is closed or not is calling `response.raw.closed`. This feels like violating Demeter's Law, so I would suggest adding a `response.closed` attribute. If you agree, I'd be happy to send a pull request with this change.
{ "avatar_url": "https://avatars.githubusercontent.com/u/76945?v=4", "events_url": "https://api.github.com/users/vitorbaptista/events{/privacy}", "followers_url": "https://api.github.com/users/vitorbaptista/followers", "following_url": "https://api.github.com/users/vitorbaptista/following{/other_user}", "gists_url": "https://api.github.com/users/vitorbaptista/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/vitorbaptista", "id": 76945, "login": "vitorbaptista", "node_id": "MDQ6VXNlcjc2OTQ1", "organizations_url": "https://api.github.com/users/vitorbaptista/orgs", "received_events_url": "https://api.github.com/users/vitorbaptista/received_events", "repos_url": "https://api.github.com/users/vitorbaptista/repos", "site_admin": false, "starred_url": "https://api.github.com/users/vitorbaptista/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/vitorbaptista/subscriptions", "type": "User", "url": "https://api.github.com/users/vitorbaptista", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2940/reactions" }
https://api.github.com/repos/psf/requests/issues/2940/timeline
null
completed
null
null
false
[ "@vitorbaptista Out of interest, why do you care if the response object is closed?\n", "I have a class that internally uses requests, but not through contexts (I have to keep the response around for a while, so I can't use contexts). I want to write a test to ensure that the response is closed after the object is destroyed.\n\nRight now, I'm testing it using `response.raw.closed`, which is OK, but as `response.close()` already exists, it seems like `response.closed` should exist as well.\n", "So, the reason I asked is that `response.close()` doesn't do what you think it does. Because we pool our connections, the method doesn't necessarily close anything at all: it just returns the socket connection (that may or may not still be open) to the connection pool.\n\nIt should be totally safe to repeatedly call `response.close()`: the method should be idempotent. For that reason, a `closed()` property isn't really necessary: just call `close` instead of checking!\n", "hmmm, interesting... but still, I need to check in the tests that the response was closed. `response.raw.closed` isn't ideal, but works fine. I could also use mocking to check that `response.close()` was called, but using `response.raw.closed` is better IMHO.\n\nIn the end, I created this issue just because it was a bit surprising for `response` to have `close()` but not `closed`. It looked like an easy change. There're other easy ways to do what I wanted to do, though, so it doesn't add much.\n\nI'll close this issue for now. If you think this change is worthwhile, please reopen and I'd be happy to write a pull request solving it.\n\nThanks for your help :+1:\n" ]
https://api.github.com/repos/psf/requests/issues/2939
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2939/labels{/name}
https://api.github.com/repos/psf/requests/issues/2939/comments
https://api.github.com/repos/psf/requests/issues/2939/events
https://github.com/psf/requests/issues/2939
123,086,277
MDU6SXNzdWUxMjMwODYyNzc=
2,939
Multiple Location headers in response are concatenated
{ "avatar_url": "https://avatars.githubusercontent.com/u/1921370?v=4", "events_url": "https://api.github.com/users/L2501/events{/privacy}", "followers_url": "https://api.github.com/users/L2501/followers", "following_url": "https://api.github.com/users/L2501/following{/other_user}", "gists_url": "https://api.github.com/users/L2501/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/L2501", "id": 1921370, "login": "L2501", "node_id": "MDQ6VXNlcjE5MjEzNzA=", "organizations_url": "https://api.github.com/users/L2501/orgs", "received_events_url": "https://api.github.com/users/L2501/received_events", "repos_url": "https://api.github.com/users/L2501/repos", "site_admin": false, "starred_url": "https://api.github.com/users/L2501/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/L2501/subscriptions", "type": "User", "url": "https://api.github.com/users/L2501", "user_view_type": "public" }
[]
closed
true
null
[]
{ "closed_at": null, "closed_issues": 29, "created_at": "2013-11-17T11:29:34Z", "creator": { "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }, "description": null, "due_on": null, "html_url": "https://github.com/psf/requests/milestone/20", "id": 487518, "labels_url": "https://api.github.com/repos/psf/requests/milestones/20/labels", "node_id": "MDk6TWlsZXN0b25lNDg3NTE4", "number": 20, "open_issues": 12, "state": "open", "title": "3.0.0", "updated_at": "2024-05-19T18:43:00Z", "url": "https://api.github.com/repos/psf/requests/milestones/20" }
28
2015-12-19T13:50:05Z
2021-09-08T15:00:54Z
2016-09-06T00:07:12Z
NONE
resolved
tested with requests 2.9 when a http server resonds with multiple Location headers in 301 redirect the locations are concatenated eg Location: http://foo.com Location: http://foo.com results in a redirect to "http://foo.com,http://foo.com" pretty sure this behavior is not present in any other http clients
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2939/reactions" }
https://api.github.com/repos/psf/requests/issues/2939/timeline
null
completed
null
null
false
[ "A web server MUST NOT respond with multiple location header fields. From [RFC 7320 Section 3.2.2](https://tools.ietf.org/html/rfc7230#section-3.2.2):\n\n> A sender MUST NOT generate multiple header fields with the same field name in a message unless either the entire field value for that header field is defined as a comma-separated list [i.e., #(values)] or the header field is a well-known exception (as noted below).\n\nThere is exactly _one_ well-known exception: `Set-Cookie`. `Location` is not `Set-Cookie`, so that doesn't apply. So it would only be acceptable to send multiple `Location` headers if the syntax allowed multiple comma-separated values. `Location` is defined in [RFC 7231 Section 7.1.2](https://tools.ietf.org/html/rfc7231#section-7.1.2):\n\n> Location = URI-reference\n> \n> The field value consists of a single URI-reference.\n\nThe server in this example is violating the HTTP specification. In the face of such violation, a user-agent is simply not required to take any specific action. In particular, while the example you provided above has a very simple course of action (both URLs are the same), if both URLs were _not_ the same requests would be unable to take any sensible action: the situation is too ambiguous to allow a correct decision.\n\nAt the level of requests actually catching this and treating it as an error condition is tricky, because requests only sees the header string `http://foo.com,http://foo.com`, which perversely does parse as a valid URL in the case of most Python libraries because in principle the comma is an allowed character in the host part of the URL. This puts us in a bit of a bind. However, `urllib3`'s `HTTPHeaderDict` structure could enforce this requirement and treat this case as an error on receiving it. That might be a sensible way to go, but it's us going out of our way to provide a slightly nicer error when a server does something it really really shouldn't do. @sigmavirus24, thoughts?\n", "I guess then the web server is doing this to break python-urllib clients as it does not seem to affect web browsers or curl\n", "@L2501 To be clear, we _could_ handle this case. I just don't think we should. The server is very wrong, and it is not our job to work around the failures of that server. In this instance there is a clear acceptable solution, but the general case is not solve able. We can handle this, and we're looking at how best to do that, but if you want to find someone to blame (and your tone suggests you do) then I recommend directing your blame at the website operator whose website is broken, rather than us.\n", "Well i just expected it to only count the first header and igonre the rest to prevent CRLF Injection attacks...\n", "Yes, @L2501. Please continue to act aggressively towards us for not having encountered such a poorly configured server. Please don't take any action to correct that server.\n\n---\n\n@Lukasa There was a proposed version of the Header Dict that ignored multiple headers for anything other Set-Cookie headers. We could fix this faster in requests by using `.getlist` on the original Header Dict from urllib3 and throwing an exception if it returns more than one value.\n", "@L2501 That doesn't prevent CRLF injection attacks, it just keeps functioning in the face of them. My proposed solution above would actually avoid _some portion_ CRLF injection attacks by throwing an exception when an invalid header block is received. Of course, we cannot _generally_ prevent CRLF injection attacks and also re-use TCP connections, so any client that _does_ re-use TCP connections is vulnerable to a well-crafted CRLF injection attack. Generally speaking, defending against CRLF injection attacks is the responsibility of the server operator, not the client.\n\n@sigmavirus24 The better long-term fix might be to promote urllib3's `HeaderDict` to use in requests. The risk with doing it your way is that we assume that the `raw` object exists and is a `urllib3` one. I suppose we could have a conditional path that _attempts_ to use urllib3's but falls back to using ours.\n\nWe could in fact do that, and then plan in 3.0.0 to promote urllib3's `HeaderDict` to use in requests which would allow us to remove the conditional code path in 3.0.0.\n", "> The risk with doing it your way is that we assume that the raw object exists and is a urllib3 one.\n\nAre you thinking about a potential AppEngineAdapter case where it returns an object that is nothing like urllib3 and further breaks urllib3's contract? We can fix this on the adapter level for requests instead of making requests handle that complexity through a whole lot of conditional cases.\n", "@sigmavirus24 No, I'm thinking of test libraries or anything else that returns a `Response` object from a `HTTPAdapter` subclass that is not actually backed by a urllib3 response object.\n", "@Lukasa I'm not as concerned about them. Any test library that hasn't been half-written and thrown up on PyPI will handle this just fine (requests-mock, betamax, vcrpy, all do just fine).\n", "@sigmavirus24 Be that as it may, in general our code assumes that [`.raw` may not come from urllib3](https://github.com/kennethreitz/requests/blob/fc8fa1aa265bb14d59c68eb68a179bce17953967/requests/models.py#L657-L667), and we should endeavour to maintain that if possible.\n", "So, that code was added in https://github.com/kennethreitz/requests/pull/1425 but there wasn't much justification provided at the time for having a conditional like that. The only other remotely similar piece of the library that I'm aware of is [our cookie handling](https://github.com/kennethreitz/requests/blob/f7cb7962417f6e106a8ead214fb556e63bb71014/requests/cookies.py#L116..L130) which just refuses to do anything if it isn't anything like a `urllib3` response.\n\nWe've held an illusion of urllib3 being an \"implementation\" detail but never supported anything other than that in reality. The likelihood that the raw response will not be a urllib3 object seems very close to 0 to me, but I'm open to being convinced otherwise.\n", "@sigmavirus24 The common situation is where the response object is `StringIO` or something similar. I think going out of our way to avoid breaking that is a good idea. If we were going straight to 3.0.0 I'd be fine with breaking it, but ideally we'd deal with this sooner.\n", "> The common situation is where the response object is StringIO or something similar.\n\nI understand we're hoping to catch the case where it's more file-like than `urllib3.HTTPResponse` like, but I'm not sure how common that case is, is what I'm saying.\n\nI also don't want to just remove support for this, but I'd like to see much better encapsulation of these shims honestly.\n", "Agreed, I don't think we've done well enough in the past, but sadly I also don't think that's an excuse for making the mistake here. We know it's a risk, we should account for it.\n\nThat said, it's also fairly easy to have a fallback.\n", "@sigmavirus24\n\n> There was a proposed version of the Header Dict that ignored multiple headers for anything other Set-Cookie headers. We could fix this faster in requests by using `.getlist` on the original Header Dict from urllib3 and throwing an exception if it returns more than one value.\n\nThis would break the list-like headers though? The ones defined as `#(values)`, which RFC 7230 [permits](https://tools.ietf.org/html/rfc7230#section-3.2.2) to split into multiple fields.\n", "@vfaronov That isn't being proposed _in general_, it's being proposed purely for the `Location` header.\n", "Supporting multiple location headers is /not/ valid practice, see [here](http://stackoverflow.com/questions/11309444/can-the-location-header-be-used-for-multiple-resource-locations-in-a-201-created), and should be rejected.\n\nHowever, supporting multiple headers in general is valid as per RFC2616, and requests should support that, see [here](http://stackoverflow.com/questions/4371328/are-duplicate-http-response-headers-acceptable).\n", "@foxx I appreciate that you were trying to be helpful, but please read the discussion on this issue before commenting. If you look at the first comment in this issue you'll see that it begins with:\n\n> A web server MUST NOT respond with multiple location header fields.\n\nThe proposed action in this thread is that requests should error on this condition, rather than doing what it currently does which is to attempt to parse the multiple `Location` fields as though they were one. Note also that that behaviour means that requests already _does_ support multiple headers.\n\nFinally, RFC 2616 is deprecated. RFC 7230 and RFC 7231 have replaced it. =)\n", "> please read the discussion on this issue before commenting.\n\nFwiw, I'd spent about 10 minutes reading through and researching before I posted, but it looks like I misread some of the further comments, apologies.\n\n> The proposed action in this thread is that requests should error on this condition\n\nGiven that the RFCs are pretty clear on the situation of multiple `Location` headers, e.g. MUST NOT, I would have thought the default would be to raise a parser error? \n\n> Finally, RFC 2616 is deprecated. RFC 7230 and RFC 7231 have replaced it. =)\n\nOh dang, my bad.\n", "Yeah, that would be ideal, but we don't own the parser: we use `httplib`. That's a source of ongoing frustration, but the upshot is that we have to do this validation higher up the stack, in requests itself, and the nature of the datastructures and relationships used make it somewhat tricky to do that.\n", "Actually just been thinking some more about this, and perhaps it makes more sense for requests to follow what the majority of other libraries and browsers are doing? Merging these multiple headers is certainly not the correct behaviour, but I'm not sure what the alternative should be.\n\nIn use cases where requests is being used as a scraper, you'd expect requests to behave in the same way as the browser, this is despite the RFCs treating multiple location headers with the same severity as a missing header colon.\n\nSome relevant details [here](https://bugzilla.mozilla.org/show_bug.cgi?id=655389), and [here](http://www.gossamer-threads.com/lists/nanog/users/175684).\n\nApparently chrome throws an error, as does Firefox.\n\n```\nError code: ERR_RESPONSE_HEADERS_MULTIPLE_LOCATION \n```\n\nIf the major browsers are throwing errors, then perhaps requests should too? \n\nThoughts?\n", "Just tried this myself in Chrome;\n\n```\n➜ ~ nc -l 8080\nGET / HTTP/1.1\nHost: 127.0.0.1:8080\nConnection: keep-alive\nAccept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8\nUpgrade-Insecure-Requests: 1\nUser-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.111 Safari/537.36\nAccept-Encoding: gzip, deflate, sdch\nAccept-Language: en-US,en;q=0.8\n\nHTTP/1.1 200 OK\nContent-Length: 0\nConnection: Closed\nLocation: http://example.com\nLocation: http://example.com/2\nLocation: http://example.com/3\n```\n\n![image](https://cloud.githubusercontent.com/assets/651797/12616007/d680ce00-c500-11e5-811a-7d6357d18cb1.png)\n", "That would be a 3.0 change but I'm amenable to that. @Lukasa, what about you?\n", "I think that's what was already proposed here. =)\n", "We should consider the possibility of raising this error only if we are following redirects (e.g. utilizing the `Location` header). If the `Location` header isn't going to trigger any automatic behavior in the client, then perhaps representing the (broken) multiple headers the way we do is okay, as it would allow the user to still access to resource, instead of making it impossible. \n", "Yup, the error would be raised in the `resolve_redirects` code?\n", "@kennethreitz Yup agreed\n", "This was fixed in 3.0 by #3417. I'm closing this as a result and tagging the correct milestone for it.\n" ]
https://api.github.com/repos/psf/requests/issues/2938
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2938/labels{/name}
https://api.github.com/repos/psf/requests/issues/2938/comments
https://api.github.com/repos/psf/requests/issues/2938/events
https://github.com/psf/requests/issues/2938
123,062,910
MDU6SXNzdWUxMjMwNjI5MTA=
2,938
Unicode decode error with PATCH request
{ "avatar_url": "https://avatars.githubusercontent.com/u/4306733?v=4", "events_url": "https://api.github.com/users/scossu/events{/privacy}", "followers_url": "https://api.github.com/users/scossu/followers", "following_url": "https://api.github.com/users/scossu/following{/other_user}", "gists_url": "https://api.github.com/users/scossu/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/scossu", "id": 4306733, "login": "scossu", "node_id": "MDQ6VXNlcjQzMDY3MzM=", "organizations_url": "https://api.github.com/users/scossu/orgs", "received_events_url": "https://api.github.com/users/scossu/received_events", "repos_url": "https://api.github.com/users/scossu/repos", "site_admin": false, "starred_url": "https://api.github.com/users/scossu/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/scossu/subscriptions", "type": "User", "url": "https://api.github.com/users/scossu", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-12-19T05:06:23Z
2021-09-08T20:00:54Z
2015-12-19T08:39:01Z
NONE
resolved
In requests 2.9.0: ``` >>> requests.patch('https://example.org', data=bytes('ô', 'utf-8')) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/combine/virtualenv/lib/python3.4/site-packages/requests/api.py", line 133, in patch return request('patch', url, data=data, **kwargs) File "/usr/local/combine/virtualenv/lib/python3.4/site-packages/requests/api.py", line 53, in request return session.request(method=method, url=url, **kwargs) File "/usr/local/combine/virtualenv/lib/python3.4/site-packages/requests/sessions.py", line 454, in request prep = self.prepare_request(req) File "/usr/local/combine/virtualenv/lib/python3.4/site-packages/requests/sessions.py", line 388, in prepare_request hooks=merge_hooks(request.hooks, self.hooks), File "/usr/local/combine/virtualenv/lib/python3.4/site-packages/requests/models.py", line 296, in prepare self.prepare_body(data, files, json) File "/usr/local/combine/virtualenv/lib/python3.4/site-packages/requests/models.py", line 447, in prepare_body body = self._encode_params(data) File "/usr/local/combine/virtualenv/lib/python3.4/site-packages/requests/models.py", line 84, in _encode_params return to_native_string(data) File "/usr/local/combine/virtualenv/lib/python3.4/site-packages/requests/utils.py", line 700, in to_native_string out = string.decode(encoding) UnicodeDecodeError: 'ascii' codec can't decode byte 0xc3 in position 0: ordinal not in range(128) ``` This seems to be a problem with all strings containing Unicode characters. It works without problems in 2.8.1.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2938/reactions" }
https://api.github.com/repos/psf/requests/issues/2938/timeline
null
completed
null
null
false
[ "This is a duplicate of #2933 and #2930, and is fixed in #2931. Thanks for the report, but in future please check both open and **closed** issues before filing a bug, as the issue may already be known. A new release of requests will be made on Monday that contains the fix for this issue.\n\nThanks! :sparkles: :cake: :sparkles:\n" ]
https://api.github.com/repos/psf/requests/issues/2937
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2937/labels{/name}
https://api.github.com/repos/psf/requests/issues/2937/comments
https://api.github.com/repos/psf/requests/issues/2937/events
https://github.com/psf/requests/pull/2937
122,917,318
MDExOlB1bGxSZXF1ZXN0NTQwODM1NDM=
2,937
Release 2.9.1
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[ { "color": "009800", "default": false, "description": null, "id": 44501218, "name": "Ready To Merge", "node_id": "MDU6TGFiZWw0NDUwMTIxOA==", "url": "https://api.github.com/repos/psf/requests/labels/Ready%20To%20Merge" } ]
closed
true
null
[]
null
16
2015-12-18T09:56:33Z
2021-09-08T05:01:07Z
2015-12-21T14:51:21Z
MEMBER
resolved
This release is scheduled for Monday. I'm aiming to set this up now so that it's an easy release to push on Monday. @shazow, what are the odds that we can get a urllib3 patch release 1.13.1 containing the fix for shazow/urllib3#761 by Monday? I'd owe you lots of :custard: if we could get it! :heart:
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2937/reactions" }
https://api.github.com/repos/psf/requests/issues/2937/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2937.diff", "html_url": "https://github.com/psf/requests/pull/2937", "merged_at": "2015-12-21T14:51:21Z", "patch_url": "https://github.com/psf/requests/pull/2937.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2937" }
true
[ "Note: please don't merge this until we're ready to release on Monday! We may land other patches between now and then, and we'll want to make sure the changelog is updated!\n\nFor those who are interested: the reason we're enforcing a delay between 2.9.0 and 2.9.1 is to try to land as many regression fixes as possible in a single release, rather than rapidly releasing several versions as issues become apparent to us. Now that requests is a widely used project it becomes important to be as reliable as possible in our release management to avoid making people's lives too hard.\n", "@Lukasa are you thinking about making an exception to [our release document](https://github.com/kennethreitz/requests/blob/master/docs/community/release-process.rst#hotfix-releases) in this case?\n", "@sigmavirus24 Yeah, good question, but I think I am, given that the urllib3 release would itself be a patch release. Any concerns there? @ralphbean @eriol?\n", "@Lukasa Could do. Could try to sneak in SOCKS support too, if you wanna.\n\nAlso I noticed some of your changelist includes an update note to the urllib3 version, some don't. Dunno if that's intentional.\n", "@shazow I'd rather leave SOCKS out for now: makes 1.13.1 very clearly a simple bugfix release.\n\nYeah, the missing changelist notes are in error, we should try to fix it.\n", "Sounds good, I'll push out v1.13.1 before monday (maybe today).\n", "@Lukasa yeah, if it's just a couple small bug-fixes, that's fine by me. I think we've been bitten in the past by urllib3 having a lot more than just bug fixes when we pulled it in.\n", ":+1: here. I just finished packaging 2.9.0. Bumping to 2.9.1/1.13.1 shouldn't be an issue. Thanks!\n", "@ralphbean Frankly I don't recommend releasing 2.9.0, it had enough nasty bugs that going straight to 2.9.1 is going to be the better approach.\n", ":+1: from me. I will skip 2.9.0 since 2.9.1 is going to be released soon, thanks!\n", "urllib3 v1.13.1 is live.\n", "> Frankly I don't recommend releasing 2.9.0, it had enough nasty bugs that going straight to 2.9.1 is going to be the better approach.\n\nAcknowledged. I attached a comment to let people know on the 1.9.0 [Fedora updates](https://bodhi.fedoraproject.org/updates/python-requests).\n", "\\o/ Thanks @shazow!\n", "@Lukasa is this still a WIP? I believe you wanted to cut this today\n", "Heh, was just waiting for you. No longer a WIP, should be good to go.\n", ":sparkles: :cake: :sparkles: \n\n:sparkler: :shipit: \n" ]
https://api.github.com/repos/psf/requests/issues/2936
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2936/labels{/name}
https://api.github.com/repos/psf/requests/issues/2936/comments
https://api.github.com/repos/psf/requests/issues/2936/events
https://github.com/psf/requests/pull/2936
122,911,202
MDExOlB1bGxSZXF1ZXN0NTQwODA2NjE=
2,936
Handle bytes and unicode URLs for netloc
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
[]
closed
true
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
[ { "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" } ]
null
1
2015-12-18T09:23:48Z
2021-09-08T05:01:08Z
2015-12-19T17:34:34Z
MEMBER
resolved
Spotted this on my local machine running tests on Python 3. Annoyingly, we didn't see this on the CI rig because the Jenkins box doesn't have a netrc file, so it never gets that far in the function. However, tests preparing bytes URLs fail here on Python 3 because the splitter is the wrong type. This change ensures that we use the appropriate type. Feels like this is safe for 2.9.1.
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2936/reactions" }
https://api.github.com/repos/psf/requests/issues/2936/timeline
null
null
false
{ "diff_url": "https://github.com/psf/requests/pull/2936.diff", "html_url": "https://github.com/psf/requests/pull/2936", "merged_at": "2015-12-19T17:34:34Z", "patch_url": "https://github.com/psf/requests/pull/2936.patch", "url": "https://api.github.com/repos/psf/requests/pulls/2936" }
true
[ "Woot! :+1: \n\nThanks @Lukasa (also YAY OUR CI IS WORKING AGAIN)\n" ]
https://api.github.com/repos/psf/requests/issues/2935
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2935/labels{/name}
https://api.github.com/repos/psf/requests/issues/2935/comments
https://api.github.com/repos/psf/requests/issues/2935/events
https://github.com/psf/requests/issues/2935
122,853,114
MDU6SXNzdWUxMjI4NTMxMTQ=
2,935
KeyError in connectionpool?
{ "avatar_url": "https://avatars.githubusercontent.com/u/56894?v=4", "events_url": "https://api.github.com/users/Kronuz/events{/privacy}", "followers_url": "https://api.github.com/users/Kronuz/followers", "following_url": "https://api.github.com/users/Kronuz/following{/other_user}", "gists_url": "https://api.github.com/users/Kronuz/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Kronuz", "id": 56894, "login": "Kronuz", "node_id": "MDQ6VXNlcjU2ODk0", "organizations_url": "https://api.github.com/users/Kronuz/orgs", "received_events_url": "https://api.github.com/users/Kronuz/received_events", "repos_url": "https://api.github.com/users/Kronuz/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Kronuz/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Kronuz/subscriptions", "type": "User", "url": "https://api.github.com/users/Kronuz", "user_view_type": "public" }
[ { "color": "e10c02", "default": false, "description": null, "id": 117744, "name": "Bug", "node_id": "MDU6TGFiZWwxMTc3NDQ=", "url": "https://api.github.com/repos/psf/requests/labels/Bug" }, { "color": "fef2c0", "default": false, "description": null, "id": 298537994, "name": "Needs More Information", "node_id": "MDU6TGFiZWwyOTg1Mzc5OTQ=", "url": "https://api.github.com/repos/psf/requests/labels/Needs%20More%20Information" } ]
closed
true
null
[]
null
44
2015-12-18T00:38:06Z
2021-09-08T19:00:30Z
2016-03-09T08:17:03Z
NONE
resolved
I'm getting a weird error during this call `conn = old_pool.get(block=False)` in connectionpool.py:410. The Error in Sentry is: `Queue in get, KeyError: (1, True)` And this is the traceback, which doesn't make sense, as `requests/packages/urllib3/connectionpool.py` catches that Empty exception; plus I have no idea what that KeyError is coming from ... any ideas?: ``` python Stacktrace (most recent call last): File "pac/business/views.py", line 49, in get_eticket response = requests.get(url, timeout=timeout, verify=False) File "requests/api.py", line 69, in get return request('get', url, params=params, **kwargs) File "requests/api.py", line 54, in request session.close() File "requests/sessions.py", line 649, in close v.close() File "requests/adapters.py", line 264, in close self.poolmanager.clear() File "requests/packages/urllib3/poolmanager.py", line 99, in clear self.pools.clear() File "requests/packages/urllib3/_collections.py", line 93, in clear self.dispose_func(value) File "requests/packages/urllib3/poolmanager.py", line 65, in <lambda> dispose_func=lambda p: p.close()) File "requests/packages/urllib3/connectionpool.py", line 410, in close conn = old_pool.get(block=False) File "python2.7/Queue.py", line 165, in get raise Empty ```
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2935/reactions" }
https://api.github.com/repos/psf/requests/issues/2935/timeline
null
completed
null
null
false
[ "@Kronuz What version of requests are you using, please?\n", "Sorry, forgot to say. I'm using v2.8.1, from PyPI.\n", "Are you running Python 3?\n", "Nope, I'm using Python 2.7.11\n", "Ok, so I'm inclined to believe that the traceback you're showing me cannot be right. It just doesn't seem to make any sense at all. Are you able to see this live?\n", "This os the Sentry link:\nhttps://sentry.dubalu.com/pac/finkok/group/7500/\n", "Well, that traceback looks impossible. Short of some insane monkeypatching, I'm not sure how this TB could be right.\n", "Does sentry always elide the full path of things as well? That's just frustrating. The path to an installed version of requests is _not_ the same as the path to Queue.py.\n", "@Kronuz can you give us an idea of what other packages are installed on your system? Something else may be monkey patching the stdlib and causing this issue.\n", "It's Django and many other things. Celery, REST Framework... But searching around there doesn't seem to exist monkey patching of the standard libs. But it's true it seems \"impossible\"\n", "@Kronuz things like eventlet and gevent may be used in one of your dependencies and they tend to monkey-patch Queue. This is why we asked for your dependencies. Many packages do things without telling the end-user.\n\nFurther, it's absolutely bizarre that you're getting a `KeyError` when the error raised is `Queue.Empty`.\n", "Yes, it is really weird. And it has happened in our development boxes from time to time as well as in production. I'm not using fervent or eventlet, but I'll try to dig further to see if there's anything else that could be patching Queue.\n", "I got the same traceback, using python 2.7.5. so I changed into urllib2 to avoid it.\n\n```\n File \"Bee.git/src/agent/custom/article_parser.py\", line 23, in get_html\n r = requests.get(url, timeout=10)\n File \"/usr/lib/python2.7/site-packages/requests/api.py\", line 67, in get\n return request('get', url, params=params, **kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/api.py\", line 53, in request\n return session.request(method=method, url=url, **kwargs)\n File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 350, in __exit__\n self.close()\n File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 649, in close\n v.close()\n File \"/usr/lib/python2.7/site-packages/requests/adapters.py\", line 270, in close\n self.poolmanager.clear()\n File \"/usr/lib/python2.7/site-packages/requests/packages/urllib3/poolmanager.py\", line 100, in clear\n self.pools.clear()\n File \"/usr/lib/python2.7/site-packages/requests/packages/urllib3/_collections.py\", line 94, in clear\n self.dispose_func(value)\n File \"/usr/lib/python2.7/site-packages/requests/packages/urllib3/poolmanager.py\", line 66, in <lambda>\n dispose_func=lambda p: p.close())\n File \"/usr/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py\", line 410, in close\n conn = old_pool.get(block=False)\n File \"/usr/lib64/python2.7/Queue.py\", line 165, in get\n raise Empty\n```\n", "This traceback remains impossible. Can anyone get a regular reproduction scenario?\n", "if i run the function alone, it works good.\n\nI change the \"close\" function in connectionpool.py. it seems this code can not catch Empty.\n\n``` python\n def close(self):\n \"\"\"\n Close all pooled connections and disable the pool.\n \"\"\"\n # Disable access to the pool\n old_pool, self.pool = self.pool, None\n\n try:\n while True:\n print '-'*10\n conn = old_pool.get(block=False)\n print [conn]\n if conn:\n print conn.__dict__\n conn.close()\n\n except Empty:\n print 'queue empty!!!!!!!!'\n import traceback\n print traceback.format_exc()\n pass # Done.\n```\n\nget follow message\n\n```\n----------\n[<requests.packages.urllib3.connection.HTTPConnection object at 0x113e390>]\n{'_HTTPConnection__state': 'Idle', '_buffer': [], '_tunnel_headers': {}, '_tunnel_host': None, 'sock': <socket._socketobject object at 0x10e8ec0>, 'port': 80, 'strict': True, 'host': 'news.bitauto.com', '_tunnel_port': None, 'timeout': 10, 'socket_options': [(6, 1, 1)], 'source_address': None, '_HTTPConnection__response': <httplib.HTTPResponse instance at 0x1143200>, '_method': 'GET'}\n----------\n[None]\n----------\n[None]\n----------\n[None]\n----------\n[None]\n----------\n[None]\n----------\n[None]\n----------\n[None]\n----------\n[None]\n----------\n[None]\n----------\nTraceback (most recent call last):\n......\n```\n\nthis line didn`t run print 'queue empty!!!!!!!!'\n\nafter i change \"except Empty\" to \"except Exception\", the problem does not show ,\n\n```\n----------\n[<requests.packages.urllib3.connection.HTTPConnection object at 0x7f8e444fc390>]\n{'_HTTPConnection__state': 'Idle', '_buffer': [], '_tunnel_headers': {}, '_tunnel_host': None, 'sock': <socket._socketobject object at 0x7f8e444a6e50>, 'port': 80, 'strict': True, 'host': 'news.bitauto.com', '_tunnel_port': None, 'timeout': 10, 'socket_options': [(6, 1, 1)], 'source_address': None, '_HTTPConnection__response': <httplib.HTTPResponse instance at 0x7f8e44501290>, '_method': 'GET'}\n----------\n[None]\n----------\n[None]\n----------\n[None]\n----------\n[None]\n----------\n[None]\n----------\n[None]\n----------\n[None]\n----------\n[None]\n----------\n[None]\n----------\nqueue empty!!!!!!!!\nTraceback (most recent call last):\n File \"/usr/lib/python2.7/site-packages/requests/packages/urllib3/connectionpool.py\", line 411, in close\n conn = old_pool.get(block=False)\n File \"/usr/lib64/python2.7/Queue.py\", line 165, in get\n raise Empty\nEmpty\n```\n\nI don`t know why...\n", "@fqroot Are you using gevent or eventlet? Or anything that monkeypatches the `Queue` module?\n", "I have removed all the code about gevent.\n", "@fqroot So, to be clear, you are _not_ using gevent? Because if anything monkeypatches the `Queue` module to change its idea of what `Empty` is then this code is at risk of failing.\n", "@fqroot, I haven't had the chance to try it myself yet, but why don't you catch `Exception as exc` And then inspect `exc` to see what the type of it is and where it's source code has been defined.\n", "I have also stumbled upon problems sort of like this one when imports fail at a certain time or during circular imports, which then leave closures with variables pointing to half constructed modules; such objects in turn generally become `None` and some times become a whole different type object from there on.\n\nThis could be the case of `Empty` in the `exc` variable in your `except Exception as exc`. Ceck where the `exc` object type's code was defined (to see if it was monkey patched) and compare the `exc` type's `id()` against `Empty` to see if it's any different, or even `None` and please comment about your findings\n", "@Kronuz OK, I will debug it tommrow. maybe there is a relationship with the module \"importlib\", I think\n", "@Kronuz \n\n``` python\n def close(self):\n \"\"\"\n Close all pooled connections and disable the pool.\n \"\"\"\n # Disable access to the pool\n old_pool, self.pool = self.pool, None\n\n try:\n while True:\n conn = old_pool.get(block=False)\n if conn:\n conn.close()\n\n # except Empty:\n except Exception as exc:\n print 'a', type(exc)\n print 'a', type(Empty) \n print 'a', id(exc)\n print 'a', id(Empty)\n pass # Done.\n```\n\noutput\n\n```\na <class 'Queue.Empty'>\na <class 'Queue.Empty'> ### it should be <type 'type'>\na 140610585058592\na 140610585058592\n```\n\nand I have found where the err code lead to my problem. \n", "@fqroot, where? ...Wait, what? how?\n\nIt's really weird... why doesn't the `except Empty` catches it then?\n", "@Kronuz I noticed that there is an error code in other part of my project\nsimple like\n\n``` python\ndef handle():\n try:\n pass # some code\n except Exception, Queue.Empty: # may drink too much :)\n pass\n\n```\n\nthe handle function was created by module importlib\nso Queue.Empty is not `<type \"type\">` any longer and become a instance of Queue.Empty\nThis explanation is my guess. \n", "@fqroot, I still don't get it :/\n\nCould you please elaborate a bit more? I'm getting `KeyError` instead of `Empty` (or so the log in Sentry says)... What where you getting there?\n", "@Kronuz My situation may not suit you.\nwhat type exception you get in that close function?\n", "@fqroot, I'm getting `KeyError((1, True))` ...for some reason\n", "@Kronuz I don't know how to help you...\n", "To anyone who happens upon this thread in the future, a few things:\n1. @Kronuz and @fqroot are having similar (on the face) but very different problems\n2. @fqroot is using Python 2 in which you could do\n \n ``` py\n try:\n # ...\n except ExceptionClass, exc:\n # Use exc as the instance of ExceptionClass\n ```\n \n when they meant to use\n \n ``` py\n try:\n # ...\n except (ExceptionClass1, ExceptionClass2, ExceptionClass3):\n # Catch one of those exception classes but do not store it as a local\n ```\n \n this is why `Queue.Empty` was no longer of type `type` in @fqroot's case and why he was seeing the `Queue.Empty` exception as being unhandled by requests/urllib3.\n3. @Kronuz is getting an entirely different exception from code that should only ever raise a `Queue.Empty` exception. At the time of this writing, it is still unclear how this is either happening or even possible.\n", "I still keep getting this error (the original one I posted) ...and I honestly have no idea why or how to figure it out.\n\nWhat are your best guesses?\n" ]
https://api.github.com/repos/psf/requests/issues/2934
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2934/labels{/name}
https://api.github.com/repos/psf/requests/issues/2934/comments
https://api.github.com/repos/psf/requests/issues/2934/events
https://github.com/psf/requests/issues/2934
122,744,855
MDU6SXNzdWUxMjI3NDQ4NTU=
2,934
DigestAuth doesn't re-auth with provided WWW-Authenticate
{ "avatar_url": "https://avatars.githubusercontent.com/u/39889?v=4", "events_url": "https://api.github.com/users/yarikoptic/events{/privacy}", "followers_url": "https://api.github.com/users/yarikoptic/followers", "following_url": "https://api.github.com/users/yarikoptic/following{/other_user}", "gists_url": "https://api.github.com/users/yarikoptic/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/yarikoptic", "id": 39889, "login": "yarikoptic", "node_id": "MDQ6VXNlcjM5ODg5", "organizations_url": "https://api.github.com/users/yarikoptic/orgs", "received_events_url": "https://api.github.com/users/yarikoptic/received_events", "repos_url": "https://api.github.com/users/yarikoptic/repos", "site_admin": false, "starred_url": "https://api.github.com/users/yarikoptic/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/yarikoptic/subscriptions", "type": "User", "url": "https://api.github.com/users/yarikoptic", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-12-17T14:29:27Z
2021-09-08T20:00:55Z
2015-12-17T16:02:04Z
NONE
resolved
I am trying to use DigestAuth but it seems that requests doesn't try to complete the negotiation: ``` In [19]: url = 'http://crawdad.org//download/cambridge/haggle/imote-trace1.tar.gz' *In [20]: r = requests.get(url, auth=requests.auth.HTTPDigestAuth(user, passw)) INFO:requests.packages.urllib3.connectionpool:Starting new HTTP connection (1): crawdad.org send: 'GET //download/cambridge/haggle/imote-trace1.tar.gz HTTP/1.1\r\nHost: crawdad.org\r\nConnection: keep-alive\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nUser-Agent: python-requests/2.8.1\r\n\r\n' reply: 'HTTP/1.1 401 Unauthorized\r\n' header: Date: Thu, 17 Dec 2015 14:20:14 GMT header: Server: Apache/2.4.10 (Fedora) header: WWW-Authenticate: Basic realm="Dartmouth wireless-network traces" header: Content-Length: 88 header: Connection: close header: Content-Type: text/html; charset=iso-8859-1 DEBUG:requests.packages.urllib3.connectionpool:"GET //download/cambridge/haggle/imote-trace1.tar.gz HTTP/1.1" 401 88 In [21]: r.status_code Out[21]: 401 In [22]: requests.__version__ Out[22]: '2.8.1' ``` note how it goes with wget (credentials are within ~/.netrc, I have masked out ETag and Authorization fields in the trace below): ``` $> wget -S --debug http://crawdad.org//download/cambridge/haggle/imote-trace1.tar.gz DEBUG output created by Wget 1.16.3 on linux-gnu. URI encoding = ‘UTF-8’ --2015-12-17 09:26:07-- http://crawdad.org//download/cambridge/haggle/imote-trace1.tar.gz Host ‘crawdad.org’ has not issued a general basic challenge. Resolving crawdad.org (crawdad.org)... 129.170.213.101 Caching crawdad.org => 129.170.213.101 Connecting to crawdad.org (crawdad.org)|129.170.213.101|:80... connected. Created socket 4. Releasing 0x00005578fbb08ee0 (new refcount 1). ---request begin--- GET //download/cambridge/haggle/imote-trace1.tar.gz HTTP/1.1 User-Agent: Wget/1.16.3 (linux-gnu) Accept: */* Accept-Encoding: identity Host: crawdad.org Connection: Keep-Alive ---request end--- HTTP request sent, awaiting response... ---response begin--- HTTP/1.1 401 Unauthorized Date: Thu, 17 Dec 2015 14:26:07 GMT Server: Apache/2.4.10 (Fedora) WWW-Authenticate: Basic realm="Dartmouth wireless-network traces" Content-Length: 88 Connection: close Content-Type: text/html; charset=iso-8859-1 ---response end--- HTTP/1.1 401 Unauthorized Date: Thu, 17 Dec 2015 14:26:07 GMT Server: Apache/2.4.10 (Fedora) WWW-Authenticate: Basic realm="Dartmouth wireless-network traces" Content-Length: 88 Connection: close Content-Type: text/html; charset=iso-8859-1 Closed fd 4 Auth scheme found 'Basic' Auth param list ' realm="Dartmouth wireless-network traces"' Auth param realm=Dartmouth wireless-network traces Authentication selected: Basic realm="Dartmouth wireless-network traces" Inserted ‘crawdad.org’ into basic_authed_hosts Found crawdad.org in host_name_addresses_map (0x5578fbb08ee0) Connecting to crawdad.org (crawdad.org)|129.170.213.101|:80... connected. Created socket 4. Releasing 0x00005578fbb08ee0 (new refcount 1). ---request begin--- GET //download/cambridge/haggle/imote-trace1.tar.gz HTTP/1.1 User-Agent: Wget/1.16.3 (linux-gnu) Accept: */* Accept-Encoding: identity Host: crawdad.org Connection: Keep-Alive Authorization: Basic XXXXXXXXXXXXXXXXXX== ---request end--- HTTP request sent, awaiting response... ---response begin--- HTTP/1.1 200 OK Date: Thu, 17 Dec 2015 14:26:08 GMT Server: Apache/2.4.10 (Fedora) Last-Modified: Tue, 17 Oct 2006 21:49:43 GMT ETag: "XXXX-XXXXXXXXXXXXXX" Accept-Ranges: bytes Content-Length: 29345 Connection: close Content-Type: application/x-gzip ---response end--- HTTP/1.1 200 OK Date: Thu, 17 Dec 2015 14:26:08 GMT Server: Apache/2.4.10 (Fedora) Last-Modified: Tue, 17 Oct 2006 21:49:43 GMT ETag: "XXXX-XXXXXXXXXXXXXX" Accept-Ranges: bytes Content-Length: 29345 Connection: close Content-Type: application/x-gzip Length: 29345 (29K) [application/x-gzip] Saving to: ‘imote-trace1.tar.gz.4’ imote-trace1.tar.gz.4 100%[==========================================================>] 28.66K --.-KB/s in 0.002s Closed fd 4 2015-12-17 09:26:08 (13.3 MB/s) - ‘imote-trace1.tar.gz.4’ saved [29345/29345] ``` P.S. if you are to decide to access that file, http://crawdad.org/joinup.html is the registration page
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2934/reactions" }
https://api.github.com/repos/psf/requests/issues/2934/timeline
null
completed
null
null
false
[ "The reason you're not authenticating is because the remote service isn't doing Digest auth, it's doing basic. Try `r = requests.get(url, auth=(user, passw))`.\n", "Thank you @Lukasa . I guess something else went wrong for me originally (when I thought I did use basic auth), and I did misread wget output (which first \"senses\" which authentication mechanism to use). Sorry about the noise\n" ]
https://api.github.com/repos/psf/requests/issues/2933
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2933/labels{/name}
https://api.github.com/repos/psf/requests/issues/2933/comments
https://api.github.com/repos/psf/requests/issues/2933/events
https://github.com/psf/requests/issues/2933
122,599,228
MDU6SXNzdWUxMjI1OTkyMjg=
2,933
Problem with bytes body in request 2.9 in Py 3.5
{ "avatar_url": "https://avatars.githubusercontent.com/u/1050156?v=4", "events_url": "https://api.github.com/users/lmazuel/events{/privacy}", "followers_url": "https://api.github.com/users/lmazuel/followers", "following_url": "https://api.github.com/users/lmazuel/following{/other_user}", "gists_url": "https://api.github.com/users/lmazuel/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/lmazuel", "id": 1050156, "login": "lmazuel", "node_id": "MDQ6VXNlcjEwNTAxNTY=", "organizations_url": "https://api.github.com/users/lmazuel/orgs", "received_events_url": "https://api.github.com/users/lmazuel/received_events", "repos_url": "https://api.github.com/users/lmazuel/repos", "site_admin": false, "starred_url": "https://api.github.com/users/lmazuel/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/lmazuel/subscriptions", "type": "User", "url": "https://api.github.com/users/lmazuel", "user_view_type": "public" }
[]
closed
true
null
[]
null
2
2015-12-16T20:55:44Z
2021-09-08T20:00:55Z
2015-12-16T21:19:07Z
NONE
resolved
Hello, Our library suddenly stops to works in Python 3 for our users after `requests` 2.9 release, for instance: https://github.com/Azure/azure-storage-python/issues/89 I investigated the stack trace, and I found that when the body of the requests contains "real" bytes ("real" stands for OSImage, PDF,... not something readable), the code breaks. Especially, `requests` 2.9 in `requests.models#_encode_params` calls a new method `requests.utils#to_native_string` which tries to convert my bytes data using an ASCII encoder which is not possible if data are "real" bytes. This is the interesting part of the stacktrace extracting from the previously mentioned bug ``` self.response = self.session.request(self.method, self.uri, data=request_body, headers=self.headers, timeout=self.timeout) File "/usr/local/lib/python3.5/site-packages/requests/sessions.py", line 454, in request prep = self.prepare_request(req) File "/usr/local/lib/python3.5/site-packages/requests/sessions.py", line 388, in prepare_request hooks=merge_hooks(request.hooks, self.hooks), File "/usr/local/lib/python3.5/site-packages/requests/models.py", line 296, in prepare self.prepare_body(data, files, json) File "/usr/local/lib/python3.5/site-packages/requests/models.py", line 447, in prepare_body body = self._encode_params(data) File "/usr/local/lib/python3.5/site-packages/requests/models.py", line 84, in _encode_params return to_native_string(data) File "/usr/local/lib/python3.5/site-packages/requests/utils.py", line 700, in to_native_string out = string.decode(encoding) UnicodeDecodeError: 'ascii' codec can't decode byte 0xd0 in position 0: ordinal not in range(128) ``` What do you think? Tell me if I can help for anything. Thank you,
{ "avatar_url": "https://avatars.githubusercontent.com/u/240830?v=4", "events_url": "https://api.github.com/users/sigmavirus24/events{/privacy}", "followers_url": "https://api.github.com/users/sigmavirus24/followers", "following_url": "https://api.github.com/users/sigmavirus24/following{/other_user}", "gists_url": "https://api.github.com/users/sigmavirus24/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/sigmavirus24", "id": 240830, "login": "sigmavirus24", "node_id": "MDQ6VXNlcjI0MDgzMA==", "organizations_url": "https://api.github.com/users/sigmavirus24/orgs", "received_events_url": "https://api.github.com/users/sigmavirus24/received_events", "repos_url": "https://api.github.com/users/sigmavirus24/repos", "site_admin": false, "starred_url": "https://api.github.com/users/sigmavirus24/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/sigmavirus24/subscriptions", "type": "User", "url": "https://api.github.com/users/sigmavirus24", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2933/reactions" }
https://api.github.com/repos/psf/requests/issues/2933/timeline
null
completed
null
null
false
[ "This is already fixed in #2931. In the future, please check open _and_ closed issues before filing new issues.\n", "Sorry, yes I forgot to check closed issues.\nThank you for the report.\n" ]
https://api.github.com/repos/psf/requests/issues/2932
https://api.github.com/repos/psf/requests
https://api.github.com/repos/psf/requests/issues/2932/labels{/name}
https://api.github.com/repos/psf/requests/issues/2932/comments
https://api.github.com/repos/psf/requests/issues/2932/events
https://github.com/psf/requests/issues/2932
122,560,251
MDU6SXNzdWUxMjI1NjAyNTE=
2,932
PyPI doesn't know about python 3 support
{ "avatar_url": "https://avatars.githubusercontent.com/u/528022?v=4", "events_url": "https://api.github.com/users/mrterry/events{/privacy}", "followers_url": "https://api.github.com/users/mrterry/followers", "following_url": "https://api.github.com/users/mrterry/following{/other_user}", "gists_url": "https://api.github.com/users/mrterry/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/mrterry", "id": 528022, "login": "mrterry", "node_id": "MDQ6VXNlcjUyODAyMg==", "organizations_url": "https://api.github.com/users/mrterry/orgs", "received_events_url": "https://api.github.com/users/mrterry/received_events", "repos_url": "https://api.github.com/users/mrterry/repos", "site_admin": false, "starred_url": "https://api.github.com/users/mrterry/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/mrterry/subscriptions", "type": "User", "url": "https://api.github.com/users/mrterry", "user_view_type": "public" }
[]
closed
true
null
[]
null
1
2015-12-16T17:26:32Z
2021-09-08T20:00:56Z
2015-12-16T17:34:29Z
NONE
resolved
PyPI doesn't seem to know that requests supports Python 3 (3.3, 3.4, & 3.5!). As a result, [caniusepython3](https://caniusepython3.com/) thinks `requests` is a Python 3 blocker and `requests` is in red on the [Python 3 Wall of Superpowers](https://python3wos.appspot.com/). Requests is awesome and it should get the credit it deserves.
{ "avatar_url": "https://avatars.githubusercontent.com/u/1382556?v=4", "events_url": "https://api.github.com/users/Lukasa/events{/privacy}", "followers_url": "https://api.github.com/users/Lukasa/followers", "following_url": "https://api.github.com/users/Lukasa/following{/other_user}", "gists_url": "https://api.github.com/users/Lukasa/gists{/gist_id}", "gravatar_id": "", "html_url": "https://github.com/Lukasa", "id": 1382556, "login": "Lukasa", "node_id": "MDQ6VXNlcjEzODI1NTY=", "organizations_url": "https://api.github.com/users/Lukasa/orgs", "received_events_url": "https://api.github.com/users/Lukasa/received_events", "repos_url": "https://api.github.com/users/Lukasa/repos", "site_admin": false, "starred_url": "https://api.github.com/users/Lukasa/starred{/owner}{/repo}", "subscriptions_url": "https://api.github.com/users/Lukasa/subscriptions", "type": "User", "url": "https://api.github.com/users/Lukasa", "user_view_type": "public" }
{ "+1": 0, "-1": 0, "confused": 0, "eyes": 0, "heart": 0, "hooray": 0, "laugh": 0, "rocket": 0, "total_count": 0, "url": "https://api.github.com/repos/psf/requests/issues/2932/reactions" }
https://api.github.com/repos/psf/requests/issues/2932/timeline
null
completed
null
null
false
[ "Annoyingly, our trove classifiers appear to have disappeared again. I've re-uploaded them. Thanks!\n" ]