workspace
stringclasses
1 value
channel
stringclasses
1 value
sentences
stringlengths
1
3.93k
ts
stringlengths
26
26
user
stringlengths
2
11
sentence_id
stringlengths
44
53
timestamp
float64
1.5B
1.56B
__index_level_0__
int64
0
106k
pythondev
help
this assumes you are using python 3.6+. If you are under 3.6 replace the f string with the correct string formatting for your version
2019-05-13T17:16:42.006300
Clemmie
pythondev_help_Clemmie_2019-05-13T17:16:42.006300
1,557,767,802.0063
23,521
pythondev
help
oh, thats completely different from my 2nd answer :stuck_out_tongue:
2019-05-13T17:17:34.007200
Walton
pythondev_help_Walton_2019-05-13T17:17:34.007200
1,557,767,854.0072
23,522
pythondev
help
there are also superfluous paren in mine, one sec
2019-05-13T17:17:59.007500
Clemmie
pythondev_help_Clemmie_2019-05-13T17:17:59.007500
1,557,767,879.0075
23,523
pythondev
help
None
2019-05-13T17:17:59.007600
Walton
pythondev_help_Walton_2019-05-13T17:17:59.007600
1,557,767,879.0076
23,524
pythondev
help
yours makes much more sense
2019-05-13T17:18:34.008100
Walton
pythondev_help_Walton_2019-05-13T17:18:34.008100
1,557,767,914.0081
23,525
pythondev
help
it might take some tweaking - I copy/paste, didn’t read it too closely, but you get the gist I think. gotta run - good luck!
2019-05-13T17:19:03.008700
Clemmie
pythondev_help_Clemmie_2019-05-13T17:19:03.008700
1,557,767,943.0087
23,526
pythondev
help
Thanks so much!
2019-05-13T17:19:22.008900
Walton
pythondev_help_Walton_2019-05-13T17:19:22.008900
1,557,767,962.0089
23,527
pythondev
help
hey all. I'm stumbling my way through writing a script to alert me when an item comes back in stock, using beautifulsoup. I'm stuck on a problem diving down into elements in the page. If I print the output of: ```results = soup.find_all("div", class_="inventory_holder")``` I get the div element that contains the span element I'm after. But that span element isn't in the output.
2019-05-13T18:35:30.012800
Ashely
pythondev_help_Ashely_2019-05-13T18:35:30.012800
1,557,772,530.0128
23,528
pythondev
help
For reference, here's the code as I see it in the browser:
2019-05-13T18:37:04.013400
Ashely
pythondev_help_Ashely_2019-05-13T18:37:04.013400
1,557,772,624.0134
23,529
pythondev
help
So I'm trying to get my hands on that span element called 'cta_price'.
2019-05-13T18:37:34.014100
Ashely
pythondev_help_Ashely_2019-05-13T18:37:34.014100
1,557,772,654.0141
23,530
pythondev
help
Any suggestions as to why bs4 isn't pulling that from the div?
2019-05-13T18:38:04.014600
Ashely
pythondev_help_Ashely_2019-05-13T18:38:04.014600
1,557,772,684.0146
23,531
pythondev
help
(edited code above, searching divs not spans)
2019-05-13T18:46:22.015400
Ashely
pythondev_help_Ashely_2019-05-13T18:46:22.015400
1,557,773,182.0154
23,532
pythondev
help
I wonder if the bs4 parser is getting confused by the quotes-within-quotes used in that tag. Does the tag output include those correctly?
2019-05-13T18:56:27.016200
Sasha
pythondev_help_Sasha_2019-05-13T18:56:27.016200
1,557,773,787.0162
23,533
pythondev
help
```<div class="inventory_holder" data-a2c='{"itemId":"3074457345619260819", "prodId":"3074457345619260818", "isGiftCard":"false", "isEcarePack":"false","serialNumber":"", "qty":"1", "popup":"false", "targetUrl":"accessories"}' data-omni='{"itemId": "3074457345619260819", "productLine":"BQ", "prodId": "3074457345619260818", "site":"hhos", "subSection": "cart", "xsellAttr": "", "addToCartMethod":"pdp", "parentId":"", "location":""}' data-ui='{"styleUI":"pdp", "linkUrl":"","isComingSoonSku":false}' id="inventory_holder_3074457345619260819_pdp"> </div> Process finished with exit code 0```
2019-05-13T19:02:22.016400
Ashely
pythondev_help_Ashely_2019-05-13T19:02:22.016400
1,557,774,142.0164
23,534
pythondev
help
You meant in the script, or in the browser? Above is the script output
2019-05-13T19:02:40.016800
Ashely
pythondev_help_Ashely_2019-05-13T19:02:40.016800
1,557,774,160.0168
23,535
pythondev
help
Cool, looks like it's transforming the quotes correctly.
2019-05-13T19:04:59.017300
Sasha
pythondev_help_Sasha_2019-05-13T19:04:59.017300
1,557,774,299.0173
23,536
pythondev
help
Is there any chance that the `span` content you want is being dynamically inserted by Javascript in the browser and isn't actually in the HTML you're feeding to bs4?
2019-05-13T19:11:07.018000
Sasha
pythondev_help_Sasha_2019-05-13T19:11:07.018000
1,557,774,667.018
23,537
pythondev
help
guys where is the error:
2019-05-13T20:07:39.018300
Karol
pythondev_help_Karol_2019-05-13T20:07:39.018300
1,557,778,059.0183
23,538
pythondev
help
That's peculiar. Any chance you have a `pandas.py` file in your directory that might be masking the real Pandas library? Can you `print(pd.read_csv)` to see what Python thinks that is?
2019-05-13T20:14:19.020900
Sasha
pythondev_help_Sasha_2019-05-13T20:14:19.020900
1,557,778,459.0209
23,539
pythondev
help
Hello, Guys is there any way can a python script work as a listener for receiving the emails? basically, i want to filter the emails and trigger a Jenkins job. but first, I want to configure a listener which actually receives the emails and filters ? is it possible?
2019-05-13T20:16:23.022700
Hai
pythondev_help_Hai_2019-05-13T20:16:23.022700
1,557,778,583.0227
23,540
pythondev
help
prints the route of the file , this is very weird I'm using collaboratory
2019-05-13T20:23:19.022800
Karol
pythondev_help_Karol_2019-05-13T20:23:19.022800
1,557,778,999.0228
23,541
pythondev
help
Any chance that it's remembering something from a previous session where you accidentally typed `pd.read_csv = ...`?
2019-05-13T20:27:34.023000
Sasha
pythondev_help_Sasha_2019-05-13T20:27:34.023000
1,557,779,254.023
23,542
pythondev
help
no chance I closed and reopen it
2019-05-13T20:29:34.023200
Karol
pythondev_help_Karol_2019-05-13T20:29:34.023200
1,557,779,374.0232
23,543
pythondev
help
I'm out of ideas, I'm afraid, beyond "Collaboratory is weirdly broken". :sweat:
2019-05-13T20:32:38.023400
Sasha
pythondev_help_Sasha_2019-05-13T20:32:38.023400
1,557,779,558.0234
23,544
pythondev
help
hahaha yeah it sucks I create a new book from scratch and it worked damn it
2019-05-13T20:33:53.023600
Karol
pythondev_help_Karol_2019-05-13T20:33:53.023600
1,557,779,633.0236
23,545
pythondev
help
¯\_(ツ)_/¯
2019-05-13T20:34:41.023800
Sasha
pythondev_help_Sasha_2019-05-13T20:34:41.023800
1,557,779,681.0238
23,546
pythondev
help
<@Sasha> Sorry, broke for dinner. I'm a little embarassed to say I have no idea, or how to find out. I'll dig into it. If it is dynamically generated, any suggestion for how you'd pull it in to a script?
2019-05-13T20:55:05.025100
Ashely
pythondev_help_Ashely_2019-05-13T20:55:05.025100
1,557,780,905.0251
23,547
pythondev
help
One way to test is to turn off Javascript in your browser and see what the page source looks like in that case. You can also try printing out the data that you're giving to bs4.
2019-05-13T20:59:54.026000
Sasha
pythondev_help_Sasha_2019-05-13T20:59:54.026000
1,557,781,194.026
23,548
pythondev
help
If it is dynamically generated, you're in more <#C5PHT9EGK|webscraping> territory, and might need to use something like Selenium to behave more like a browser.
2019-05-13T21:00:51.027100
Sasha
pythondev_help_Sasha_2019-05-13T21:00:51.027100
1,557,781,251.0271
23,549
pythondev
help
Yep, sure enough. The 'out of stock' button the span element sits on top of doesn't even load. Good call.
2019-05-13T21:05:43.028000
Ashely
pythondev_help_Ashely_2019-05-13T21:05:43.028000
1,557,781,543.028
23,550
pythondev
help
I'll take my case to the scrapers. Thanks for the point in the right direction.
2019-05-13T21:06:07.028500
Ashely
pythondev_help_Ashely_2019-05-13T21:06:07.028500
1,557,781,567.0285
23,551
pythondev
help
hello everyone i want help regarding text messaging service, which one should i implement in my project
2019-05-14T02:20:27.030400
Eboni
pythondev_help_Eboni_2019-05-14T02:20:27.030400
1,557,800,427.0304
23,552
pythondev
help
Twilio is good and has a very simple api <@Eboni>
2019-05-14T03:48:37.031400
Conchita
pythondev_help_Conchita_2019-05-14T03:48:37.031400
1,557,805,717.0314
23,553
pythondev
help
<@Conchita> okay thanks
2019-05-14T04:53:34.032000
Eboni
pythondev_help_Eboni_2019-05-14T04:53:34.032000
1,557,809,614.032
23,554
pythondev
help
Hi folks, I'm trying to create a list of lists using comprehension to no avail. I want `[[foo, bar], [foo, bar], [foo, bar]]` where `foo` and `bar` comes from a function call.. I tried something like this and variations of it: ``` [[a, b] for a, b in (func_a(), func_b())] ``` But that doesn't work, obviously :slightly_smiling_face:
2019-05-14T05:47:20.034200
Jolynn
pythondev_help_Jolynn_2019-05-14T05:47:20.034200
1,557,812,840.0342
23,555
pythondev
help
how many [foo, bar] do u want?
2019-05-14T05:50:26.034800
Lolita
pythondev_help_Lolita_2019-05-14T05:50:26.034800
1,557,813,026.0348
23,556
pythondev
help
[[func_a(), func_b()] for _ in range(how_many)]
2019-05-14T05:51:24.035400
Lolita
pythondev_help_Lolita_2019-05-14T05:51:24.035400
1,557,813,084.0354
23,557
pythondev
help
can you elaborate what you said? Do you have any reference so that i can go through.
2019-05-14T05:52:02.035500
Darcie
pythondev_help_Darcie_2019-05-14T05:52:02.035500
1,557,813,122.0355
23,558
pythondev
help
Do you understand what "external " means? How it differs from "internal"?
2019-05-14T05:52:40.036300
Chester
pythondev_help_Chester_2019-05-14T05:52:40.036300
1,557,813,160.0363
23,559
pythondev
help
Well it's actually wrapped in a for loop already: ``` for node in xml.findall('element'): self.my_list = [[a, b] for a, b in (node.get('a'), node.get('b'))} ```
2019-05-14T05:53:22.037300
Jolynn
pythondev_help_Jolynn_2019-05-14T05:53:22.037300
1,557,813,202.0373
23,560
pythondev
help
Sorry for not mentioning that :slightly_smiling_face:
2019-05-14T05:53:34.037600
Jolynn
pythondev_help_Jolynn_2019-05-14T05:53:34.037600
1,557,813,214.0376
23,561
pythondev
help
I guess you want to do `my_list = [[node.get('a'), node.get('b')] for node in xml.findall('element')]`
2019-05-14T05:55:44.038600
Lolita
pythondev_help_Lolita_2019-05-14T05:55:44.038600
1,557,813,344.0386
23,562
pythondev
help
Oh yeah, that seems more logical..
2019-05-14T05:57:35.038900
Jolynn
pythondev_help_Jolynn_2019-05-14T05:57:35.038900
1,557,813,455.0389
23,563
pythondev
help
Thanks!
2019-05-14T05:58:29.039100
Jolynn
pythondev_help_Jolynn_2019-05-14T05:58:29.039100
1,557,813,509.0391
23,564
pythondev
help
i am using `asyncio.ensure_future` to run multiples tasks. In the task, under a certain condition i need everything (all tasks) to sleep and wait for 15 minutes. so i use `time.sleep(900)` however i get the following timeout error after 20-30 minutes. ```concurrent.futures._base.TimeoutError``` Why is it timing out and not waking after 900s
2019-05-14T07:22:05.041800
Pura
pythondev_help_Pura_2019-05-14T07:22:05.041800
1,557,818,525.0418
23,565
pythondev
help
If you need to sleep in asyncio-based apps, you should use `asyncio.sleep` instead
2019-05-14T07:23:04.042400
Chester
pythondev_help_Chester_2019-05-14T07:23:04.042400
1,557,818,584.0424
23,566
pythondev
help
Also, shameless plug: <https://github.com/malinoff/aionursery>
2019-05-14T07:23:20.042800
Chester
pythondev_help_Chester_2019-05-14T07:23:20.042800
1,557,818,600.0428
23,567
pythondev
help
What if i needed the entire process to sleep because i need to wait until some rate limit has reset.
2019-05-14T08:21:43.043400
Pura
pythondev_help_Pura_2019-05-14T08:21:43.043400
1,557,822,103.0434
23,568
pythondev
help
Uhm, why?
2019-05-14T08:21:58.043600
Chester
pythondev_help_Chester_2019-05-14T08:21:58.043600
1,557,822,118.0436
23,569
pythondev
help
I need to wait until the rate limit has reset before continuing with the tasks
2019-05-14T08:22:13.043900
Pura
pythondev_help_Pura_2019-05-14T08:22:13.043900
1,557,822,133.0439
23,570
pythondev
help
so i set a sleep to try and do this
2019-05-14T08:22:23.044100
Pura
pythondev_help_Pura_2019-05-14T08:22:23.044100
1,557,822,143.0441
23,571
pythondev
help
You still want the event loop to roll. With `time.sleep` you essentially block everything from running, including various IO readiness checks, which is really really bad. You don't want to do this in asyncio-based application
2019-05-14T08:23:14.044800
Chester
pythondev_help_Chester_2019-05-14T08:23:14.044800
1,557,822,194.0448
23,572
pythondev
help
Okay
2019-05-14T08:24:35.045000
Pura
pythondev_help_Pura_2019-05-14T08:24:35.045000
1,557,822,275.045
23,573
pythondev
help
but if i need to have some feature to wait for 15 minutes before continuning running of the tasks/co-routines how could i do this
2019-05-14T08:37:24.045300
Pura
pythondev_help_Pura_2019-05-14T08:37:24.045300
1,557,823,044.0453
23,574
pythondev
help
pause all tasks for 15 minutes depending on some condition
2019-05-14T08:38:03.045500
Pura
pythondev_help_Pura_2019-05-14T08:38:03.045500
1,557,823,083.0455
23,575
pythondev
help
Backups: How are you guys implementing backups on your production sites? I'm using a postgres database.
2019-05-14T08:39:05.045800
Tatum
pythondev_help_Tatum_2019-05-14T08:39:05.045800
1,557,823,145.0458
23,576
pythondev
help
RDS
2019-05-14T08:39:16.046000
Chester
pythondev_help_Chester_2019-05-14T08:39:16.046000
1,557,823,156.046
23,577
pythondev
help
It's kind of weird requirement. What if one of the tasks is in the middle of DB transaction?
2019-05-14T08:39:47.046100
Chester
pythondev_help_Chester_2019-05-14T08:39:47.046100
1,557,823,187.0461
23,578
pythondev
help
if you are on AWS, use RDS.
2019-05-14T08:41:23.046600
Karoline
pythondev_help_Karoline_2019-05-14T08:41:23.046600
1,557,823,283.0466
23,579
pythondev
help
ye true
2019-05-14T08:41:34.046900
Pura
pythondev_help_Pura_2019-05-14T08:41:34.046900
1,557,823,294.0469
23,580
pythondev
help
I'm currently on GCS.
2019-05-14T08:41:41.047200
Tatum
pythondev_help_Tatum_2019-05-14T08:41:41.047200
1,557,823,301.0472
23,581
pythondev
help
there must be a better way to seperate this code then
2019-05-14T08:41:42.047300
Pura
pythondev_help_Pura_2019-05-14T08:41:42.047300
1,557,823,302.0473
23,582
pythondev
help
GCS? Google cloud?
2019-05-14T08:42:10.047800
Chester
pythondev_help_Chester_2019-05-14T08:42:10.047800
1,557,823,330.0478
23,583
pythondev
help
I imagine they have a similar managed DB service.
2019-05-14T08:42:12.047900
Karoline
pythondev_help_Karoline_2019-05-14T08:42:12.047900
1,557,823,332.0479
23,584
pythondev
help
Yup. I'll take a poke around. It may be worth migrating to AWS as you can't take chances with backing up data. Thanks!
2019-05-14T08:42:44.049100
Tatum
pythondev_help_Tatum_2019-05-14T08:42:44.049100
1,557,823,364.0491
23,585
pythondev
help
set up a DB, import a dump from your current database, do some stuff, then restore from backup so you are comfortable with the process and how it works.
2019-05-14T08:42:58.049400
Karoline
pythondev_help_Karoline_2019-05-14T08:42:58.049400
1,557,823,378.0494
23,586
pythondev
help
because i need someway to pause these operations because my app is rate limited
2019-05-14T08:43:00.049500
Pura
pythondev_help_Pura_2019-05-14T08:43:00.049500
1,557,823,380.0495
23,587
pythondev
help
Yes, Google Cloud Services
2019-05-14T08:43:15.050100
Tatum
pythondev_help_Tatum_2019-05-14T08:43:15.050100
1,557,823,395.0501
23,588
pythondev
help
I'd say you don't need to pause operations. You need to stop scheduling operations when you face a rate limiting action.
2019-05-14T08:43:43.050700
Chester
pythondev_help_Chester_2019-05-14T08:43:43.050700
1,557,823,423.0507
23,589
pythondev
help
at this point with RDS we're comfortable enough with the restore process that we actually use it for troubleshooting as well - we'll replicate prod for really hard to reproduce issues and test against that, it's stupid easy.
2019-05-14T08:43:45.050900
Karoline
pythondev_help_Karoline_2019-05-14T08:43:45.050900
1,557,823,425.0509
23,590
pythondev
help
RDS is also nice because it's your lovely postgres under the hood. There is no vendor lock-in, really. You can just `pg_dump` your whole data any time, and migrate somewhere else
2019-05-14T08:44:43.051600
Chester
pythondev_help_Chester_2019-05-14T08:44:43.051600
1,557,823,483.0516
23,591
pythondev
help
hmm interesting what if the operation youre currently on faced a rate limit action and you need to repeat it?
2019-05-14T08:44:56.051900
Pura
pythondev_help_Pura_2019-05-14T08:44:56.051900
1,557,823,496.0519
23,592
pythondev
help
We also had to increase the size of our prod database and it was as easy as editing the size of of disk allocated to it. It's an incredible product really.
2019-05-14T08:45:32.053100
Karoline
pythondev_help_Karoline_2019-05-14T08:45:32.053100
1,557,823,532.0531
23,593
pythondev
help
Put it in a waiting queue; replay the queue when it's possible
2019-05-14T08:45:46.053500
Chester
pythondev_help_Chester_2019-05-14T08:45:46.053500
1,557,823,546.0535
23,594
pythondev
help
<@Tatum> FWIW, my company’s on GCP/GKE and not sure how the ops team set it up, but there’s definitely lots of backing up going on with the dbs
2019-05-14T08:45:56.054200
Hiroko
pythondev_help_Hiroko_2019-05-14T08:45:56.054200
1,557,823,556.0542
23,595
pythondev
help
I wish all aws products were like RDS...
2019-05-14T08:46:10.054600
Chester
pythondev_help_Chester_2019-05-14T08:46:10.054600
1,557,823,570.0546
23,596
pythondev
help
so I doubt you need to go from GC to AWS just for this thing
2019-05-14T08:46:16.054900
Hiroko
pythondev_help_Hiroko_2019-05-14T08:46:16.054900
1,557,823,576.0549
23,597
pythondev
help
haha yeah it definitely stands out as one of their best designed. not everything is so nice for sure
2019-05-14T08:46:27.055200
Karoline
pythondev_help_Karoline_2019-05-14T08:46:27.055200
1,557,823,587.0552
23,598
pythondev
help
yeah my guess is there's something similar on google, it's pretty fundamental
2019-05-14T08:46:44.056000
Karoline
pythondev_help_Karoline_2019-05-14T08:46:44.056000
1,557,823,604.056
23,599
pythondev
help
At the moment i scheduled operations in batches like this ``` for i, batch in enumerate(batches): tasks = [asyncio.ensure_future(fetch_user_objects(apps, session, user_id=users, tweet_mode="extended")) for users in batch] for t in tasks: d = await t ``` Do you recommend another method? i batch request 10 at a time otherwise the process is so slow
2019-05-14T08:46:57.056300
Pura
pythondev_help_Pura_2019-05-14T08:46:57.056300
1,557,823,617.0563
23,600
pythondev
help
Thanks guys! Much appreciated. I found GCS has had to list their matching services with AWS. <https://cloud.google.com/free/docs/map-aws-google-cloud-platform>
2019-05-14T08:47:39.057400
Tatum
pythondev_help_Tatum_2019-05-14T08:47:39.057400
1,557,823,659.0574
23,601
pythondev
help
awesome, I'd be curious to hear your experience after you play with it for a bit.
2019-05-14T08:48:49.057900
Karoline
pythondev_help_Karoline_2019-05-14T08:48:49.057900
1,557,823,729.0579
23,602
pythondev
help
also, don't forget, you don't really have a backup until you've restored from it. so make sure you test out your restore process.
2019-05-14T08:49:14.058700
Karoline
pythondev_help_Karoline_2019-05-14T08:49:14.058700
1,557,823,754.0587
23,603
pythondev
help
So true. I recently tried a backup script and couldn't restore it. :scream_cat: Luckily it was all fake data I inserted in a development environment
2019-05-14T08:51:55.060300
Tatum
pythondev_help_Tatum_2019-05-14T08:51:55.060300
1,557,823,915.0603
23,604
pythondev
help
I guess `asyncio.gather` would be easier
2019-05-14T08:52:15.060400
Chester
pythondev_help_Chester_2019-05-14T08:52:15.060400
1,557,823,935.0604
23,605
pythondev
help
i mean how would you implement the “stop scheduling if the fetch_user_object” hit a rate limit action
2019-05-14T08:52:57.061000
Pura
pythondev_help_Pura_2019-05-14T08:52:57.061000
1,557,823,977.061
23,606
pythondev
help
People can be in three groups: • ones who don't make backups yet • ones who do • ones who test restoration procedure
2019-05-14T08:53:33.061600
Chester
pythondev_help_Chester_2019-05-14T08:53:33.061600
1,557,824,013.0616
23,607
pythondev
help
ah the problem is that rate limit action might be for 1 app, but i need to stop scheduling if all the apps hit the rate limit
2019-05-14T08:53:33.061700
Pura
pythondev_help_Pura_2019-05-14T08:53:33.061700
1,557,824,013.0617
23,608
pythondev
help
haha yes.
2019-05-14T08:53:42.062100
Karoline
pythondev_help_Karoline_2019-05-14T08:53:42.062100
1,557,824,022.0621
23,609
pythondev
help
oh, and people who think RAID is backup. lol
2019-05-14T08:53:50.062500
Karoline
pythondev_help_Karoline_2019-05-14T08:53:50.062500
1,557,824,030.0625
23,610
pythondev
help
the `fetch_user_objects` takes in `apps` as a parameter so i could check to see if all apps are rate limited in this function by adding some flag and if they are return some error response. i would have to make apps a global variable so that the state is persisted?
2019-05-14T08:54:36.062600
Pura
pythondev_help_Pura_2019-05-14T08:54:36.062600
1,557,824,076.0626
23,611
pythondev
help
&gt; how would you implement the “stop scheduling if the fetch_user_object” hit a rate limit action Just put a `await asyncio.sleep()` right before your `asyncio.ensure_future` calls.
2019-05-14T08:54:37.062800
Chester
pythondev_help_Chester_2019-05-14T08:54:37.062800
1,557,824,077.0628
23,612
pythondev
help
I guess they fall into the first group :slightly_smiling_face:
2019-05-14T08:55:06.063300
Chester
pythondev_help_Chester_2019-05-14T08:55:06.063300
1,557,824,106.0633
23,613
pythondev
help
i only sleep based on some condition that would be sleeping before everything?
2019-05-14T08:55:10.063400
Pura
pythondev_help_Pura_2019-05-14T08:55:10.063400
1,557,824,110.0634
23,614
pythondev
help
Sure, you need to do a conditional sleep. Is that a problem? :slightly_smiling_face:
2019-05-14T08:57:19.063900
Chester
pythondev_help_Chester_2019-05-14T08:57:19.063900
1,557,824,239.0639
23,615
pythondev
help
yes because i need to return a response from one of these ensure futures first
2019-05-14T08:57:49.064100
Pura
pythondev_help_Pura_2019-05-14T08:57:49.064100
1,557,824,269.0641
23,616
pythondev
help
``` for i, batch in enumerate(batches): if rate_limited(): await asyncio.sleep(some_time) tasks = [asyncio.ensure_future(fetch_user_objects(apps, session, user_id=users, tweet_mode="extended")) for users in batch] for t in tasks: d = await t ```
2019-05-14T08:58:04.064300
Chester
pythondev_help_Chester_2019-05-14T08:58:04.064300
1,557,824,284.0643
23,617
pythondev
help
and this rate_limited() function could loop through all the apps and check if the flag self.rate_limit = True
2019-05-14T08:58:58.064900
Pura
pythondev_help_Pura_2019-05-14T08:58:58.064900
1,557,824,338.0649
23,618
pythondev
help
this self.rate_limit = True is set within `fetch_user_objects` if it hits a 429 error from the http request
2019-05-14T08:59:21.065100
Pura
pythondev_help_Pura_2019-05-14T08:59:21.065100
1,557,824,361.0651
23,619
pythondev
help
the last problem i think i see here is… this ensure_future is batch of 10 requests. If in the 5th request everything is rate limited, i would need to sleep…
2019-05-14T09:00:23.065800
Pura
pythondev_help_Pura_2019-05-14T09:00:23.065800
1,557,824,423.0658
23,620