repo_name
stringlengths
4
136
issue_id
stringlengths
5
10
text
stringlengths
37
4.84M
microsoft/CryptoNets
657049895
Title: 'error budget is zero' exception & noise budget reduction in different encryption parameters Question: username_0: Dear authors, Thanks very much for your released codes. Could I ask you two questions as follows? 1. I met the 'error budget is zero' exception in running CryptoNets codes. 2. Could you help me with how to use CryptoTracker in your codes to track one operation's noise budget reduction? My target is to know the noise budget reduction for each operation under different degrees N and the number of plaintext modulus. Could you kindly give me any ideas on how to implement my target? Thanks in advance for your response.<issue_closed> Status: Issue closed
httpwg/httpbis-issues
725491455
Title: Refining age for 1.1 proxy chains (Trac #212) Question: username_0: Migrated from https://trac.ietf.org/ticket/212 ```json { "status": "closed", "changetime": "2012-03-22T01:52:45", "_ts": "1332381165103275", "description": "A lot of the text in 2616's caching section \n-- especially around age calculation -- was explanatory, not spec \ntext. Most of it has been removed in p6-07, but some still remains, \nand my take is that this is one example of this.\n\nAs you've noticed, the algorithm is very conservative; i.e., it will \nalways err on the side of considering something older than it actually \nis. This is annoying when the freshness lifetime is short (or the skew \nvery large), as you point out making things that could have been \ncacheable uncachable.\n\nSince transit time is already accounted for here (related issue: <http://tools.ietf.org/wg/httpbis/trac/ticket/29 \n >), it seems like the wild card that you have to deal with is HTTP/ \n1.0 caches not emitting Age headers.\n\nJust having a 1.1 origin isn't necessarily good enough, because there \ncould (in theory) be a 1.0 cache interposed somewhere along the way; \nremember, they aren't required to emit Age nor Via, and while the next \nhop towards the UA *should* record it in the Via header (presuming \nit's 1.1), as we know not everyone sends them.\n\nI'm wondering if it's good enough to specify that if:\n - your next hop is a proxy AND it sends a Via header that's all \n1.1, OR\n - your next hop is the origin, and if the Via header is present \nit's all 1.1\nyou can calculate age using the age header without trying to account \nfor hidden 1.0 caches in the chain (using Date).\n\nThis does have the potential to mess up in a few circumstances, but \nAFAIK 1.0 caches will produce Age anyway; e.g. Squid. Most of the \nother caches deployed are going to be either accelerators/CDNs (which \nalready do unholy things with Date and Age; see Edith Cohen's paper \nfrom a while back), or interception caches, which are responsible for \nany problems they cause anyway.\n\nThoughts? An alternative would be to reduce the Date portion of the \ncalculation to a SHOULD-level requirement.\n\n\nSee also \nhttp://www.w3.org/mid/033b01cb0188$fdc07660$f9416320$@org\nand previous in thread.\n\nproposal:\n\nresponse_delay = response_time - request_time;\ncorrected_age_value = age_value + response_delay;\nif (reponse_version_1_1 && !response_via_1_0) {\n\tcorrected_initial_age = corrected_age_value;\n} else {\n\tapparent_age = max(0, response_time - date_value);\n\tcorrected_initial_age = max(apparent_age, corrected_age_value);\n}\t\nresident_time = now - response_time;\ncurrent_age = corrected_initial_age + resident_time;\n\n\n", "reporter": "<EMAIL>", "cc": "", "resolution": "fixed", "time": "2010-06-02T06:55:15", "component": "p6-cache", "summary": "Refining age for 1.1 proxy chains", "priority": "later", "keywords": "", "milestone": "18", "owner": "<EMAIL>", "type": "design", "severity": "Active WG Document" } ```
LeaVerou/multirange
163549096
Title: ghost.value returns single instead of array Question: username_0: Ghost input returns single value, whereas it has attribute `multiple`. That is confusing in case if we try to handle `input` event and read `this.value`. I am trying multirange in [prama](https://github.com/username_0/prama), it is very graceful solution for inputs, thanks @username_1! Answers: username_1: The idea is that you wouldn't be using ghost at all in your code. But I see your use case, multirange should fire an input on the main slider when either is modified.
ReactiveX/RxSwift
105719901
Title: combineLatest resultSelector redundant? Question: username_0: It seems that `combineLatest` could be simplified to exclude `resultSelector` given that `combineLatest(a, b, fab)` is exactly equivalent to `combineLatest(a, b).map(fab)`? Is this correct? I did notice that the other Rx implementations also include this function though, so maybe consistency is a more pressing concern, or I'm missing something obvious. Answers: username_1: Hi @username_0 , I understand why do you feel that `resultSelector` should be excluded. I'm not sure I can convince you that it's not excess function argument, but I can share reasoning why it's there, so maybe that will clear things up. While I was working on this project I have tried to really think about which parts of Rx are general cross platform concepts, and which ones are just per platforms adaptations. I'm not trying to say this is some kind of universal truth, but more explanation why things are the way they are. * the name says `combineLatest`, and not `latestTuple`, so `combine` part is unclear without a way to specify how elements are combined. * if we were to exclude `resultSelector` we would in a statistically convincing number of cases need to have `map` behind every `combineLatest` anyway because tuples aren't usually useful. * writing `combineLatest(a, b, +)` is as clear, but more concise then `combineLatest(a,b).map(+)`. * even though we try to make operators as cheap as possible, one operator still has better performance then 2, and less stack track footprint. * this way is consistent with other Rx implementations. * there is also a version of `combineLatest` that extends array of observables and resultSelector that accepts array of elements, so if we were exclude selector, we would make a different API for a function that shares the name. There are maybe other reasons I can't think of right now, but these were enough for me. Hope this clears things up :) username_0: Thanks for the comprehensive answer, I can see that your reasoning is well thought out! Status: Issue closed
HexTree/curry-bot
509398852
Title: Command to auto-create multitwitch of all runners currently in voice-chat Question: username_0: Tricky bit is how to fetch and update Discord users' Twitch usernames. Could provide punctionality to allow Discord users to add their Twitch username via commands, which gets written to a permanent database. Alternatively, can look into Discord roles and labels, see if we can use that somehow. Status: Issue closed Answers: username_0: Closing, as it doesn't seem like it will practically make things easier.
lovelmh13/myBlog
952675817
Title: crontab 执行 shell 脚本报错:node: command not found Question: username_0: ## 场景: 使用 crontab 定时任务执行 shell 中的 node 命令,mail 发送信息,发现报错:node: command not found。但是直接执行 shell 运行 node 没有问题。 代码如下: ```bash #!/usr/bin/env bash node /home/work/jarvis-service/shell/sendMobDailyMessage.js ``` ## 查找问题: 检查发现安装了 nvm 来管理 node,通过查看环境变量,发现 node 的环境变量在 nvm 中。尝试使用 nvm 中的 node 路径来执行。成功解决。 代码如下: ```bash #!/usr/bin/env bash ~/.nvm/versions/node/v14.14.0/bin/node /home/work/jarvis-service/shell/sendMobDailyMessage.js ```
VKCOM/VKUI
862676352
Title: [Question] Question: username_0: Сейчас Tooltip - это экспериментальный компонент. Будет ли дорабатываться? И появится ли поведение- отобразить сообщение при наведении курсора на оборачиваемый компонент? Answers: username_1: Будет. Появится. username_0: Спасибо! Приблизительно когда? username_1: Сейчас сроков дать не можем :( Status: Issue closed
neta540/ArduinoThreadedGSM
347964401
Title: sendSMS examples Question: username_0: Hi there, thanks for making this library. Could you add an sms send example. You mention that the string to pass to the method should be a PDU but it's not particularly clear what format this should take without an example. Answers: username_1: The PDU format should be a hexadecimal string of the encoded PDU you would like to send. This library uses SMS PDU mode. When sending a message, the message payload has to be encoded to PDU. In most cases that would require additional code or a library for encoding, and that does not belong here. Similarly, to parse or read an incoming message, you'd have to decode the received PDU. For information about PDU mode take a look at [http://www.gsm-modem.de/sms-pdu-mode.html](http://www.gsm-modem.de/sms-pdu-mode.html). This link has example of sending a message in PDU mode and how to encode it. In their example **0011000B916407281553F80000AA0AE8329BFD4697D9EC37** will send a message "hellohello" to 0708251358. username_0: Hi thanks for replying so quickly. Given the complexity and specificity of PDU encoding, would it not be prudent to incorporate a PDU parser and create an overloaded version of the sendSMS method which merely takes number and message as two char arrays? If I was to do this, would you be interested in merging it into your repo? username_1: I chose not to integrate a PDU encoder/decoder module since I'd prefer this kind of module not to be specific only to this library. In my opinion, a universal library that is capable of generating a PDU message and parsing it, regardless of GSM library, device, or a modem model, would be much more beneficial to the Arduino community, than having it as a part of this specific library. It would be nice to see a good, resource-friendly PDU library written for Arduino devices. In case you were to write such library, more information about the sending process could be written in the readme, or even adding a new example, referencing to that library. username_0: Thanks for your response Neta. I understand your decision. I'll be trying https://github.com/tardigrade888/c-pdu. If I have problems with that lib, I'll fork it and make it compatible with your methods. I really like the way you've designed it. It's surprising to see so few people use it on platformio, I hope it gets more recognition. username_2: Hi, I also thanks for making this library. Generally it’s working except sending SMS. I have read above about PDU and I have properly formatted message (I’ve check it on various online PDU decoder/encoder) but nothing happens. Maybe I’ve put it in wrong place ? Just below SIM900.setHandlers() in yours ArduinoThreadedGSM example. Could you add another example how to use sendSMS() function? Please… Ps. How to turn on debug mode in ThreadedGSM.h? username_1: 1. Make sure your PDU message is correct, perhaps the receiver number is wrong. I can't remember how the receiver number has to be formatted. I suppose it has to be an international phone number, I can't remember at the moment whether a `+` sign prefix is necessary or should be omitted. 2. Debug will be printed to serial when you add a line such as: ```c++ #define THREADEDGSM_DEBUG Serial1 ``` You will have to define it before including the library code with `#include <ThreadedGSM.h>` As in the example: ```c++ #define THREADEDGSM_DEBUG SerialDebug #include <ThreadedGSM.h> ``` You will need to replace `SerialDebug` with the Serial to output debug information to. username_2: Finally it’s working! There was two problems. The first one was a little funny. I’m using Visual Studio Code with PlatformIO extension and I install ThreadedGSM using it’s feature believing it’s the newest version. But as I discover later – it wasn’t. So, believing it’s the newest version, I start to study Your code and I reach a conclusion that there is no way it can work (sending SMS). First of all there was lack of “SMS.OutboxMsgContents = PDU;” statement in void sendSMS(String& PDU) function. But sending SMS still wasn’t working. Then I “discover” on your Github page, that missing statement was already corrected by You, but some how I miss it. So, I download the “github version” of the ThreadedGSM, but it doesn’t help. The second problem -> In my case it was with the part of the Outbox() function which calculate TPDU_length (https://www.developershome.com/sms/cmgsCommand4.asp). In my case PDU always contain SMSC number in international format, so simply (SMS.OutboxMsgContents.length()-16) / 2 is doing the job. Thank You for your effort and time, it spare a lot of my time. ;) username_1: A fix seems to be needed here. I don't like numbers like 16 appearing in the code when they can be of a different value than 16. What was the value of `smsc_length` in your case? Can we use `smsc_length` to get the right formula for the final value? You're welcome to create a PR with a fix which will work for all SMSC numbers and will calculate the 16 for you. maybe even `smsc_length` is 16 or 8, I am not sure. I don't have any board to test it, so please let me know. Thank you. username_2: I don't have a needed knowledge to do this. But I think, that I figure it out and for You it'll be 1 minutes of coding. SMS message encoded to PDU contain two parts - the SMSC part (contain SMSC number) and TPDU part (whatever it’s mean). We need to separate this two parts and count length of the TPDU part which is parameter to AT+CMGS command. The SMSC part contain three sub-fields. First octet is the length of the second and third sub-fields together. Second sub-field it’s type of SMSC number and it may be 0x81 or 0x91. 0x81 - the SMSC number assigned to the third sub-field can be either "85290000000" (country code included) or "90000000" (country code omitted) 0x91 - SMSC number assigned to the third sub-field should be "85290000000" By the truth it doesn’t matter for us. The third sub-field is SMSC number. Here are two PDU’s with SMSC number and without: With SMSC number ------------------------- AT+CMGS=25 SMSC part TPDU part 07 91 8498674523F1 11000B918421436587F90000AA0CC8329BFD06DDDF72363904 07 means that the second and third sub-fields together contain 14 hexadecimal digits in "918498674523F1" and each hexadecimal digit represents 4 bits. So, there are totally 7 octets. That's why the value of the first sub-field is 0x07. Plus length of “07” and we got 16 digits. SMSC#: +48897654321 Receipient#:+48123456789 Validity:Rel 4d TP_PID:00 TP_DCS:00 TP_DCS-popis:Uncompressed Text No class Alphabet:Default Hello world! Length:12 Without SMSC number ------------------------- AT+CMGS=25 SMSC part TPDU part 00 11000B918421436587F90000AA0CC8329BFD06DDDF72363904 There is nothing to explain I think. SMSC# Receipient:+48123456789 Validity:Rel 4d TP_PID:00 TP_DCS:00 TP_DCS-popis:Uncompressed Text No class Alphabet:Default Hello world! Length:12 username_1: Fixed in 0.9.2 https://github.com/username_1/ArduinoThreadedGSM/commit/d1764c265faebbcb4f7d1daa37e04581818635ab Please submit your results with 0.9.2. Thank you. username_2: With SMSC number given it's working OK, but without it - NO. With SMSC number given AT+CMGS=65 but without it - AT+CMGS=71 and this is wrong because TPDU part was the same in both cases. username_1: ```c++ #include <stdio.h> #include <string.h> int main(void) { const char pdu[] = "0011000B918421436587F90000AA0CC8329BFD06DDDF72363904"; int smsc_length = 0; char smsc_len_b; for (int i = 0; i <= 1; i++) { smsc_len_b = pdu[i]; smsc_length <<= 4; if (smsc_len_b >= '0' && smsc_len_b <= '9') smsc_length |= smsc_len_b - '0'; else if (smsc_len_b >= 'a' && smsc_len_b <= 'f') smsc_length |= smsc_len_b - 'a' + 10; else if (smsc_len_b >= 'A' && smsc_len_b <= 'F') smsc_length |= smsc_len_b - 'A' + 10; } printf("pdu is: %s\n", pdu); printf("smsc_length: %d\n", smsc_length); int cmgs = (strlen(pdu) / 2 - (smsc_length + 1)); printf("cmgs: %d\n", cmgs); return 0; } ``` Both return: **cmgs: 25** username_2: Forgive me please, I was wrong :(. I forgot that in function preparing PDU, I hard-coded 07 91 at the beginning, whether SMSC number was given or not. I corrected this and it seems that everything is OK. SMS are sent in both cases. Sorry! username_1: Great, then I close the issue. Fixed in 0.9.2 https://github.com/username_1/ArduinoThreadedGSM/commit/d<PASSWORD>eb<PASSWORD> Status: Issue closed
nuxt-community/hapi-nuxt
455719508
Title: Feature: Prevent re compiling on dev mode when server file changes Question: username_0: Every time i change a file inside my server folder whole package runs in dev mode and recompiles. I'm using `nodemon` to start my app. I appreciate any suggestions. Thanks for this awesome plugin. Answers: username_1: It would be better to develop an host API Hapi server separately.
cyclonephp/DB
2883208
Title: Constraint violation exceptions Question: username_0: The database executors and prepared executors should throw cyclone\db\ConstraintException instances when the SQL statement fails due to a database constraint violation. This already works in cyclone\db\executor\PostgresExecutor. To achieve this the error messages need to be parsed. The constraint exceptions should store the following information: * constraint type (unique/primary/foreign) * the affected columns * the constraint name * the conflicting values (optional)<issue_closed> Status: Issue closed
ReactiveX/RxPY
646326351
Title: zip operator should complete if a single upstream source completes Question: username_0: The following observable `y` ... ``` python import rx from rx import operators as rxop y = rx.range(2).pipe( rxop.zip(rx.defer(lambda _: y)), rxop.merge(rx.just(0)), rxop.share(), ) y.subscribe(print, on_completed=lambda: print('completed')) ``` ... never completes. My suggestion: The zip operator should complete if any upstream source completed and the corresponding queue is empty. That is how it is implemented in [RxJava](https://github.com/ReactiveX/RxJava/blob/3.x/src/main/java/io/reactivex/rxjava3/internal/operators/observable/ObservableZip.java). Answers: username_1: I agree that zip should complete in such cases, because no item will be emitted after one source completes. Some applications may use the current behavior to wait for all streams to complete, but probably most people expect it to complete immediately. username_2: Hi @username_0, I created a PR #537 Status: Issue closed
LaunchPadLab/decanter
147851259
Title: DateTime parsing Question: username_0: I'm trying to pass "10:00 am" to the date_time parser and I'm getting an invalid date error. It looks like the parser is calling `Date.strptime` on my "10:00 am" string. Should it be using `DateTime.parse` instead? Answers: username_1: The datetime parser currently uses Date.strptime. It should user DateTime and the default format should include the time. username_1: PR #21 Submitted for @francirp Status: Issue closed
Azure/azure-signalr
382471929
Title: ASP.NET SignalR KeepAlive Time Question: username_0: Using the SignalR 2.4.0 Nuget package along with Microsoft.Azure.SignalR.AspNet 1.0.0-preview1-10259, the keep alive message time does not change when set on GlobalHost.Configuration.KeepAlive. It appears to be fixed at roughly 15 seconds when connected with websockets. Is this intentional or is there another way to set the KeepAlive message time? Here is the applicable C# code in Startup.cs with KeepAlive set before calling MapAzureSignalR: GlobalHost.Configuration.KeepAlive = TimeSpan.FromSeconds(4); app.MapAzureSignalR("/signalr", "myapp", new HubConfiguration() { EnableDetailedErrors = false, EnableJavaScriptProxies = false, EnableJSONP = false }, map => { map.AccessTokenLifetime = TimeSpan.FromDays(1); map.ConnectionCount = 5; GlobalHost.DependencyResolver.Register( typeof(JsonSerializer), () => JsonSerializerFactory.Value); }); Answers: username_1: This is not yet configurable.
adminfaces/admin-template
306246030
Title: Breadcrumb navigation breaks extensionless URLs when not using link attribute Question: username_0: ##### Issue Overview Using extensionless URLs results in the suffix `.xhtml` being appended to links generated by the breadcrumb navigation when the `link` attribute is not used. This is due to the usage of the `viewId` from JSFs view root. ##### Current Behaviour When using extensionless URLs (using OmniFaces or OCPSoft Rewrite) the Breadcrumb navigation will use the `viewId` (before it has been rewritten) including the configured extension (such as `.xhtml`). For example, if the URL `/dashboard` was accessed, and after that the url `/otherDashboard` the breadcrumb navigation will contain a link to `dashboard.xhtml`. ##### Expected Behaviour The suffix `.xhtml` is not automatically appended when using the Breadcrumb navigation and no `link` attribute is given (for example using a new config property automatically removing the `.xhtml` extension). ##### How to reproduce Configure extensionless URLs using OmniFaces or OCPSoft Rewrite rules, so that the JSF pages are reachable without `.jsf` or `.xhtml` extensions. A sample project or code may help. Can it be reproduced in [admin-starter](https://github.com/adminfaces/admin-starter)? ##### Additional Information * Affected version: 1.0.0.RC12 / 1.0.0.RC13-SNAPSHOT ##### Workaround Provide explicit links using the `link` attribute.<issue_closed> Status: Issue closed
segmentio/analytics-ios
89900303
Title: Unable to find a specification for `TestFlightSDK (= 3.0.2)` Question: username_0: I'm trying to integrate this into my iOS app, and when adding the Analytics pod, and run pod install, I get the following error ~~~sh Analyzing dependencies [!] Unable to find a specification for `TestFlightSDK (= 3.0.2)` depended upon by `Analytics/TestFlight` ~~~ And this is what my Podfile looks like ~~~ source 'https://github.com/CocoaPods/Specs.git' platform :ios, "8.0" target "MyApp" do pod 'TTTAttributedLabel' pod 'PureLayout' pod 'MTJSONUtils' pod 'KZPropertyMapper' pod 'MHPrettyDate' pod 'AFNetworking' pod 'UALogger' pod 'InstagramKit', '3.6.3' pod 'Mixpanel' pod 'Lookback' pod 'Instabug' pod 'Analytics' end ~~~~ Anyone knows how to get around this issue? Thanks! Answers: username_1: Can you try hardcoding the version? `pod 'Analytics', '1.11.13'` username_1: Let me know if you still see this error with `pod 'Analytics', '~> 2.0.0'` Status: Issue closed
Razer2015/NordeaTunnusluvut_Patched
490951096
Title: Versio 1.7 Question: username_0: Nordea on päivittänyt taas Codes-appia ja vanhalla versiolla on jo alkanut tulemaan varoituksia, joten aikaisemmasta oppineena nyt on noin 2 viikkoa aikaa saada uusi versio toimimaan puhelimessa. Uusin versio näyttää käyttävän natiivikirjastoa, jota kutsutaan Dalvik-koodista. Onkohan näistä jo tietoa, onnistuuko LineageOS/root-tunnistuksen poisto pelkkää Dalvik-koodia muuttamalla (esim. poistamalla tietyn funktion ohjaaminen natiivikirjaston puolelle)? Jos en saa Codes-appia toimimaan LineageOS-puhelimessa, niin ennemmin vaihtuu pankki kuin puhelin... Answers: username_1: Nuo natiivikirjastot on juurikin pahoja itselle, koska en osaa niitä debuggailla. Muistaakseni Nordea Pay käytti jo ihan alkuaikoina checkkiä tuolla natiivikirjastoissa ja aluksi sen pystyi patchaaman smali-koodia muokkaamalla (ylikirjoitti natiivikirjaston palauttaman arvon). Jossain vaiheessa tuo kuitenkin muuttui ja tuo ei enää onnistunut. En muista oliko tässä vai jossain muussa ohjelmassa, mutta kyseinen ohjelma suljettiin natiivikoodista käsin (kaatui), jos puhelin oli rootattu. Ei ole vieläkään oikein kunnolla vapaa-aikaa niin ei kauheasti kerkeä tutkiskelemaan. Pari tuntia ei tunnu riittävän, koska siellä ainakin edellisessä versiossa oli useampia tarkastuksia. Yritän saada jostain revittyä aikaa, jotta voisin tuota testata, mutta näillä näkymin ei kyllä kauheasti kerkeä. Noissa on myös se, että taitaa lähes kaikilla pankeilla olla noita root-tarkastuksia ohjelmissa. En tiedä toimisiko näiden ohittamisessa tuo Magisk Hide? Itsellä ei ole kokemusta kyseisestä tavasta, mutta muualta vaan kuullut, että aika hyvin tuolla pystyy huijaamaan noita tarkistuksia.
cucumber/cucumber-jvm
693979413
Title: Is it possible to freely choose to run cucumber by feature or scenario level and implement failed retry functions Question: username_0: I would like to: ![Snipaste_2020-09-05_12-30-45](https://user-images.githubusercontent.com/66766526/92297619-d7dbb880-ef73-11ea-8c5c-948305ad8c89.png) Status: Issue closed Answers: username_1: Please don't use screen shots to report problems with code. [You can find a long list of reasons here](https://meta.stackoverflow.com/questions/285551/why-not-upload-images-of-code-errors-when-asking-a-question). That said. No this is not possible. Parallel execution is handled by a combination of JUnit and Surefire. Neither support parallel execution of `ParentRunner` (i.e. @RunWith(Cucumber.class)) implementations below the first level. username_2: Hi, I'd like to have a guidance on how to develop the retry I have the time to work on it and make a PR
rossfuhrman/_why_the_lucky_markov
449458790
Title: Ronaldo could no longer lives in the bottle, it’s the paperwork behind the counter and earnestly say to the class, instead of in such nightmarish custody. In another, related fantasy, I am having a seat on a giant incongruous and poorly-dyed market. Question: username_0: Toot: Ronaldo could no longer lives in the bottle, it’s the paperwork behind the counter and earnestly say to the class, instead of in such nightmarish custody. In another, related fantasy, I am having a seat on a giant incongruous and poorly-dyed market. One comment = 1 upvote. Sometime after this gets 2 upvotes, it will be posted to the main account at https://mastodon.xyz/@_why_toots
marcoschwartz/aREST
177748239
Title: Memory running out in while(true) block Question: username_0: Hi, why such block inside a personalized function is rebooting my ESP8622? ``` int receiveCommand(String command){ int timeStart = millis(); while(true){ //My code goes here //If not happen in 5 seconds then break this loop. if((millis() - timeStart) > TIMEOUT){ break; } } return 1; } ``` I would like to lock the request up to 5 seconds waiting for user interaction with the device. Thanks, Victor
rmosolgo/graphql-ruby
313442136
Title: [1.8] Inheriting arguments from input object Question: username_0: If I inherit from another input object the arguments are not passed down. If I include GraphQL::Schema::Member::HasArguments on the input object I am inheriting from - then everything works as expected. Pretty sure it has something to do with this in the HasArguments module: ![image](https://user-images.githubusercontent.com/3682518/38636459-3baef418-3d96-11e8-9420-c2191c47dab5.png) which is returning an empty hash Answers: username_1: aww bummer, thanks for opening the issue! I bet that check could be changed to `if superclass.respond_to?(:arguments)`, I updated a bunch of the other ones but must have missed one :S Status: Issue closed
jennyWan278/issue
331079951
Title: jquery优化 Question: username_0: [参考链接](http://www.cnblogs.com/xiaohuochai/p/8270561.html) ### jQuery代码优化的9种方法 * 用对选择器 * 最快的选择器:id选择器和元素标签选择器 * 较慢的选择器:class选择器 * 最慢的选择器:伪类选择器和属性选择器 * 理解父子关系,从父元素中选择子元素 * $('.child', $parent) * $parent.find('.child') * $parent.children('.child') * $('#parent > .child') * $('#parent .child') * $('.child', $('#parent'))
IFRCGo/go-frontend
568158384
Title: [Staging] PER: stop benchmarks from being mandatory Question: username_0: ## Issue / Expected behaviour As per request from the PER team, the _benchmarks_ in a PER form should not be mandatory **but** the _components_ should be. So within a form that is filled by a user (Account -> PER forms -> Choose National Society from dropdown -> click on one of the "Area" forms), the benchmarks should not be mandatory fields (in the screenshot below 2.X) but the final box within a component ("Component Y performance") should remain mandatory. ![image](https://user-images.githubusercontent.com/52411237/74917911-95cc9a00-53c8-11ea-9779-2a20bdcf8861.png) ## Steps to reproduce This concerns the forms highlighted below as each of them contains components and benchmarks. ![image](https://user-images.githubusercontent.com/52411237/74843481-42a50980-532c-11ea-836e-5037bf63e994.png) ## Criticality/Urgency As some historical PER data, that is to be migrated to GO, does not have all the benchmarks, this is a blocker for the upload of prior PER data. Answers: username_1: On the frontend, we need to: - Modify the `checkFormFilled` method to allow the `Benchmark Status` fields to be optional - https://github.com/IFRCGo/go-frontend/blob/develop/app/assets/scripts/components/per-forms/per-form.js#L231. There's probably a few ways to do this - ideally would be to add another property in the form data to specify `required` as either `True` or `False`. An alternative would be to always allow Yes / No questions to be optional. @necoline would be great to quickly get your thoughts on best approach here. On the backend: - Around here: https://github.com/IFRCGo/go-api/blob/master/per/views.py#L112 - ideally we would just not be sending values for things that are not filled in for the frontend, so instances of FormData for questions not filled in would just not exist. We just need to test thoroughly that if the backend does not contain data for some questions for a form, all aspects of the frontend work fine - this is a bit hard to tell for certain without some more digging and testing, but I will start working on making the Benchmark Status questions optional on the frontend, not save those answers to the db, and then do a thorough check of what might break as a consequence. username_2: @username_1 advises that this will take a significant chunk more time to build. So for now, this ticket is going ON HOLD, and I'm assigning @LukeCaley to confirm whether to proceed or whether to close ticket and leave all as mandatory Status: Issue closed username_2: This change request is CANCELLED - @LukeCaley has agreed with PER team that existing rules will stay as-is (i.e. all components and benchmarks are mandatory). thanks for investigating and working on this @username_1 but no code check in required. Closing ticket.
umijs/umi
615083205
Title: 切换路由会白屏一闪而过 Question: username_0: <!-- 感谢您向我们反馈问题,为了高效的解决问题,我们期望你能提供以下信息: --> ## What happens? <!-- 清晰的描述下遇到的问题。--> 第一次切换到某个路由下(没有缓存的情况下),切换路由会白屏一闪而过(浏览器上毫秒级,内嵌到安卓应用中就很明显)。单页应用切路由应该不会出现这种问题吧,不知道umi是否有针对这个的配置项? ## 相关环境信息 - **Umi 版本**:2.9.0 - **Node 版本**:12.16.0 - **操作系统**:mac Answers: username_1: 页面按需加载的吧,都会这样,关掉按需加载。就首屏启动较慢 username_0: @username_1 没有按需加载 username_2: 我预计在浏览器端,双端节点对比失败,导致组件重新渲染,也就是只有当服务端和浏览器端渲染的组件具有相同的props 和 DOM 结构的时候,组件才能只渲染一次。 username_1: 提供复现demo Status: Issue closed
cesc-park/attend2u
455465902
Title: ImportError: libcublas.so.8.0: cannot open shared object file: No such file or directory Question: username_0: Hey, I'm getting error while training the data after extraction, Can you help me with it? ![Capture](https://user-images.githubusercontent.com/42184498/59392837-c0287b00-8d96-11e9-85f7-b476e0506d95.JPG) Answers: username_1: I guess it's because your cudatoolkit is 9.* or 10.*. If you use anaconda, you can simply: `conda install cudatoolkit==8.0` or some other version. Or instead, you can use an updated tensorflow like `conda install tensorflow-gpu==1.12.0rc2`. Status: Issue closed username_2: @username_1 Thanks for the address I saw from requirements.txt that the source code requires tensorflow-gpu version 1.1.0 to run. Does tensorflow-gpu version 1.12 actually work for this source code?
webkom/lego-webapp
376484738
Title: Both sections are selected at the same time Question: username_0: At https://abakus.no/quotes/ , if one goes to to the unapproved section both sections will look selected <img width="407" alt="screenshot 2018-11-01 at 18 09 32" src="https://user-images.githubusercontent.com/23152018/47867277-5f4cc380-de01-11e8-896a-f9bf1475638e.png"> Answers: username_1: Close due to #1377 Status: Issue closed
unexpectedpanda/retool
867966697
Title: Sega - Mega Drive - Genesis - 2 Compilation Roms can be removed Question: username_0: These 2 Roms are redundant. The 1st contains the 6 titles listed and the 2nd contains the 2 titles listed. + Arcade Legends Sega Mega Drive ~ Arcade Legends Sega Genesis ~ Mega Drive Play TV (World) - Altered Beast (USA, Europe) - Dr. Robotnik's Mean Bean Machine (USA) - Flicky (USA, Europe) - Golden Axe (World) (Rev A) - Kid Chameleon (USA, Europe) - Sonic The Hedgehog (USA, Europe) + Arcade Legends Street Fighter II' - Special Champion Edition ~ Mega Drive Play TV 3 (World) - Daimakaimura ~ Ghouls'n Ghosts (Japan, USA) - Street Fighter II' - Special Champion Edition (USA) Answers: username_0: Updated ^ username_0: Sports Games boots straight to Super Volleyball. There's no game selection or any indication of one aside from the size of the rom. I personally booted every compilation rom listed in this issue and verified the games listed above manually. I tested Sports Games on 3 emulators all with the same results. username_1: Cheers for the work on this. Cross region compilations are complicated, so it might be some time before I get to this issue.
oursky/oursky-site
92825810
Title: Animated stars Question: username_0: My two cents: maybe it would be even better to update animated stars under “We craft state of the art applications” at the homepage into gold/ yellow colour? Answers: username_1: Actually quite a few people I send this website to told me they feel the stars is werid / meaningless to them... hmmm @username_2 username_2: I have updated this section with client's quote Status: Issue closed
iranianpep/ajax-live-search
265525775
Title: Split code ( client <--> server ) Question: username_0: Hi I'm trying to split the code, one part in client side other in server side with communication done using an API. Anyone have any recommendation, about what should be in client side and server side. Basically the idea to use API to search in server DB and return result to client side. Cheers, Nema
RSS-Bridge/rss-bridge
565805461
Title: nala_cat - Instagram Bridge failed with error 429 Question: username_0: Error message: `The requested resource cannot be found! Please make sure your input parameters are correct! cUrl error: (0) PHP error: ` Query string: `action=display&bridge=Instagram&context=Username&u=nala_cat&media_type=all&format=Html` Version: `dev.2019-12-01` Answers: username_1: It works on my side. Could you upgrade to the latest release ? Status: Issue closed username_2: Closing because `HTTP 429` is a generic rate limiter.
idaholab/raven
617525905
Title: [TASK] Cross-directory prereqs Question: username_0: -------- Issue Description -------- It would be helpful to be able to make a test a prerequisite test for a second tests in another directory. For example: ``` DirA/ inputA1.xml inputA2.xml tests DirB/ inputB1.xml inputB2.xml tests ``` and I would like to make test inputB2 depend on test inputA1. This is needed specifically for HERON where a set of ARMA ROMs are trained, then many tests in other directories make use of those ROMs. For now, static versions of the serialized ROM have been added to the repository, but this does not assure continuous integration. ---------------- For Change Control Board: Issue Review ---------------- This review should occur before any development is performed as a response to this issue. - [x] 1. Is it tagged with a type: defect or task? - [x] 2. Is it tagged with a priority: critical, normal or minor? - [x] 3. If it will impact requirements or requirements tests, is it tagged with requirements? - [x] 4. If it is a defect, can it cause wrong results for users? If so an email needs to be sent to the users. - [x] 5. Is a rationale provided? (Such as explaining why the improvement is needed or why current code is wrong.) ------- For Change Control Board: Issue Closure ------- This review should occur when the issue is imminently going to be closed. - [ ] 1. If the issue is a defect, is the defect fixed? - [ ] 2. If the issue is a defect, is the defect tested for in the regression test system? (If not explain why not.) - [ ] 3. If the issue can impact users, has an email to the users group been written (the email should specify if the defect impacts stable or master)? - [ ] 4. If the issue is a defect, does it impact the latest release branch? If yes, is there any issue tagged with release (create if needed)? - [ ] 5. If the issue is being closed without a pull request, has an explanation of why it is being closed been provided?
medly/medly-components
1120542940
Title: bug: multi-select not closed on dropdown icon click Question: username_0: ### Describe the bug When you have more than one multi-select on a screen and click on the chevron (dropdown) icon to open multiple multi-selects then all of them will open but the focus will be on the one opened with click on input field rather than chevron icon. ### To Reproduce Provide step-by-step instructions to reproduce the bug: 1. Go to [Multi-select](https://www.medlycomponents.com/?path=/docs/core-multiselect--default-story) 2. Click on `chevron icon` (dropdown) of multiple multi-select 3. See error ### Expected behavior When clicked outside of a multi-select including the `chevron icon` other multi-select should lost their focus and should close ### Screenshots <img width="1766" alt="Screenshot 2022-02-01 at 4 49 43 PM" src="https://user-images.githubusercontent.com/43134750/151959510-dd86216b-fdc4-43f7-ac38-7fe892e1f926.png"> ### Package versions - Latest version as of `1 feb 2022` on [Meldy components](medlycomponents.com) ### Additional context This only happens when multi-select is opened by clicking on `chevron icon` and not on the click of input field.
sciyoshi/redmine-slack
311864030
Title: Message goes to projects that have not set Slack channel Question: username_0: When redmine wiki update, message is sent to other channels even though notification is not set for the project. Project A has no channel setting in redmine. Project B has channel settings in redmine. When updating Project A's wiki, a message is sent to slack. redmine-slack version: 0.2 Thanks in advance.
harvard-lil/capstone
430457197
Title: Researcher unable to request temporary access twice Question: username_0: - An expiration date is set on the researcher's unlimited access the first time they get unlimited access - access expires - researcher requests access again at some point in time in the future - access is "granted", shows expired expiration date, access is useless<issue_closed> Status: Issue closed
nextstrain/ncov
867897429
Title: logistic growth needs units and symmetric color map Question: username_0: Our logistic growth coloring ![image](https://user-images.githubusercontent.com/8379168/116118140-4c0ecc00-a6bd-11eb-919c-4b25cdbc221d.png) should specify the unit in the legend (presumable %/week) and ideally would use a color map where no-growth has a neutral color. Answers: username_1: It would be good to update the "Color by" title with to be `Logistic growth per year`. In terms of color ramp, I think this requires an Auspice fix. We had originally considered a `range` for https://github.com/nextstrain/augur/blob/master/augur/data/schema-export-v2.json#L157, but it was never implemented. Given that this is a continuous coloring, we don't have a way to enforce color ramp at the moment. cc @username_2 username_2: Couple of improvements should really help here: - [ ] Allow `unit` to be specified in the `colorings` (e.g. "%/year") and then Auspice would add this to the legend entries - [ ] We currently allow non-continuous scales to provide a scale where values are given hexes, and missing values are given greys. For continuous scales we could allow one to define certain values and have d3 interpolate between them. E.g. for this example the following would look good: ```js { "key": "logistic_growth", "title": "Logistic growth", "unit": "%/year", "type": "continuous" "scale": [ ["-20", "#3350ff"], // blue ["-0.11", "#38d076"], // green ["-0.1", "#d0d0d0"], // grey ["0.1", "#d0d0d0"], // grey ["0.11", "#f7e944"], // yellow ["10", "#ff0808"] // red ] }, ``` username_2: @huddlej -- I'm confident that the spec in #1340 will be available in live nextstrain.org within the next few days, but to close this issue we'll need a `scale` and `legend` entry added to the coloring info for `logistic_growth`. P.S. There's something funny going on with the dataset JSON that's produced from the pipeline. The coloring definition for `logistic_growth` (and others, like `S1_mutations`) is repeated in the dataset JSON. Presumably the first one comes from the `global_auspice_config.json` (or similar) and the second is added to the array by one of our scripts. Status: Issue closed
ex-aws/ex_aws_s3
342971030
Title: rm_rf/1 for directory removal. Question: username_0: I'm looking for an operation of s3 directory removal. Locally `File.rm_rf!/1` was used. But unfortunately there seems no directory operations in `ex_aws_s3`. If there's no direct way to do, then how about make a function that receive a folder and remove all the files in it recursively? I think I can make a PR after some discussions. Answers: username_1: Hey @username_0, the main goal of the ExAws.S3 library is to provide an Elixir wrapper to the S3 API, and the S3 API does not have anything like `rm -rf`. This is in part because S3 doesn't actually have directories. However I think you can achieve what you want simply enough: ``` bucket = "some-bucket" bucket |> ExAws.S3.list_objects(prefix: "path/to/dir") |> ExAws.stream! |> Enum.map(fn %{key: key} -> bucket |> ExAws.delete_object(bucket, key) |> ExAws.request! end) ``` As you can see it's relatively simple to combine existing features from this library to achieve a lot of things. username_0: Thanks for response. Your comment was helpful enough to rewrite my own S3 policy. Happily S3 prefix can be applied to `path + prefix_of_filename`. For example, `upload/images/8/89846b0a-8e62-4bd0-8e77-6e4e8d7a6af4_i1_original.jpg` and `upload/images/8/89846b0a-8e62-4bd0-8e77-6e4e8d7a6af4_i1_thumb.jpg` can be found by `list_objects(bucket, prefix: "upload/images/8/89846b0a-8e62-4bd0-8e77-6e4e8d7a6af4"`. So I replaces the storage policy from folders of each UUID with folders of first character of UUID for better performance and readability. And how about change some words in document? For example, `delete_object(bucket, object, opts \\ [])` can be `delete_object(bucket, object_key, opts \\ [])`. Because I confused to pass the `body.content` of list_objects/2`. Status: Issue closed
Esri/crowdsource-reporter-scripts
291994690
Title: Exception raised when no substitutions provided Question: username_0: At the line below substitutions can be None which raises an exception falsely reporting that their was a failure reading the email template https://github.com/Esri/crowdsource-reporter-scripts/blob/single-tool/servicefunctions.py#L207
go1com/moodle-mod_goone
495664318
Title: Invalid name of the upgrade function Question: username_0: It seems the current upgrade code has been generated by an older version of the plugins skeleton generator that contained a bug https://github.com/username_0/moodle-tool_pluginskel/issues/90 As a result, the db/upgrade.php defines a function `xmldb_mod_goone_upgrade()` which should however be named `xmldb_goone_upgrade()`. The current code would fail with ``` Fatal error: Call to undefined function xmldb_goone_upgrade() in .../lib/upgradelib.php on line 726 ``` Answers: username_1: Thanks @username_0, resolved this by renaming the function. Status: Issue closed
MCJack123/craftos2
557131768
Title: fs.list() crash Question: username_0: **Bug description** Running fs.list() without arguments inside the interactive lua prompt crashes the craftos-pc emulator. (Running fs.list("/") runs just fine.) **To Reproduce** Steps to reproduce the behavior: 1. Open craftos-pc from the terminal by typing craftos 2. Open the interactive lua prompt inside craftos 3. Type fs.list() 4. Craftos-pc crashes, with the following std-out error in the terminal: ``` terminate called after throwing an instance of 'std::logic_error' what(): basic_string::_M_construct null not valid Aborted (core dumped) ``` **Expected behavior** I don't remember what the CC behaviour is (lua error/listing working directory?). But definitely not crashing craftos. **Environment (please complete the following information):** - OS: Linux mint 19.3, x86-64 - Version latest from the ubuntu ppa - Compiled from source? No - file system: ext4 **Further notes/testing** - It happens when starting it using the .desktop shortcut, but the error message is not seen - the crash happens when fs.list() is inside a lua script saved on the disk and executed - running apt list --installed | grep lua outputs: ``` liblua5.1-0/bionic,now 5.1.5-8.1build2 amd64 [installed,automatic] libluajit-5.1-2/bionic,now 2.1.0~beta3+dfsg-5.1 amd64 [installed,automatic] libluajit-5.1-common/bionic,bionic,now 2.1.0~beta3+dfsg-5.1 all [installed,automatic] libtexlua52/bionic-updates,bionic-security,now 2017.20170613.44572-8ubuntu0.1 amd64 [installed,automatic] libtexluajit2/bionic-updates,bionic-security,now 2017.20170613.44572-8ubuntu0.1 amd64 [installed,automatic] texlive-luatex/bionic,bionic,now 2017.20180305-1 all [installed,automatic] ``` Before installing craftos-pc, I only had luajit installed, I assume craftOS downloaded lua 5.1 (is it liblua5.1?) Let me know if you need more info. Answers: username_1: * [`fs.list`](https://wiki.computercraft.cc/Fs.list) requires one argument - the path to list; for example `fs.list("")` will list the root directory. * I forgot to check for invalid arguments in `fs.list`, so it should now error if you pass an invalid (or no) argument. Status: Issue closed
TijnVdVelde/4S_Handleidingen
1005150075
Title: 3a. Kolom toevoegen DB teller +1 Question: username_0: Het aantal bezoeken aan iedere handleiding bijhouden. Kolom toevoegen in DB waarin voor iedere manual een teller staat. Bij iedere weergave van een manual (als de pagina van een manual wordt geopend) doe je de teller +1.<issue_closed> Status: Issue closed
allegro/allegro-api
570799239
Title: Zasób /sale/offers Question: username_0: Zasób /sale/offers ( lista ofert ) zwraca oferty ze starym opisem, czy jest możliwość przekazania w parametrach żądania informację, że chcemy uzyskać tylko oferty z nowym opisem, lub jest jakaś informacja po której możemy rozpoznać, która oferta ma stary opis? Bo wyciągnięcie jakichkolwiek szczegółów oferty ze starym opisem poprzez zasób /sale/offers/{{offer.id}} kończy się błędem 404. Answers: username_1: Co masz na myśli przez "stary opis"? Czy możesz podać konkretny przykład sytuacji, o której napisałeś? Status 404 zwracamy dla ofert, które zostały zarchiwizowane lub usunięte (szkice). username_0: Rzeczywiście oferta była zarchiwizowana. Dziękuję za wytłumaczenie.
Project-3-UPC-DDV-BCN/Project3
321865578
Title: Campaign should be a mission Question: username_0: * Description: The Campaign Button should say mission instead * Severity: C: Gameplay/art issues but don't prevent you from playing * Steps to reproduce: Look at the main menu UI, it is the first option. * Screenshot: ![image](https://user-images.githubusercontent.com/17142220/39862499-4f3452b6-5444-11e8-941b-5a0dce5dcf50.png) Answers: username_1: Should be fixed @username_0 Status: Issue closed
ikedaosushi/tech-news
373026673
Title: お前らはまだ侍エンジニア塾の凄さを知らない Question: username_0: &#12362;&#21069;&#12425;&#12399;&#12414;&#12384;&#12289;&#20365;&#12456;&#12531;&#12472;&#12491;&#12450;&#22654;&#12398;&#20932;&#12373;&#12434;&#30693;&#12425;&#12394;&#12356;<br> &#20365;&#12456;&#12531;&#12472;&#12491;&#12450;&#22654;&#12392;&#12399; &#20365;&#12456;&#12531;&#12472;&#12491;&#12450;&#22654;&#12387;&#12390;&#30693;&#12387;&#12390;&#12414;&#12377;&#12363;&#65311;&#12503;&#12525;&#12464;&#12521;&#12510;&#12394;&#12425;&#12415;&#12394;&#12373;&#12435;&#20309;&#12363;&#12375;&#12425;&#12368;&#12368;&#12387;&#12383;&#12392;&#12365;&#12395;&#12289;&#12411;&#12412;&#12488;&#12483;&#12503;&#12395;&#34920;&#31034;&#12373;&#12428;&#12427;&#12354;&#12398;&#12469;&#12452;&#12488;&#12391;&#12377;&#12290;&#12388;&#12356;&#20808;&#26085;&#12289;&#20365;&#12456;&#12531;&#12472;&#12491;&#12450;&#22654;&#12364;&#12392;&#12390;&#12418;&#28814;&#19978;&#12375;&#12390;&#12356;&#12414;&#12375;&#12383;&#12397;&#12290;&#12415;&#12394;&#12373;&#12435;&#12420;&#12387;<br> https://ift.tt/2S3uvFD
EnMasseProject/enmasse
369076539
Title: Wrong response when resource already exists Question: username_0: **Describe the bug** To me it looks like the response from the EnMasse API server, in case a resource already exists, is wrong: ~~~ Error from server (AlreadyExists): buildconfigs.build.openshift.io "fabric8-s2i-java-custom-build" already exists Error from server (Conflict): Failure executing: POST at: https://kubernetes.default.svc/api/v1/namespaces/enmasse/configmaps. Message: configmaps "hono.default" already exists. Received status: Status(apiVersion=v1, code=409, details=StatusDetails(causes=[], group=null, kind=configmaps, name=hono.default, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=configmaps "hono.default" already exists, metadata=ListMeta(resourceVersion=null, selfLink=null, additionalProperties={}), reason=AlreadyExists, status=Failure, additionalProperties={}). Error from server (Conflict): User 'mqtt' already exists in [[mqtt]] ~~~ As you can see, the first request (a build config) returns `AlreadyExists`, where the EnMasse request returns `Conflict`. **To Reproduce** Create some resource twice. **Expected behavior** I would expect the result `AlreadyExists`. **Screenshots** See log above. Answers: username_1: Is this a consistent behavior for all Kubernetes resources? I'm not sure what http response code is used to get AlreadyExists. username_0: It looks to me that way: ~~~ ➜ openshift_s2i git:(feature/enmasse_crd_1) ✗ oc process -f hono-template.yml | oc create -f - messaginguser.user.enmasse.io "default.http" created Error from server (AlreadyExists): configmaps "hono-configuration" already exists Error from server (AlreadyExists): secrets "hono-secrets" already exists Error from server (AlreadyExists): configmaps "hono-example-data" already exists Error from server (AlreadyExists): imagestreams.image.openshift.io "centos" already exists Error from server (AlreadyExists): imagestreams.image.openshift.io "fabric8-s2i-java" already exists Error from server (AlreadyExists): imagestreams.image.openshift.io "fabric8-s2i-java-custom" already exists Error from server (AlreadyExists): buildconfigs.build.openshift.io "fabric8-s2i-java-custom-build" already exists Error from server (Conflict): Failure executing: POST at: https://kubernetes.default.svc/api/v1/namespaces/enmasse/configmaps. Message: configmaps "hono.default" already exists. Received status: Status(apiVersion=v1, code=409, details=StatusDetails(causes=[], group=null, kind=configmaps, name=hono.default, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=configmaps "hono.default" already exists, metadata=ListMeta(resourceVersion=null, selfLink=null, additionalProperties={}), reason=AlreadyExists, status=Failure, additionalProperties={}). Error from server (Conflict): User 'mqtt' already exists in [[mqtt]] Error from server (AlreadyExists): imagestreams.image.openshift.io "influxdb" already exists Error from server (AlreadyExists): persistentvolumeclaims "influxdb-pvc" already exists Error from server (AlreadyExists): deploymentconfigs.apps.openshift.io "influxdb" already exists Error from server (AlreadyExists): services "influxdb" already exists Error from server (AlreadyExists): imagestreams.image.openshift.io "hono-adapter-mqtt-vertx" already exists Error from server (AlreadyExists): buildconfigs.build.openshift.io "hono-adapter-mqtt-vertx-build" already exists Error from server (AlreadyExists): secrets "hono-mqtt-secrets" already exists Error from server (AlreadyExists): configmaps "hono-adapter-mqtt-vertx-config" already exists Error from server (AlreadyExists): deploymentconfigs.apps.openshift.io "hono-adapter-mqtt-vertx" already exists Error from server (AlreadyExists): services "hono-adapter-mqtt-vertx" already exists Error from server (Invalid): Service "hono-adapter-mqtt-vertx-nodeport" is invalid: spec.ports[0].nodePort: Invalid value: 31883: provided port is already allocated Error from server (AlreadyExists): routes.route.openshift.io "hono-adapter-mqtt-vertx" already exists Error from server (AlreadyExists): routes.route.openshift.io "hono-adapter-mqtt-vertx-sec" already exists Error from server (AlreadyExists): imagestreams.image.openshift.io "hono-adapter-http-vertx" already exists Error from server (AlreadyExists): buildconfigs.build.openshift.io "hono-adapter-http-vertx-build" already exists Error from server (AlreadyExists): secrets "hono-http-secrets" already exists Error from server (AlreadyExists): configmaps "hono-adapter-http-vertx-config" already exists Error from server (AlreadyExists): deploymentconfigs.apps.openshift.io "hono-adapter-http-vertx" already exists Error from server (AlreadyExists): services "hono-adapter-http-vertx" already exists Error from server (AlreadyExists): routes.route.openshift.io "hono-adapter-http-vertx" already exists Error from server (AlreadyExists): routes.route.openshift.io "hono-adapter-http-vertx-sec" already exists Error from server (AlreadyExists): imagestreams.image.openshift.io "hono-service-device-registry" already exists Error from server (AlreadyExists): buildconfigs.build.openshift.io "hono-service-device-registry-build" already exists Error from server (AlreadyExists): configmaps "hono-service-device-registry-config" already exists Error from server (AlreadyExists): persistentvolumeclaims "hono-device-registry-pvc" already exists Error from server (AlreadyExists): deploymentconfigs.apps.openshift.io "hono-service-device-registry" already exists Error from server (AlreadyExists): services "hono-service-device-registry" already exists Error from server (AlreadyExists): routes.route.openshift.io "hono-service-device-registry-https" already exists Error from server (AlreadyExists): imagestreams.image.openshift.io "hono-service-auth" already exists Error from server (AlreadyExists): buildconfigs.build.openshift.io "hono-service-auth-build" already exists Error from server (AlreadyExists): configmaps "hono-service-auth-config" already exists Error from server (AlreadyExists): deploymentconfigs.apps.openshift.io "hono-service-auth" already exists Error from server (AlreadyExists): services "hono-service-auth" already exists ~~~ username_0: Setting the log level to 10, it get the following: ~~~ I1011 13:42:42.446140 969 round_trippers.go:405] POST https://XXX:8443/apis/template.openshift.io/v1/namespaces/hono/templates 409 Conflict in 131 milliseconds I1011 13:42:42.446160 969 round_trippers.go:411] Response Headers: I1011 13:42:42.446167 969 round_trippers.go:414] Date: Thu, 11 Oct 2018 11:42:42 GMT I1011 13:42:42.446173 969 round_trippers.go:414] Cache-Control: no-store I1011 13:42:42.446178 969 round_trippers.go:414] Content-Type: application/json I1011 13:42:42.446183 969 round_trippers.go:414] Content-Length: 264 I1011 13:42:42.446354 969 request.go:897] Response Body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"templates.template.openshift.io \"hono-tenant\" already exists","reason":"AlreadyExists","details":{"name":"hono-tenant","group":"template.openshift.io","kind":"templates"},"code":409} I1011 13:42:42.446558 969 helpers.go:201] server response object: [{ "kind": "Status", "apiVersion": "v1", "metadata": {}, "status": "Failure", "message": "error when creating \"hono-tenant-template.yml\": templates.template.openshift.io \"hono-tenant\" already exists", "reason": "AlreadyExists", "details": { "name": "hono-tenant", "group": "template.openshift.io", "kind": "templates" }, "code": 409 }] F1011 13:42:42.446660 969 helpers.go:119] Error from server (AlreadyExists): error when creating "hono-tenant-template.yml": templates.template.openshift.io "hono-tenant" already exists ~~~ So it looks you still get `409` but with an additional response. username_1: Ok, I think what is going on is that kubernetes sets 'reason' in the response to 'AlreadyExists', while EnMasse sets 'Conflict'. Should be easy to fix. username_2: I am curious why the contents of the [reason-phrase](https://tools.ietf.org/html/rfc7231#section-6.1) is significant to you? I wouldn't expect an application to base processing decisions on the reason phrase as the values are "only recommendations -- they can be replaced by local equivalents without affecting the protocol". EnMasse is already using the recommended reason-phrase for the 409 code "Conflict". username_0: To me: ~~~ Error from server (AlreadyExists): buildconfigs.build.openshift.io "fabric8-s2i-java-custom-build" already exists ~~~ is more clear than: ~~~ Error from server (Conflict): Failure executing: POST at: https://kubernetes.default.svc/api/v1/namespaces/enmasse/configmaps. Message: configmaps "hono.default" already exists. Received status: Status(apiVersion=v1, code=409, details=StatusDetails(causes=[], group=null, kind=configmaps, name=hono.default, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=configmaps "hono.default" already exists, metadata=ListMeta(resourceVersion=null, selfLink=null, additionalProperties={}), reason=AlreadyExists, status=Failure, additionalProperties={}). ~~~ If `oc` works that way, I would support that properly. Status: Issue closed
craffel/pretty-midi
181032126
Title: Round in BPM to microseconds per beat conversion Question: username_0: e.g. here: https://github.com/username_0/pretty-midi/blob/master/pretty_midi/pretty_midi.py#L1246 This is how it was done when using the `python-midi` backend, so it is currently this way for backwards compatibility. `mido` rounds, which makes a lot more sense: https://github.com/olemb/mido/blob/master/mido/midifiles_meta.py#L137 We should switch to using this convenience function.
cs2103-ay1819s2-w10-2/main
413959146
Title: Familiarization of Different Components of Projects Question: username_0: To make our workflow easier, we split the below components such every component is known in-depth by at least one person. Although every command implemented by a person will be done so from end-to-end. any queries with regards to a particular component can be directed to the specialist in charge. Logic - Jothi & Daryl Model - Dehui Storage - Shune UI - Jeff
jlippold/tweakCompatible
414072154
Title: `Clear Badges` working on iOS 12.1 Question: username_0: ``` { "packageId": "com.clezz.clearbadges", "action": "working", "userInfo": { "arch32": false, "packageId": "com.clezz.clearbadges", "deviceId": "iPhone9,4", "url": "http://cydia.saurik.com/package/com.clezz.clearbadges/", "iOSVersion": "12.1", "packageVersionIndexed": true, "packageName": "Clear Badges", "category": "Tweaks", "repository": "BigBoss", "name": "Clear Badges", "installed": "1.0.1", "packageIndexed": true, "packageStatusExplaination": "A matching version of this tweak for this iOS version could not be found. Please submit a review if you choose to install.", "id": "com.clezz.clearbadges", "commercial": false, "packageInstalled": true, "tweakCompatVersion": "0.1.0", "shortDescription": "Double-tap app icon while it jiggles", "latest": "1.0.1", "author": "<NAME>", "packageStatus": "Unknown" }, "base64": "<KEY> "chosenStatus": "working", "notes": "" } ``` Answers: username_0: Works iPhone 7+ iOS 12.1 Status: Issue closed
Creators-of-Create/Create
871125212
Title: Deployer causing a game-crash & world destruction. Question: username_0: So, the issue appears to be as follows: Anytime a deployer, of any sort bearing any item in the 1.16.4 version interacts with a spot that contains a block aside from air, the game will immediately crash. This has happened when: I used a deployer holding a name-tag on a nixie tube I used a deployer holding an Oak Log to place said log, and then it crashed when it repeated the process (While the log was still present, and holding a new log) I used a deployer with a Redstone Block to place said block, then on interacting with an empty hand, the game still crashed. Is there a way to fix this issue? They have a wide range of possible uses, but at this stage I'm approaching avoiding the block altogether. Answers: username_1: update to 1.16.5 username_0: I did exactly that and it worked just fine, far as I've tested. Thank you! username_1: you're welcome! close this issue btw, as it is resolved now Status: Issue closed
department-of-veterans-affairs/va.gov-team
1073726901
Title: Add Support Response Templates to AVA Question: username_0: ## Issue Description _Now that we are handling support issues in AVA, we want to add our response templates there for more efficient handing of cases_ --- ## Tasks - [ ] _Add support response template to AVA backend_ ## Acceptance Criteria - [ ] _All applicable support templates have been added to AVA_ --- ## How to configure this issue - [ ] **Attached to a Milestone** (when will this be completed?) - [ ] **Attached to an Epic** (what body of work is this a part of?) - [ ] **Labeled with Team** (`product support`, `analytics-insights`, `operations`, `service-design`, `tools-be`, `tools-fe`) - [ ] **Labeled with Practice Area** (`backend`, `frontend`, `devops`, `design`, `research`, `product`, `ia`, `qa`, `analytics`, `contact center`, `research`, `accessibility`, `content`) - [ ] **Labeled with Type** (`bug`, `request`, `discovery`, `documentation`, etc.)
soujava/vagas-java
478043430
Title: [São Paulo, <NAME> - SP] Desenvolvedor Full Stack Java / AWS / Vuejs Question: username_0: Requisitos: Conhecimento em Java e AWS; 2 anos de experiência; Experiência em Vue e Vuex; Experiência com SPA; Domínio em HTML, CSS e JavaScript; Metodologias CSS (BEM, Atomic, OOCSS, ...); Pré-processadores CSS (Sass, PostCSS, Less, ...); Entender como o Browser funciona; Conhecer HTTP, HTTP/2 e APIs Rest; Testes automatizados (Jest, Jasmine, Selenium); Saber utilizar Git; Criação de código limpo, legível, cross-browser e cross-device; SEO; Performance; Desejáveis: Acessibilidade; ES6 / ES7; Webpack; Design pattern em javascript;
nodejs/nodejs.org
386966404
Title: Re-enstate link to NiM in debug documentation. Question: username_0: As no response has been received to https://github.com/nodejs/nodejs.org/issues/1908 since nearly a week, I am opening this issue. @username_6 Would be great to get some feedback on this. For the record, I've compared NiM's Chrome web store page with that of 10 different companies, all of whom have items in Google's Chrome web store, FOUR (EBay, Paypal, Walmart, and LinkedIn, of whom are among "10 Global Companies Using Node.js in Production" according to Google. And none of which I would think the Node.js community would have a problem "recommend"ing if that is what it means to add a link on the https://nodejs.org/en/docs/guides/debugging-getting-started/ page. I have compiled current pictures of all 10 companies’ chrome web store pages, as well as the privacy policy pages for each of the 5 which actually have them referenced, and uploaded them to Google photos (see link below). As far as the other 5, there are a number of valid reasons why they may not have privacy policy links, just as NiM did not, but I don’t want to digress. https://photos.app.goo.gl/y4TN5dT1JeNbED2KA I can say with strong conviction that data collected by NiM is minuscule by every measure when compared to that collected by these larger companies. Further, no data has ever been used for ill purposes and will not ever deviate from that found in our new privacy policy (link again: https://app.termly.io/document/privacy-policy/04164179-f943-4e87-ac8b-5afd0367dc6c#infocollect) And again, @ea167’s statement that “this requirement was added recently.” is simply NOT the case as can be determined by reading the GitHub project history. THIS is where/when the extra permissions were added on Feb 9, 2017, close to 2 years ago. As the removal was partly predicated on that statement, I bring it up again. I do regret not having a privacy policy in place sooner. To my defense, unlike these larger organizations, I don’t really have spare man/woman power to go around. That said, the focus for me has been on trying to make NiM great for the betterment of like-minded Node.js developers. I should point out that I make zero monetary gain from NiM, and it is a side project for me. So, if they weren’t before, I believe that all of the bases are now covered. I would like to get the Chrome web store link for NiM back and would be more than happy to submit a PR for such. Please advise... -Respectfully to all Answers: username_0: Thank you again, Myles, for nudging this. username_1: I'm fine in having that link in. username_0: Not sure my messages were going thru on https://www.youtube.com/c/nodejs+foundation/live. I will submit a PR for the change. username_2: I would prefer not recommending _any_ browser extensions, for security reasons. username_3: Perhaps we should keep it generic and mention that the Chrome Web store provides additional tools that help with debugging. We can even add a link with a keyword search that would bring up NiM, but would also bring up other extensions that satisfy the keywords. Nevertheless, if NiM were the first of many such helpers and would be closely associated with the keywords it would, at least for the beginning, be the first search result. We can then maybe tune the keywords to keep a sufficient distance between making a specific recommendation while allowing other tools to show up as they are being developed, and being too generic. username_2: @username_3 that seems ok to me, but imo we shouldn't _actively_ promote the usage of browser extensions for this task. A note that those exist (without explicitly calling people to install those, and without linking to any specific one) would be fine in my opinion. username_0: I sort of get the hand waving and wide eyes about "browser extensions" and "security reasons" by some but in all fairness, as developers, I think we are held to a slightly higher level of sophistication as it relates to understanding how to stay safe. Using that same safety paradigm, should we pull all references to any Inspector Clients from the debugging-getting-started page?! After all, we are all aware of the NPM "security" issues of late, and I am safe to say that a full-on executable has much more ability to wreak havoc on a system vs a Javascript sandboxed web application. NiM wasn't even listed as a primary option but rather a sub-option to Chrome DevTools, a very suitable place for it. Of course, we shouldn't remove listings to other Inspector Clients. If anything, more should be added if they are deemed as helpful as NiM has proven to be (see the rating/feedback). Many have asked or commented on the fact that they wished they had found NiM sooner! This conversation seems to be diverging from the initial issue (privacy policy). The issue was resolved over two weeks ago. And now it would be great to reinstate the URL. username_0: @nodejs/tsc has delegated the issue back to the web working group. Apparently that was the wrong place to surface this issue. Can someone from that group chime in please. @username_6 Status: Issue closed username_4: This was discussed in the TSC meeting this week: Consensus was that it is not the time to escalate to TSC and let website team lead the discussion, reach a discussion. username_5: As no response has been received to https://github.com/nodejs/nodejs.org/issues/1908 since nearly a week, I am opening this issue. @username_6 Would be great to get some feedback on this. For the record, I've compared NiM's Chrome web store page with that of 10 different companies, all of whom have items in Google's Chrome web store, FOUR (EBay, Paypal, Walmart, and LinkedIn, of whom are among "10 Global Companies Using Node.js in Production" according to Google. And none of which I would think the Node.js community would have a problem "recommend"ing if that is what it means to add a link on the https://nodejs.org/en/docs/guides/debugging-getting-started/ page. I have compiled current pictures of all 10 companies’ chrome web store pages, as well as the privacy policy pages for each of the 5 which actually have them referenced, and uploaded them to Google photos (see link below). As far as the other 5, there are a number of valid reasons why they may not have privacy policy links, just as NiM did not, but I don’t want to digress. https://photos.app.goo.gl/y4TN5dT1JeNbED2KA I can say with strong conviction that data collected by NiM is minuscule by every measure when compared to that collected by these larger companies. Further, no data has ever been used for ill purposes and will not ever deviate from that found in our new privacy policy (link again: https://app.termly.io/document/privacy-policy/04164179-f943-4e87-ac8b-5afd0367dc6c#infocollect) And again, @ea167’s statement that “this requirement was added recently.” is simply NOT the case as can be determined by reading the GitHub project history. THIS is where/when the extra permissions were added on Feb 9, 2017, close to 2 years ago. As the removal was partly predicated on that statement, I bring it up again. I do regret not having a privacy policy in place sooner. To my defense, unlike these larger organizations, I don’t really have spare man/woman power to go around. That said, the focus for me has been on trying to make NiM great for the betterment of like-minded Node.js developers. I should point out that I make zero monetary gain from NiM, and it is a side project for me. So, if they weren’t before, I believe that all of the bases are now covered. I would like to get the Chrome web store link for NiM back and would be more than happy to submit a PR for such. Please advise... -Respectfully to all https://chrome.google.com/webstore/detail/nodejs-v8-inspector-manag/gnhhdgbaldcilmgcpfddgdbkhjohddkj username_5: Reopening as I don't think that this issue needs to be closed username_2: Yes, I believe that closing it was probably accidential. username_6: (First of all sorry for the delay in my answer. It was Christmas time after all and I've been busy like the rest of you, I guess.) I'm still strongly opposed to add this extension back to the list of recommendations in its current form. For the record again, **the email address of the user** (taken from the Google account used to install the extension on Chrome) **is sent every second to Google Analytics.** There is no need to collect personal data for the extension to work, and this can be considered a violation of the GDPR. To sum up the arguments you brought up against this so far: * "We added a visible link to our privacy policy, which the user accepts with downloading and using out extension." – That's a good thing, although the privacy policy mixes use of your website and the app. And it doesn't inform the user about the email collection. * "We want to collect the data to maybe contact the users in the future" – as @oncletom [already stated before](https://github.com/nodejs/nodejs.org/issues/1908#issuecomment-442070908), GDPR is about being given consent by the user. So if you want to collect data not required for using this extension, it has to be explicitly opt-in. If you just wanted to have statistical data about the daily usage of your extension, there'd be other options (like using a hash which can't be used to personally identify the user). * "Other companies are doing things much worse." – sorry, but I won't even discuss this statement. Now, if those changes are made I'd be willing to reconsider my position. On the other hand, we're already discussing more broadly in #1963, #1964 and #1965 what kind of third party content should be collected on the Node.js website at all (also making sure that it doesn't mean this content is promoted/endorsed by the Node.js Foundation in any way). /cc'ig @Trott (who rose two of the issues mentioned above) and @amiller-gh (as this is also relevant for @nodejs/website-redesign). Status: Issue closed username_0: So in the name of (... Greyskull?!), I've removed the emails from Google Analytics. My initial thought was to encrypt the data but ultimately the extra work isn't worth it to me. I don't profit from this and honestly, it's just not worth it to spend any more of my time on it. Please see the PR I merged into my own code base. AS WELL, I've pushed an updated version to the Chrome Web Store that tracks the change. That should kill all the friendly birds with just one stone! Cheers. username_0: As no response has been received to https://github.com/nodejs/nodejs.org/issues/1908 since nearly a week, I am opening this issue. @username_6 Would be great to get some feedback on this. For the record, I've compared NiM's Chrome web store page with that of 10 different companies, all of whom have items in Google's Chrome web store, FOUR (EBay, Paypal, Walmart, and LinkedIn, of whom are among "10 Global Companies Using Node.js in Production" according to Google. And none of which I would think the Node.js community would have a problem "recommend"ing if that is what it means to add a link on the https://nodejs.org/en/docs/guides/debugging-getting-started/ page. I have compiled current pictures of all 10 companies’ chrome web store pages, as well as the privacy policy pages for each of the 5 which actually have them referenced, and uploaded them to Google photos (see link below). As far as the other 5, there are a number of valid reasons why they may not have privacy policy links, just as NiM did not, but I don’t want to digress. https://photos.app.goo.gl/y4TN5dT1JeNbED2KA I can say with strong conviction that data collected by NiM is minuscule by every measure when compared to that collected by these larger companies. Further, no data has ever been used for ill purposes and will not ever deviate from that found in our new privacy policy (link again: https://app.termly.io/document/privacy-policy/04164179-f943-4e87-ac8b-5afd0367dc6c#infocollect) And again, @ea167’s statement that “this requirement was added recently.” is simply NOT the case as can be determined by reading the GitHub project history. THIS is where/when the extra permissions were added on Feb 9, 2017, close to 2 years ago. As the removal was partly predicated on that statement, I bring it up again. I do regret not having a privacy policy in place sooner. To my defense, unlike these larger organizations, I don’t really have spare man/woman power to go around. That said, the focus for me has been on trying to make NiM great for the betterment of like-minded Node.js developers. I should point out that I make zero monetary gain from NiM, and it is a side project for me. So, if they weren’t before, I believe that all of the bases are now covered. I would like to get the Chrome web store link for NiM back and would be more than happy to submit a PR for such. Please advise... -Respectfully to all https://chrome.google.com/webstore/detail/nodejs-v8-inspector-manag/gnhhdgbaldcilmgcpfddgdbkhjohddkj username_6: Waiting for the PR to land, closing this. Status: Issue closed username_7: Well, it seems like the "Sending Email to Google Analytics" which was removed in December apparently, [is back](https://github.com/username_0/NiM/commit/c7678cd2746c06dfb4d18c9c41a56ad14c1b9310#diff-f9b12bd72f40c9b8ed144484f0f12a5fR143) as of 29 days ago and it's currently in master: https://github.com/username_0/NiM/blob/master/background.js#L151 We should probably edit the docs and mention that this is still happening and it's a recurring issue that apparently comes back! @username_6 shall we reopen this issue? (I just discovered this problem in https://github.com/nodejs/node/issues/28185 a few hours ago on my own, and @joyeecheung guided me to here.) PS. A friendly note to the author of the plugin: Your great work is much appreciated. But as I suggested in my own ticket, maybe replacing the email address with a randomly generated UUID, stored in the client-side for your tracking consistency, can make everyone feel better. :) username_6: Thanks for reopening, let's discuss matters over there. username_0: Please note that encryption was added per the GDPR requirement of google and was accepted as a solution to the complaint way back when this issue arose the first time. No changes have been made specific/surrounding this issue since then. Please note that. This is in no way some attempt to sneak anything past anyone. I'm still going thru the code but I saw this and wanted to address it asap. username_0: Again, just as last time someone raised a concern about this code issue, it is/was nothing NEW. And with the help of those on the node teams, including even bringing this to a TSC meeting, I thought I took every applicable action/step to make things compliant. The extensive GitHub history shows every action I've taken including adding notices, encryption, and removing things. The steps I took back then were validated when after MUCH time (and I can only assume --inspect ion of the code changes) NiM was reinstated. I think I said it before, but there are no nefarious things going on. I really hope the solution this time is not simply chopping the head off and instead if there is still a VALID issue, that it be addressed and I am willing to make MORE changes to move in the direction of the greatest good. If the solutions that were good enough back then are no longer good enough, or there is some problem with the encryption scheme I implemented... please let me know. On Wed, Jun 12, 2019 at 8:47 AM 667 <<EMAIL>> wrote: > Please note that encryption was added per the GDPR requirement of google > and was accepted as a solution to the complaint way back when this issue > arose the first time. No changes have been made specific/surrounding this > issue since then. Please note that. This is in no way some attempt to > sneak anything past anyone. > > I'm still going thru the code but I saw this and wanted to address it asap. > > username_7: Let's centralize our conversation in https://github.com/nodejs/nodejs.org/issues/1908. Thanks.
poise/poise-ruby
300788581
Title: No SCL repository package for redhat 7.4 when trying to install Ruby 2.4 Question: username_0: Attempting to setup `2.4` runtime, produces the following error: ` Generated at 2018-02-27 20:49:32 +0000 PoiseLanguages::Error: No SCL repoistory package for redhat 7.4 /var/chef/cache/cookbooks/poise-languages/files/halite_gem/poise_languages/scl/mixin.rb:43:in `block in scl_package' /var/chef/cache/cookbooks/poise-languages/files/halite_gem/poise_languages/scl/mixin.rb:42:in `tap' /var/chef/cache/cookbooks/poise-languages/files/halite_gem/poise_languages/scl/mixin.rb:42:in `scl_package' /var/chef/cache/cookbooks/poise-languages/files/halite_gem/poise_languages/scl/mixin.rb:26:in `install_scl_package' /var/chef/cache/cookbooks/poise-ruby/files/halite_gem/poise_ruby/ruby_providers/scl.rb:45:in `install_ruby' /var/chef/cache/cookbooks/poise-ruby/files/halite_gem/poise_ruby/ruby_providers/base.rb:44:in `block in action_install' /var/chef/cache/cookbooks/poise/files/halite_gem/poise/helpers/subcontext_block.rb:54:in `instance_eval' /var/chef/cache/cookbooks/poise/files/halite_gem/poise/helpers/subcontext_block.rb:54:in `subcontext_block' /var/chef/cache/cookbooks/poise/files/halite_gem/poise/helpers/notifying_block.rb:67:in `notifying_block' /var/chef/cache/cookbooks/poise-ruby/files/halite_gem/poise_ruby/ruby_providers/base.rb:43:in `action_install' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/provider.rb:171:in `run_action' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/resource.rb:591:in `run_action' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/runner.rb:70:in `run_action' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/runner.rb:98:in `block (2 levels) in converge' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/runner.rb:98:in `each' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/runner.rb:98:in `block in converge' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/resource_collection/resource_list.rb:94:in `block in execute_each_resource' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/resource_collection/stepable_iterator.rb:114:in `call_iterator_block' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/resource_collection/stepable_iterator.rb:85:in `step' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/resource_collection/stepable_iterator.rb:103:in `iterate' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/resource_collection/stepable_iterator.rb:55:in `each_with_index' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/resource_collection/resource_list.rb:92:in `execute_each_resource' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/runner.rb:97:in `converge' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/client.rb:718:in `block in converge' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/client.rb:713:in `catch' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/client.rb:713:in `converge' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/client.rb:752:in `converge_and_save' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/client.rb:286:in `run' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/application.rb:291:in `block in fork_chef_client' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/application.rb:279:in `fork' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/application.rb:279:in `fork_chef_client' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/application.rb:244:in `block in run_chef_client' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/local_mode.rb:44:in `with_server_connectivity' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/application.rb:232:in `run_chef_client' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/application/client.rb:469:in `sleep_then_run_chef_client' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/application/client.rb:458:in `block in interval_run_chef_client' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/application/client.rb:457:in `loop' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/application/client.rb:457:in `interval_run_chef_client' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/application/client.rb:441:in `run_application' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/lib/chef/application.rb:59:in `run' /opt/chef/embedded/lib/ruby/gems/2.4.0/gems/chef-13.6.4/bin/chef-client:26:in `<top (required)>' /bin/chef-client:58:in `load' /bin/chef-client:58:in `<main>' ` rh-ruby24 is available in the SCL repository provided by Red Hat. Answers: username_0: Actually looks like the issue resides in https://github.com/poise/poise-languages/blob/master/lib/poise_languages/scl/mixin.rb Should I resubmit issue there? username_1: What version of the cookbook are you using? Status: Issue closed username_1: Oh never mind, it looks like the 2.4 SCL never made it into a release. You'll have to either set the package overrides or use it from master until that happens.
godotengine/godot
810144019
Title: Crash, when using set_script and emit_signal on AnimatedSprite Question: username_0: **Godot version:** Godot 3.2.4 rc 2 **OS/device including version:** Ubuntu 20.04 **Issue description:** When executing ``` extends Node2D var q_AnimatedSprite : AnimatedSprite = AnimatedSprite.new() func _ready() -> void: add_child(q_AnimatedSprite) func _process(_delta : float) -> void: if randi() % 2 == 0: print("Executing AnimatedSprite::set_script") var p_object_0 = Reference.new() q_AnimatedSprite.set_script(p_object_0) if randi() % 2 == 0: print("Executing AnimatedSprite::emit_signal") q_AnimatedSprite.emit_signal("") ``` Godot crashes and shows this backtrace ``` core/object.cpp:1490:46: runtime error: member call on null pointer of type 'struct Script' core/object.cpp:1490:46: runtime error: member access within null pointer of type 'struct Script' [1] /lib/x86_64-linux-gnu/libc.so.6(+0x46210) [0x7f1b6f4b5210] (??:0) [2] Object::emit_signal(StringName const&, Variant const**, int) (/home/rafal/Pulpit/godot3.2/core/object.cpp:1192 (discriminator 12)) [3] Object::_emit_signal(Variant const**, int, Variant::CallError&) (/home/rafal/Pulpit/godot3.2/core/object.cpp:1179) [4] MethodBindVarArg<Object>::call(Object*, Variant const**, int, Variant::CallError&) (/home/rafal/Pulpit/godot3.2/./core/method_bind.h:345 (discriminator 4)) [5] Object::call(StringName const&, Variant const**, int, Variant::CallError&) (/home/rafal/Pulpit/godot3.2/core/object.cpp:919 (discriminator 1)) [6] Variant::call_ptr(StringName const&, Variant const**, int, Variant*, Variant::CallError&) (/home/rafal/Pulpit/godot3.2/core/variant_call.cpp:1129 (discriminator 1)) [7] GDScriptFunction::call(GDScriptInstance*, Variant const**, int, Variant::CallError&, GDScriptFunction::CallState*) (/home/rafal/Pulpit/godot3.2/modules/gdscript/gdscript_function.cpp:1089) [8] GDScriptInstance::call_multilevel(StringName const&, Variant const**, int) (/home/rafal/Pulpit/godot3.2/modules/gdscript/gdscript.cpp:1254) [9] Node::_notification(int) (/home/rafal/Pulpit/godot3.2/scene/main/node.cpp:58) [10] Node::_notificationv(int, bool) (/home/rafal/Pulpit/godot3.2/./scene/main/node.h:46 (discriminator 14)) [11] CanvasItem::_notificationv(int, bool) (/home/rafal/Pulpit/godot3.2/./scene/2d/canvas_item.h:166 (discriminator 3)) [12] Node2D::_notificationv(int, bool) (/home/rafal/Pulpit/godot3.2/./scene/2d/node_2d.h:38 (discriminator 3)) [13] Object::notification(int, bool) (/home/rafal/Pulpit/godot3.2/core/object.cpp:931) [14] SceneTree::_notify_group_pause(StringName const&, int) (/home/rafal/Pulpit/godot3.2/scene/main/scene_tree.cpp:988) [15] SceneTree::idle(float) (/home/rafal/Pulpit/godot3.2/scene/main/scene_tree.cpp:528 (discriminator 3)) [16] Main::iteration() (/home/rafal/Pulpit/godot3.2/main/main.cpp:2113) [17] OS_X11::run() (/home/rafal/Pulpit/godot3.2/platform/x11/os_x11.cpp:3634) [18] /usr/bin/godot(main+0x125) [0x16e0bfb] (/home/rafal/Pulpit/godot3.2/platform/x11/godot_x11.cpp:57) [19] /lib/x86_64-linux-gnu/libc.so.6(__libc_start_main+0xf3) [0x7f1b6f4960b3] (??:0) [20] /usr/bin/godot(_start+0x2e) [0x16e0a1e] (??:?) ``` Answers: username_1: will make a PR for this in some time
BigAppleSoftball/ratingsManager
83474530
Title: Player Ratings Complete Spreadsheet doesn't match Team Spreadsheet Question: username_0: Hi Paige, Can you check the Main Spreadsheet with all teams: http://basl-manager.herokuapp.com/ratings vs Cyclones (example): http://basl-manager.herokuapp.com/teams/307/ratings I don't think it's syncing up: <NAME> is a 12 (new rating) on Cyclones page but is a 13 on All Teams Page. The all teams page is what i want to submit to reggie Status: Issue closed Answers: username_1: I think I fixed this do you still see it? username_0: Will check and let you know. Thanks!
DeflatedPickle/JustBiteTheDust
249873349
Title: Error while loading up Question: username_0: got this error while loading the game: Caught exception from justbitethedust (java.lang.IllegalArgumentException: The name justbitethedust:psi_nugget has been registered twice, for com.deflatedpickle.justbitethedust.items.ItemBase@35bc1511 and com.deflatedpickle.justbitethedust.items.ItemBase@36a81475.) instance has over 140 mods Answers: username_1: Sorry for not commenting earlier, pretty sure you'll have stopped using this, but thank you for reporting it. It never occured to me to check if an material had already been registered. Status: Issue closed
thriftrw/thriftrw-go
174878584
Title: Code generation should capitalize abbreviations Question: username_0: And I think there's a good reason for that. For example, if we have `foo.thrift`: ``` thrift struct User { 1: required i32 ZIPCode 2: required i32 groupId 3: required bool hidden } ``` We should definitely see the capitalization `User.ZIPCode` in our Go object, since that stays true to the thrift definition (this is not currently the case). I'm less sure if we can extend this to capitalize params like `groupId` -> `GroupID` without accidentally capitalizing `hidden` -> `HIDden`, which is probably why generation tools are generally exempt from the capitalization rule. Answers: username_1: We leave attributes that are capitalized alone. In the above example, we leave `ZIPCode` as-is. Changing `groupId` to `groupID` is a bit complicated with a lot of corner cases. It's simpler to leave it as the user specified. Based on our conversation offline: There is a bug in thriftrw where all-caps attributes which are not known abbreviations are converted into PascalCase. We should leave those alone too. username_1: Related: https://github.com/thriftrw/thriftrw-go/issues/208 Status: Issue closed username_1: This is sort of resolved to the extent that it's feasible to do so. Handling all possible cases for this is not very easy. Maybe we can address it with a `go.name` annotation. See also #223.
grpc/grpc
623015102
Title: Issue while building bazel with grpc_deps Question: username_0: **What version of gRPC and what language are you using?** bazel - 3.1.0 **What operating system (Linux, Windows,...) and version?** MacOS Catalina 10.15.4 **What did you do?** I added the following to the WORKSPACE. ``` load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive") http_archive( name = "com_github_grpc_grpc", sha256 = "b391a327429279f6f29b9ae7e5317cd80d5e9d49cc100e6d682221af73d984a6", strip_prefix = "grpc-93e8830070e9afcbaa992c75817009ee3f4b63a0", # v1.24.3 with fixes urls = ["https://github.com/grpc/grpc/archive/93e8830070e9afcbaa992c75817009ee3f4b63a0.zip"], ) load("@com_github_grpc_grpc//bazel:grpc_deps.bzl", "grpc_deps") grpc_deps() ``` **What did you expect to see?** A successful build. The goal is to generate .pb with rpc services **What did you see instead?** Error while Bazel build: `no such package '@zlib//': The repository '@zlib' could not be resolved and referenced by '//external:madler_zlib'` Status: Issue closed Answers: username_1: Hi, I'm sorry for closing this down. This forum is dedicated to bugs and feature requests. Please address questions to wider communities like stack overflow.
Nephera/FFRecruiter
539010787
Title: Job Base Classes Question: username_0: Base job classes are causing a lot of headaches for the project (there are a lot of exceptions that are being raised, much like BLU) which are hindering further development. While they're nice for clarity for new players (those who may not know that ROG eventually turns into NIN), the base classes are largely unnecessary. These classes should be omitted from all forms and validation checks going forward. Should include a legend or a question in the FAQ to direct new players to the correct jobs to use.<issue_closed> Status: Issue closed
bespokeinteractive/pharmacyapp
154170142
Title: Issue drug to patient Question: username_0: 1. Allow posting of comment to array & finally to save 2. Save the drug added to db & cashier 3. Redirect back to the find patient page after it's done 4. Don't put print thing on the finish button
planningcenter/developers
857017731
Title: people.v2.events.email.updated does not give old email that was updated Question: username_0: **Affected Product** Which product does this bug affect? Webhooks **Describe the bug** people.v2.events.email.updated webhook does not list the old email address that was changed. All the "relationship" URLs in the webhook only point to the updated email, not the old one. **To Reproduce** 1. Create an email address on a Person record 2. Change the email address **Expected behavior** I would expect the webhook to contain the old and updated email address. To update downstream systems, the old email is necessary. Without this information, the webhook is useless. **Screenshots** **Additional Context:** **Additional context** ## I have.. - [ x] Reviewed the documentation found at https://developer.planning.center/docs - [ x] Searched for previous issues reporting this bug - [ x] Removed all private information from this issue (credentials, tokens, emails, phone numbers, etc.) - [x] Reviewed my issue for completeness Answers: username_1: @username_0 Sorry for the delayed response. The behaviour you are seeing in webhooks is actually the expected behaviour. When a resource is updated or destroyed, the payload will include information about the resource such as its id and type. Furthermore, when it is updated, it will include its new, updated attributes. However, the information we do provide is enough to track changes downstream. Other integrations or applications that use webhooks will simply keep a local copy of the attributes of the resource indexed by resource type and id. When it receives a webhook that a specific resource type and id have been changed or deleted, it updates its local copy of that information. In that way, it has access to any changes that may have occurred. Since this response also details what happens during a delete webhook, I'm going to copy and paste this same response in #864. I'm going to close both of these issues but please let me know if you have any further questions. Status: Issue closed
dart-lang/typed_data
337076330
Title: Fix dart2js tests on travis Question: username_0: https://travis-ci.org/dart-lang/typed_data/jobs/398049453 Seems there are a few constants that make dart2js sad... Answers: username_1: I'm probably not the relevant one for this. @jmesserly or @username_2, want to take a look? I haven't been closely following the changes around large 64-bit-ish integers. username_2: I'm not familiar with logic on these tests, but they need to be updated to run without using literals that can't be represented in JS, or split so that those large literals are only used in VM tests. FWIW - all of these large literals are used as parameters for int32x4, which are anyways truncated to 32-bit ints. username_3: FYI, #17 splits tests across two Travis jobs, and the browsers tests have been disabled for now. That way we can get the build passing again. Status: Issue closed
ros-planning/moveit
354253235
Title: non-blocking call to PlaceAction from Python returns with "Must specify group in motion plan request" Question: username_0: ### Description I'm using Moveit for a pick and place scenario but I don't want to block my node until those actions are completed. I've found that https://github.com/mikeferguson/moveit_python (see #99 or #776) allows me to make the underlying action calls in a non-blocking fashion. I also subscribed to the `feedback` topics of the involved actions to get notified about their outcome. This works fine with pick but does not does work for place if I use the action in non-blocking manner, i.e. with `wait=False`. In that case the `feedback` subscriber receives a message containing an "Must specify group in motion plan request" error. In fact, the group is specified in the goal which is sent to action server. Moreover, If I call the the same code with `wait=True` then everything works fine. So the group appears both to be set and valid since the blocking call can successfully execute the place. I really wonder why (not) waiting on the action result makes a difference for the action's outcome. (SInce mikeferguson/moveit_python only sends goals to MoveIt action, I suspect the error - if any - is located in MoveIt, not in Mike's code. Correct me, if I'm wrong) ### Your environment * ROS Distro: Kinetic * OS Version: Ubuntu 16.04 * Binary build * `ros-kinetic-moveit-core/xenial,now 0.9.12-1xenial-20180809-194610-0800`, and the latest of mikeferguson/moveit_python ### Steps to reproduce 1. register a subscriber for `/place/feedback`. For simplicity, the subscriber function may just output the feedback message and do nothing more. 2. add object to planning scene and attach it to the robot 3. construct place location(s) as required by moveit 4. invoke `PickPlaceInterface.place()` with `wait=False` and `True` to see the difference ### Expected behaviour The action is executed and the object that is attached object to the robot is placed on the given location. ### Actual behaviour "Must specify group in motion plan request" error is received. The place action is aborted (state = 4). Answers: username_0: Remark: Using `pick ` with `wait=False` works like a charm. Status: Issue closed username_0: I just found out that the problem was in the application code. At some (non-obvious) place after the non-blocking action was started, the planning scene objects were "updated" by removing them all and adding new ones. This caused the problem. Leaving the planning scene objects untouched or changing them carefully solves the issue. Although I don't know the internal details, the error message appears to be confusing or little related to the root-cause. Nevertheless, sorry for the irritation and opening the issue here.
Azure/manage-azure-policy
719425044
Title: Policy Files Not Being Detected Question: username_0: Hi, I've created the following: ``` /root ../policies ../../check-storage-account-location ../../../assign.DEV.json ../../../policy.json ``` Yet the action reports: **_Warning: Did not find any policies to create/update. No policy files match the given patterns or no changes were detected. If you have policy definitions or policy initiatives, please ensure that the files are named 'policy.json' and 'policyset.json' respectively._** ... and no policy is created in my subscription. Can someone double check the dir/file setup above it correct ? Is there anything within the policy/assignment files themselves that might stop the action from working? I'm using the action as follows: ``` - name: Create or Update Azure Policies uses: azure/manage-azure-policy@v0 with: paths: | policies/** # path to directory where policy files were downloaded in runner ``` Answers: username_1: @username_0 Do the json files contain id, type, name and properties fields? Also, are you using checkout action before invoking the manage-azure-policy action? In case you are using it, are you checking out to some other path? - name: Checkout uses: actions/checkout@v2 username_0: @username_1 policy below ``` { "properties": { "displayName": "Check Storage Account Location", "policyType": "Custom", "mode": "All", "description": "Force resources to be deployed to UK South", "metadata": { "category": "Storage" } }, "parameters": { "allowedLocation": { "type": "string", "metadata": { "description": "The allowed location for resources.", "displayName": "Allowed location", "strongType": "location" } } }, "if": { "allOf": [ { "field": "type", "equals": "Microsoft.Storage/storageAccounts" }, { "not": { "field": "location", "equals": "[parameters('allowedLocation')]" } } ] }, "then": { "effect": "Audit" }, "id": "/subscriptions/__subscriptionId__/providers/Microsoft.Authorization/policyDefinitions/__policyName__", "type": "Microsoft.Authorization/policyDefinitions", "name": "Check Storage Account Location" } ``` First steps in pipeline are ``` steps: - uses: actions/checkout@v2 - uses: azure/[email protected] with: creds: ${{ secrets.AZURE_CREDENTIALS }} enable-AzPSSession: true - uses: cschleiden/replace-tokens@v1 with: tokenPrefix: '__' tokenSuffix: '__' files: '["**/*.json"]' env: subscriptionId: $env:subscription_id policyName: $env:policy_name ``` Thanks for the assistance username_1: @username_0 I believe the issue is with the comment in path provided " policies/** # path to directory where policy files were downloaded in runner" This will try to match the whole string Can you try with just "policies/**" username_0: Thankyou, removing the "# path to directory where policy files were downloaded in runner" resolved the issue. Status: Issue closed
yeahdongcn/RSBarcodes_Swift
1085285751
Title: Cannot generate interleaved2of5 Question: username_0: ``` let gen = RSUnifiedCodeGenerator.shared gen.fillColor = UIColor.white gen.strokeColor = UIColor.black if let image = gen.generateCode("Test", machineReadableCodeObjectType: AVMetadataObject.ObjectType.code39.rawValue, targetSize: CGSize(width: 300, height: 150)) { print(image) return image } ``` is my code Answers: username_1: `interleaved2of5` can only encode digits into barcode, so `Test` is not a valid input. Status: Issue closed
aJean/daily-activities
1132582700
Title: 产品思维 Question: username_0: 本质思维:第一性原理从头算起,只采用最基本的事实作为依据,然后再层层推导,得出结论。抛开别人怎么做,过去怎么做得到不一样的视角(拒绝被同类产品的设计影响和压根不懂同类产品的设计是俩回事 )连环追问法是一种手段,理清过往思路和关键环节,帮助快速判断并且产生新的 idea。 相对思维:日光与阴影,让东西明亮不一定是加强它的亮度,可以通过调低周边的环境。这是一种逆向思维。成功与失败,优势与劣势都是暂时,相对的概念。看问题很重要的俩个角度:关系和时间 抽象思维:白痴与上帝,高级抽象视角看问题和用户本能层级看问题会有冲突。如果看不同局部可以切换是比较重要的能力。具体与抽象像飞机腾空时一个个点被不断缩小的过程。多考虑新元素(能力)而不是新功能,元素可以搭建功能。 系统思维:反馈的地位。反馈系统模型是基础的抽象模型,本质上都是在设计反馈。思维误区所有极端和异常的路径是小概率现象。 演化思维:自下而上的设计。极简是演化的基础,好的框架重点突出并且能够收放自如
chintan9/plyr-react
874513941
Title: Would be nice to automatically acquire types when someone types something like: Question: username_0: Would be nice to automatically acquire types when someone types something like: ```ts import * as someLibrary from "someLibrary"; ``` See https://github.com/microsoft/TypeScript-Website/issues/134 __Originally posted by @dsherret in https://github.com/dsherret/ts-ast-viewer/issues/68__<issue_closed> Status: Issue closed
unfetter-discover/unfetter
330730723
Title: Organize Ansible Playbook to wait for core services to finish before bringing up other container Question: username_0: The database and processor loading the core configs are necessary before the API can be run. Reorganize the tasks to support the following: - [ ] Start the database, and don't run the Processor until the database is up - [ ] If the processor is not being upgraded, and the database has not been configured, run the processor with just the configuration to get the core data in the database - [ ] When the database has been configured, the certs are created, then start building up the other tasks. - [ ] Finally, start the gateway. Ansible needs to evaluate if the mongo database has been started, if the processor has already configured the database. NOt sure yet how to get that to work.<issue_closed> Status: Issue closed
blue-yonder/tsfresh
342759125
Title: select_features returns empty list Question: username_0: Hi, The select features method does not return any features. I am using it as follow filtered_features = select_features(extracted_features, target) where target is['type1' 'type2'] and extracted_features are like below. id | cpu__abs_energy | cpu__absolute_sum_of_changes | cpu__agg_autocorrelation__f_agg_"mean" | cpu__agg_autocorrelation__f_agg_"median" | cpu__agg_autocorrelation__f_agg_"var" | cpu__agg_linear_trend__f_agg_"max"__chunk_len_10__attr_"intercept" | cpu__agg_linear_trend__f_agg_"max"__chunk_len_10__attr_"rvalue" | cpu__agg_linear_trend__f_agg_"max"__chunk_len_10__attr_"slope" | cpu__agg_linear_trend__f_agg_"max"__chunk_len_10__attr_"stderr" | cpu__agg_linear_trend__f_agg_"max"__chunk_len_50__attr_"intercept" | cpu__agg_linear_trend__f_agg_"max"__chunk_len_50__attr_"rvalue" | cpu__agg_linear_trend__f_agg_"max"__chunk_len_50__attr_"slope" | cpu__agg_linear_trend__f_agg_"max"__chunk_len_50__attr_"stderr" | cpu__agg_linear_trend__f_agg_"max"__chunk_len_5__attr_"intercept" | cpu__agg_linear_trend__f_agg_"max"__chunk_len_5__attr_"rvalue" | cpu__agg_linear_trend__f_agg_"max"__chunk_len_5__attr_"slope" | cpu__agg_linear_trend__f_agg_"max"__chunk_len_5__attr_"stderr" | cpu__agg_linear_trend__f_agg_"mean"__chunk_len_10__attr_"intercept" | cpu__agg_linear_trend__f_agg_"mean"__chunk_len_10__attr_"rvalue" | cpu__agg_linear_trend__f_agg_"mean"__chunk_len_10__attr_"slope" | cpu__agg_linear_trend__f_agg_"mean"__chunk_len_10__attr_"stderr" | cpu__agg_linear_trend__f_agg_"mean"__chunk_len_50__attr_"intercept" | cpu__agg_linear_trend__f_agg_"mean"__chunk_len_50__attr_"rvalue" | cpu__agg_linear_trend__f_agg_"mean"__chunk_len_50__attr_"slope" | cpu__agg_linear_trend__f_agg_"mean"__chunk_len_50__attr_"stderr" | cpu__agg_linear_trend__f_agg_"mean"__chunk_len_5__attr_"intercept" | cpu__agg_linear_trend__f_agg_"mean"__chunk_len_5__attr_"rvalue" | cpu__agg_linear_trend__f_agg_"mean"__chunk_len_5__attr_"slope" | cpu__agg_linear_trend__f_agg_"mean"__chunk_len_5__attr_"stderr" | cpu__agg_linear_trend__f_agg_"min"__chunk_len_10__attr_"intercept" | cpu__agg_linear_trend__f_agg_"min"__chunk_len_10__attr_"rvalue" | cpu__agg_linear_trend__f_agg_"min"__chunk_len_10__attr_"slope" | cpu__agg_linear_trend__f_agg_"min"__chunk_len_10__attr_"stderr" | cpu__agg_linear_trend__f_agg_"min"__chunk_len_50__attr_"intercept" | cpu__agg_linear_trend__f_agg_"min"__chunk_len_50__attr_"rvalue" | cpu__agg_linear_trend__f_agg_"min"__chunk_len_50__attr_"slope" | cpu__agg_linear_trend__f_agg_"min"__chunk_len_50__attr_"stderr" | cpu__agg_linear_trend__f_agg_"min"__chunk_len_5__attr_"intercept" | cpu__agg_linear_trend__f_agg_"min"__chunk_len_5__attr_"rvalue" | cpu__agg_linear_trend__f_agg_"min"__chunk_len_5__attr_"slope" | cpu__agg_linear_trend__f_agg_"min"__chunk_len_5__attr_"stderr" | cpu__agg_linear_trend__f_agg_"var"__chunk_len_10__attr_"intercept" | cpu__agg_linear_trend__f_agg_"var"__chunk_len_10__attr_"rvalue" | cpu__agg_linear_trend__f_agg_"var"__chunk_len_10__attr_"slope" | cpu__agg_linear_trend__f_agg_"var"__chunk_len_10__attr_"stderr" | cpu__agg_linear_trend__f_agg_"var"__chunk_len_50__attr_"intercept" | cpu__agg_linear_trend__f_agg_"var"__chunk_len_50__attr_"rvalue" | cpu__agg_linear_trend__f_agg_"var"__chunk_len_50__attr_"slope" | cpu__agg_linear_trend__f_agg_"var"__chunk_len_50__attr_"stderr" | cpu__agg_linear_trend__f_agg_"var"__chunk_len_5__attr_"intercept" | cpu__agg_linear_trend__f_agg_"var"__chunk_len_5__attr_"rvalue" | cpu__agg_linear_trend__f_agg_"var"__chunk_len_5__attr_"slope" | cpu__agg_linear_trend__f_agg_"var"__chunk_len_5__attr_"stderr" | cpu__ar_coefficient__k_10__coeff_0 | cpu__ar_coefficient__k_10__coeff_1 | cpu__ar_coefficient__k_10__coeff_2 | cpu__ar_coefficient__k_10__coeff_3 | cpu__ar_coefficient__k_10__coeff_4 | cpu__augmented_dickey_fuller__attr_"pvalue" | cpu__augmented_dickey_fuller__attr_"teststat" | cpu__augmented_dickey_fuller__attr_"usedlag" | cpu__autocorrelation__lag_0 | cpu__autocorrelation__lag_1 | cpu__autocorrelation__lag_2 | cpu__autocorrelation__lag_3 | cpu__autocorrelation__lag_4 | cpu__autocorrelation__lag_5 | cpu__autocorrelation__lag_6 | cpu__autocorrelation__lag_7 | cpu__autocorrelation__lag_8 | cpu__autocorrelation__lag_9 | cpu__binned_entropy__max_bins_10 | cpu__c3__lag_1 | cpu__c3__lag_2 | cpu__c3__lag_3 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.2__ql_0.0 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.2__ql_0.2 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.2__ql_0.4 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.2__ql_0.6 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.2__ql_0.8 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.4__ql_0.0 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.4__ql_0.2 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.4__ql_0.4 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.4__ql_0.6 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.4__ql_0.8 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.6__ql_0.0 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.6__ql_0.2 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.6__ql_0.4 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.6__ql_0.6 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.6__ql_0.8 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.8__ql_0.0 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.8__ql_0.2 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.8__ql_0.4 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.8__ql_0.6 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_0.8__ql_0.8 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_1.0__ql_0.0 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_1.0__ql_0.2 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_1.0__ql_0.4 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_1.0__ql_0.6 | cpu__change_quantiles__f_agg_"mean"__isabs_False__qh_1.0__ql_0.8 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.2__ql_0.0 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.2__ql_0.2 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.2__ql_0.4 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.2__ql_0.6 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.2__ql_0.8 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.4__ql_0.0 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.4__ql_0.2 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.4__ql_0.4 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.4__ql_0.6 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.4__ql_0.8 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.6__ql_0.0 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.6__ql_0.2 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.6__ql_0.4 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.6__ql_0.6 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.6__ql_0.8 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.8__ql_0.0 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.8__ql_0.2 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.8__ql_0.4 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.8__ql_0.6 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_0.8__ql_0.8 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_1.0__ql_0.0 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_1.0__ql_0.2 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_1.0__ql_0.4 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_1.0__ql_0.6 | cpu__change_quantiles__f_agg_"mean"__isabs_True__qh_1.0__ql_0.8 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.2__ql_0.0 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.2__ql_0.2 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.2__ql_0.4 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.2__ql_0.6 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.2__ql_0.8 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.4__ql_0.0 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.4__ql_0.2 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.4__ql_0.4 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.4__ql_0.6 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.4__ql_0.8 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.6__ql_0.0 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.6__ql_0.2 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.6__ql_0.4 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.6__ql_0.6 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.6__ql_0.8 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.8__ql_0.0 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.8__ql_0.2 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.8__ql_0.4 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.8__ql_0.6 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_0.8__ql_0.8 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_1.0__ql_0.0 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_1.0__ql_0.2 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_1.0__ql_0.4 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_1.0__ql_0.6 | cpu__change_quantiles__f_agg_"var"__isabs_False__qh_1.0__ql_0.8 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.2__ql_0.0 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.2__ql_0.2 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.2__ql_0.4 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.2__ql_0.6 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.2__ql_0.8 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.4__ql_0.0 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.4__ql_0.2 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.4__ql_0.4 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.4__ql_0.6 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.4__ql_0.8 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.6__ql_0.0 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.6__ql_0.2 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.6__ql_0.4 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.6__ql_0.6 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.6__ql_0.8 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.8__ql_0.0 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.8__ql_0.2 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.8__ql_0.4 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.8__ql_0.6 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_0.8__ql_0.8 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_1.0__ql_0.0 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_1.0__ql_0.2 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_1.0__ql_0.4 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_1.0__ql_0.6 | cpu__change_quantiles__f_agg_"var"__isabs_True__qh_1.0__ql_0.8 | cpu__cid_ce__normalize_False | cpu__cid_ce__normalize_True | cpu__count_above_mean | cpu__count_below_mean | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_0__w_10 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_0__w_2 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_0__w_20 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_0__w_5 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_10__w_10 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_10__w_2 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_10__w_20 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_10__w_5 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_11__w_10 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_11__w_2 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_11__w_20 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_11__w_5 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_12__w_10 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_12__w_2 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_12__w_20 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_12__w_5 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_13__w_10 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_13__w_2 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_13__w_20 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_13__w_5 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_14__w_10 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_14__w_2 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_14__w_20 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_14__w_5 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_1__w_10 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_1__w_2 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_1__w_20 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_1__w_5 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_2__w_10 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_2__w_2 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_2__w_20 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_2__w_5 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_3__w_10 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_3__w_2 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_3__w_20 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_3__w_5 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_4__w_10 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_4__w_2 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_4__w_20 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_4__w_5 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_5__w_10 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_5__w_2 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_5__w_20 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_5__w_5 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_6__w_10 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_6__w_2 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_6__w_20 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_6__w_5 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_7__w_10 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_7__w_2 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_7__w_20 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_7__w_5 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_8__w_10 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_8__w_2 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_8__w_20 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_8__w_5 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_9__w_10 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_9__w_2 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_9__w_20 | cpu__cwt_coefficients__widths_(2, 5, 10, 20)__coeff_9__w_5 | cpu__energy_ratio_by_chunks__num_segments_10__segment_focus_0 | cpu__energy_ratio_by_chunks__num_segments_10__segment_focus_1 | cpu__energy_ratio_by_chunks__num_segments_10__segment_focus_2 | cpu__energy_ratio_by_chunks__num_segments_10__segment_focus_3 | cpu__energy_ratio_by_chunks__num_segments_10__segment_focus_4 | cpu__energy_ratio_by_chunks__num_segments_10__segment_focus_5 | cpu__energy_ratio_by_chunks__num_segments_10__segment_focus_6 | cpu__energy_ratio_by_chunks__num_segments_10__segment_focus_7 | cpu__energy_ratio_by_chunks__num_segments_10__segment_focus_8 | cpu__energy_ratio_by_chunks__num_segments_10__segment_focus_9 | cpu__fft_aggregated__aggtype_"centroid" | cpu__fft_aggregated__aggtype_"kurtosis" | cpu__fft_aggregated__aggtype_"skew" | cpu__fft_aggregated__aggtype_"variance" | cpu__fft_coefficient__coeff_0__attr_"abs" | cpu__fft_coefficient__coeff_0__attr_"angle" | cpu__fft_coefficient__coeff_0__attr_"imag" | cpu__fft_coefficient__coeff_0__attr_"real" | cpu__fft_coefficient__coeff_10__attr_"abs" | cpu__fft_coefficient__coeff_10__attr_"angle" | cpu__fft_coefficient__coeff_10__attr_"imag" | cpu__fft_coefficient__coeff_10__attr_"real" | cpu__fft_coefficient__coeff_11__attr_"abs" | cpu__fft_coefficient__coeff_11__attr_"angle" | cpu__fft_coefficient__coeff_11__attr_"imag" | cpu__fft_coefficient__coeff_11__attr_"real" | cpu__fft_coefficient__coeff_12__attr_"abs" | cpu__fft_coefficient__coeff_12__attr_"angle" | cpu__fft_coefficient__coeff_12__attr_"imag" | cpu__fft_coefficient__coeff_12__attr_"real" | cpu__fft_coefficient__coeff_13__attr_"abs" | cpu__fft_coefficient__coeff_13__attr_"angle" | cpu__fft_coefficient__coeff_13__attr_"imag" | cpu__fft_coefficient__coeff_13__attr_"real" | cpu__fft_coefficient__coeff_14__attr_"abs" | cpu__fft_coefficient__coeff_14__attr_"angle" | cpu__fft_coefficient__coeff_14__attr_"imag" | cpu__fft_coefficient__coeff_14__attr_"real" | cpu__fft_coefficient__coeff_15__attr_"abs" | cpu__fft_coefficient__coeff_15__attr_"angle" | cpu__fft_coefficient__coeff_15__attr_"imag" | cpu__fft_coefficient__coeff_15__attr_"real" | cpu__fft_coefficient__coeff_16__attr_"abs" | cpu__fft_coefficient__coeff_16__attr_"angle" | cpu__fft_coefficient__coeff_16__attr_"imag" | cpu__fft_coefficient__coeff_16__attr_"real" | cpu__fft_coefficient__coeff_17__attr_"abs" | cpu__fft_coefficient__coeff_17__attr_"angle" | cpu__fft_coefficient__coeff_17__attr_"imag" | cpu__fft_coefficient__coeff_17__attr_"real" | cpu__fft_coefficient__coeff_18__attr_"abs" | cpu__fft_coefficient__coeff_18__attr_"angle" | cpu__fft_coefficient__coeff_18__attr_"imag" | cpu__fft_coefficient__coeff_18__attr_"real" | cpu__fft_coefficient__coeff_19__attr_"abs" | cpu__fft_coefficient__coeff_19__attr_"angle" | cpu__fft_coefficient__coeff_19__attr_"imag" | cpu__fft_coefficient__coeff_19__attr_"real" | cpu__fft_coefficient__coeff_1__attr_"abs" | cpu__fft_coefficient__coeff_1__attr_"angle" | cpu__fft_coefficient__coeff_1__attr_"imag" | cpu__fft_coefficient__coeff_1__attr_"real" | cpu__fft_coefficient__coeff_20__attr_"abs" | cpu__fft_coefficient__coeff_20__attr_"angle" | cpu__fft_coefficient__coeff_20__attr_"imag" | cpu__fft_coefficient__coeff_20__attr_"real" | cpu__fft_coefficient__coeff_21__attr_"abs" | cpu__fft_coefficient__coeff_21__attr_"angle" | cpu__fft_coefficient__coeff_21__attr_"imag" | cpu__fft_coefficient__coeff_21__attr_"real" | cpu__fft_coefficient__coeff_22__attr_"abs" | cpu__fft_coefficient__coeff_22__attr_"angle" | cpu__fft_coefficient__coeff_22__attr_"imag" | cpu__fft_coefficient__coeff_22__attr_"real" | cpu__fft_coefficient__coeff_23__attr_"abs" | cpu__fft_coefficient__coeff_23__attr_"angle" | cpu__fft_coefficient__coeff_23__attr_"imag" | cpu__fft_coefficient__coeff_23__attr_"real" | cpu__fft_coefficient__coeff_24__attr_"abs" | cpu__fft_coefficient__coeff_24__attr_"angle" | cpu__fft_coefficient__coeff_24__attr_"imag" | cpu__fft_coefficient__coeff_24__attr_"real" | cpu__fft_coefficient__coeff_25__attr_"abs" | cpu__fft_coefficient__coeff_25__attr_"angle" | cpu__fft_coefficient__coeff_25__attr_"imag" | cpu__fft_coefficient__coeff_25__attr_"real" | cpu__fft_coefficient__coeff_26__attr_"abs" | cpu__fft_coefficient__coeff_26__attr_"angle" | cpu__fft_coefficient__coeff_26__attr_"imag" | cpu__fft_coefficient__coeff_26__attr_"real" | cpu__fft_coefficient__coeff_27__attr_"abs" | cpu__fft_coefficient__coeff_27__attr_"angle" | cpu__fft_coefficient__coeff_27__attr_"imag" | cpu__fft_coefficient__coeff_27__attr_"real" | cpu__fft_coefficient__coeff_28__attr_"abs" | cpu__fft_coefficient__coeff_28__attr_"angle" | cpu__fft_coefficient__coeff_28__attr_"imag" | cpu__fft_coefficient__coeff_28__attr_"real" | cpu__fft_coefficient__coeff_29__attr_"abs" | cpu__fft_coefficient__coeff_29__attr_"angle" | cpu__fft_coefficient__coeff_29__attr_"imag" | cpu__fft_coefficient__coeff_29__attr_"real" | cpu__fft_coefficient__coeff_2__attr_"abs" | cpu__fft_coefficient__coeff_2__attr_"angle" | cpu__fft_coefficient__coeff_2__attr_"imag" | cpu__fft_coefficient__coeff_2__attr_"real" | cpu__fft_coefficient__coeff_30__attr_"abs" | cpu__fft_coefficient__coeff_30__attr_"angle" | cpu__fft_coefficient__coeff_30__attr_"imag" | cpu__fft_coefficient__coeff_30__attr_"real" | cpu__fft_coefficient__coeff_31__attr_"abs" | cpu__fft_coefficient__coeff_31__attr_"angle" | cpu__fft_coefficient__coeff_31__attr_"imag" | cpu__fft_coefficient__coeff_31__attr_"real" | cpu__fft_coefficient__coeff_32__attr_"abs" | cpu__fft_coefficient__coeff_32__attr_"angle" | cpu__fft_coefficient__coeff_32__attr_"imag" | cpu__fft_coefficient__coeff_32__attr_"real" | cpu__fft_coefficient__coeff_33__attr_"abs" | cpu__fft_coefficient__coeff_33__attr_"angle" | cpu__fft_coefficient__coeff_33__attr_"imag" | cpu__fft_coefficient__coeff_33__attr_"real" | cpu__fft_coefficient__coeff_34__attr_"abs" | cpu__fft_coefficient__coeff_34__attr_"angle" | cpu__fft_coefficient__coeff_34__attr_"imag" | cpu__fft_coefficient__coeff_34__attr_"real" | cpu__fft_coefficient__coeff_35__attr_"abs" | cpu__fft_coefficient__coeff_35__attr_"angle" | cpu__fft_coefficient__coeff_35__attr_"imag" | cpu__fft_coefficient__coeff_35__attr_"real" | cpu__fft_coefficient__coeff_36__attr_"abs" | cpu__fft_coefficient__coeff_36__attr_"angle" | cpu__fft_coefficient__coeff_36__attr_"imag" | cpu__fft_coefficient__coeff_36__attr_"real" | cpu__fft_coefficient__coeff_37__attr_"abs" | cpu__fft_coefficient__coeff_37__attr_"angle" | cpu__fft_coefficient__coeff_37__attr_"imag" | cpu__fft_coefficient__coeff_37__attr_"real" | cpu__fft_coefficient__coeff_38__attr_"abs" | cpu__fft_coefficient__coeff_38__attr_"angle" | cpu__fft_coefficient__coeff_38__attr_"imag" | cpu__fft_coefficient__coeff_38__attr_"real" | cpu__fft_coefficient__coeff_39__attr_"abs" | cpu__fft_coefficient__coeff_39__attr_"angle" | cpu__fft_coefficient__coeff_39__attr_"imag" | cpu__fft_coefficient__coeff_39__attr_"real" | cpu__fft_coefficient__coeff_3__attr_"abs" | cpu__fft_coefficient__coeff_3__attr_"angle" | cpu__fft_coefficient__coeff_3__attr_"imag" | cpu__fft_coefficient__coeff_3__attr_"real" | cpu__fft_coefficient__coeff_40__attr_"abs" | cpu__fft_coefficient__coeff_40__attr_"angle" | cpu__fft_coefficient__coeff_40__attr_"imag" | cpu__fft_coefficient__coeff_40__attr_"real" | cpu__fft_coefficient__coeff_41__attr_"abs" | cpu__fft_coefficient__coeff_41__attr_"angle" | cpu__fft_coefficient__coeff_41__attr_"imag" | cpu__fft_coefficient__coeff_41__attr_"real" | cpu__fft_coefficient__coeff_42__attr_"abs" | cpu__fft_coefficient__coeff_42__attr_"angle" | cpu__fft_coefficient__coeff_42__attr_"imag" | cpu__fft_coefficient__coeff_42__attr_"real" | cpu__fft_coefficient__coeff_43__attr_"abs" | cpu__fft_coefficient__coeff_43__attr_"angle" | cpu__fft_coefficient__coeff_43__attr_"imag" | cpu__fft_coefficient__coeff_43__attr_"real" | cpu__fft_coefficient__coeff_44__attr_"abs" | cpu__fft_coefficient__coeff_44__attr_"angle" | cpu__fft_coefficient__coeff_44__attr_"imag" | cpu__fft_coefficient__coeff_44__attr_"real" | cpu__fft_coefficient__coeff_45__attr_"abs" | cpu__fft_coefficient__coeff_45__attr_"angle" | cpu__fft_coefficient__coeff_45__attr_"imag" | cpu__fft_coefficient__coeff_45__attr_"real" | cpu__fft_coefficient__coeff_46__attr_"abs" | cpu__fft_coefficient__coeff_46__attr_"angle" | cpu__fft_coefficient__coeff_46__attr_"imag" | cpu__fft_coefficient__coeff_46__attr_"real" | cpu__fft_coefficient__coeff_47__attr_"abs" | cpu__fft_coefficient__coeff_47__attr_"angle" | cpu__fft_coefficient__coeff_47__attr_"imag" | cpu__fft_coefficient__coeff_47__attr_"real" | cpu__fft_coefficient__coeff_48__attr_"abs" | cpu__fft_coefficient__coeff_48__attr_"angle" | cpu__fft_coefficient__coeff_48__attr_"imag" | cpu__fft_coefficient__coeff_48__attr_"real" | cpu__fft_coefficient__coeff_49__attr_"abs" | cpu__fft_coefficient__coeff_49__attr_"angle" | cpu__fft_coefficient__coeff_49__attr_"imag" | cpu__fft_coefficient__coeff_49__attr_"real" | cpu__fft_coefficient__coeff_4__attr_"abs" | cpu__fft_coefficient__coeff_4__attr_"angle" | cpu__fft_coefficient__coeff_4__attr_"imag" | cpu__fft_coefficient__coeff_4__attr_"real" | cpu__fft_coefficient__coeff_50__attr_"abs" | cpu__fft_coefficient__coeff_50__attr_"angle" | cpu__fft_coefficient__coeff_50__attr_"imag" | cpu__fft_coefficient__coeff_50__attr_"real" | cpu__fft_coefficient__coeff_51__attr_"abs" | cpu__fft_coefficient__coeff_51__attr_"angle" | cpu__fft_coefficient__coeff_51__attr_"imag" | cpu__fft_coefficient__coeff_51__attr_"real" | cpu__fft_coefficient__coeff_52__attr_"abs" | cpu__fft_coefficient__coeff_52__attr_"angle" | cpu__fft_coefficient__coeff_52__attr_"imag" | cpu__fft_coefficient__coeff_52__attr_"real" | cpu__fft_coefficient__coeff_53__attr_"abs" | cpu__fft_coefficient__coeff_53__attr_"angle" | cpu__fft_coefficient__coeff_53__attr_"imag" | cpu__fft_coefficient__coeff_53__attr_"real" | cpu__fft_coefficient__coeff_54__attr_"abs" | cpu__fft_coefficient__coeff_54__attr_"angle" | cpu__fft_coefficient__coeff_54__attr_"imag" | cpu__fft_coefficient__coeff_54__attr_"real" | cpu__fft_coefficient__coeff_55__attr_"abs" | cpu__fft_coefficient__coeff_55__attr_"angle" | cpu__fft_coefficient__coeff_55__attr_"imag" | cpu__fft_coefficient__coeff_55__attr_"real" | cpu__fft_coefficient__coeff_56__attr_"abs" | cpu__fft_coefficient__coeff_56__attr_"angle" | cpu__fft_coefficient__coeff_56__attr_"imag" | cpu__fft_coefficient__coeff_56__attr_"real" | cpu__fft_coefficient__coeff_57__attr_"abs" | cpu__fft_coefficient__coeff_57__attr_"angle" | cpu__fft_coefficient__coeff_57__attr_"imag" | cpu__fft_coefficient__coeff_57__attr_"real" | cpu__fft_coefficient__coeff_58__attr_"abs" | cpu__fft_coefficient__coeff_58__attr_"angle" | cpu__fft_coefficient__coeff_58__attr_"imag" | cpu__fft_coefficient__coeff_58__attr_"real" | cpu__fft_coefficient__coeff_59__attr_"abs" | cpu__fft_coefficient__coeff_59__attr_"angle" | cpu__fft_coefficient__coeff_59__attr_"imag" | cpu__fft_coefficient__coeff_59__attr_"real" | cpu__fft_coefficient__coeff_5__attr_"abs" | cpu__fft_coefficient__coeff_5__attr_"angle" | cpu__fft_coefficient__coeff_5__attr_"imag" | cpu__fft_coefficient__coeff_5__attr_"real" | cpu__fft_coefficient__coeff_60__attr_"abs" | cpu__fft_coefficient__coeff_60__attr_"angle" | cpu__fft_coefficient__coeff_60__attr_"imag" | cpu__fft_coefficient__coeff_60__attr_"real" | cpu__fft_coefficient__coeff_61__attr_"abs" | cpu__fft_coefficient__coeff_61__attr_"angle" | cpu__fft_coefficient__coeff_61__attr_"imag" | cpu__fft_coefficient__coeff_61__attr_"real" | cpu__fft_coefficient__coeff_62__attr_"abs" | cpu__fft_coefficient__coeff_62__attr_"angle" | cpu__fft_coefficient__coeff_62__attr_"imag" | cpu__fft_coefficient__coeff_62__attr_"real" | cpu__fft_coefficient__coeff_63__attr_"abs" | cpu__fft_coefficient__coeff_63__attr_"angle" | cpu__fft_coefficient__coeff_63__attr_"imag" | cpu__fft_coefficient__coeff_63__attr_"real" | cpu__fft_coefficient__coeff_64__attr_"abs" | cpu__fft_coefficient__coeff_64__attr_"angle" | cpu__fft_coefficient__coeff_64__attr_"imag" | cpu__fft_coefficient__coeff_64__attr_"real" | cpu__fft_coefficient__coeff_65__attr_"abs" | cpu__fft_coefficient__coeff_65__attr_"angle" | cpu__fft_coefficient__coeff_65__attr_"imag" | cpu__fft_coefficient__coeff_65__attr_"real" | cpu__fft_coefficient__coeff_66__attr_"abs" | cpu__fft_coefficient__coeff_66__attr_"angle" | cpu__fft_coefficient__coeff_66__attr_"imag" | cpu__fft_coefficient__coeff_66__attr_"real" | cpu__fft_coefficient__coeff_67__attr_"abs" | cpu__fft_coefficient__coeff_67__attr_"angle" | cpu__fft_coefficient__coeff_67__attr_"imag" | cpu__fft_coefficient__coeff_67__attr_"real" | cpu__fft_coefficient__coeff_68__attr_"abs" | cpu__fft_coefficient__coeff_68__attr_"angle" | cpu__fft_coefficient__coeff_68__attr_"imag" | cpu__fft_coefficient__coeff_68__attr_"real" | cpu__fft_coefficient__coeff_69__attr_"abs" | cpu__fft_coefficient__coeff_69__attr_"angle" | cpu__fft_coefficient__coeff_69__attr_"imag" | cpu__fft_coefficient__coeff_69__attr_"real" | cpu__fft_coefficient__coeff_6__attr_"abs" | cpu__fft_coefficient__coeff_6__attr_"angle" | cpu__fft_coefficient__coeff_6__attr_"imag" | cpu__fft_coefficient__coeff_6__attr_"real" | cpu__fft_coefficient__coeff_70__attr_"abs" | cpu__fft_coefficient__coeff_70__attr_"angle" | cpu__fft_coefficient__coeff_70__attr_"imag" | cpu__fft_coefficient__coeff_70__attr_"real" | cpu__fft_coefficient__coeff_71__attr_"abs" | cpu__fft_coefficient__coeff_71__attr_"angle" | cpu__fft_coefficient__coeff_71__attr_"imag" | cpu__fft_coefficient__coeff_71__attr_"real" | cpu__fft_coefficient__coeff_72__attr_"abs" | cpu__fft_coefficient__coeff_72__attr_"angle" | cpu__fft_coefficient__coeff_72__attr_"imag" | cpu__fft_coefficient__coeff_72__attr_"real" | cpu__fft_coefficient__coeff_73__attr_"abs" | cpu__fft_coefficient__coeff_73__attr_"angle" | cpu__fft_coefficient__coeff_73__attr_"imag" | cpu__fft_coefficient__coeff_73__attr_"real" | cpu__fft_coefficient__coeff_74__attr_"abs" | cpu__fft_coefficient__coeff_74__attr_"angle" | cpu__fft_coefficient__coeff_74__attr_"imag" | cpu__fft_coefficient__coeff_74__attr_"real" | cpu__fft_coefficient__coeff_75__attr_"abs" | cpu__fft_coefficient__coeff_75__attr_"angle" | cpu__fft_coefficient__coeff_75__attr_"imag" | cpu__fft_coefficient__coeff_75__attr_"real" | cpu__fft_coefficient__coeff_76__attr_"abs" | cpu__fft_coefficient__coeff_76__attr_"angle" | cpu__fft_coefficient__coeff_76__attr_"imag" | cpu__fft_coefficient__coeff_76__attr_"real" | cpu__fft_coefficient__coeff_77__attr_"abs" | cpu__fft_coefficient__coeff_77__attr_"angle" | cpu__fft_coefficient__coeff_77__attr_"imag" | cpu__fft_coefficient__coeff_77__attr_"real" | cpu__fft_coefficient__coeff_78__attr_"abs" | cpu__fft_coefficient__coeff_78__attr_"angle" | cpu__fft_coefficient__coeff_78__attr_"imag" | cpu__fft_coefficient__coeff_78__attr_"real" | cpu__fft_coefficient__coeff_79__attr_"abs" | cpu__fft_coefficient__coeff_79__attr_"angle" | cpu__fft_coefficient__coeff_79__attr_"imag" | cpu__fft_coefficient__coeff_79__attr_"real" | cpu__fft_coefficient__coeff_7__attr_"abs" | cpu__fft_coefficient__coeff_7__attr_"angle" | cpu__fft_coefficient__coeff_7__attr_"imag" | cpu__fft_coefficient__coeff_7__attr_"real" | cpu__fft_coefficient__coeff_80__attr_"abs" | cpu__fft_coefficient__coeff_80__attr_"angle" | cpu__fft_coefficient__coeff_80__attr_"imag" | cpu__fft_coefficient__coeff_80__attr_"real" | cpu__fft_coefficient__coeff_81__attr_"abs" | cpu__fft_coefficient__coeff_81__attr_"angle" | cpu__fft_coefficient__coeff_81__attr_"imag" | cpu__fft_coefficient__coeff_81__attr_"real" | cpu__fft_coefficient__coeff_82__attr_"abs" | cpu__fft_coefficient__coeff_82__attr_"angle" | cpu__fft_coefficient__coeff_82__attr_"imag" | cpu__fft_coefficient__coeff_82__attr_"real" | cpu__fft_coefficient__coeff_83__attr_"abs" | cpu__fft_coefficient__coeff_83__attr_"angle" | cpu__fft_coefficient__coeff_83__attr_"imag" | cpu__fft_coefficient__coeff_83__attr_"real" | cpu__fft_coefficient__coeff_84__attr_"abs" | cpu__fft_coefficient__coeff_84__attr_"angle" | cpu__fft_coefficient__coeff_84__attr_"imag" | cpu__fft_coefficient__coeff_84__attr_"real" | cpu__fft_coefficient__coeff_85__attr_"abs" | cpu__fft_coefficient__coeff_85__attr_"angle" | cpu__fft_coefficient__coeff_85__attr_"imag" | cpu__fft_coefficient__coeff_85__attr_"real" | cpu__fft_coefficient__coeff_86__attr_"abs" | cpu__fft_coefficient__coeff_86__attr_"angle" | cpu__fft_coefficient__coeff_86__attr_"imag" | cpu__fft_coefficient__coeff_86__attr_"real" | cpu__fft_coefficient__coeff_87__attr_"abs" | cpu__fft_coefficient__coeff_87__attr_"angle" | cpu__fft_coefficient__coeff_87__attr_"imag" | cpu__fft_coefficient__coeff_87__attr_"real" | cpu__fft_coefficient__coeff_88__attr_"abs" | cpu__fft_coefficient__coeff_88__attr_"angle" | cpu__fft_coefficient__coeff_88__attr_"imag" | cpu__fft_coefficient__coeff_88__attr_"real" | cpu__fft_coefficient__coeff_89__attr_"abs" | cpu__fft_coefficient__coeff_89__attr_"angle" | cpu__fft_coefficient__coeff_89__attr_"imag" | cpu__fft_coefficient__coeff_89__attr_"real" | cpu__fft_coefficient__coeff_8__attr_"abs" | cpu__fft_coefficient__coeff_8__attr_"angle" | cpu__fft_coefficient__coeff_8__attr_"imag" | cpu__fft_coefficient__coeff_8__attr_"real" | cpu__fft_coefficient__coeff_90__attr_"abs" | cpu__fft_coefficient__coeff_90__attr_"angle" | cpu__fft_coefficient__coeff_90__attr_"imag" | cpu__fft_coefficient__coeff_90__attr_"real" | cpu__fft_coefficient__coeff_91__attr_"abs" | cpu__fft_coefficient__coeff_91__attr_"angle" | cpu__fft_coefficient__coeff_91__attr_"imag" | cpu__fft_coefficient__coeff_91__attr_"real" | cpu__fft_coefficient__coeff_92__attr_"abs" | cpu__fft_coefficient__coeff_92__attr_"angle" | cpu__fft_coefficient__coeff_92__attr_"imag" | cpu__fft_coefficient__coeff_92__attr_"real" | cpu__fft_coefficient__coeff_93__attr_"abs" | cpu__fft_coefficient__coeff_93__attr_"angle" | cpu__fft_coefficient__coeff_93__attr_"imag" | cpu__fft_coefficient__coeff_93__attr_"real" | cpu__fft_coefficient__coeff_94__attr_"abs" | cpu__fft_coefficient__coeff_94__attr_"angle" | cpu__fft_coefficient__coeff_94__attr_"imag" | cpu__fft_coefficient__coeff_94__attr_"real" | cpu__fft_coefficient__coeff_95__attr_"abs" | cpu__fft_coefficient__coeff_95__attr_"angle" | cpu__fft_coefficient__coeff_95__attr_"imag" | cpu__fft_coefficient__coeff_95__attr_"real" | cpu__fft_coefficient__coeff_96__attr_"abs" | cpu__fft_coefficient__coeff_96__attr_"angle" | cpu__fft_coefficient__coeff_96__attr_"imag" | cpu__fft_coefficient__coeff_96__attr_"real" | cpu__fft_coefficient__coeff_97__attr_"abs" | cpu__fft_coefficient__coeff_97__attr_"angle" | cpu__fft_coefficient__coeff_97__attr_"imag" | cpu__fft_coefficient__coeff_97__attr_"real" | cpu__fft_coefficient__coeff_98__attr_"abs" | cpu__fft_coefficient__coeff_98__attr_"angle" | cpu__fft_coefficient__coeff_98__attr_"imag" | cpu__fft_coefficient__coeff_98__attr_"real" | cpu__fft_coefficient__coeff_99__attr_"abs" | cpu__fft_coefficient__coeff_99__attr_"angle" | cpu__fft_coefficient__coeff_99__attr_"imag" | cpu__fft_coefficient__coeff_99__attr_"real" | cpu__fft_coefficient__coeff_9__attr_"abs" | cpu__fft_coefficient__coeff_9__attr_"angle" | cpu__fft_coefficient__coeff_9__attr_"imag" | cpu__fft_coefficient__coeff_9__attr_"real" | cpu__first_location_of_maximum | cpu__first_location_of_minimum | cpu__friedrich_coefficients__m_3__r_30__coeff_0 | cpu__friedrich_coefficients__m_3__r_30__coeff_1 | cpu__friedrich_coefficients__m_3__r_30__coeff_2 | cpu__friedrich_coefficients__m_3__r_30__coeff_3 | cpu__has_duplicate | cpu__has_duplicate_max | cpu__has_duplicate_min | cpu__index_mass_quantile__q_0.1 | cpu__index_mass_quantile__q_0.2 | cpu__index_mass_quantile__q_0.3 | cpu__index_mass_quantile__q_0.4 | cpu__index_mass_quantile__q_0.6 | cpu__index_mass_quantile__q_0.7 | cpu__index_mass_quantile__q_0.8 | cpu__index_mass_quantile__q_0.9 | cpu__kurtosis | cpu__large_standard_deviation__r_0.05 | cpu__large_standard_deviation__r_0.1 | cpu__large_standard_deviation__r_0.15000000000000002 | cpu__large_standard_deviation__r_0.2 | cpu__large_standard_deviation__r_0.25 | cpu__large_standard_deviation__r_0.30000000000000004 | cpu__large_standard_deviation__r_0.35000000000000003 | cpu__large_standard_deviation__r_0.4 | cpu__large_standard_deviation__r_0.45 | cpu__large_standard_deviation__r_0.5 | cpu__large_standard_deviation__r_0.55 | cpu__large_standard_deviation__r_0.6000000000000001 | cpu__large_standard_deviation__r_0.65 | cpu__large_standard_deviation__r_0.7000000000000001 | cpu__large_standard_deviation__r_0.75 | cpu__large_standard_deviation__r_0.8 | cpu__large_standard_deviation__r_0.8500000000000001 | cpu__large_standard_deviation__r_0.9 | cpu__large_standard_deviation__r_0.9500000000000001 | cpu__last_location_of_maximum | cpu__last_location_of_minimum | cpu__length | cpu__linear_trend__attr_"intercept" | cpu__linear_trend__attr_"pvalue" | cpu__linear_trend__attr_"rvalue" | cpu__linear_trend__attr_"slope" | cpu__linear_trend__attr_"stderr" | cpu__longest_strike_above_mean | cpu__longest_strike_below_mean | cpu__max_langevin_fixed_point__m_3__r_30 | cpu__maximum | cpu__mean | cpu__mean_abs_change | cpu__mean_change | cpu__mean_second_derivative_central | cpu__median | cpu__minimum | cpu__number_crossing_m__m_-1 | cpu__number_crossing_m__m_0 | cpu__number_crossing_m__m_1 | cpu__number_cwt_peaks__n_1 | cpu__number_cwt_peaks__n_5 | cpu__number_peaks__n_1 | cpu__number_peaks__n_10 | cpu__number_peaks__n_3 | cpu__number_peaks__n_5 | cpu__number_peaks__n_50 | cpu__partial_autocorrelation__lag_0 | cpu__partial_autocorrelation__lag_1 | cpu__partial_autocorrelation__lag_2 | cpu__partial_autocorrelation__lag_3 | cpu__partial_autocorrelation__lag_4 | cpu__partial_autocorrelation__lag_5 | cpu__partial_autocorrelation__lag_6 | cpu__partial_autocorrelation__lag_7 | cpu__partial_autocorrelation__lag_8 | cpu__partial_autocorrelation__lag_9 | cpu__percentage_of_reoccurring_datapoints_to_all_datapoints | cpu__percentage_of_reoccurring_values_to_all_values | cpu__quantile__q_0.1 | cpu__quantile__q_0.2 | cpu__quantile__q_0.3 | cpu__quantile__q_0.4 | cpu__quantile__q_0.6 | cpu__quantile__q_0.7 | cpu__quantile__q_0.8 | cpu__quantile__q_0.9 | cpu__range_count__max_1__min_-1 | cpu__ratio_beyond_r_sigma__r_0.5 | cpu__ratio_beyond_r_sigma__r_1 | cpu__ratio_beyond_r_sigma__r_1.5 | cpu__ratio_beyond_r_sigma__r_10 | cpu__ratio_beyond_r_sigma__r_2 | cpu__ratio_beyond_r_sigma__r_2.5 | cpu__ratio_beyond_r_sigma__r_3 | cpu__ratio_beyond_r_sigma__r_5 | cpu__ratio_beyond_r_sigma__r_6 | cpu__ratio_beyond_r_sigma__r_7 | cpu__ratio_value_number_to_time_series_length | cpu__skewness | cpu__spkt_welch_density__coeff_2 | cpu__spkt_welch_density__coeff_5 | cpu__spkt_welch_density__coeff_8 | cpu__standard_deviation | cpu__sum_of_reoccurring_data_points | cpu__sum_of_reoccurring_values | cpu__sum_values | cpu__symmetry_looking__r_0.0 | cpu__symmetry_looking__r_0.05 | cpu__symmetry_looking__r_0.1 | cpu__symmetry_looking__r_0.15000000000000002 | cpu__symmetry_looking__r_0.2 | cpu__symmetry_looking__r_0.25 | cpu__symmetry_looking__r_0.30000000000000004 | cpu__symmetry_looking__r_0.35000000000000003 | cpu__symmetry_looking__r_0.4 | cpu__symmetry_looking__r_0.45 | cpu__symmetry_looking__r_0.5 | cpu__symmetry_looking__r_0.55 | cpu__symmetry_looking__r_0.6000000000000001 | cpu__symmetry_looking__r_0.65 | cpu__symmetry_looking__r_0.7000000000000001 | cpu__symmetry_looking__r_0.75 | cpu__symmetry_looking__r_0.8 | cpu__symmetry_looking__r_0.8500000000000001 | cpu__symmetry_looking__r_0.9 | cpu__symmetry_looking__r_0.9500000000000001 | cpu__time_reversal_asymmetry_statistic__lag_1 | cpu__time_reversal_asymmetry_statistic__lag_2 | cpu__time_reversal_asymmetry_statistic__lag_3 | cpu__value_count__value_-inf | cpu__value_count__value_0 | cpu__value_count__value_1 | cpu__value_count__value_inf | cpu__value_count__value_nan | cpu__variance | cpu__variance_larger_than_standard_deviation -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- 0 | 26638.36 | 139.52 | -0.01522 | -0.01227 | 0.269134 | 33.14 | -0.03871 | -0.21 | 3.833127 | 0 | 0 | 0 | 0 | 28.50333 | 0.143068 | 0.393333 | 1.110841 | 23.6644 | 0.353808 | 0.9684 | 1.81022 | 0 | 0 | 0 | 0 | 23.497 | 0.243658 | 0.462857 | 0.752144 | 18.7 | 0.174991 | 0.25 | 0.994615 | 0 | 0 | 0 | 0 | 18.58 | 0.38177 | 0.487143 | 0.481473 | 27.07737 | -0.06791 | -0.89765 | 9.324987 | 0 | 0 | 0 | 0 | 17.29696 | -0.04921 | -0.32758 | 2.714429 | 9.115389 | 0.448049 | 0.461179 | -0.00423 | -0.36996 | 0.016688 | -3.26154 | 0 | 1 | 0.662828 | 0.490491 | 0.292488 | 0.158452 | 0.169823 | 0.211266 | 0.172095 | 0.151066 | -0.01634 | 2.058775 | 16843.4 | 16014.01 | 15535.66 | -0.16667 | 0 | 0 | 0 | 0 | -0.47778 | -0.21 | 0 | 0 | 0 | 0.093333 | 0.075556 | 0 | 0 | 0 | -0.66667 | -1.28533 | -2.724 | -1.85 | 0 | -0.39179 | -1.17308 | -1.5525 | -1.14769 | -1.68 | 2.126667 | 0 | 0 | 0 | 0 | 1.948889 | 0.29 | 0 | 0 | 0 | 2.291429 | 1.595555 | 0 | 0 | 0 | 2.888889 | 2.949333 | 4.708 | 1.85 | 0 | 3.577436 | 3.531538 | 4.445 | 3.944615 | 1.68 | 5.414757 | 0 | 0 | 0 | 0 | 5.539951 | 0.0841 | 0 | 0 | 0 | 6.876622 | 3.515446 | 0 | 0 | 0 | 11.968 | 13.16355 | 22.64951 | 0.0169 | 0 | 20.0699 | 17.14264 | 23.18984 | 19.15449 | 1.706399 | 0.919822 | 0 | 0 | 0 | 0 | 1.970054 | 0.0441 | 0 | 0 | 0 | 1.634688 | 0.975358 | 0 | 0 | 0 | 4.066766 | 6.11706 | 7.904418 | 0.0169 | 0 | 7.425357 | 6.04699 | 5.842076 | 4.911702 | 1.706399 | 28.08403 | 4.740612 | 16 | 24 | 28.95296 | 10.48611 | 59.5415 | 13.25788 | 50.02201 | 6.145128 | 93.6126 | 7.603322 | 46.60072 | 1.763174 | 96.1423 | 3.162488 | 42.96367 | -3.54321 | 98.47138 | -0.50139 | 38.49943 | -5.74894 | 100.6149 | -3.50845 | 34.40969 | -3.88623 | 102.582 | -5.99276 | 35.75521 | 30.62511 | 63.68899 | 23.39751 | 41.47861 | 32.68167 | 67.71863 | 31.02765 | 46.85692 | 19.03864 | 71.60614 | 35.42325 | 51.43895 | 2.801024 | 75.29608 | 36.43093 | 54.55731 | -6.67531 | 78.78064 | 34.43364 | 56.8478 | -8.2981 | 82.04791 | 30.21278 | 56.46216 | -4.67713 | 85.14677 | 24.65218 | 55.69976 | 1.25089 | 88.10897 | 18.64414 | 53.22806 | 5.952424 | 90.92291 | 12.8231 | 0.163681 | 0.069221 | 0.085618 | 0.05926 | 0.061931 | 0.057006 | 0.10574 | 0.14259 | 0.160137 | 0.094817 | 2.874721 | 5.767957 | 1.900242 | 27.55556 | 1004.68 | 0 | 0 | 1004.68 | 12.09278 | -26.3108 | -5.36 | 10.84 | 14.23603 | -7.61304 | -1.88602 | 14.11054 | 13.99443 | 123.7017 | 11.64249 | -7.76508 | 10.32962 | 2.993591 | 0.539457 | 10.31553 | 28.5999 | -53.4804 | -22.9844 | 17.01973 | 20.38576 | 30.72843 | 10.4165 | 17.52357 | 14.63503 | -147.868 | -7.78405 | -12.3933 | 21.50096 | -18.9825 | -6.99382 | 20.33169 | 6.493035 | -134.03 | -4.66837 | -4.51285 | 24.21092 | -96.3959 | -24.0602 | -2.69705 | 116.9119 | 46.504 | 84.81053 | 80.47092 | 23.64 | 0 | 0 | 23.64 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 25.66137 | 175.0984 | 2.192622 | -25.5675 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 30.0009 | -46.1163 | -21.6231 | 20.79651 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 20.67315 | 15.7694 | 5.618264 | 19.89508 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 53.86551 | -77.0881 | -52.5035 | 12.03642 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 62.87322 | -61.6864 | -55.3513 | 29.82063 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 19.08046 | -19.5639 | -6.38924 | 17.97893 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 33.13139 | -15.6492 | -8.93706 | 31.90325 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 25.22864 | -22.353 | -9.59476 | 23.33291 | 0 | 0.325 | 49.12804 | -4.69858 | 0.151193 | -0.00172 | 1 | 0 | 0 | 0.075 | 0.2 | 0.3 | 0.425 | 0.675 | 0.75 | 0.825 | 0.9 | -0.64934 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.025 | 0.35 | 40 | 23.46459 | 0.308588 | 0.165117 | 0.084739 | 0.08211 | 11 | 16 | 32.3511 | 39.38 | 25.117 | 3.577436 | -0.39179 | 0.087895 | 22.61 | 17.08 | 0 | 0 | 0 | 8 | 7 | 10 | 0 | 5 | 3 | 0 | 1 | 0.662828 | 0.091233 | -0.11409 | -0.04302 | 0.175733 | 0.130224 | -0.10282 | -0.01562 | -0.20689 | 0.025641 | 0.05 | 18.842 | 20.128 | 21.124 | 21.932 | 25.14 | 28.364 | 31.86 | 34.074 | 0 | 0.725 | 0.35 | 0.15 | 0 | 0.025 | 0 | 0 | 0 | 0 | 0 | 0.975 | 0.706376 | 222.3684 | 30.93968 | 4.246522 | 5.924136 | 42.88 | 21.44 | 1004.68 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | -1484.65 | -1417.55 | -336.475 | 0 | 0 | 0 | 0 | 0 | 35.09539 | 1 1 | 18349.48 | 120.32 | -0.0542 | -0.12647 | 0.101404 | 21.956 | 0.762124 | 3.276 | 1.96787 | 0 | 0 | 0 | 0 | 22.93167 | 0.264389 | 0.560952 | 0.835355 | 18.5418 | 0.661671 | 1.5508 | 1.242628 | 0 | 0 | 0 | 0 | 18.31533 | 0.43057 | 0.729333 | 0.624139 | 13.982 | 0.617138 | 0.732 | 0.659945 | 0 | 0 | 0 | 0 | 14.235 | 0.564837 | 0.946429 | 0.564481 | 5.097557 | 0.830213 | 7.532923 | 3.57653 | 0 | 0 | 0 | 0 | 9.251459 | -0.09127 | -0.30041 | 1.338116 | 8.751631 | 0.473739 | 0.350939 | 0.066816 | -0.16797 | 0.348019 | -1.86642 | 1 | 1 | 0.675023 | 0.564051 | 0.456604 | 0.278694 | 0.253177 | 0.161032 | 0.14995 | 0.090266 | -0.10411 | 2.062961 | 10170.66 | 10174.74 | 10119.73 | 3.26 | 0 | 0 | 0 | 0 | 0.008 | -0.38 | 0 | 0 | 0 | -0.11895 | 0.22 | 0.366666 | 0 | 0 | -0.26 | -0.31231 | -0.19 | 0.71 | 0 | -0.18051 | -0.15917 | 0.335556 | -0.26 | -1.368 | 3.26 | 0 | 0 | 0 | 0 | 1.7 | 0.46 | 0 | 0 | 0 | 1.923158 | 0.506667 | 0.38 | 0 | 0 | 2.388571 | 1.973846 | 1.735 | 1.309999 | 0 | 3.085128 | 3.295833 | 3.333333 | 3.354545 | 2.048 | 0 | 0 | 0 | 0 | 0 | 4.656736 | 0.2116 | 0 | 0 | 0 | 5.78621 | 0.399467 | 0.203822 | 0 | 0 | 8.409143 | 7.592094 | 5.673501 | 1.716099 | 0 | 15.63398 | 19.70852 | 20.20638 | 16.68873 | 3.198656 | 0 | 0 | 0 | 0 | 0 | 1.7668 | 0.1444 | 0 | 0 | 0 | 2.101821 | 0.191155 | 0.193866 | 0 | 0 | 2.771469 | 3.793562 | 2.699375 | 0.5041 | 0 | 6.148548 | 8.871333 | 9.207867 | 5.503353 | 0.875776 | 24.71833 | 5.124859 | 14 | 26 | 16.52615 | 3.32053 | 44.41734 | 1.098568 | 37.1247 | -4.56706 | 76.83596 | 7.977637 | 35.72019 | -6.51937 | 79.72761 | 3.078846 | 34.09542 | -4.74366 | 82.47823 | -1.2393 | 31.42788 | -1.06442 | 85.11935 | -4.84304 | 29.0518 | 1.517878 | 87.6733 | -7.69314 | 21.85449 | 11.84164 | 47.89559 | 8.839398 | 26.86141 | 11.51202 | 51.31553 | 15.84468 | 31.06329 | 5.880036 | 54.67657 | 21.35687 | 34.92582 | 2.084671 | 57.96543 | 24.8589 | 36.79182 | 3.097658 | 61.21889 | 26.08386 | 38.69634 | 6.630492 | 64.44843 | 25.12874 | 39.23715 | 8.406645 | 67.62691 | 22.29763 | 39.1158 | 5.954871 | 70.76708 | 18.10632 | 38.50107 | 0.50245 | 73.84111 | 13.14568 | 0.080957 | 0.100864 | 0.059639 | 0.067003 | 0.070937 | 0.073322 | 0.171977 | 0.148842 | 0.157141 | 0.069318 | 2.803208 | 5.597497 | 1.898582 | 27.42264 | 834.72 | 0 | 0 | 834.72 | 17.07921 | -38.5341 | -10.64 | 13.36 | 15.0773 | 23.68799 | 6.057392 | 13.80699 | 26.18006 | 10.11837 | 4.599375 | 25.77288 | 14.96435 | -44.289 | -10.4493 | 10.71188 | 10.47979 | -119.729 | -9.10044 | -5.19691 | 24.26534 | -0.57342 | -0.24284 | 24.26412 | 21.34327 | 116.3339 | 19.12835 | -9.4679 | 18.58314 | -64.0867 | -16.7147 | 8.121034 | 2.995379 | 117.2035 | 2.664056 | -1.36934 | 12.12194 | -126.936 | -9.68911 | -7.2844 | 90.44037 | 91.98745 | 90.38597 | -3.13652 | 18.56 | 0 | 0 | 18.56 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 61.43125 | -146.27 | -34.1117 | -51.09 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 18.92557 | -105.014 | -18.2795 | -4.90268 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 21.45394 | 179.4287 | 0.213913 | -21.4529 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 29.01767 | -0.63747 | -0.32284 | 29.01588 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 36.50995 | 21.3851 | 13.31279 | 33.99626 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 7.535512 | -16.4722 | -2.13669 | 7.226236 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 10.9412 | 136.9249 | 7.47236 | -7.9921 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 15.21758 | -99.9236 | -14.9899 | -2.62253 | 0.8 | 0.275 | 83.03604 | -10.9374 | 0.476518 | -0.00693 | 0 | 0 | 0 | 0.125 | 0.225 | 0.35 | 0.475 | 0.675 | 0.75 | 0.825 | 0.9 | -0.4895 | 1 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0.825 | 0.3 | 40 | 17.94205 | 0.022864 | 0.359111 | 0.150049 | 0.06326 | 11 | 16 | 25.23234 | 32.04 | 20.868 | 3.085128 | -0.18051 | 0.079474 | 18.91 | 13.04 | 0 | 0 | 0 | 7 | 6 | 12 | 0 | 4 | 3 | 0 | 1 | 0.675023 | 0.199129 | 0.033013 | -0.17175 | 0.10133 | -0.04684 | 0.080587 | -0.09226 | -0.31117 | 0 | 0 | 16.094 | 16.844 | 18.098 | 18.592 | 19.792 | 23.738 | 25.004 | 28.562 | 0 | 0.675 | 0.3 | 0.175 | 0 | 0.05 | 0 | 0 | 0 | 0 | 0 | 1 | 0.747955 | 237.958 | 18.91707 | 12.10359 | 4.823223 | 0 | 0 | 834.72 | 0 | 0 | 0 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | -190.975 | -84.3816 | 52.50409 | 0 | 0 | 0 | 0 | 0 | 23.26348 | 1 Could you please let me know what's the problem with it? Thanks Shuja Answers: username_0: yes, I have seen it and following the same way as explained in the example. Could you please let me know if there is any other platform (mailing list, community ) to ask such questions? Thanks Shuja Status: Issue closed username_1: stackoverflow?
jamoma/JamomaMax
24951569
Title: create basic demo of buffer within Max implementation Question: username_0: I am calling it a demo for now, but it could be the start of a helpfile. It would be nice to have this working for my own testing. Some of the features I have working in Core seem to be missing or are not translating to the Max attributes. This may be some misunderstandings I have about what needs to be in the code to make this happen auto-magically. <pre><code> ----------begin_max5_patcher---------- 624.3ocyV0zbaBCD8L7qPCSmIWnFIgAi6s7anyzKcxjQXDNxFDLBQpayj9au 5<KEY>EA<KEY> rel<KEY> Q<KEY> U<KEY>4<KEY> -----------end_max5_patcher----------- </code></pre> Status: Issue closed Answers: username_0: No use for this now. Note however that sending the "load" command in this patch crashes Max. Is there a way to prevent the j~ external from using more experimental/incomplete classes in the Core like TTBuffer? Opening a separate issue #900 for that question and closing this one.
developit/microbundle
512700864
Title: Using modern format forces ALL outputted bundles to be modern code Question: username_0: I think this is a bug unless I'm misunderstanding the purpose of specifying multiple output formats. If you run `microbundle --format modern,umd` then both the `modern.js` and `umd.js` files that are output will not polyfill code using `class`, `async/await`, etc. If you run `microbundle --format umd` then the `umd.js` file will be polyfilled and remove modern code like `class` or `async/await`, as I would expect being a safe default. If this is indeed buggy behavior then I think this is a big problem because `modern` is a default format, which means if you don't specify a format all of your output bundles will be expected to only run in modern browsers. Answers: username_1: This should be fixed by #570. Status: Issue closed
rancher/rancher
489419667
Title: Revise the way we store/share global resources Question: username_0: Should revise and generalize the way global resources are stored and shared in rancher: - [ ] NodeTemplates - [ ] CloudCredentials - [ ] ClusterTemplates Today some are stored in the user namespace, some are in the global one. Some are sharable, some are not. @cloudnautique wasn't sure if we have an item filed for it already; feel free to close if we do.
liaujianjie/react-image-snipper
374734490
Title: Use webpack externals for production build Question: username_0: Packages such as `react` shouldn't be bundled for production. Answers: username_0: https://webpack.js.org/guides/author-libraries/ username_1: This should be a good point of reference. https://itnext.io/how-to-package-your-react-component-for-distribution-via-npm-d32d4bf71b4f username_0: I'm debating if we should switch to Rollup.
sigp/lighthouse
582686785
Title: BeaconState::get_active_validator_indices should check epoch Question: username_0: ## Description The function `BeaconState::get_active_validator_indices` takes an arbitrary epoch and attempts to compute (without the committee cache) the active validators at that epoch. The problem is, we only have limited lookahead because of new validators activating and old validators leaving. ## Steps to resolve 1. Work out the minimum lookahead on the active validator set by reasoning about the activation queue, finalization, and the exit queue. Probably the most relevant part of the spec is [`process_registry_updates`](https://github.com/ethereum/eth2.0-specs/blob/v0.10.1/specs/phase0/beacon-chain.md#registry-updates). Based on some back of the envelope math it looks like maybe the safe lookahead is 2 epochs, i.e. 3 epoch transitions from when a deposit is made: 1st adds validator to the activation queue with eligibility in the next epoch, 2nd justifies the eligible epoch (best case), 3rd finalizes the eligible epoch (best case) thus activating the validator. 2. Add a check on the epoch to `get_active_validator_indices`, returning an error when it is too high (or indeed too low). Alternatively, look into whether `get_active_validator_indices` actually needs to be accessible from the `BeaconState`. Answers: username_1: I will take a look at this. username_1: Correct me if I'm wrong but the active validator set is known for the current epoch and every past epoch correct? So I'm not sure what you mean by minimum lookahead / too low (other than a negative number). I've had a bunch of stuff come up recently so I haven't had much time to look at this. I believe your maximum of 3 epoch transitions is correct at least as far as the activation queue goes but I still need to work out the exit queue. username_0: @username_1 You're right! There's no lower limit! Thanks for looking into this :) username_1: Okay so I'll try to write this out in steps simplifying as I go: There are 4 events that could change the active validator set: 1. NEW DEPOSIT: ``` state_transition(): process_slots(): process_epoch(): // once per epoch when (state.slot + 1) % SLOTS_PER_EPOCH == 0 process_block(): process_operations(): process_deposit(): // leads to one of the following which can make validator eligible for activation queue 1: get_validator_from_deposit() 2: increase_balance() ``` 2. VALIDATOR SLASHED ``` state_transition() process_slots(): process_epoch(): // once per epoch when (state.slot + 1) % SLOTS_PER_EPOCH == 0 process_block(): process_operations(): process_proposer_slashing() / process_attester_slashing(): slash_validator(): initiate_validator_exit(): validator.exit_queue_epoch = compute_activation_exit_epoch(); // MINIMUM ``` 3. VOLUNTARY EXIT ``` state_transition() process_slots(): process_epoch(): // once per epoch when (state.slot + 1) % SLOTS_PER_EPOCH == 0 process_block(): process_operations(): process_voluntary_exit(): initiate_validator_exit(): validator.exit_queue_epoch = compute_activation_exit_epoch(); // MINIMUM ``` 4. PENALTIES FORCE VALIDATOR TO EXIT ``` state_transition(): process_slots(): process_epoch(): // once per epoch when (state.slot + 1) % SLOTS_PER_EPOCH == 0 ... process_rewards_and_penalties(): // expanded below ``` I need to expand on process_epoch(): ``` process_epoch(): // once per epoch when (state.slot + 1) % SLOTS_PER_EPOCH == 0 process_justification_and_finalization(state) // skimming this code I *think* (don't quote me) the smallest epoch increments are: JUSTIFICATION_GAP = 1 FINALIZATION_GAP = 2 process_rewards_and_penalties(state) process_registry_updates(state) foreach validator: if eligible_for_activation_queue(): validator.activation_eligibility_epoch = current_epoch+1 [Truncated] ``` 2-4. VALIDATOR SLASHED / VOLUNTARY EXIT / PENALTIES FORCE VALIDATOR EXIT: ``` validator.exit_queue_epoch = compute_activation_exit_epoch(current_epoch); // MINIMUM ``` simplifies to: 1. NEW DEPOSIT ``` validator.activation_epoch = current_epoch + FINALIZATION_GAP + 6 ``` 2-4. VALIDATOR SLASHED / VOLUNTARY EXIT / PENALTIES FORCE VALIDATOR EXIT: ``` validator.exit_queue_epoch = current_epoch + 5 ``` Since cases 2-4 have a minimum < case 1, I don't really need to precisely calculate the finalization gap. tl;dr I believe the maximum safe lookahead is 4. I'll make a PR :) username_0: Closed by #1300 Status: Issue closed
isl/x3ml
169578656
Title: Update X3MLEngineFactory class to support the injection of multiple X3ML mapping files Question: username_0: Update the X3MLEngineFactory class to support the injection of multiple X3ML mapping files on a single run. The activities will start after the functionality has been implemented and tested (issue #61) Answers: username_0: the factory class has been implemented to support multiple mappings files Status: Issue closed
qoollo/pearl
755282209
Title: Add API method to get the memory occupied by the index Answers: username_1: [missed, wrote it into PR] 02.12 - 09.12: 9h (acquaintance with the project + some thoughts about solution realization) 09.12 - 16.12: 3h (first version) 16.12 - 23.12: 1.5h (restructure logic a bit (caching key in header and computing size of header with serialized_size)) username_1: 26.12 - 27.12: 1.5h (add test for this public method) username_1: 28.12: 4h (tests, fixes, description for tests and measurements logic) Status: Issue closed
sc-forks/solidity-coverage
574215450
Title: Unparse-able solc 0.6.0 language features Question: username_0: Tracking issue for new solc syntax that [parser-diligence][1] still has difficulty with: + file scoped structs & enums + contract types and enums as keys for mappings + payable as function call: payable(address) [1]: https://github.com/ConsenSys/solidity-parser-antlr/issues Answers: username_1: Hi @username_0. I created a fork of consensys's fork (yeah, I know...) and I'm actively working on it, since I need an up to date version of the parser for solhint and prettier-solidity. Check it out if you are interested: https://github.com/solidity-parser/parser Almost all the issues you listed will be solved soon. The third one, `payable as function call`, is the exception; I will open an issue for that. username_0: Oh great! Thanks @username_1. username_2: Hi I think the issue I created earlier today is probably also relevant here #493 username_1: Oh, yeah, it is! @username_2 could you please open an issue [here](https://github.com/solidity-parser/parser)? Thanks! username_2: Hi @username_1 . I will open an issue there. thanks! username_1: @username_0 `@solidity-parser/[email protected]` should have all these issues fixed, I think, in case you want to give it a try. username_1: (Except the one that @username_2 created, sorry, didn't notice it was added to this list) username_0: Ah sorry! I didn't get a chance yesterday to test that one out... @username_1 Fantastic. Thanks so much for doing all this, just amazing. 🎉 🎉 username_1: FYI I just ran the parser in all OpenZeppelin's contracts and, at least, there weren't any errors. username_0: `0.7.4` Thanks again @username_1! Status: Issue closed
WarEmu/WarBugs
37796256
Title: [Runepriest] Rune of Immolation (video included) (720p/1080p) Question: username_0: spell tooltip: 0:01-0:05 (video) Bug: Spell does lower damage than tooltip says, you can see on combat log or look at yellow damage it is does 19damage per second, total 14 second, 14x19=266.damage instead of 295. https://www.youtube.com/watch?v=VU833_TM2f4&feature=youtu.be Checked on player too,same dmg on mob. (its means emenies dont have resistance) ![runepriest_035](https://cloud.githubusercontent.com/assets/8081009/3572687/747e1222-0b6a-11e4-8c05-4d9e648fa0a4.jpg) "Deals 295 Elemental damage over 15 seconds."<issue_closed> Status: Issue closed
eclipse/microprofile-metrics
330722181
Title: Should it be possible to annotate an object with both @HitCounted and @ParallelCounted? Question: username_0: See also https://github.com/eclipse/microprofile-metrics/pull/247#discussion_r192965218 Especially when we implement #251, I see a desire to say @HitCounted @ParallelCounted(keepMax = true) // attribute is pseudo-code void someMethod() { ... } to both count the absolute number of invocations as well as the max number of parallel invocations of `someMethod()`. Answers: username_1: I think developers should be able to annotation with any combination of `@HitCounted`, `@ParallelCounted`, `@Counted`, `@Timed`, and `@Metered` as long as each one has a unique name. I think the issue we need to avoid (and state in the spec throw an exception) is a `@HitCounted(reusable=true)` of the same name as `@ParallelCounted(reusable=true)` since this would mess up the calculations for both, especially if the underlining implementation is a "Counter" type. username_2: That's a reasonable use case. username_0: We can internally have 2 different types and then on exposing add a label e.g. `ctype=hit` or `ctype=parallel`. Or also `monotonic=true` and `monotonic=false`. The 1st may be more descriptive though username_0: We could potentially internally add _hit and _par to the name to differentiate the two. username_0: If we allow for same naming, we probably need to change the following in `MetricRegistry`: public abstract Map<String, Metric> getMetrics(); public abstract Map<String, Metadata> getMetadata(); which both assume a (unique) name as key. Also the `MetricFilter`probably needs to be updated to add in a type to the `matches()` method. Status: Issue closed username_0: I think this is possible with the impl in #309 Closing this one.
ant-design/ant-design
451940162
Title: webkitRelativePath of each file is empty if directory has dropped into dragger Upload component Question: username_0: - [ ] I have searched the [issues](https://github.com/ant-design/ant-design/issues) of this repository and believe that this is not a duplicate. ### Reproduction link [https://github.com](https://github.com) ### Steps to reproduce Just create Drop-down style Upload component and just write some log when directory added. ### What is expected? It should return relative path value. ### What is actually happening? It is just empty string if it is dropped down. It works fine with selecting folder by clicking. | Environment | Info | |---|---| | antd | 3.19.2 | | React | 16.8.10 | | System | Mac OS 10.13.6 | | Browser | Chrome 74.0.3729.169 | <!-- generated by ant-design-issue-helper. DO NOT REMOVE --> Answers: username_1: Fixed in #16426, could you confirm the version of antd you are using? username_0: oh it is 3.16.1. Thank you for reply and i will close this issue. Status: Issue closed
rathukugan/moments-for-youtube
127777343
Title: Format into timestamp format. Question: username_0: player.getCurrentTime() returns seconds, need to format this into the timestamp format - Need to make a function to convert seconds to something like “#t=31m08s” - Noted problem: getCurrentTime() is off by ~10 seconds<issue_closed> Status: Issue closed
MathMark/MathAnalyser
211876502
Title: Color of function does not change in list box Question: username_0: Color of function does not change in list box for functions on main form, however the curve does change its color. ![untitled](https://cloud.githubusercontent.com/assets/13971845/23578741/53ce3bf0-00e6-11e7-85bb-f6f55b072f0d.png)<issue_closed> Status: Issue closed
plotly/dash-table
486371124
Title: TableTooltip has shouldComponentUpdate but is a React.PureComponent Question: username_0: I've just started getting warnings like ``` Warning: t has a method called shouldComponentUpdate(). shouldComponentUpdate should not be used when extending React.PureComponent. Please extend React.Component if shouldComponentUpdate is used. warningWithoutStack @ [email protected]?….1&m=1566991803:500 checkClassInstance @ [email protected]?…&m=1566991803:11401 mountClassInstance @ [email protected]?…&m=1566991803:11598 ..... ``` I based on the warning message think I tracked it down to shouldComponentUpdate being defined in: https://github.com/plotly/dash-table/blob/master/src/dash-table/components/ControlledTable/fragments/TableTooltip.tsx#L26 Given that that code is 7 months old and I've not seen that warning before I'm not sure if it's something I've done to upset the system or if I got a newer version of something when I rebuilt my app's container. Since I'm not sure I thought I should open an issue rather than make a PR to change the `React.PureComponent` to a `React.Component` like the warning says to.<issue_closed> Status: Issue closed
apache/hudi
868191558
Title: [SUPPORT] Does Hudi support flink DataStream API? Question: username_0: Besides [flink-sql-client](https://hudi.apache.org/docs/flink-quick-start-guide.html#setup), does hudi support [flink DataStream API](https://ci.apache.org/projects/flink/flink-docs-stable/dev/connectors/kafka.html#kafka-consumer)? ``` // Read kafka topic with DataStream API val props = new Properties() props.put("bootstrap.servers", "localhost:9092") props.put("group.id", "flink-kafka-consumer") val consumer = new FlinkKafkaConsumer[String]("device_logs", new SimpleStringSchema(), props) val stream = senv.addSource(consumer) ``` Given DataStream[String] above ^^^, is there a way to publish it to hudi table? If yes, any scala/java code snippet? Thanks for your help. Answers: username_1: @username_2 @leesf @dannyhchen : can one of you folks take this up username_2: Yes, you can change or rewrite your `HoodieFlinkStreamer` so that you can replace the source or do more things. Or, you can even reuse the `HoodieFlinkWriteClient` for more scalability if you want. username_3: @username_0 Does that help answer your question ? Are you looking for any further help ? Status: Issue closed username_3: @username_0 Closing this ticket due to inactivity. Please re-open if you need further assistance.
yarnpkg/berry
675496900
Title: [Bug] Nondeterministic build script execution for the case of monorepo workspace Question: username_0: **Describe the bug** Build scripts launching inside monorepo workspaces can be nondeterministic **To Reproduce** <details> <summary>Reproduction</summary> ```js repro await packageJson({ workspaces: [`packages/common`], dependencies: { [`common`]: `workspace:*` }, scripts: { postinstall: `echo monorepo` } }); await packageJson({ name: `common`, version: `1.0.0`, scripts: { postinstall: `echo common` } }, {cwd: `./packages/common`}); const outFresh = await yarn() expect(outFresh).toContain(`STDOUT common`) expect(outFresh).toContain(`STDOUT monorepo`) const outAdd = await yarn('workspace', 'common', 'add', 'lodash') expect(outAdd).toContain(`STDOUT common`) expect(outAdd).toContain(`STDOUT monorepo`) ``` </details> Answers: username_1: Fixed in master Status: Issue closed
Azure/bicep
1108649770
Title: Bicep VsCode ext constantly tries and fails to install .NET 6 Question: username_0: It keeps trying and failing even after I manually installed .NET 6 myself. The error is: An error occurred while installing .NET (6.0): .NET Acquisition Failed: Installation failed: Error: Command failed: powershell.exe -NoProfile -ExecutionPolicy unrestricted -Command "& { [Net.ServicePointManager]::SecurityProtocol = [Net.ServicePointManager]::SecurityProtocol -bor [Net.SecurityProtocolType]::Tls12 ; & 'c:\Users\<my-name>\.vscode\extensions\ms-dotnettools.vscode-dotnet-runtime-1.5.0\dist\install scripts\dotnet-install.ps1' -InstallDir 'c:\Users\<my-name>\AppData\Roaming\Code\User\globalStorage\ms-dotnettools.vscode-dotnet-runtime\.dotnet\6.0.1' -Version 6.0.1 -Runtime dotnet } There is not enough disk space on drive C: At C:\Users\<my-name>\.vscode\extensions\ms-dotnettools.vscode-dotnet-runtime-1.5.0\dist\install scripts\dotnet-install.ps1:1091 char:5 + throw "There is not enough disk space on drive ${installDrive}:" + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : OperationStopped: (There is not en...ace on drive C::String) [], RuntimeException + FullyQualifiedErrorId : There is not enough disk space on drive C: NOTE this: ![image](https://user-images.githubusercontent.com/5447190/150232709-2f8c8513-cdbd-4874-a844-b2c0cd14b5a4.png) Does this occur consistently? Yes Repro steps: Action: bicep.activate Error type: Error Error Message: Failed to install redacted:url runtime v6.0. Version: 0.4.1124 OS: win32 OS Release: 10.0.22000 Product: Visual Studio Code Product Version: 1.63.2 Language: en <details> <summary>Call Stack</summary> ``` logger client.ts:129:11 o logger.ts:112:10 ``` </details> Status: Issue closed Answers: username_1: This is the same issue encountered in #2483. Please follow the steps there to continue the troubleshooting.
tomwalder/php-appengine-pull-queue
136061655
Title: I am using list task method , but it is not working . I am facing an error Question: username_0: I am using list task method , but it is not working . I am facing an error . We put Queue.php and Task.php under AEQ folder Declear namespace AEQ; for both file Queue.php and Task.php Add task and lease & delete task working well.. But list task not working . Please check below code include_once 'AEQ/Task.php'; include_once 'AEQ/Queue.php'; $obj_task = new AEQ\Task(); $obj_queue = new AEQ\Queue('queue_name'); // List Tasks foreach($obj_queue->listTasks() as $obj_task) { echo $obj_task->getName(); } Using this code I got an error i.e Fatal error: Uncaught exception 'RuntimeException' with message 'Failed to execute call [QueryTasks] with: 9 (PERMISSION_DENIED)' in /base/data/home/apps/s~project/1.390926642656759954/public/pull/AEQ/Queue.php:223 Stack trace: #0 /base/data/home/apps/s~project/1.390926642656759954/public/pull/AEQ/Queue.php(197): AEQ\Queue->makeCall('QueryTasks', Object(google\appengine\TaskQueueQueryTasksRequest), Object(google\appengine\TaskQueueQueryTasksResponse)) #1 /base/data/home/apps/s~project/1.390926642656759954/public/pull/listtask.php(12): AEQ\Queue->listTasks() #2 {main} thrown in /base/data/home/apps/s~project/1.390926642656759954/public/pull/AEQ/Queue.php on line 223 Please give me solution Answers: username_1: Is the queue a pull queue? username_0: yes..tom ... username_1: I see the same error on AppEngine. Works locally. I may have to raise a query with the Google team.
pwndbg/pwndbg
424889336
Title: Enter do not repeat the last command Question: username_0: ### Description when using gdb 8.2 on Ubuntu 16.04, the enter do not repeat last command. ### Steps to reproduce 1. install gdb 8.2 on ubuntu 16.04 2. gdb any program. 3. enter `starti` 4. enter `si` 5. enter `enter` the fifth command do not repeat `si`. ### My setup ```shell pwndbg> version Gdb: 8.2 Python: 3.5.2 (default, Nov 12 2018, 13:43:14) [GCC 5.4.0 20160609] Pwndbg: 1.1.0 build: 0f4e31e Capstone: 4.0.1024 Unicorn: 1.0.1 pwndbg> show version GNU gdb (Ubuntu 8.2-0ubuntu1~16.04.1) 8.2 Copyright (C) 2018 Free Software Foundation, Inc. License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html> This is free software: you are free to change and redistribute it. There is NO WARRANTY, to the extent permitted by law. Type "show copying" and "show warranty" for details. This GDB was configured as "x86_64-linux-gnu". Type "show configuration" for configuration details. For bug reporting instructions, please see: <http://www.gnu.org/software/gdb/bugs/>. Find the GDB manual and other documentation resources online at: <http://www.gnu.org/software/gdb/documentation/>. For help, type "help". Type "apropos word" to search for commands related to "word". ``` Answers: username_1: this problem has been reported before. https://github.com/pwndbg/pwndbg/issues/511#issue-352141391 username_0: sorry.. Status: Issue closed
Jaliborc/Blitz
1124768780
Title: Lua Error - events.lua:115 Question: username_0: 137x Blitz\addons\main\events.lua:115: bad argument #1 to 'IsOnQuest' (outside of expected range -2147483648 to 2147483647 - Usage: local isOnQuest = C_QuestLog.IsOnQuest(questID)) [string "=[C]"]: in function `IsOnQuest' [string "@Blitz\addons\main\events.lua"]:115: in function `NumSkips' [string "@Blitz\addons\main\events.lua"]:53: in function `BestSkip' [string "@Blitz\addons\main\events.lua"]:40: in function `?' [string "@AtlasLootClassic\Libs\CallbackHandler-1.0\CallbackHandler-1.0-7.lua"]:119: in function <...sic\Libs\CallbackHandler-1.0\CallbackHandler-1.0.lua:119> [string "=[C]"]: ? [string "@AtlasLootClassic\Libs\CallbackHandler-1.0\CallbackHandler-1.0-7.lua"]:29: in function <...sic\Libs\CallbackHandler-1.0\CallbackHandler-1.0.lua:25> [string "@AtlasLootClassic\Libs\CallbackHandler-1.0\CallbackHandler-1.0-7.lua"]:64: in function `Fire' [string "@Attune\Libs\AceEvent-3.0\AceEvent-3.0-4.lua"]:120: in function <...aceAttune\Libs\AceEvent-3.0\AceEvent-3.0.lua:119> [220204-Blitz_LUA.txt](https://github.com/Jaliborc/Blitz/files/8007118/220204-Blitz_LUA.txt)
oppia/oppia
472585976
Title: Allow skill creators to add a rubric to the skill Question: username_0: When we create questions for a skill, we allow creators to set a difficulty of the question per skill. The difficulty levels are Easy, Medium, and Hard. Difficulty is quite a subjective parameter and question creators may not know how to classify the question difficulty for a particular skill. So we would like to have skill creators give a rubric. Skill creators would have to add rubrics for 'Easy', 'Medium', and 'Hard'. The rubric should give a clear explanation for each of the difficulty levels and give examples as well. Answers: username_1: Closing it as this is completed by #7389 Status: Issue closed
vuejs/vetur
283782434
Title: html syntax error in the style Question: username_0: - [ done ] I have searched through existing issues - [ done ] I have read through [docs](https://vuejs.github.io/vetur) - [ done ] I have read [FAQ](https://github.com/vuejs/vetur/blob/master/docs/FAQ.md) ## Info - Platform: <!-- Win/macOS/Linux --> - Vetur version: 0.11.5 - VS Code version: 1.19.0 提交 816be6780ca8bd0ab80314e11478c48c70d09383 日期 2017-12-14T12:06:43.492Z Shell 1.7.9 渲染器 58.0.3029.110 Node 7.9.0 架构 x64 ## Problem <!-- Include error message from Panel -> Output -> Vue Language Server --> <!-- With screenshot / gif if possible --> html error tips, display in the style. ## Reproducible Case <!-- Important. Please provide clear steps for reproducing the problem. Otherwise we can't help you and your issue might be closed. For example, fork https://github.com/octref/veturpack and modify it to reproduce the error, then push your changes to GitHub and send us the link. --> in the style ,hint html error. ![Uploading image.png…]() Answers: username_0: ![image](https://user-images.githubusercontent.com/30047595/34242858-d2040632-e658-11e7-8179-6a7618541938.png) username_1: Please provide a minimal reproduction. username_1: Also, note the error comes from eslint-plugin-vue. Probably it is an issue there. Status: Issue closed
flutter/flutter
446898331
Title: NetworkImage._loadAsync: The getter 'length' was called on null Question: username_0: ## Steps to Reproduce 1. Code as follow: ```dart Widget loadAvatar(url, name, {color = Colors.teal}) { NetworkImage image; try { image = new NetworkImage(url); return CircleAvatar( backgroundImage: image, ); } catch (Exception) { print("Load avatar error: $Exception"); } return new CircleAvatar( child: Text( name, textAlign: TextAlign.center, overflow: TextOverflow.fade, ), backgroundColor: color, ); } ``` ## Logs ``` NoSuchMethodError: NoSuchMethodError: The getter 'length' was called on null. Receiver: null Tried calling: length File "uri.dart", line 788, in Uri.parse File "uri.dart", line 2499, in _Uri.resolve File "image_provider.dart", line 489, in NetworkImage._loadAsync File "<asynchronous suspension>" File "image_provider.dart", line 475, in NetworkImage.load File "image_provider.dart", line 285, in ImageProvider.resolve.<fn>.<fn> File "image_cache.dart", line 157, in ImageCache.putIfAbsent File "image_provider.dart", line 285, in ImageProvider.resolve.<fn> File "synchronous_future.dart", line 38, in SynchronousFuture.then File "image_provider.dart", line 283, in ImageProvider.resolve File "decoration_image.dart", line 239, in DecorationImagePainter.paint File "box_decoration.dart", line 414, in _BoxDecorationPainter._paintBackgroundImage File "box_decoration.dart", line 432, in _BoxDecorationPainter.paint File "proxy_box.dart", line 1968, in RenderDecoratedBox.paint File "object.dart", line 2092, in RenderObject._paintWithContext File "object.dart", line 173, in PaintingContext.paintChild File "proxy_box.dart", line 123, in _RenderProxyBox&RenderBox&RenderObjectWithChildMixin&RenderProxyBoxMixin.paint File "object.dart", line 2092, in RenderObject._paintWithContext File "object.dart", line 173, in PaintingContext.paintChild File "list_tile.dart", line 1041, in _RenderListTile.paint.doPaint File "list_tile.dart", line 1044, in _RenderListTile.paint File "object.dart", line 2092, in RenderObject._paintWithContext File "object.dart", line 173, in PaintingContext.paintChild File "shifted_box.dart", line 70, in RenderShiftedBox.paint File "object.dart", line 2092, in RenderObject._paintWithContext File "object.dart", line 173, in PaintingContext.paintChild File "proxy_box.dart", line 123, in _RenderProxyBox&RenderBox&RenderObjectWithChildMixin&RenderProxyBoxMixin.paint File "object.dart", line 2092, in RenderObject._paintWithContext File "object.dart", line 173, in PaintingContext.paintChild [Truncated] File "proxy_box.dart", line 123, in _RenderProxyBox&RenderBox&RenderObjectWithChildMixin&RenderProxyBoxMixin.paint File "object.dart", line 2092, in RenderObject._paintWithContext File "object.dart", line 173, in PaintingContext.paintChild File "proxy_box.dart", line 123, in _RenderProxyBox&RenderBox&RenderObjectWithChildMixin&RenderProxyBoxMixin.paint File "object.dart", line 2092, in RenderObject._paintWithContext File "object.dart", line 128, in PaintingContext._repaintCompositedChild File "object.dart", line 96, in PaintingContext.repaintCompositedChild File "object.dart", line 853, in PipelineOwner.flushPaint File "binding.dart", line 331, in _WidgetsFlutterBinding&BindingBase&GestureBinding&ServicesBinding&SchedulerBinding&PaintingBinding&SemanticsBinding&RendererBinding.drawFrame File "binding.dart", line 701, in _WidgetsFlutterBinding&BindingBase&GestureBinding&ServicesBinding&SchedulerBinding&PaintingBinding&SemanticsBinding&RendererBinding&WidgetsBinding.drawFrame File "binding.dart", line 268, in _WidgetsFlutterBinding&BindingBase&GestureBinding&ServicesBinding&SchedulerBinding&PaintingBinding&SemanticsBinding&RendererBinding._handlePersistentFrameCallback File "binding.dart", line 988, in _WidgetsFlutterBinding&BindingBase&GestureBinding&ServicesBinding&SchedulerBinding._invokeFrameCallback File "binding.dart", line 928, in _WidgetsFlutterBinding&BindingBase&GestureBinding&ServicesBinding&SchedulerBinding.handleDrawFrame File "binding.dart", line 840, in _WidgetsFlutterBinding&BindingBase&GestureBinding&ServicesBinding&SchedulerBinding._handleDrawFrame File "zone.dart", line 1124, in _rootRun File "zone.dart", line 1021, in _CustomZone.run File "zone.dart", line 923, in _CustomZone.runGuarded File "hooks.dart", line 209, in _invoke File "hooks.dart", line 168, in _drawFrame ``` Answers: username_1: Are you sure that `url` is not Null? Does ```print(url);``` log out with a valid URL String? Status: Issue closed
moleculerjs/moleculer-db
624479731
Title: Support moleculer-db-adapter-typeorm Question: username_0: Hi, any chance to have [moleculer-db-adapter-typeorm](https://github.com/dkuida/moleculer-db-adapter-typeorm) incorporated to this repository? It has been inactive for a long time and it seems that it is not accepting pull requests. As this is the official repository, i think the community could contribute more easily. Thanks Answers: username_1: If you could help to contribute with this repo, I will pull it. But I have no capacity to learn about typeorm and maintain this adapter, as well. username_0: Of course, i would be happy to help! username_1: Great. In this case, I will try to reach @dkuida to transfer the repo into moleculer organization. If he doesn't reply, I will fork it and place into it. username_0: Hi, @username_1. Sorry for the inconvenience, but... any news about that? username_1: @username_0 Here is the transferred repo: https://github.com/moleculerjs/moleculer-db-adapter-typeorm Status: Issue closed username_1: @username_0 do you still plan to maintain it? username_0: Hi, Yes, I consider that the typeorm is an interesting option for the connection with some relational banks so I would like to help to mantain it, unfortunately I'm not having free time for this. At the time of the issue opening, I had a specific need that I ended up solving in another way, so we don’t need to treat this as a priority, if there’s something more interesting to moleculer that I can help with, I’ll be happy to do. Otherwise, in a few months I would look at the typeorm again username_1: Ok, go ahead ;) First of all, try to cover the same functionality as other adapters.
rust-lang/rust
233365245
Title: Lifetime arguments are completely ignored in method calls Question: username_0: This snippet successfully compiles despite `f` not having lifetimes parameters and even `'a`, `'b`, `'c` being non-existent. ```rust struct S; impl S { fn f(self) {} } fn main() { S.f::<'a, 'b, 'c>(); } ``` The parser parses lifetime arguments, but doesn't store them anywhere, so they don't present in AST/HIR. The issue exists since Rust 1.0. Answers: username_0: Supposedly they should be stored and then treated exactly like arguments in the UFCS form `S::f::<'a, 'b, 'c>`. Status: Issue closed
NiklasMerz/cordova-plugin-fingerprint-aio
533394328
Title: Add support for setConfirmationRequired for Android Question: username_0: Just a simple request to expose BiometricPrompt.PromptInfo builder option setConfirmationRequired for Android. Thanks Answers: username_1: I don't own a Pixel. How does the confirm work. @exxbrain and @greaterking please correct me if I am wrong. I guess this confirm is part of the Biometric prompt as implemented by Google for Pixel devices and we have no access to this? username_2: Hello guys, I just opened a PR to add this feature #222 @username_1 [here](https://developer.android.com/training/sign-in/biometric-auth#no-explicit-user-action), [here](https://developer.android.com/reference/androidx/biometric/BiometricPrompt.PromptInfo.Builder.html#setConfirmationRequired(boolean)) and [here](https://android-developers.googleblog.com/2019/10/one-biometric-api-over-all-android.html) you can find great documentation on how `setConfirmationRequired()` works. Status: Issue closed
rapidsai/cudf
1083074838
Title: [BUG] 2D Array Assignment Question: username_0: **Describe the bug** When I try to set 2D cupy array to some columns of cudf dataframe, it doesn't work correct. I get same value for the whole row. SO it is like it is setting one column of 2D array to all columns of cudf dataframe. **Steps/Code to reproduce bug** `df[[f"f_{i}" for i in range(n)]] = cp.random.randn(m, n) ` **Expected behavior** It should correctly set 2D arrays to columns like it can be done in pandas. **Workaround** ``` for i in range(n): df[f"f_{i}"] = cp.random.randn(m) ``` **Environment overview (please complete the following information)** Fresh RAPIDS 21.12 conda installation on Ubuntu 20.04 Answers: username_1: This seems similar to another issue we resolved: https://github.com/rapidsai/cudf/issues/8672 ```python import pandas as pd import numpy as np ​ m, n = 5, 3 ​ df = pd.DataFrame({f"f_{i}":range(m) for i in range(n)}) df[[f"f_{i}" for i in range(n)]] = np.random.randn(m, n) print(df) f_0 f_1 f_2 0 -1.145855 -1.925980 -0.888776 1 -1.197261 -0.259193 3.517007 2 0.233636 -0.435549 -1.525251 3 -0.322054 -0.423073 -0.802210 4 -1.017598 -0.881462 -0.382520 ``` ```python import cudf import numpy as np ​ m, n = 5, 3 ​ df = cudf.DataFrame({f"f_{i}":range(m) for i in range(n)}) df[[f"f_{i}" for i in range(n)]] = cp.random.randn(m, n) print(df) f_0 f_1 f_2 0 -0.274912 -0.274912 -0.274912 1 -0.889039 -0.889039 -0.889039 2 0.049632 0.049632 0.049632 3 0.605849 0.605849 0.605849 4 0.766410 0.766410 0.766410 ```
kubernetes-sigs/kubespray
1025559412
Title: unable to upgrade due to 1.20.0 not >= 1.22.2 🕳️ Question: username_0: **Environment**: onprem - debian - **OS (`printf "$(uname -srm)\n$(cat /etc/os-release)\n"`):** ``` Linux 4.19.0-12-amd64 x86_64 PRETTY_NAME="Debian GNU/Linux 10 (buster)" NAME="Debian GNU/Linux" VERSION_ID="10" VERSION="10 (buster)" VERSION_CODENAME=buster ID=debian HOME_URL="https://www.debian.org/" SUPPORT_URL="https://www.debian.org/support" BUG_REPORT_URL="https://bugs.debian.org/" ``` - **Version of Ansible** (`ansible --version`): ``` ansible 2.9.15 config file = /KRN/kubespray/ansible.cfg configured module search path = [u'/KRN/kubespray/library'] ansible python module location = /usr/lib/python2.7/dist-packages/ansible executable location = /usr/bin/ansible python version = 2.7.16 (default, Oct 10 2019, 22:02:15) [GCC 8.3.0] ``` - **Version of Python** (`python --version`): ``` Python 2.7.16 ``` **Kubespray version (commit) (`git rev-parse --short HEAD`):** **Network plugin used**: **Command used to invoke ansible**: `ansible-playbook -i inventory/XX/hosts.yml upgrade-cluster.yml **Output of ansible run**: ``` fatal: [master1]: FAILED! => { "assertion": "kube_version is version(kube_version_min_required, '>=')", "changed": false, "evaluated_to": false, "msg": "The current release of Kubespray only support newer version of Kubernetes than v1.20.0 - You are trying to apply 1.22.2" } ``` the `roles/kubernetes/preinstall/tasks/0020-verify-settings.yml` ``` that: kube_version is version(kube_version_min_required, '>=') ``` looks legit, but somehow 1.22.2 is not `>=` 1.20.0 🥲 Answers: username_1: Mistakes happens, no problem man and thanks
desi-programmer/fluttertodo
1052559880
Title: got error like this when i try to add card in todo list Question: username_0: TypeError: Cannot read properties of null (reading 'insert') at insert (http://localhost:64355/packages/todo/dbhelper.dart.lib.js:52:25) at insert.next (<anonymous>) at http://localhost:64355/dart_sdk.js:40646:33 at _RootZone.runUnary (http://localhost:64355/dart_sdk.js:40503:58) at _FutureListener.thenAwait.handleValue (http://localhost:64355/dart_sdk.js:35432:29) at handleValueCallback (http://localhost:64355/dart_sdk.js:36017:49) at Function._propagateToListeners (http://localhost:64355/dart_sdk.js:36055:17) at _Future.new.[_completeWithValue] (http://localhost:64355/dart_sdk.js:35897:23) at async._AsyncCallbackEntry.new.callback (http://localhost:64355/dart_sdk.js:35920:35) at Object._microtaskLoop (http://localhost:64355/dart_sdk.js:40808:13) at _startMicrotaskLoop (http://localhost:64355/dart_sdk.js:40814:13) at http://localhost:64355/dart_sdk.js:36279:9 Answers: username_1: Can You Post Mode Details ?? Where it occurred ? Flutter version ?
thinreports/thinreports-generator
184488945
Title: Dynamically change Shape size Question: username_0: I'd like to change the size of my rectangle within my generator with something like: `row.item("my_shape").style(:height, "50px", :width, "30px")` Is it possible? Answers: username_1: No, it isn't. username_0: Ok... Will it be made available as an improvement? username_1: Hmm.., I'll give it a try for now. username_2: How can I change the height of an item of a list dynamically? username_1: Currently you can't.
mAAstro7/tiralabra
68478484
Title: Koodikatselmointi Question: username_0: Logiikkaa ja käyttöliittymää voisi erotella enemmän, esim. siirtämällä valtaosa UI-luokan toiminnasta jonkinlaiseen Game-luokkaan joka käyttää UI-luokkaa tulosteluun ja syötten lukemiseen. Omat luokat eri siirroille, joita käytetään jonkinlaisen Move-rajapinnan avulla, voisi olla hyvä idea, mm. koska se helpottaa elämää jos pitää lisätä lizard ja spock. Satunnaiset suomenkieliset muuttujat voisi refaktoroida englanninkieliseksi. RoundRemember-luokkaa ei varmaan tarvitse- saman asian ajaa muuttuja joka osoittaa viimeksi luotuun Roundiin. Testikattavuus on aika hyvä. Plussaa, koska voitin 50-48.<issue_closed> Status: Issue closed
corona-warn-app/cwa-app-ios
641119887
Title: Move all AccessibilityIdentifiers to central file Question: username_0: As discussed with @i336616, it would be handy if we had all AccessibilityIdentifiers in one place. Currently, most of these are hard-coded. ## Current Implementation Just an excerpt of how it is handled now, from `ImageTableViewCell.swift`: ![image](https://user-images.githubusercontent.com/15066374/85013433-da90f380-b164-11ea-8ffc-28eca543351a.png) ## Suggested Enhancement Move the AccessibilityIdentifiers to a central file, similar to `AppStrings.swift`. Answers: username_1: Hi @username_0 , Hi @i336616 My question is, shall we put everything on AppStrings.swift file? The file is super fat now. Thanks. username_0: Hi Hao, I meant "similar" to AppStrings.swift - would make a separate file just for the Accessibility Identifiers. Sorry if I didn't explain it correctly username_1: Hi @username_0 , sorry, my bad, oversaw the "similar". :-) Status: Issue closed
jcansdale/gpr
658030430
Title: Publishing .snupkg fails as duplicate version Question: username_0: It looks like GitHub Packages doesn't currently support .snupkg files. The followings silently fails to push symbols: ``` dotnet nuget push package.snukpg ``` This will fail with a duplicate version error: ``` gpr push package.*nukpg ``` https://github.com/username_0-test/nuget-symbols/runs/876992894?check_suite_focus=true#step:6:12