repo_name
stringlengths
4
136
issue_id
stringlengths
5
10
text
stringlengths
37
4.84M
jlippold/tweakCompatible
356644936
Title: `MinimaLS` working on iOS 11.3.1 Question: username_0: ``` { "packageId": "com.imuseum.minimals", "action": "working", "userInfo": { "arch32": false, "packageId": "com.imuseum.minimals", "deviceId": "iPhone7,2", "url": "http://cydia.saurik.com/package/com.imuseum.minimals/", "iOSVersion": "11.3.1", "packageVersionIndexed": false, "packageName": "MinimaLS", "category": "Widgets", "repository": "Packix", "name": "MinimaLS", "installed": "1.0", "packageIndexed": false, "packageStatusExplaination": "This tweak has not been reviewed. Please submit a review if you choose to install.", "id": "com.imuseum.minimals", "commercial": false, "packageInstalled": true, "tweakCompatVersion": "0.1.0", "shortDescription": "A Simple XenHTML Widget", "latest": "1.0", "author": "iMuseum", "packageStatus": "Unknown" }, "base64": "<KEY> "chosenStatus": "working", "notes": "" } ```
arthurosipyan/JamaScript
354895124
Title: Installation: Missing Require Modules Question: username_0: **Describe the bug** Program is missing required packages and modules **To Reproduce** Steps to reproduce the behavior: 1. Download and unzip program 2. Open in PyCharm 3. Run ```main.py``` 4. See error **Expected behavior** Program should run. Answers: username_0: Seems to be an issue with IntelliJ not setting the source root. - Solution: Right click"JamaScript" -> "Mark Directory as" -> "Sources Root" Status: Issue closed
KubeOperator/KubeOperator
589566865
Title: ssh/scp连接建立后没释放 Question: username_0: **username_1 版本** 您所使用的 username_1 版本是? 2.4.30 **Bug 描述** 简要描述您碰到的问题 ssh/scp连接建立后没释放 **Bug 重现步骤** ![image](https://user-images.githubusercontent.com/41712985/77822867-83fdb580-7131-11ea-9725-afd87b347b03.png) Answers: username_0: @Aaron3S username_1: 已修复,待发布。 username_1: 已解决。 Status: Issue closed
ros2/rclpy
362393934
Title: Respect the use_sim_time parameter for rclpy nodes Question: username_0: There is support for Clocks that use ROS Time as described in [this design document](http://design.ros2.org/articles/clock_and_time.html). It was added in [this PR](https://github.com/ros2/rclpy/pull/210). Clocks can be set to have ROS time "active" (will use the value seen on the /clock topic as its time) by calling `time_source.ros_time_is_active = True`. A node has a time source associated with it (added in https://github.com/ros2/rclpy/pull/210). At the time of that PR, parameters were not available in rclpy, so the node's time source does not respond to the `use_sim_time` parameter: https://github.com/ros2/rclpy/blob/62012d3650de790d89c725232451c4e998b396fb/rclpy/rclpy/node.py#L97 As of https://github.com/ros2/rclpy/pull/214, parameters are available in rclpy. Tasks to get rclpy nodes to respond to the `use_sim_time` parameter: - [ ] Add a [parameter callback](https://github.com/ros2/rclpy/pull/228) that detects if `use_sim_time` has been set on the node. - [ ] Add tests that the node's clock output matches what's expected based on the combination of `use_sim_time` parameter and the presence of a `/clock` publisher. Answers: username_1: @username_0 I'd like to work on this issue. Can you assign it to me? username_2: @username_1 Thanks! I can't assign the issue to you (isaacs/github#100), but your contribution is still very welcome. When you make a pull request add the text `connects to ros2/rclpy#240` to the PR description. That will link the PR and issue on the [waffle board](https://waffle.io/ros2/ros2). username_3: @username_1 - I'm in need of this functionality now for testing ROS2 Navigation, did you get this to work? username_4: Resolved by #297 Status: Issue closed
aws-amplify/amplify-cli
598128939
Title: More than 200 resources not allowed in GraphQL Schema? Question: username_0: **Describe the bug** I added an Aurora Serverless data source to a brand new GraphQL Api. The data in the db is from an established piece of software with a relational db consisting of 49 tables. There are 1700 or so users in this db and the tables hold various data about them. So it's not that big of a data set. After doing `amplify api add-graphql-datasource` I got error of ` UnhandledPromiseRejectionWarning: InvalidMigrationError: The <redact>.json stack defines more than 200 resources.` **Amplify CLI Version** 4.18.0 **To Reproduce** Add an Aurora Serverless data source with a sizeable relational db. **Expected behavior** Should just work. Answers: username_0: Also looking at the code that throws this exception, it's supposed to give me this message but it doesn't: `The ${stackName} stack defines more than 200 resources.`, 'CloudFormation templates may contain at most 200 resources.', 'If the stack is a custom stack, break the stack up into multiple files in stacks/. ' + 'If the stack was generated, you have hit a limit and can use the StackMapping argument in ' + `${TRANSFORM_CONFIG_FILE_NAME} to fine tune how resources are assigned to stacks.` username_1: @username_0 Is the stack that is giving you the error a model generated stack? username_0: Not sure what you mean by "model generated". Yes it is an auto-generated stack based off of the schema in the relational db. I did not use @model transforms or anything like that. It is a brand new GraphQL API with only the relational db data source added. I guess this is an expected limitation, and I'm supposed to use the "StackMapping" attribute somehow. But I think I will just remove some of the resolver resources, as I only need read access to this database and there will remove all mutation and delete resolvers. username_2: Also running into this limitation in a medium sized project with 54 tables. Is Amplify only designed to be used in small projects? username_3: @username_0 CloudFormation and the CLI both has increased the limit to 500 resources now. Are you still running into this limitation with your app? username_0: Hey @username_3 - I went an alternate route and have no need to connect to this db anymore. So you can go ahead and close this ticket and if I end up doing something similar in future and running into issue I'll open new ticket. If you really want me to test I can but I have no need currently to connect to a large db that might hit the resource limit. Status: Issue closed
chakra-ui/chakra-ui
1049208246
Title: None Question: username_0: Hi @justnoobbot this is not a bug, it's working as designed by React. `useColorModeValue` uses the `useColorMode hook which consumes `ColorModeContext` with the React `useContext` hook. The `useContext` hook is what causes the rerender. Get more explanation [here](https://github.com/facebook/react/issues/15156#issuecomment-474590693) Status: Issue closed Answers: username_1: @username_0 I believe this is the color mode issue that was just fixed by #4989. If the system color mode is dark then the app is rendered with the initialColor mode, light. It then re-renders with dark mode. Using the CSB with the fix, the example that was provided only renders once for light mode. username_0: cc @username_1 username_0: ### Description When I run the app i expect console.log("log") to run once, but it got executed twice ### Link to Reproduction https://codesandbox.io/s/chakra-ui-usecolormodevalue-j5y1j?file=/src/App.js ### Steps to reproduce 1. Visit sandbox 2. Click console 3. See `console.log("log");` got executed twice When I remove `const bg = useColorModeValue("white", "gray");` it runs once as expected (I need that in my project) ### Chakra UI Version 1.6.12 ### Browser _No response_ ### Operating System - [ ] macOS - [ ] Windows - [ ] Linux ### Additional Information _No response_ username_0: I'm the mean time you can use the `_dark` pseudo prop for mode styling instead of the `useColorModeValue` hook @justnoobbot username_1: My apologies. I missed the notification on this. I looked at the CSB, what am I suppose to observe? I ran the app with and without the commented out code. Each time the log, modified to `console.log('log', colorMode)`, show a single message with the color mode. The text that says "There's nothing here!" displays appropriately regardless of the color mode. Uncommenting the code, `useColorModeValue`, and changing the log message to `console.log('log', bg)` also produced a single log message with the correct color mode value. username_0: @username_1 try it in dark mode username_1: I did. In light mode the text is black and bg white. In dark mode the bg is dark gray and text is white. What am I suppose to be seeing? username_0: @username_1 Check the console in dark mode, the component is rendered twice. username_1: @username_0 Thanks. I just finished triaging the issue. This is the white flash/double render issue caused by the ColorModeManager. I've tested a local fix with a React app and will follow up with one using Next. Once that's complete I'll submit a PR.
hank-cp/sbp
816408970
Title: How plugins can read application.properties outside plugin jar Question: username_0: Hi there, Trying to read `application.properties` plugin jar using PropertySource ``` @PropertySource(value = "file:${admin-ui.home}/config/plugin-name.properties") ``` But it shows the below message when running the project. `java -jar hans-admin-ui-api-standalone.jar --admin-ui.home=D:\Projects\java\hans-admin-ui-api` **Logs** ``` 2021-02-25 18:02:58.338 INFO 11584 --- [ main] o.s.c.a.ConfigurationClassParser : Properties location [file:${admin-ui.home}/config/plugin-name.properties] not resolvable: Could not resolve placeholder 'admin-ui.home' in value "file:${admin-ui.home}/config/plugin-name.properties" ``` We have also used `admin-ui.home` placeholder in the parent project. It is working there. Answers: username_1: Let's clearify your question first. Is `admin-ui.home` declare in your main app's `application.properties` and you want to use it in plugin? If yes, you could use [spring.sbp.plugin-properties](https://github.com/username_1/sbp/blob/master/docs/configuration.md#springsbpplugin-properties) to pass configuration from app to plugin. Status: Issue closed
yairEO/tagify
436311426
Title: Bug when changing letter to uppercase, tag become invalid Question: username_0: Hi, On the demo page https://yaireo.github.io/tagify/ in the first example, if you change for example `html` tag to `Html` it's marked as invalid. The comparaison with itself looks case sensitive and IMHO it shouldn't. Answers: username_1: Yes i see what you mean. I will change this line to be *case insensitive* https://github.com/username_1/tagify/blob/master/src/tagify.js#L481 username_0: 👍 Status: Issue closed username_0: I saw you fixed the issue but I can't install the version 2.19.2 from NPM, it's stuck to 2.19.1 username_1: @username_0 - will be fixed in 6-7 hours after I get home from work and publish the version to NPM
alcatrazEscapee/ore-veins
554627019
Title: can't configure Question: username_0: 1. creating World . 2. copying the "`data\oreveins`" folder from "oreVeins" .. ".jar" to "`saves\<<World>>\data`" . 3. changing the "`data\oreveins\oreveins`" folder and content 's names . 4. removing ` ``"type": "oreveins:default_veins"`` ` from the files, setting ` ``enableDefaultVeins = false`` ` in "`config\oreveins-comon.toml`" . 5. loading the World and using "`/reload`" . 6. no vein options shown at "`/findveins `" . please help Status: Issue closed Answers: username_1: There's a number of things here - You need to make a [Data Pack](https://minecraft.gamepedia.com/Data_pack) (link contains help on how exactly to do that, if not there's plenty of other resources on the format) - You need to have a separate namespace for your data pack, otherwise you'll end up overriding the existing veins instead of adding new ones. - The `data/oreveins/oreveins` is the path within your datapack (i.e. similar to `data/[namespace]/advancements` from vanilla). So yours would be `data/[your namespace]/oreveins` - Don't use `null` as a entry for conditions, just omit the conditions. username_0: @username_1 . ty .
ant-design/ant-design-pro
391534490
Title: Module not found: Error: Can't resolve '@babel/runtime/helpers/esm/objectWithoutPropertiesLoose' in 'D:\bc\workspace\mgr\node_modules\react-redux\es\components' Question: username_0: 之前的ant-design-pro项目,执行了npm install,然后就报这个 感觉应该是哪个包的版本出问题了,求解 Module not found: Error: Can't resolve '@babel/runtime/helpers/esm/objectWithoutPropertiesLoose' in 'D:\bc\workspace\mgr\node_modules\react-redux\es\components' Answers: username_1: 删除,重新安装就好了 Status: Issue closed
backdrop/backdrop-issues
832867358
Title: Add a link to the BackdropCMS.org page on module list for each module Question: username_0: **Description of the need** On the module list at `admin/modules` page would be nice to be able to see a link to the module's detail page on BackdropCMS.org and/or the repo on GH. **Proposed solution** Add a link under the "more/less" toggle. Answers: username_1: @username_0 Can I work on this issue? username_2: @username_1 - You do not have to ask to work on an issue, unless you can see that someone else is already working on it. Just post a note saying that you are working on it and get started. If you have any questions, please feel free to ask here. username_2: @username_1 - If you need help getting started. Feel free to try our chat channel: https://backdrop.zulipchat.com/ If you have not read this yet, this is a good place to get started: https://backdropcms.org/news/openforce-2022-getting-started username_1: @username_2 Can you share relevant links for this issue? username_2: @username_1 is looking at Backdrop CMS for the first time, can anyone provide any tips or pointers at how to get started on this issue. I won't have time until tomorrow. username_3: 1. If you open the `admin/modules` page, you can see "more" links in the descriptions of the modules. Open the first. 2. There is a line with "Tags:..." string. It is generated by `core\modules\system\system.theme.inc` Search `Tags:` string in this file (`function theme_system_modules_fieldset()`). 3. There is a `$requirements` variable. You should expand the `$requirements` with some new lines contain necessary links. 4. Links can be generated by `$key`, because it is the machine name of the module - Module's detail page: `'https://backdropcms.org/project/' . $key` - Module's repo on GitHub: `'https://github.com/backdrop-contrib/' . $key` Except the core modules. Core modules haven't got detail page and the 'repo' is only a subdirectory in [the core repo](https://github.com/backdrop/backdrop). username_4: Thank you for working on the PR @username_1 🙏🏼 There are some test failures, but before you work further on the PR... I had a quick look, and suggested the following: - The new `div` introduced to hold the links has a class of `admin-requirements` (same as the `dvi` that holds the module tags) - I suggested that we also add another class; perhaps `project-links`. - Instead of "module's detail page" and "module's repository", I suggested using "project information" and "GitHub repository", since that gives some indication as to where these links would take you. I was about to suggest that we set `target="_blank"` on the links, so that there is no disruption to the workflow when checking the details of modules (navigating away from the site, then having to return - repeat). But then thought that it might be better altogether if we used the same UI pattern here as in the Project Browser, where we use a single link (the module name) to open a pop-up dialog with the project information (which we could improve to include a link to the GitHub project). Thoughts? username_1: @username_4 Those changes are ok. As per the issue, we just need to add links for more information about modules. So why do we need to show project information? username_4: @username_1 the information shown in the `https://backdropcms.org/project/MODULE_NAME` links is the same as the project info in the pop-up dialog. For example: - `https://backdropcms.org/project/devel`: <img width=50% src="https://user-images.githubusercontent.com/2423362/156989723-2ae1d0a5-3e72-41ea-9f75-55bd86734417.png"> - and the respective pop-up with the details/info from the Project Browser: <img width=50% src="https://user-images.githubusercontent.com/2423362/156989858-33b5d23e-2bca-42b5-b9bb-77d0d5b329b4.png"> It was just a suggestion, so it's fine if others also think that it's not a good idea 👍🏼 username_2: I've reviewed the PR in sandbox and I like it. My initial reaction to the pop-up idea was that it's unnecessary and I don't want to derail the PR of our enthusiastic new contributor. But, then I thought about the fact that this pop-up is displaying the README, which often has instructions on how to use a module. Wouldn't it be nice to have those instructions available right on the module page. The more I think about it, the more I like it. I'm curious what others think about this? NOTE TO @username_1: It is not unusual in our issue queue, that once we have a PR to look at, the scope of the discussion changes, sometimes dramatically. Sometimes, the initial PR never get's accepted, but does lead to a completely new and better idea/feature - which is also a very important contribution. Thanks for giving us this PR to move this issue forward. username_2: Also, I just noticed that the PR creates links for core modules that do not work - because they do not have project pages. I think that we can only have links for contrib modules, how much harder is that? username_1: @username_2 ```php if ($module_data[$key]->info['project'] !== 'backdrop') { $requirements .= '<div class="admin-requirements">' . t('Links:') . ' ' . l(t("module's detail page"), 'https://backdropcms.org/project/' . $key) . ', ' . l(t("module's repository"), 'https://github.com/backdrop-contrib/' . $key) . '</div>'; } ``` username_3: I think it is a speciality of the sandbox. Try without sandbox. There is a tag in every released module's `.info` file: `project = ` Only in the core modules: `project = bakcdrop` username_2: @username_1 - Don't forget to restore this PR so that folks can review it and see your work. username_2: I put this on the dev meeting agenda for tomorrow. We have a PR that needs testing, feedback, and review. https://forum.backdropcms.org/forum/mar-10th-weekly-meeting-agenda-items?page=0%2C0#comment-3875 username_5: @username_4 I disagree with having a popup showing the module's README. If a contrib module is able to be installed, it's already on the system, so someone's already put it there (and presumably knows what it does). The only exception to this is core modules (some people might know know what they do), but ironically they don't have READMEs to show anyway. So I think ditch the popup idea and just go with the basic links for now. @username_1 Some general feedback on the direction of the PR: - Checking the 'project' in the info data to differentiate core modules doesn't seem like a solid solution here. I know @username_3 pointed out why it doesn't work in the PR preview site, but it also doesn't work on my local development site. I think a better way to check for core modules is to check if the 'uri' starts with `core/`: ![image](https://user-images.githubusercontent.com/2385329/157638549-56e77fc3-2a1b-44a8-a876-e63436bccbde.png) - I'm not entirely comfortable with the call to `system_rebuild_module_data()` in the theme function... Since it's only used to check for core modules in the `if` statement, I think a better solution might be to do that check where the rest of the `$variables` are fetched outside this function, then just add a simple 'core' boolean to the `$variables` array that you can check here. I haven't looked further into this to know where that should happen though... - I wonder if a better wording for the links might be something like: `Project links: BackdropCMS.org, GitHub` If not, at least capitalise the 'P' in 'project' 🙂 Thanks for your contributions so far! username_3: @username_5 Thank You for better and elegant way to check for core modules: I found a solution: Insert this row into `core\modules\system\system.admin.inc` into line 789: ```php $form['#core'] = $extra['core']; ``` And the fixed code in the `core\modules\system\system.theme.inc`: ```php if (!$module['#core']) { $requirements .= '<div class="admin-requirements project-links">' . t('Links:') . ' ' . l(t("BackdropCMS.org"), 'https://backdropcms.org/project/' . $key) . ', ' . l(t("GitHub"), 'https://github.com/backdrop-contrib/' . $key) . '</div>'; } ``` (Remove this line: `$module_data = system_rebuild_module_data();`) username_6: I, too, don't think the module's README should be in the popup, for the reasons articulated by @username_5. username_2: @username_1 - We discussed this at our meeting last week and the consensus was that we should get your last Pull Release working properly and treat the suggestion by @username_4 as a new issue. I think we should start with the links and then discuss if we want more later. Are you interested in updating your last Pull Request with the feedback that has been provided, but staying with the same idea. username_2: @username_1 - Are you able to restore this PR? username_1: I will open a new one. I can't restore. username_2: @username_1 - Excellent work. I've tested this PR and it works for me. The feedback from @username_3 has been incorporated and the PR no longer shows links to core modules. @username_4 - There is opposition to the idea of using the same UI pattern as we have in the project installer, so I recommend that you save that for a new follow-up issue if you wish to pursue. I believe that this issue can proceed as it is. @username_5, @username_6, @username_3 or @username_0 - does anyone else have any feedback? Can we get a code review? username_3: I have tested and reviewed, it it correct. :+1: username_2: @username_3 - Are you prepared to mark this RTBC or do you want to wait for another review? username_5: It works, and code looks good to me. username_5: No advocate, but since this is RTBC now, adding to the minor milestone. username_3: It's great, thank You! :fireworks: username_2: @username_1 - Excellent work helping us make Backdrop easier to use and understand.
chipKIT32/chipKIT-importer
338118428
Title: Import issue of chipKIT-importer V1.0.6 with MPLAB X IDE Question: username_0: I have installed MPLAB X IDE V4.15, chipKIT Import plugin V1.0.6 and chipKIT-core V1.4.1 / V2.0.5. But when I launched chipKIT Import and import a Sketch, the import operation is always faied. The import fail information as shown below: java.lang.NullPointerException at com.microchip.mplab.nbide.embedded.chipkit.wizard.ImportWorker.importChipKitProjectFiles(ImportWorker.java:264) at com.microchip.mplab.nbide.embedded.chipkit.wizard.ImportWorker.createProjectFromChipKit(ImportWorker.java:189) at com.microchip.mplab.nbide.embedded.chipkit.wizard.ImportWorker.invokeImporterTasks(ImportWorker.java:144) at com.microchip.mplab.nbide.embedded.chipkit.wizard.ImportWorker.doInBackground(ImportWorker.java:94) at com.microchip.mplab.nbide.embedded.chipkit.wizard.ImportWorker.doInBackground(ImportWorker.java:72) at javax.swing.SwingWorker$1.call(SwingWorker.java:295) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at javax.swing.SwingWorker.run(SwingWorker.java:334) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) I heard that other people have used the chipKIT Import plugin V1.0.5 to import successfully. I also want to try chipKIT Import plugin V1.0.5. So, could you please provide chipKIT Import plugin V1.0.5 install package? Thanks! Answers: username_1: I'm on it. Thanks for the bug report. username_1: username_0, I'm sorry it took so long but there seems to be a more complicated problem here. You can get version 1.0.5 of the plugin here: https://github.com/chipKIT32/chipKIT-importer/blob/aa3ce30b3a7de263571e8d9f8b6a6d88eca1ad07/dist/com-microchip-mplab-nbide-embedded-chipkit.nbm However, frankly, I'm not convinced if this will actually solve your problem. Please try and let me know.
nklayman/vue-cli-plugin-electron-builder
704102817
Title: ipcRenderer.on, this.$router undefined Question: username_0: In ipcRenderer.on('channel', (event, data) => {}) callback function, this.$router is undefined `mounted: function () { console.log(this.$router) ipcRenderer.removeAllListeners() ipcRenderer.on('close-project', (event, data) => { console.log(this.$router) ipcRenderer.send('win-set-screen-style', { maximize: false, }) ipcRenderer.send('menu-enable-menu-item', { itemArray: [ { id: 'closeProject', enabled: false }, { id: 'openProject', enabled: false }, ], }) }) } ` <img width="798" alt="截屏2020-09-18 下午1 41 09" src="https://user-images.githubusercontent.com/43738111/93560383-9f48cf80-f9b4-11ea-8130-7cfe0a50ae38.png"> Answers: username_0: ``` mounted: function () { console.log(this.$router) ipcRenderer.removeAllListeners() ipcRenderer.on('close-project', (event, data) => { console.log(this.$router) this.$router.back() ipcRenderer.send('win-set-screen-style', { maximize: false, }) ipcRenderer.send('menu-enable-menu-item', { itemArray: [ { id: 'closeProject', enabled: false }, { id: 'openProject', enabled: false }, ], }) }) } ``` username_1: I tried the code snippet you posted and it worked fine for me. Can you please create and share a full repo that demonstrates the issue? username_0: After I created a new project, the problem disappeared. But there is a new problem. <img width="1841" alt="截屏2020-09-26 下午9 25 58" src="https://user-images.githubusercontent.com/43738111/94341790-e121e780-003e-11eb-8987-61a48138f7a9.png"> I have to use `window.require('electron')` can solve this problem. Is that right? I'll upload my test code. Maybe I lost some files when I introduced dependency. Can you provide your empty project? [electron-test.zip](https://github.com/username_1/vue-cli-plugin-electron-builder/files/5286749/electron-test.zip) username_1: Did you make sure to enable node integration in the new app? See https://username_1.github.io/vue-cli-plugin-electron-builder/guide/security.html#node-integration for details. username_0: You're right. I forgot to add this configuration to the test project. Thank you very much. username_1: Since it works in the fresh project, the error likely isn't coming from this plugin, so I'm going to close this issue. If you have more info on the original project you can post that and I can try and help you figure out what the issue was there. Status: Issue closed
jeremylong/DependencyCheck
445030366
Title: Junit report generation doesn't show skipped (suppressed) count Question: username_0: **Describe the bug** Junit report doesn't show the count for suppressed vulnerabilities. **Version of dependency-check used** Maven plugin v5.0.0-M3 **To Reproduce** Files can be found in [this gist](https://gist.github.com/username_0/868546812e67ad39cfa732f187ac9486). Use attached POM file, which includes a false positive (on `jwks-rsa-java`), suppressed trough the attached `dependency-checker-suppression.xml`. Run dependency check (`mvn clean verify`). Inspect resulting junit report (`target/dependency-check/dependency-check-junit.xml`). See the missing `skipped` count in the report. **Expected behavior** The `skipped` attribute contains the suppressed count. Answers: username_1: Thanks for the bug reports! Status: Issue closed
ISISComputingGroup/IBEX
729776664
Title: Allow seeing matplotlib window from IDAaaS machine Question: username_0: As a reflectometry user, I would like to be able to view plots in matplotlib from a read-only client on the IDAaaS cloud. #### Acceptance Criteria - Matplotlib OPI can display plots on a remote instruments Answers: username_1: Assuming there are no firewall issues, the web browser OPI widget would just need pointing at the remote instrument's hostname as opposed to localhost. Is it acceptable that a remote instrument can modify (e.g. resize, zoom) the graphs? It is the same graph for all clients. username_0: I have told POLREF that users can already see the plots by navigating to the right address in the browser (subject to the same issues you mentioned). They are happy with this for the moment. It would be easy enough to add a "remote host" macro to the OPI, so you don't have to look at it in a separate window, but this also makes it easier for IBEX users to stumble upon and mess with the plots from any instrument. Does anyone have strong feelings about this? I suppose long term we should aim for creating a separate plot for each client somehow? I am unsure exactly how hard this would be to do. username_2: I would want to check but this should work `g.adv.open_plot_window(is_primary=True, host="NDXPOLREF")`
postmanlabs/newman
212598767
Title: Environment variables not substituting Question: username_0: Version/App Information: --> 1. Newman Version: v 3.5.0 2. OS details: Windows 10 3. Are you using Newman as a library, or via the CLI? CLI 4. Did you encounter this recently, or has this bug always been there: Recently may have always been there. 5. Expected behaviour: Parameters / variable substituted. 6. Console logs newman SuperDiary - Build Verification Tests □ Attachments - BVT └ Obtain Leap Auth Token POST https://{{auth_url}}/oauth/token [errored] 2⠄ ReferenceError in test-script └ POST Attachments POST https://{{sd_baseURL}}/attachments [errored] 4⠄ ReferenceError in test-script └ POST Attachment Refresh POST https://{{sd_baseURL}}/attachments/refresh [errored] 6⠄ ReferenceError in test-script └ Create Appointments w Attachment POST https://{{sd_baseURL}}/appointments [errored] 8⠄ ReferenceError in test-script └ DELETE Appointment DELETE https://{{sd_baseURL}}/appointments/{{appointment_id}} [errored] 10⠄ ReferenceError in test-script ┌─────────────────────────┬──────────┬──────────┐ │ │ executed │ failed │ ├─────────────────────────┼──────────┼──────────┤ │ iterations │ 1 │ 0 │ ├─────────────────────────┼──────────┼──────────┤ │ requests │ 5 │ 5 │ ├─────────────────────────┼──────────┼──────────┤ │ test-scripts │ 5 │ 5 │ ├─────────────────────────┼──────────┼──────────┤ │ prerequest-scripts │ 1 │ 0 │ ├─────────────────────────┼──────────┼──────────┤ │ assertions │ 0 │ 0 │ ├─────────────────────────┴──────────┴──────────┤ │ total run duration: 262ms │ ├───────────────────────────────────────────────┤ │ total data received: 0B (approx) │ ├───────────────────────────────────────────────┤ │ average response time: 0ms │ └───────────────────────────────────────────────┘ # failure detail 01. Error Invalid URI "https:///%7B%7Bauth_url%7D%7D/oauth/token" at request inside "Obtain Leap Auth Token" of "Attachments - BVT" 02. ReferenceError responseBody is not defined [Truncated] inside "Create Appointments w Attachment" of "Attachments - BVT" 09. Error Invalid URI "https:///%7B%7Bsd_baseURL%7D%7D/appointments/%7B%7Bappointment_id%7D%7D" at request inside "DELETE Appointment" of "Attachments - BVT" 10. ReferenceError responseBody is not defined at test-script inside "DELETE Appointment" of "Attachments - BVT" <!-- Steps to reproduce the problem: --> Running this newman command: newman run https://www.getpostman.com/collections/b05c7fb5abe123341f30 --environment https://api.getpostman.com/environments/1487933-4e1f6f9e-61e1-d895-370d-00276aed6f63 --iteration-data "data.json" -n 1 --folder "Attachments - BVT" Collection to be run is this one: https://www.getpostman.com/collections/b05c7fb5abe123341f30 The console logs are showing that the substitution is not happening anywhere... Answers: username_0: This works in Postman via Collection Runner username_1: @username_0 The collection and environment urls must also include an `apiKey` query param, whose value should be your secret api key. You can manage your api key(s) here: https://app.getpostman.com/dashboard/integrations username_0: Hi @username_1, how do you add the api-key to the newman command? I've done it via postman but not through newman as there is no argument option for it. Are you expecting people to not use the URL when a apikey is involved and just export them from postman? username_0: I just made a new collection that has the API-key inside the single request in the collection and I'm still not getting access to the environment variables. The reason I am doing this is I'm trying to tie this to a Continuous Integration job and I need this to work so I can pull the latest code cuts. If there is another way, please let me know. newman run https://www.getpostman.com/collections/36c8147ae0280dad10d1 --export-environment $EnvFile username_0: So I worked out how to do it. I did it through powershell and used the Invoke-RestMethod and outputted the information to a file which I piped back into the newman request. username_2: Great that you found a workaround. That's a genius idea by the way. 😊 To answer your previous question, refer to API.getpostman.com documentation. You can provide API key as a URL query parameter https://www.getpostman.com/collections/36c8147ae0280dad10d1?apikey=<<your-key-here>> Closing the issue for now. Status: Issue closed
renggli/dart-xml
858244321
Title: Error thrown when parsing Doctype Question: username_0: Hello, I get the following error when trying to parse a doctype: `XmlParserException (Closure: () => String from Function 'toString':.: Expected name at 2:4)` I'm using version 5.1.0 Here is the xml: ``` <?xml version="1.0"?> <!DOCTYPE TEI.2 PUBLIC "-//TEI P4//DTD Main DTD Driver File//EN" "http://www.tei-c.org/Guidelines/DTD/tei2.dtd"[ <!ENTITY % TEI.XML "INCLUDE"> <!ENTITY % PersProse PUBLIC "-//Perseus P4//DTD Perseus Prose//EN" "http://www.perseus.tufts.edu/DTD/1.0/PersProse.dtd"> %PersProse; ]> <TEI.2> </TEI.2> ``` Answers: username_1: Thank you, I can reproduce the problem. This is also a very strange error message ... investigating. Status: Issue closed username_0: Thank you so much!
ajpowell021/chessAISolo
208694898
Title: Pawn score generator. Question: username_0: Add a number to the board score depending on: - 15% of points if pawn is on a edge. + points if pawn is near a promotion. (No promotion if it would end in an immediate capture.) + board position table modifier.<issue_closed> Status: Issue closed
fannheyward/coc-marketplace
540354646
Title: Doesn't work behind proxy Question: username_0: Hi there. I use marketplace at home okay but in the office all I get is this - it never expands and searches don't work: ![image](https://user-images.githubusercontent.com/6132708/71182016-908e8a00-2275-11ea-82f6-09622ed1e970.png) The only difference is at work I have a proxy configured. I can get the same result at home when blocking `registry.npmjs.com` on the firewall. Answers: username_1: How is your proxy configured? The `fetch` used is not https://github.com/github/fetch, but fetch model provided by coc.nvim https://github.com/neoclide/coc.nvim/blob/master/src/model/fetch.ts username_0: This I didn't expect. I have the same proxy configuration in both Gnome and environment variables. It usually works fine when applications support system-wide proxy settings. Gnome: ![image](https://user-images.githubusercontent.com/6132708/71262480-1ecf4280-2340-11ea-9282-2305655743f4.png) `~/.bashrc`: ``` export http_proxy=http://proxiac:8080 export https_proxy=http://proxiac:8080 export ftp_proxy=http://proxiac:8080 export rsync_proxy=http://proxiac:8080 export all_proxy=http://proxiac:8080 export no_proxy="localhost,cy,127.0.0.1,127.0.0.1/8,::1,10.0.0.0/16,*.intra" export HTTP_PROXY=$http_proxy export HTTPS_PROXY=$https_proxy export FTP_PROXY=$ftp_proxy export RSYNC_PROXY=$rsync_proxy export ALL_PROXY=$all_proxy export NO_PROXY=$no_proxy ``` I also tried to replace our proxy hostname with IP but it didn't change anything. To make it even more interesting, `:CocInstall` works okay - my guess is it uses the same `fetch` implementation(?). username_1: This has been fixed on coc.nvim mater https://github.com/neoclide/coc.nvim/commit/f4d9acae106a85b415449c7441174197623d29c0 Status: Issue closed username_0: The issue was never related to lower or upper case, I have all environment variables declared in both. I'll post it upstream, though. username_1: Hi there. I use marketplace at home okay but in the office all I get is this: ![image](https://user-images.githubusercontent.com/6132708/71182016-908e8a00-2275-11ea-82f6-09622ed1e970.png) It never expands to the full list of extensions and searches don't work. The only difference is at work I have a proxy configured. I can get the same result at home when blocking `registry.npmjs.com` on the firewall. Currently you're using `fetch`: ```ts const resp = (await fetch(uri)) as any; ``` After quick search I'd propose to use `fetch-with-proxy` which seems to respect system settings and environment variables (example from the website, I don't code in TypeScript): ```ts import fetch from 'fetch-with-proxy'; // ... fetch(uri) .then((response) => response.text()); .then(console.log) .catch(console.error) ``` username_1: Yes, both `:CocInstall ` and `coc-marketplace` are using same `fetch` model. You can try to add `"http.proxy": "proxiac:8080"` to `coc-settings.json`. username_1: Can you see `Using proxy from...` from `:CocOpenLog`? username_0: I'm terribly sorry, it's a false alarm... a tricky one I would never expect, but still, not an issue in your code, I think. When I connect to http://registry.npmjs.com in the browser or with `curl` (through proxy) all works fine but when `coc.nvim::fetch` connects to that URL (through proxy) it gets `403 Forbidden`. It must be something set up by our admins but no idea why or what exactly - will investigate. If by any chance it is anything that should be fixed on your end (as in: a missing support for some rarely implemented nuance covered by the standard but I doubt it) I'll let you know. For the completeness sake - yes, I see proxy in the log: ``` 11 2020-01-03T16:21:15.095 INFO (pid:29407) [model-fetch] - fetch: http://registry.npmjs.com/-/v1/search?text=keywords:coc.nvim&size=200&from=0 12 2020-01-03T16:21:15.095 INFO (pid:29407) [model-fetch] - Using proxy from: proxiac:8080 ``` And I can confirm it's taken from `coc-settings.json` (when set) or from environment variable. P.s. Sorry for delay, winter holidays. username_0: Few months passed and I still don't know why it doesn't work but I don't think it's something that concerns other users so I'm closing the issue. Have an awesome week, cheers! Status: Issue closed
processing/p5.js-website
449283145
Title: createCapture() video demo cut off on mobile screens Question: username_0: <!-- Hi there! PLEASE NOTE: The github issues are for bugs and feature requests for the p5.js library itself. If you have a general question or bug programming with p5.js please post it in the p5.js forum: https://discourse.processing.org To check any option, replace the "[ ]" with a "[x]". Be sure to check out how it looks in the Preview tab! Feel free to remove any portion of the template that is not relevant for your issue. --> #### Nature of issue? - [ ] Found a bug - [x] Existing feature enhancement - [ ] New feature request #### Most appropriate sub-area of p5.js? - [ ] Color - [ ] Core/Environment/Rendering - [ ] Data - [ ] Events - [ ] Image - [ ] IO - [ ] Math - [ ] Typography - [ ] Utilities - [ ] WebGL - [x] Other (specify if possible) demo CSS? #### Which platform were you using when you encountered this? - [x] Mobile/Tablet (touch devices) - [ ] Desktop/Laptop - [ ] Others (specify if possible) #### Details about the bug: - p5.js version: <!-- You can first this in the first line of the p5.js file --> just the demo itself at https://p5js.org/examples/dom-video-capture.html - Web browser and version: <!-- In the address bar, on Chrome enter "chrome://version", on Firefox enter "about:support". On Safari, use "About Safari". --> Chrome Android 73.0.3683.90 - Operating System: <!-- Ex: Windows/MacOSX/Linux along with version --> Android Pie - Steps to reproduce this: visit https://p5js.org/examples/dom-video-capture.html <!-- Include a simple code snippet that demonstrates the problem, along with any console errors produced. If this isn't possible, then simply describe the issue as best you can! --> <!-- If you want to enhance an existing feature, please describe here, otherwise remove this section --> #### Feature enhancement details: I'm just wondering if there's a simple CSS fix so that the video is visible and not cut off, which makes it seem like there is an error: ![Screenshot_20190416-131856](https://user-images.githubusercontent.com/24359/56230451-41d78300-604a-11e9-9f80-e76d16feaf0f.png) Requesting desktop site shows the video is actually working. ![Screenshot_20190416-131022](https://user-images.githubusercontent.com/24359/56230458-4439dd00-604a-11e9-8556-9f3bfa222eed.png) Thanks, all! Love p5js! Answers: username_1: Seems like CSS can't fix the canvas issue, because, if canvas width is changed, the sketch still remains at the original position, i.e. the objects drawn in the sketch. A similar 'cut off' issue where the canvas is too large => [#284 ](https://github.com/processing/p5.js-website/issues/284) username_1: The only working solution that I can see right now: 1. In setup(), add ``` let maxWidth = 390*2; let scale = 1; if (window.innerWidth < maxWidth) { scale = window.innerWidth / maxWidth; } ``` This ensures that the whole sketch scales down to fit the mobile screen utilizing maximum available mobile screen space (excluding left and right margins) but still displays at original size on larger screens. 2. Multiply all dimensions by `scale`, such as ``` createCanvas(390*scale, 240*scale); capture.size(320*scale, 240*scale); ... ``` To keep example code simple and easy to understand, dimensions could be pre-multiplied by `scale` while rendering the html to the client. Is there a way to do that @lmccart ? username_1: I think dimensions could be pre-multiplied by `scale` in [src/assets/js/examples.js](https://github.com/processing/p5.js-website/blob/master/src/assets/js/examples.js). Any ideas to do it in a concise way?
SpeciesFileGroup/taxonworks
1002879729
Title: As a developer/user I want a pattern that provides clean REST with options to expand/extend the response Question: username_0: It's not RESTy, but at minimum updating our responses to be clean by default, with the option to return related data/metadata via a single param, will start to clean/speed our responses. Param values should be snake_cased names of the AR relation, e.g. `role`, `origin_citation`, `citations` Proposal (found in other RESTful discussions): * Use `&expand` when the a set of params is nested (e.g. `.... roles: { }, ...`) * Use `&embed` when you need values at the root of the object (e.g. `new_attribute: 1`)<issue_closed> Status: Issue closed
dflemstr/rust-native-wasm-loader
296214046
Title: Error parsing JSON Question: username_0: I tried the loader with a Rust project that uses some other libraries and got a failure on `yarn run webpack` saying: ``` ERROR in ./src/lib.rs Module build failed: SyntaxError: Unexpected token { in JSON at position 558 at JSON.parse (<anonymous>) at handleCargo (<path>\node_modules\rust-native-wasm-loader\dist\index.js:142:23) ``` I got it to work by subsituting `"\n"` instead of `_os2.default.EOL` in the index.js file at line 142. I assume, it is a Windows problem where the normal newline is expected to be `\r\n` but the incoming data seems to just use `\n`. I am using yarn 1.3.2 with rustc 1.25.0-nightly (2018-02-08) on Windows 10. Answers: username_1: Curious how cargo still uses `\n` for its output on Windows. Feel free to send a PR with the expected behavior! Status: Issue closed
ironmansoftware/universal-dashboard-documentation
510775069
Title: Stacked Bar Charts example needs revision Question: username_0: Unless I am doing something wrong on my end, I don't think the current stacked bar chart example would work as intended. As it is, it will add the free space on top of the total size, causing the chart to extend passed what your total drive size would be. https://github.com/ironmansoftware/universal-dashboard-documentation/blob/master/components/data-visualizations/charts.md Perhaps altering the example to include the Used space instead of total size could clear it up. `Size = [Math]::Round($_.Size / 1GB, 2);` to `Used = [Math]::Round(($_.size - $_.FreeSpace) / 1GB, 2)` Then ` New-UdChartDataset -DataProperty "Size" -Label "Size" -BackgroundColor "#80962F23" -HoverBackgroundColor "#80962F23"` to `New-UdChartDataset -DataProperty "Used" -Label "Used" -BackgroundColor "#80962F23" -HoverBackgroundColor "#80962F23"`
DFRobot/DFRobot_Display
777526428
Title: Doesn't work on Ardunio Question: username_0: Firstly, thank you for the good library. I have tried on esp32 8266 d1 mini and worked. But on Arduino, I couldn't do it. When I look at SPI signals with a logic analyzer, the MOSI signal was always logic 1.
robotframework/robotframework
909341541
Title: Skipped test case will have fail status if teardown is fail Question: username_0: Robot Framework 4.0.3 (Python 3.6.8 on win32) ``` *** Settings *** Documentation ... Library SeleniumLibrary Suite Setup Suite Setup Suite Teardown Suite Teardown Force Tags example noncritical *** Variables *** *** Test cases *** Test Cases Wait Until Page Contains Element css:img[alt="Google"] Page Should Contain Element css:img[alt="Gogle"] #non correct selector intentionally *** Keywords *** Suite Setup Open Browser https://google.com chrome Suite Teardown No existed keyword #non correct keywordintentionally Close All Browsers ``` command line: robot --skiponfailure noncritical example.robot Expected result some thing like this: ![image](https://user-images.githubusercontent.com/48047242/120463652-cc9fa700-c3a4-11eb-813d-0cfad7269618.png) Really: ![image](https://user-images.githubusercontent.com/48047242/120462632-c78e2800-c3a3-11eb-9a59-c7ae2c07ff41.png) Answers: username_0: It may also be worth revising the logic when tests are PASS and teardown is fail. username_1: What to do with skipped tests if a suite teardown fails probably wasn't thought at all when the skip functionality was designed. Failing test teardowns don't change skip status to fail, so it would be logical to handle suite teardowns the same way. Passed tests being marked failed if suite teardown fails is by design and won't change. username_0: Sorry, what is positive in that design? In my point of view - suite teardown should not impact to test cases state. What teardown does - performs some steps for clear testing environment and sets it to default state, may be to create some custom report. Also the teardown is executed after all test cases. Examples of not comfortable points when all test cases were fail because suite teardown was fail: - suite test cases are PASS and situation don't have relation to them (if suite teardown was fail by some outside reason(web testing - browser was closed during )) - when we try to findout reason of fail - one of all fail test cases is real reason of fail - but all test cases look like are fail - it can hide bug. Thanks. In any case it my vision of behavior. username_1: If we'd leave tests passed after suite teardown failure, there would be no indication in report about that failure. This could mean that serious problems aren't detected. We obviously could enhance report so that suite teardown failure is reported separately, but I don't thing that's worth the effort and all tests passing could nevertheless mean that the problem is ignored. For wider discussion see #2135. As discussed in the aforementioned issue, you can wrap the teardown keyword with "Run Keyword And Ignore Error" or with "Run Keyword And Warn On Failure". The latter was actually added partly with this use case in mind (#2294). username_0: Yep, I think it is needed :) May be add some tail to suite progress line ![image](https://user-images.githubusercontent.com/48047242/120779982-5039cf00-c530-11eb-95c9-a59f73e8e68b.png) or paint border to red color ![image](https://user-images.githubusercontent.com/48047242/120780124-6cd60700-c530-11eb-8bb5-0c604965d206.png) It would may to keep real passed test cases statistic just thoughts out loud :) username_1: Failing suite teardowns are such a special case that I don't think handling them differently than now is worth the effort. Most importantly, it would be hard to come up with mechanism that would leave tests passed but would make it _explicit_ for users that there is a problem to be fixed. Anyway, discussing about that doesn't belong to this issue. If you think that behavior should be changed, submit a new issue or start discussion on our Slack. Status: Issue closed
gchq/CyberChef
1000063523
Title: Feature request: RSAVerify handle CRLF line endings in the 'Message' field Question: username_0: **Is your feature request related to a problem? Please describe.** When attempting to verify an email DKIM signature, the message to be verified typically contains **CRLF** line endings. However, when uploaded to the "Message" textarea, the line endings are replaced with **LF**, causing a different hash to be computed, which naturally always results in 'Verification failed'. I'm guessing this is probably an html textarea issue, rather than a bug in the project. **Describe the solution you'd like** Proposed solution could be to add additional toggle button "Message Input", where a user can choose how a Message should be interpreted/decoded. ie. (Hex, UTF8, Latin1, Base64). **Alternate solution** A textarea allowing escaped characters. Example: 'Hello\r\n' **Additional context** ![rsaverify](https://user-images.githubusercontent.com/48497634/133893930-6b6ae7f1-3dfa-479c-a520-c63c2c24ae69.png)
sinricpro/esp8266-esp32-sdk
627832340
Title: Using the Blinds Example with a Stepper Motor Question: username_0: So I have a general understanding of how stepper motors work with they alternate sets of magnets, and each alternation is a step. So By having an input of a number of steps it will turn that many steps/degrees. How would I use that ideology to make the blinds turn only a percentage of a completely open blinds. Open is 100 % and that equates to 100 steps, but how would it be done with the code to close the blinds to 70%. Answers: username_1: Do you know how many steps you need to open or close the blinds fully? username_0: No I do not. Does that matter? I'm not in a situation where I can know the number of steps, but I'm just trying to get the code to work and will test it out later. username_0: No I do not. Does that matter? I'm not in a situation where I can know the number of steps, but I'm just trying to get the code to work and will test it out later. On Sat, May 30, 2020 at 8:43 PM <NAME> <<EMAIL>> wrote: > Do you know how many steps you need to open or close the blinds fully? > > username_1: To be honest, I have not implemented blinds with Motor control so I am just guessing here. 1. Try to get the code working (without the Sinric Pro) integration using a library like this https://github.com/Stan-Reifel/FlexyStepper 2. Open or Close needs different calculations and you need to keep a track of what's the current position in order to calculate the difference and move Check out these examples. It might help you to get an idea. https://github.com/intive/patronage20-embedded/blob/e6a358c3bb3491f4fc8ed22fdd78ff7f2f7987cd/libraries/BlindsController/blinds.h username_2: Hi Dragontamer! Your question is very interesting, and much more complicated than the first glance thinks. The first thing you need to know in advance is how many steps the motor has to take for the entire distance. This depends on the installation and must be calculated in advance. The sketch must know the current position at all times. A stepper motor does not know this. Here you can use limit switches (End-Stops) on one or both sides. As soon as one of these switches is triggered, you will know the exact position (100% or 0%). If the position is unknown (e.g. when booting the ESP) you have to move to one of these positions until the limit switch is triggered (homing). To implement the motor control I would proceed as follows: 1) Defining variables - Defining a global variable ``currentPosition`` (this keeps the current position in number of steps) - Defining a global variable ``stepsToDo`` 2) Implementation of a function ``doStep`` to control the motor. This function checks if the variable ``stepsToDo`` is not equal to 0. If ``stepsToDo`` is greater than 0 a step is made in the positive direction and ``stepsToDo`` and ``currentPosition`` is increased by 1. If ``stepsToDo`` is less than 0 a step is made in the negative direction and ``stepsToDo`` and ``currentPosition`` is decreased by 1. 3) Call ``doSteps`` inside the ``loop`` function Everytime ``loop`` is called and there are steps to do, one step will be done. 4) SinricPro ``onSetPosition`` and ``onAdjustPosition`` In these functions, you must convert percentages of the variable ``position`` / ``positionDelta`` into the number of steps required and enter them in the variable ``stepsToDo``. The rest is then processed by the function ``doStep`` Extension: As soon as the function ``doStep`` has processed all steps, you can send the event ``sendPositionEvent`` and the current position (as percentage value) to the SinricPro server. username_0: Ok, thanks everyone for your help. I'm waiting on my stepper motors now, so I'll have to test your suggestions out when I get them. Status: Issue closed username_3: Guys I really need your help I am stuck:( anybody did it ?
Icinga/icinga-powershell-framework
930002058
Title: Typo in output of Get-IcingaAgentInstallation and Get-IcingaAgentVersion - mayor vs. major Question: username_0: --- We've spent some time evaluating why some if conditions regarding the installed Icinga Agent didn't work. It came down to a typo of the version variable "mayor" which wasn't obvious at first (should be "major" (with "j"). Maybe one can correct the output, knowing that already implemented evaluations out there would break. Also irritating was the variable name "fixes" which we've expected to be "build". Thanks for checking! ## Expected Behavior When calling Get-IcingaAgentInstallation or Get-IcingaAgentVersion return the following attributes as in other software projects. * major * minor * build ## Current Behavior When calling Get-IcingaAgentInstallation or Get-IcingaAgentVersion return the following attributes: ``` PS C:\WINDOWS\system32> Get-IcingaAgentVersion Name Value ---- ----- Snapshot Minor 12 Full 2.12.2 Mayor 2 Fixes 2 ``` ``` PS C:\WINDOWS\system32> (Get-IcingaAgentInstallation).version Name Value ---- ----- Snapshot Minor 12 Full 2.12.2 Mayor 2 Fixes 2 ``` ## Possible Solution Correct the typo from "mayor" to "major". ## Steps to Reproduce (for bugs) 1. run the above mentioned commands Get-IcingaAgentInstallation or Get-IcingaAgentVersion ## Context We're using the version of the installed Icinga2 Agent to evatluate whether an update of the Icinga2 Agent is necessary or not. ## Your Environment * PowerShell Version used (`$PSVersionTable.PSVersion`): * 5.1.14409.1018 * 5.1.14393 .3866 * Operating System and version (`Get-IcingaWindowsInformation Win32_OperatingSystem | Select-Object Version, BuildNumber, Caption`): * 6.3.9600 / 9600 / Microsoft Windows Server 2012 R2 Standard * 10.0.14393 / 14393 / Microsoft Windows Server 2016 Datacenter Answers: username_1: Thank you for the issue. Yes, this is something we can resolve and fix for all occurrences in our code. Status: Issue closed
TomHarrop/fa-variants
178285558
Title: Finish analysis Question: username_0: Got to the point of needing to extract exon/CDS sequences from vcf for pairwise alignments. The sequences might be extacted using `transcriptsBy` in `GenomicFeatures`, which is described in section 2.13 of the `GenomicRanges` HOWTOs pdf.
square/leakcanary
466337974
Title: Next steps for hprof parser Question: username_0: * Add support for a sequence of objects ids with metadata (type, class id for instance, etc) * Can be filtered down to become a sequence of instances etc * Can be used to compute total retained object size * Add node inspectors that replace leak trace inspectors * Use those inspectors once to identify leaking instances (discard the other info. bit wasteful but simpler API) * Then find paths to gc roots for all these instances. * Then deduplicate by building a trie. We get a subset of the leaking instances, the true unique leaks * Then reapply the node inspectors to get the full leaktrace information * The merging into all the node inspector details into the final leaktraces can become an extension point, we can have leaktrace transformers. That way can apply other tricks like flattening (array list + object array into one node), different conflict resolution strategies, or adding labels based on multiple chained nodes. * Next step is to change the HprofGraph to have less reads. The graph wrappers can hold to the in memory index nodes and read the data from disk only when accessing fields. * Further down we separate the hprof parsing code from the android extensions (but all stays jvm compatible) Answers: username_0: After #1460 is merged: - [ ] Handle UI for cases where no instance is found (used to be a failure) => "no need leaks found". Still goes to analysis - [ ] Keep in memory list of leaktraces to figure out which ones are new. username_0: Closing, this is now experiment in LeakCanary. Status: Issue closed
MvvmCross/MvvmCross
363724657
Title: Xamarin.Android: KeyListener not working on real devices Question: username_0: ## 🐛 Bug Report In the OnCreate fragment method, I get my EditText from the layout and call SetOnKeyListener: ``` Aaa = view.FindViewById<EditText>(Resource.Id.aaaText); Aaa.SetOnKeyListener(new LocationKeyListener()); ``` **Layout declaration:** ``` <EditText android:id="@+id/aaaText" android:layout_width="0dp" android:layout_height="wrap_content" android:layout_weight="1" android:imeOptions="actionNext" style="@style/LocationEditTextStyle" android:hint="AAA" app:MvxBind="Text Aaa.Value"/> ``` **EditText styles:** ``` <style name="LoginEditTextTheme" parent="Theme.AppCompat"> <item name="colorControlNormal">@color/accentColor</item> <item name="colorControlActivated">@color/secondaryTextColor</item> </style> <style name="LocationEditTextStyle" parent="Widget.AppCompat.EditText"> <item name="android:textAllCaps">true</item> <item name="android:inputType">textCapCharacters</item> <item name="android:maxLength">3</item> <item name="android:gravity">center_horizontal</item> <item name="android:textSize">@dimen/location_text_size</item> <item name="android:theme">@style/LocationEditTextTheme</item> </style> ``` My keyListener: ![uqpyo](https://user-images.githubusercontent.com/30652592/46036594-921ee000-c10e-11e8-89dd-62d1e0058960.png) The problem is that the LocationKeyListener.OnKey method is not called, when I write in EditText. ### Expected behavior When I press the button on the Android keyboard on a real device, the OnKey / OnKeyPressed event is called. ### Reproduction steps You can clone basic sample to reproduce issue: https://github.com/username_0/KeyPressIssue ### Configuration Visual Studio 2017 15.8.5 **Version:** 6.2.0 **Platform:** - [ ] :iphone: iOS - [ ] :robot: Android - [ ] :checkered_flag: WPF - [ ] :earth_americas: UWP - [ ] :apple: MacOS - [ ] :tv: tvOS - [ ] :monkey: Xamarin.Forms Answers: username_1: I don't see why this has anything to do with MvvmCross. `EditText` is a native Android class, so we cannot do anything about your events not being hit. Status: Issue closed
jamiekaren/field-app-v2
525831073
Title: Research: Mobile Responsive Question: username_0: We need a mobile-friendly app, as the trainers will have access to the app only via a smartphone! - [ ] Can we make our current slider mobile friendly? - [ ] If not, what else can we use? Bootstrap? Pure CSS? Another library?
RocketSurgeonsGuild/Testing
1106288674
Title: Support for "meta" library Question: username_0: Turn `Rocket.Surgery.Testing` into a meta library via a source generator. Features: * Detect XUnit being referenced and add the xunit extensions * Detect Moq/NSubstitute/FakeItEasy and emit base test classes * This should only emit once, and not emit if the classes exist.
manga-download/hakuneko
621216512
Title: mangadiyari Question: username_0: **Version:** [6.1.7@009a3b](https://github.com/manga-download/hakuneko/commits/009a3b) https://mangadiyari.com/ <details><summary>What has been done?</summary> ☑ Searched for Similar Issues: https://git.io/hakuneko-issuesearch </details> Answers: username_1: Template: WordPress Madara Status: Issue closed
newbloodteam/TL77
561850325
Title: Lineups Question: username_0: Notification of current lineups. Answers: username_1: ROUND 1. Lineups: https://github.com/orgs/newbloodteam/projects/7#column-7954656 Have fun in **TL77**! username_1: ROUND 2. Lineups: https://github.com/orgs/newbloodteam/projects/7#column-8040152 username_1: ROUND 3. Lineups: https://github.com/orgs/newbloodteam/projects/7#column-8123867 username_1: ROUND 4. Lineups: https://github.com/orgs/newbloodteam/projects/7#column-8204860 **Reminder:** Next **Sunday, March 8**, **server time will go forward 1 hour** (2:00 EST will become 3:00 EDT), so please take that into account when scheduling your games for Sunday or later. For those of you living in the US and Canada there'll be no difference. username_0: ROUND 5. Lineups: https://github.com/orgs/newbloodteam/projects/7#column-8272898 **Reminder** for our mates who play for **Leuko**: Extensions are not allowed in the "Kasparov" Section in round 5, all games _to be completed_ within **Tue March 17, 22:00**. Please schedule your games accordingly. _Other Sections are not affected._ username_1: ROUND 6. Lineups: https://github.com/orgs/newbloodteam/projects/7#column-8362212
Project-OSRM/osrm-backend
867579467
Title: OSM node-id's returned by "match" call are suddenly different ?! Question: username_0: Hi. I'm using your map-matching service for an OSM road network that contains all of France. I downloaded the snapshot of the 5th of April of this year. I created the necessary files by following the tutorial: ``` mkdir car mv 210405_france-latest.osm.pbf ./car/ sudo docker run -t -v "${PWD}/car/:/data/car" osrm/osrm-backend osrm-extract -p /opt/car.lua /data/car/210405_france-latest.osm.pbf sudo docker run -t -v "${PWD}/car/:/data/car" osrm/osrm-backend osrm-partition /data/car/210405_france-latest.osrm sudo docker run -t -v "${PWD}/car/:/data/car" osrm/osrm-backend osrm-customize /data/car/210405_france-latest.osrm sudo docker run -t -i -p 5000:5000 -v "${PWD}/car/:/data/car" osrm/osrm-backend osrm-routed --algorithm mld /data/car/210405_france-latest.osrm ``` I saved all files into a zip for later use. In my project (that I run on an Amazon EMR cluster) I have a bootstrap script which does: ``` aws s3 cp s3://retailsonar-spark/crowd-monitoring/FR/osrm_fr.zip . sudo unzip osrm_fr.zip sudo service docker start sudo docker pull osrm/osrm-backend sudo docker run -d -p 5000:5000 -v "${PWD}/car/:/data/car" osrm/osrm-backend osrm-routed --algorithm mld /data/car/210405_france-latest.osrm ``` In my project, this worked fine on the 21st of April. However, today I see something really weird. The match calls return data and state that they are successful, but the returned node-id's seem scrambled up. Some of them just don't exist, others exist but are not part of a road. How could it be that the returned id's don't belong to roads or just don't exist? An example: ``` { "code": "Ok", "matchings": [{ "confidence": 0.904254, "geometry": { "coordinates": [ [2.386255, 48.766522], [2.386255, 48.766522], [2.386051, 48.766547], [2.385695, 48.766591], [2.385363, 48.76664], [2.385225, 48.766671], [2.385239, 48.76669], [2.385243, 48.766712], [2.385238, 48.766733], [2.385223, 48.766752], [2.3852, 48.766768], [2.385171, 48.766778], [2.385137, 48.766782], [2.385103, 48.766779], [2.385072, 48.766769], [2.385048, 48.766753], [2.385032, 48.766732], [2.385027, 48.76671], [2.384879, 48.766706], [2.384801, 48.766704], [2.383954, 48.766801], [2.383828, 48.766816], [2.383627, 48.766841], [2.382811, 48.766942], [2.382303, 48.767005], [2.382197, 48.767018], [2.381409, 48.767116], [2.381185, 48.767144], [Truncated] "name": "<NAME>", "distance": 13.750966, "hint": "B6bNgP___38YAAAAGAAAAAUAAAAAAAAAeVcVQgAAAAAmaudAAAAAABgAAAAYAAAABQAAAAAAAADGGwAAeLojANpl6ALhuSMAI2boAgEAnwA2-8d8" }, { "alternatives_count": 60, "waypoint_index": 7, "matchings_index": 0, "location": [2.341575, 48.78503], "name": "<NAME>", "distance": 31.485708, "hint": "CqbNgJumzQANAAAAFAAAAAAAAAAAAAAAGuqfQSEh8EEAAAAAAAAAAA0AAAAUAAAAAAAAAAAAAADGGwAAx7ojAIZm6AItuSMA2GboAgAADwE2-8d8" }] } ``` This contains the node-id: 10952294855 This node doesn't exist in my OSM source file, and doesn't exist right now on OSM: https://www.openstreetmap.org/node/10952294855. Has the code somehow been updated in the repo over the last few days ? Because a few days ago this worked fine... I am completely stuck by this weird issue, and I'm working on strict commercial deadlines, so any help is greatly appreciated !! Answers: username_0: To reproduce: this is the url I called: ``` curl "http://127.0.0.1:5000/match/v1/car/2.387044,48.76641699998613;2.3870860000000005,48.76642299998612;2.3686750000000005,48.76873499998612;2.362188000000001,48.77108899998613;2.3396430000000006,48.77412999998613;2.3398250000000007,48.77470599998614;2.341345,48.784930999986166;2.3411651111166467,48.785111666655354?steps=true&annotations=true&overview=full&radiuses=50;50;48.0;50;50;50;50;50&geometries=geojson&timestamps=1591044822;1591044823;1591044978;1591045057;1591045289;1591045294;1591045566;1591045598&gaps=ignore" ``` username_1: Sounds like yet another case where the "real" OSM ids overflow the internal 33 bit size. See #6016 for context and pointer to a fix. username_0: What I did to fix it (for good I hope). On my pre-calc machine: pull the 5.24.0 docker image, and recalculate the necessary files + move to them to S3. On my cluster, in my bootstrap script I also pull the 5.24.0 image. That seems to work again. Following this flow, I guess I'm safe ? Or #6016 is not fixed yet in 5.24.0 ? username_2: The fix is not included in 5.24.0. To get the fix, you need to either use the `latest` docker image, or wait for the 5.25 release, which is currently not planned. username_0: Hmm, but I ran into problems using the latest docker image. Am I correct that you cannot preprocess an OSM file with version X and then run the backend server with version Y ? username_2: @username_0 It would help if you said exactly what those problems were. But in general yes, data is not compatible between OSRM versions - in fact, the changelog for https://github.com/Project-OSRM/osrm-backend/issues/6016 _specifically_ calls out that it breaks the data format. You should re-process the data using the same docker image you intend to run it with. username_0: ok, might be worthwhile to mention this in the tutorial section which describes setting up a server ? username_2: When you mix up versions. If you're running the latest, _unreleased_ code, then you're on your own. username_0: Ok, fair enough, didn't catch that message as I deployed it in a EMR cluster. Thanks for the responses guys. Status: Issue closed username_0: Hi. I'm using your map-matching service for an OSM road network that contains all of France. I downloaded the snapshot of the 5th of April of this year. I created the necessary files by following the tutorial: ``` mkdir car mv 210405_france-latest.osm.pbf ./car/ sudo docker run -t -v "${PWD}/car/:/data/car" osrm/osrm-backend osrm-extract -p /opt/car.lua /data/car/210405_france-latest.osm.pbf sudo docker run -t -v "${PWD}/car/:/data/car" osrm/osrm-backend osrm-partition /data/car/210405_france-latest.osrm sudo docker run -t -v "${PWD}/car/:/data/car" osrm/osrm-backend osrm-customize /data/car/210405_france-latest.osrm sudo docker run -t -i -p 5000:5000 -v "${PWD}/car/:/data/car" osrm/osrm-backend osrm-routed --algorithm mld /data/car/210405_france-latest.osrm ``` I saved all files into a zip for later use. In my project (that I run on an Amazon EMR cluster) I have a bootstrap script which does: ``` aws s3 cp s3://retailsonar-spark/crowd-monitoring/FR/osrm_fr.zip . sudo unzip osrm_fr.zip sudo service docker start sudo docker pull osrm/osrm-backend sudo docker run -d -p 5000:5000 -v "${PWD}/car/:/data/car" osrm/osrm-backend osrm-routed --algorithm mld /data/car/210405_france-latest.osrm ``` In my project, this worked fine on the 21st of April. However, today I see something really weird. The match calls return data and state that they are successful, but the returned node-id's seem scrambled up. Some of them just don't exist, others exist but are not part of a road. How could it be that the returned id's don't belong to roads or just don't exist? An example: ``` { "code": "Ok", "matchings": [{ "confidence": 0.904254, "geometry": { "coordinates": [ [2.386255, 48.766522], [2.386255, 48.766522], [2.386051, 48.766547], [2.385695, 48.766591], [2.385363, 48.76664], [2.385225, 48.766671], [2.385239, 48.76669], [2.385243, 48.766712], [2.385238, 48.766733], [2.385223, 48.766752], [2.3852, 48.766768], [2.385171, 48.766778], [2.385137, 48.766782], [2.385103, 48.766779], [2.385072, 48.766769], [2.385048, 48.766753], [2.385032, 48.766732], [2.385027, 48.76671], [2.384879, 48.766706], [2.384801, 48.766704], [2.383954, 48.766801], [2.383828, 48.766816], [2.383627, 48.766841], [2.382811, 48.766942], [2.382303, 48.767005], [2.382197, 48.767018], [2.381409, 48.767116], [2.381185, 48.767144], [Truncated] "name": "<NAME>", "distance": 13.750966, "hint": "B6bNgP___38YAAAAGAAAAAUAAAAAAAAAeVcVQgAAAAAmaudAAAAAABgAAAAYAAAABQAAAAAAAADGGwAAeLojANpl6ALhuSMAI2boAgEAnwA2-8d8" }, { "alternatives_count": 60, "waypoint_index": 7, "matchings_index": 0, "location": [2.341575, 48.78503], "name": "<NAME>", "distance": 31.485708, "hint": "CqbNgJumzQANAAAAFAAAAAAAAAAAAAAAGuqfQSEh8EEAAAAAAAAAAA0AAAAUAAAAAAAAAAAAAADGGwAAx7ojAIZm6AItuSMA2GboAgAADwE2-8d8" }] } ``` This contains the node-id: 10952294855 This node doesn't exist in my OSM source file, and doesn't exist right now on OSM: https://www.openstreetmap.org/node/10952294855. Has the code somehow been updated in the repo over the last few days ? Because a few days ago this worked fine... I am completely stuck by this weird issue, and I'm working on strict commercial deadlines, so any help is greatly appreciated !! username_0: Reopening this. I preprocessed osm files with v5.24.0. Then I set up my service as such: ``` sudo docker pull osrm/osrm-backend:v5.24.0 sudo docker run -d -p 5000:5000 -v "${PWD}/car/:/data/car" osrm/osrm-backend:v5.24.0 osrm-routed --algorithm mld /data/car/210405_france-latest.osrm sudo docker run -d -p 5001:5000 -v "${PWD}/bicycle/:/data/bicycle" osrm/osrm-backend:v5.24.0 osrm-routed --algorithm mld /data/bicycle/210405_france-latest.osrm ``` So I fixed the version to make sure I didn't have weird node-id's anymore in my response. Somehow, my responses contain them again. For example, a route in France: ``` curl "http://127.0.0.1:5000/match/v1/car/2.4293275000000003,48.917223349986465;2.4293275000000003,48.917223349986465;2.33948825,48.92566325998648;2.2880347599999995,48.936142819986486;2.2736557799999995,48.93377962998649;2.25333033,48.94202875998652;2.2527895299999994,48.94253954998651;2.251477930000001,48.94376162998651;2.2488684800000005,48.94635733998651;2.2453628300000004,48.94441776998653;2.2390706999999996,48.94491346998652;2.2389876512443387,48.944643695404494?steps=true&annotations=true&overview=full&radiuses=50;20.0;20.0;20.0;20.0;20.0;20.0;20.0;20.0;20.0;20.0;50&geometries=geojson&timestamps=1607299232;1607299232;1607299509;1607299661;1607299703;1607299897;1607299905;1607299919;1607300005;1607300052;1607300198;1607300244&gaps=ignore" {"code":"Ok","matchings":[{"confidence":0.11462,"geometry":{"coordinates":[[2.429407,48.917239],[2.42873,48.918681],[2.428527,48.919146],[2.428489,48.919234],[2.428218,48.919984],[2.427974,48.920793],[2.427752,48.921612],[2.427548,48.922616],[2.42747,48.922926],[2.427333,48.923279],[2.427181,48.923593],[2.426933,48.923961],[2.426671,48.924267],[2.426269,48.924622],[2.425799,48.92497],[2.425301,48.925254],[2.424715,48.925528],[2.424168,48.925728],[2.424064,48.925761],[2.422168,48.926354],[2.421433,48.926584],[2.420896,48.926776],[2.419748,48.927238],[2.419341,48.927429],[2.418959,48.927603],[2.418151,48.927965],[2.417831,48.928093],[2.417281,48.928261],[2.416864,48.928362],[2.416417,48.928457],[2.416011,48.928512],[2.415594,48.928549],[2.415133,48.928577],[2.414725,48.928577],[2.414364,48.928567],[2.414071,48.928543],[2.413614,48.928492],[2.412891,48.928379],[2.410164,48.927927],[2.409604,48.927826],[2.408496,48.927628],[2.407414,48.927408],[2.404288,48.926766],[2.403249,48.926557],[2.402209,48.926414],[2.399656,48.92606],[2.398707,48.925946],[2.398118,48.925863],[2.397825,48.925827],[2.397533,48.92579],[2.396977,48.925717],[2.396398,48.925658],[2.395869,48.925608],[2.395317,48.925546],[2.394768,48.925474],[2.394222,48.925393],[2.39368,48.925302],[2.38792,48.924324],[2.385312,48.923908],[2.385189,48.923879],[2.383231,48.92352],[2.382354,48.923357],[2.380472,48.922844],[2.380298,48.922795],[2.376217,48.921606],[2.376134,48.921581],[2.37574,48.921495],[2.37515,48.921361],[2.37467,48.921287],[2.374289,48.921227],[2.373809,48.921174],[2.373248,48.921125],[2.372765,48.9211],[2.372213,48.921098],[2.371824,48.9211],[2.371215,48.921128],[2.37074,48.921167],[2.370241,48.921222],[2.369732,48.921301],[2.36916,48.921394],[2.368616,48.921502],[2.366396,48.921909],[2.366109,48.921947],[2.36526,48.922011],[2.364803,48.922042],[2.364215,48.922068],[2.363216,48.922091],[2.362226,48.922095],[2.361361,48.922107],[2.360573,48.922145],[2.358593,48.922302],[2.357997,48.922342],[2.356456,48.922445],[2.354395,48.922618],[2.351766,48.922776],[2.350869,48.922803],[2.349617,48.922802],[2.346776,48.922786],[2.346009,48.922813],[2.345673,48.922836],[2.34537,48.922871],[2.345045,48.922922],[2.344867,48.922951],[2.34437,48.923053],[2.343835,48.923205],[2.343365,48.923388],[2.342643,48.923739],[2.342233,48.923967],[2.341764,48.924263],[2.340142,48.925302],[2.339494,48.925668],[2.339473,48.92568],[2.33885,48.925972],[2.338595,48.926086],[2.33842,48.926159],[2.338259,48.926223],[2.33793,48.926341],[2.337731,48.926405],[2.337278,48.926532],[2.336677,48.926683],[2.335613,48.926943],[2.335205,48.927039],[2.334937,48.9271],[2.33462,48.927168],[2.3341,48.927294],[2.333739,48.927385],[2.33354,48.927438],[2.333379,48.927488],[2.333223,48.927539],[2.333042,48.927602],[2.332812,48.927685],[2.332702,48.927729],[2.332634,48.927761],[2.332149,48.927998],[2.332009,48.928079],[2.331729,48.928254],[2.331515,48.928401],[2.331397,48.928476],[2.331326,48.928519],[2.331191,48.928597],[2.330704,48.92882],[2.330111,48.929058],[2.329565,48.929233],[2.328685,48.929487],[2.328057,48.929641],[2.326246,48.930075],[2.325698,48.930218],[2.325286,48.930346],[2.324858,48.930493],[2.324326,48.930712],[2.324143,48.930795],[2.32391,48.93091],[2.323707,48.93102],[2.323595,48.931089],[2.323335,48.931249],[2.323219,48.931327],[2.323105,48.931404],[2.322963,48.931508],[2.322702,48.931718],[2.322526,48.931883],[2.322371,48.932041],[2.322171,48.932272],[2.321661,48.932926],[2.320643,48.934359],[2.32024,48.934923],[2.319992,48.935284],[2.31973,48.935618],[2.31941,48.935978],[2.319062,48.936306],[2.318676,48.936589],[2.31831,48.936825],[2.31788,48.937057],[2.31742,48.937286],[2.316686,48.937589],[2.316128,48.937771],[2.315585,48.937926],[2.315105,48.938064],[2.314278,48.938228],[2.313749,48.938321],[2.313616,48.938349],[2.313304,48.938383],[2.312814,48.938443],[2.311128,48.938598],[2.30979,48.938728],[2.306808,48.939012],[2.306596,48.938996],[2.306217,48.939026],[2.305524,48.939075],[2.30477,48.939111],[2.304458,48.939119],[2.304116,48.939111],[2.303731,48.939095],[2.303167,48.939056],[2.302681,48.939004],[2.302052,48.938911],[2.30152,48.938818],[2.301048,48.938721],[2.300035,48.938517],[2.29917,48.938342],[2.298074,48.938127],[2.297116,48.937938],[2.29616,48.937744],[2.294401,48.937392],[2.292074,48.936923],[2.292021,48.936912],[2.291944,48.936897],[2.291862,48.93688],[2.290208,48.936553],[2.290053,48.936554],[2.288037,48.93614],[2.287071,48.935941],[2.284449,48.935423],[2.284238,48.935355],[2.280936,48.934698],[2.280508,48.934606],[2.278148,48.93413],[2.276648,48.933844],[2.276172,48.933767],[2.275701,48.93371],[2.275243,48.93368],[2.274807,48.933677],[2.274279,48.933701],[2.274234,48.933705],[2.273654,48.933773],[2.273508,48.93379],[2.27307,48.933867],[2.268642,48.93473],[2.268498,48.934818],[2.26725,48.935157],[2.265909,48.935475],[2.264479,48.935785],[2.263912,48.935889],[2.263344,48.935945],[2.262766,48.935965],[2.262442,48.935957],[2.262278,48.935938],[2.261853,48.935915],[2.261495,48.935866],[2.261176,48.9358],[2.261013,48.935771],[2.260998,48.935768],[2.260832,48.93576],[2.26055,48.935952],[2.260419,48.936036],[2.25936,48.936711],[2.259104,48.936886],[2.259025,48.936945],[2.258946,48.937004],[2.258867,48.937063],[2.258513,48.937341],[2.258482,48.937366],[2.258357,48.937463],[2.258254,48.937545],[2.258186,48.937599],[2.257989,48.93776],[2.257082,48.938497],[2.256326,48.93913],[2.256224,48.939221],[2.256176,48.939262],[2.255262,48.940128],[2.253752,48.941558],[2.253564,48.941751],[2.253308,48.941994],[2.253291,48.942008],[2.253277,48.942019],[2.253185,48.942116],[2.253102,48.94219],[2.253037,48.942252],[2.252791,48.942489],[2.252755,48.942524],[2.252451,48.942821],[2.252323,48.94295],[2.252264,48.943009],[2.252115,48.943113],[2.251472,48.94376],[2.251022,48.944213],[2.250951,48.944284],[2.250867,48.944372],[2.249764,48.945476],[2.249705,48.945538],[2.249641,48.945604],[2.2496,48.945646],[2.249359,48.945892],[2.249279,48.945973],[2.248898,48.946362],[2.248857,48.946461],[2.248839,48.946478],[2.248777,48.946536],[2.248711,48.946541],[2.248645,48.946545],[2.248251,48.946341],[2.248138,48.946247],[2.248019,48.946195],[2.24788,48.946128],[2.247751,48.946051],[2.247473,48.945861],[2.247239,48.945695],[2.247122,48.945611],[2.24665,48.945291],[2.246113,48.944929],[2.246019,48.944868],[2.245916,48.944798],[2.245359,48.94442],[2.245185,48.944302],[2.245081,48.944227],[2.244955,48.944133],[2.24469,48.943935],[2.243077,48.942724],[2.242931,48.942815],[2.242513,48.943108],[2.242442,48.943161],[2.241626,48.943718],[2.241251,48.944006],[2.241079,48.944141],[2.240751,48.9444],[2.24057,48.944541],[2.239893,48.945048],[2.239771,48.944993],[2.239259,48.944745],[2.238997,48.944618],[2.23901,48.944624]],"type":"LineString"},"legs":[{"annotation":{"metadata":{"datasource_names":["lua profile"]},"nodes":[58878148,4305302905,2592874712,948031525,6482880512,7163624483,29439072,14719535,5769400607,7761037788,6933189561,5897766013,5278541683,1971343079,3951074767,973159326,8342769578,2114090815,6247394183,4368434885,4496617086,3427259774,7359767,1072979941,1879694588,1051455996,4027824633,5811619,3679883,7876789393,2149323589,6981907442,23246732,5769564447,2884780175,5373880294,5737355847,4295887266,20684729,2049999714,2448777102,3795758065,2157825998,4499892739,5906040017,2684584552,6450340527,7646486140,3643398312,5309735399,6949833971,2441354005,7769883769,8384921630,4127198721,2096230343,6411358709,6559042708,41374618,5771731990,683806090,5503041844,2684412058,7466584896,1214287727,20687308,6458962496,7637465500,3818732238,1308651597,3817764027,637548582,1908882009,5249408298,6452794598,1175481428,2152655475,6919671444,539456575,3459835721,4605352979,1036168576,1075805768,2688482448,7013106942,2298482185,4045308477,8446503722,1554497559,71536080,7030191658,504917420,8539731940,403685156,8556638152,515985,7617239680,4296260127,6194205072,6716851071,4563725959,4701925717,6451660510,5442111236,4176550157,7014974338,3506438593,2293913568,5859801225,630287702,4273721568],"datasources":[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],"speed":[20,19.9,20.3,19.9,19.9,20.1,20.1,20.6,20.3,20.4,20.3,19.5,19.7,19.9,20.1,20.2,19.9,21.1,19.9,19.8,20.3,20.1,20.3,15.4,15.5,15.2,15.8,15.5,15.6,15.9,15.4,15.4,15.7,15.5,15.4,15.4,15.5,20,20.2,20,20.2,19.9,19.9,19.9,20.1,20.1,20,19.8,19.8,19.7,20.4,19.5,20.5,20.5,20.5,20.4,20,20,19.1,20.1,20.2,20.1,19.8,20,22.2,20.2,19.8,20,20.5,19.8,19.7,19.7,20.2,20.3,20.3,20.6,20.5,20.1,19.6,19.8,20.1,19.5,20.1,19.8,19.6,19.7,20.1,19.8,19.9,20,19.9,19.9,20,20.1,19.9,19.9,20,20,20.6,20.4,15.3,14.9,15.9,15.8,15.4,15.6,15.7,15.3,15.6,16],"weight":[8.4,2.7,0.5,4.3,4.6,4.6,5.6,1.7,2,1.8,2.2,2,2.5,2.6,2.4,2.6,2.3,0.4,7.7,3,2.2,4.9,1.8,2.2,4.6,1.8,2.8,2.1,2.2,1.9,2,2.2,1.9,1.7,1.4,2.2,3.5,10.3,2.1,4.2,4.1,12,4,3.9,9.5,3.5,2.2,1.1,1.1,2.1,2.1,2,2,2,2,2,21.7,9.8,0.5,7.4,3.3,7.4,0.7,16.3,0.3,1.5,2.3,1.8,1.4,1.8,2.1,1.8,2,1.4,2.2,1.7,1.8,1.9,2.2,2.1,8.4,1.1,3.1,1.7,2.2,3.7,3.6,3.2,2.9,7.3,2.2,5.7,7.6,9.6,3.3,4.6,10.4,2.8,1.2,1.1,1.6,0.9,2.4,2.7,2.6,4.2,2.5,3.1,10.6,3.9],"duration":[8.4,2.7,0.5,4.3,4.6,4.6,5.6,1.7,2,1.8,2.2,2,2.5,2.6,2.4,2.6,2.3,0.4,7.7,3,2.2,4.9,1.8,2.2,4.6,1.8,2.8,2.1,2.2,1.9,2,2.2,1.9,1.7,1.4,2.2,3.5,10.3,2.1,4.2,4.1,12,4,3.9,9.5,3.5,2.2,1.1,1.1,2.1,2.1,2,2,2,2,2,21.7,9.8,0.5,7.4,3.3,7.4,0.7,16.3,0.3,1.5,2.3,1.8,1.4,1.8,2.1,1.8,2,1.4,2.2,1.7,1.8,1.9,2.2,2.1,8.4,1.1,3.1,1.7,2.2,3.7,3.6,3.2,2.9,7.3,2.2,5.7,7.6,9.6,3.3,4.6,10.4,2.8,1.2,1.1,1.6,0.9,2.4,2.7,2.6,4.2,2.5,3.1,10.6,3.9],"distance":[167.848004,53.806367,10.17434,85.738962,91.732237,92.528024,112.662091,34.948209,40.519419,36.649167,44.76457,39.051576,49.215923,51.749464,48.190816,52.562065,45.747453,8.44019,153.456064,59.493781,44.677459,98.380309,36.550431,33.968027,71.46759,27.37743,44.323317,32.477442,34.331427,30.292868,30.749193,33.831511,29.814846,26.403733,21.576925,33.873931,54.3082,205.522672,42.436826,83.910707,82.769276,239.341513,79.407317,77.648282,190.680277,70.501928,44.022692,21.783498,21.732471,41.43565,42.819076,39.057235,40.92534,40.912407,40.906246,40.882297,434.775214,196.134136,9.550337,148.562752,66.608713,148.909093,13.835314,326.264236,6.673073,30.343019,45.623902,36.033994,28.63442,35.573312,41.362149,35.41014,40.344387,28.431537,44.618515,34.986035,36.979624,38.22458,43.06602,41.533856,168.447045,21.397087,62.456376,33.577436,43.07126,73.05683,72.355609,63.232682,57.745901,145.757933,43.785138,113.20463,151.851141,192.940392,65.625163,91.501303,207.639458,56.135859,24.689125,22.484023,24.420241,13.402851,38.053177,42.598284,39.927025,65.638355,39.254813,47.526142,165.548295,62.448314]},"steps":[{"intersections":[{"classes":["motorway"],"location":[2.429407,48.917239],"bearings":[343],"entry":[true],"out":0},{"classes":["tunnel","motorway"],"out":1,"location":[2.42873,48.918681],"bearings":[165,345],"entry":[false,true],"in":0},{"classes":["tunnel","motorway"],"out":2,"location":[2.428218,48.919984],"bearings":[158,167,349],"entry":[false,false,true],"in":1},{"classes":["motorway"],"out":1,"location":[2.426269,48.924622],"bearings":[150,315],"entry":[false,true],"in":0},{"classes":["tunnel","motorway"],"out":1,"location":[2.424168,48.925728],"bearings":[120,300],"entry":[false,true],"in":0},{"classes":["motorway"],"out":1,"location":[2.421433,48.926584],"bearings":[120,300],"entry":[false,true],"in":0},{"classes":["motorway"],"out":1,"location":[2.418959,48.927603],"bearings":[120,300,315],"entry":[false,true,true],"in":0},{"classes":["motorway"],"out":2,"location":[2.412891,48.928379],"bearings":[70,77,256],"entry":[false,false,true],"in":1},{"classes":["motorway"],"out":1,"location":[2.410164,48.927927],"bearings":[75,255,270],"entry":[false,true,true],"in":0},{"classes":["motorway"],"out":2,"location":[2.398707,48.925946],"bearings":[60,75,255],"entry":[false,false,true],"in":1},{"classes":["tunnel","motorway"],"out":1,"location":[2.397533,48.92579],"bearings":[75,255],"entry":[false,true],"in":0},{"classes":["motorway"],"out":1,"location":[2.396398,48.925658],"bearings":[75,255],"entry":[false,true],"in":0},{"classes":["motorway"],"out":1,"location":[2.380298,48.922795],"bearings":[60,240,255],"entry":[false,true,true],"in":0},{"classes":["motorway"],"out":1,"location":[2.366396,48.921909],"bearings":[105,285,300],"entry":[false,true,true],"in":0},{"classes":["motorway"],"out":2,"location":[2.354395,48.922618],"bearings":[75,90,270],"entry":[false,false,true],"in":1}],"driving_side":"right","geometry":{"coordinates":[[2.429407,48.917239],[2.42873,48.918681],[2.428527,48.919146],[2.428489,48.919234],[2.428218,48.919984],[2.427974,48.920793],[2.427752,48.921612],[2.427548,48.922616],[2.42747,48.922926],[2.427333,48.923279],[2.427181,48.923593],[2.426933,48.923961],[2.426671,48.924267],[2.426269,48.924622],[2.425799,48.92497],[2.425301,48.925254],[2.424715,48.925528],[2.424168,48.925728],[2.424064,48.925761],[2.422168,48.926354],[2.421433,48.926584],[2.420896,48.926776],[2.419748,48.927238],[2.419341,48.927429],[2.418959,48.927603],[2.418151,48.927965],[2.417831,48.928093],[2.417281,48.928261],[2.416864,48.928362],[2.416417,48.928457],[2.416011,48.928512],[2.415594,48.928549],[2.415133,48.928577],[2.414725,48.928577],[2.414364,48.928567],[2.414071,48.928543],[2.413614,48.928492],[2.412891,48.928379],[2.410164,48.927927],[2.409604,48.927826],[2.408496,48.927628],[2.407414,48.927408],[2.404288,48.926766],[2.403249,48.926557],[2.402209,48.926414],[2.399656,48.92606],[2.398707,48.925946],[2.398118,48.925863],[2.397825,48.925827],[2.397533,48.92579],[2.396977,48.925717],[2.396398,48.925658],[2.395869,48.925608],[2.395317,48.925546],[2.394768,48.925474],[2.394222,48.925393],[2.39368,48.925302],[2.38792,48.924324],[2.385312,48.923908],[2.385189,48.923879],[2.383231,48.92352],[2.382354,48.923357],[2.380472,48.922844],[2.380298,48.922795],[2.376217,48.921606],[2.376134,48.921581],[2.37574,48.921495],[2.37515,48.921361],[2.37467,48.921287],[2.374289,48.921227],[2.373809,48.921174],[2.373248,48.921125],[2.372765,48.9211],[2.372213,48.921098],[2.371824,48.9211],[2.371215,48.921128],[2.37074,48.921167],[2.370241,48.921222],[2.369732,48.921301],[2.36916,48.921394],[2.368616,48.921502],[2.366396,48.921909],[2.366109,48.921947],[2.36526,48.922011],[2.364803,48.922042],[2.364215,48.922068],[2.363216,48.922091],[2.362226,48.922095],[2.361361,48.922107],[2.360573,48.922145],[2.358593,48.922302],[2.357997,48.922342],[2.356456,48.922445],[2.354395,48.922618],[2.351766,48.922776],[2.350869,48.922803],[2.349617,48.922802],[2.346776,48.922786],[2.346009,48.922813],[2.345673,48.922836],[2.34537,48.922871],[2.345045,48.922922],[2.344867,48.922951],[2.34437,48.923053],[2.343835,48.923205],[2.343365,48.923388],[2.342643,48.923739],[2.342233,48.923967],[2.341764,48.924263],[2.340142,48.925302],[2.339494,48.925668]],"type":"LineString"},"mode":"driving","duration":393.1,"maneuver":{"bearing_after":343,"type":"depart","modifier":"left","bearing_before":0,"location":[2.429407,48.917239]},"ref":"A 86","weight":393.1,"distance":7572.6,"name":""},{"intersections":[{"in":0,"entry":[true],"bearings":[131],"location":[2.339494,48.925668]}],"driving_side":"right","geometry":{"coordinates":[[2.339494,48.925668],[2.339494,48.925668]],"type":"LineString"},"mode":"driving","duration":0,"maneuver":{"bearing_after":0,"location":[2.339494,48.925668],"bearing_before":311,"type":"arrive"},"ref":"A 86","weight":0,"distance":0,"name":""}],"distance":7572.6,"duration":393.1,"summary":"A 86, Viaduc du Canal de Saint-Denis","weight":393.1},{"annotation":{"metadata":{"datasource_names":["lua profile"]},"nodes":[630287702,4273721568,1752957152,8504959872,2730337259,8419993345,5660135909,8103607872,5171314800,4408934371,2025901808,2585460792,253237694,8580925125,7125035242,8571916171,2350962858,7857484913,6716608850,3356207273,1677055060,5132970538,8553899031,2043892458,455868687,2411728831,8175586216,4087789012,143072168,5460674583,1205602271,2331414702,4663796573,1292697628,82749238,646332430,6425302865,3212643240,123650,323100679,201573894,8512387722,8301746367,619553556,1627833693,4833384564,8539168069,4260703906,3777708147,8435102997,4722525081,165498478,161529859,5239390236,1888849977,4375730177,802949870,5972823464,6482831360,4303599670,34533593,138183524,7536382464,1322922881,2645845775,548516353,3768190976,4614623240,2994114651,1884095360,813916844,5425594897,4750054833,910175235,1323987888,7973797346,7969174026,6509590738,7348413973,2001969024,2375025170,4003938305,4408738057,1667498116,8007877122,7611908837,4801839112,7425820676,4832107206,6261708809,2775687213,3933491219,2371965172,2938124839,5310457095,2638451331,4988364811,7866990631,5352603768,3255667394],"datasources":[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],"speed":[10.2,15.5,16.1,15.1,15.3,15.2,16.2,15.6,15.7,15.7,15.8,15.9,15.2,15.6,15.7,15.7,16.3,15.9,15,16,15.7,15.3,15.8,15.1,15.7,15.1,15,14.1,18.8,19.7,20.3,20.2,20.1,20.4,20.1,19.6,19.6,19.6,19.9,20.3,19.4,19.2,18.7,20,20.2,19.9,19.4,20.1,20.4,20.9,19.7,19.9,20,19.8,20,19.9,20.2,20.2,20.1,19.7,20.3,20.1,19.8,19.8,19.7,20.1,19.7,20,20.4,19.3,20.2,20.1,20.2,20,19.5,19.9,20.4,19.7,20.7,19.2,20.1,19.7,20,19.6,20.1,20.1,19.8,20,19.9,19.8,19.8,20.1,20,20.3,19.6,20.9,20,18.9,20],"weight":[0.2,3.6,1.4,1,0.9,1.8,1,2.3,3,5.3,2,1.3,1.6,2.6,1.8,1,0.8,0.8,1,1.2,0.6,0.4,2.8,0.9,1.8,1.5,0.8,0.5,0.7,2.2,2.5,2.2,3.5,2.4,7,2.2,1.7,1.8,2.3,0.8,1.1,1,0.6,1.3,0.6,0.6,0.8,1.5,1.1,1,1.5,4.1,8.8,3.5,2.2,2.1,2.3,2.2,2.1,1.9,2,2.1,3.2,2.3,2.2,1.9,3.2,2,0.5,1.2,1.8,6.2,4.9,11,0.8,1.4,2.5,2.8,1.1,1.3,1.4,2.1,1.8,2.4,2,1.8,3.9,3.3,4.2,3.7,3.7,6.7,8.9,0.2,0.3,0.3,6.3,0.6,7.7],"duration":[0.2,3.6,1.4,1,0.9,1.8,1,2.3,3,5.3,2,1.3,1.6,2.6,1.8,1,0.8,0.8,1,1.2,0.6,0.4,2.8,0.9,1.8,1.5,0.8,0.5,0.7,2.2,2.5,2.2,3.5,2.4,7,2.2,1.7,1.8,2.3,0.8,1.1,1,0.6,1.3,0.6,0.6,0.8,1.5,1.1,1,1.5,4.1,8.8,3.5,2.2,2.1,2.3,2.2,2.1,1.9,2,2.1,3.2,2.3,2.2,1.9,3.2,2,0.5,1.2,1.8,6.2,4.9,11,0.8,1.4,2.5,2.8,1.1,1.3,1.4,2.1,1.8,2.4,2,1.8,3.9,3.3,4.2,3.7,3.7,6.7,8.9,0.2,0.3,0.3,6.3,0.6,7.7],"distance":[2.033888,55.925664,22.539944,15.148647,13.751556,27.392018,16.191434,35.992417,47.0219,82.958836,31.670101,20.726795,24.369069,40.50225,28.25599,15.691778,13.013596,12.733364,14.968429,19.176149,9.411036,6.112396,44.170464,13.632137,28.240699,22.624904,11.997653,7.056469,13.137325,43.378543,50.779456,44.393547,70.237784,48.983079,140.865757,43.087309,33.302698,35.291211,45.875673,16.249398,21.295293,19.228349,11.219562,26.031798,12.129087,11.94742,15.539372,30.154581,22.409824,20.907272,29.558679,81.732002,175.889739,69.298761,44.052095,41.791777,46.367893,44.468613,42.263442,37.471714,40.656065,42.170216,63.338419,45.517884,43.2567,38.281599,63.115364,40.009774,10.204029,23.106679,36.4167,124.381516,98.818996,220.144591,15.590694,27.890192,50.923259,55.232733,22.812137,25.00242,28.184416,41.43367,35.975281,47.104789,40.221102,36.132978,77.410767,66.127515,83.569896,73.081863,73.104399,134.348167,177.840437,4.061046,5.868058,6.2825,126.201941,11.325455,154.327379]},"steps":[{"intersections":[{"classes":["motorway"],"location":[2.339494,48.925668],"bearings":[311],"entry":[true],"out":0},{"classes":["motorway"],"out":2,"location":[2.339473,48.92568],"bearings":[120,135,300],"entry":[false,false,true],"in":1},{"classes":["motorway"],"out":1,"location":[2.332634,48.927761],"bearings":[120,300,330],"entry":[false,true,true],"in":0},{"classes":["motorway"],"out":2,"location":[2.328685,48.929487],"bearings":[105,120,285],"entry":[false,false,true],"in":0},{"classes":["motorway"],"out":1,"location":[2.323219,48.931327],"bearings":[133,315,322],"entry":[false,true,true],"in":0},{"classes":["motorway"],"out":2,"location":[2.31742,48.937286],"bearings":[120,135,300],"entry":[false,false,true],"in":0},{"classes":["tunnel","motorway"],"out":1,"location":[2.314278,48.938228],"bearings":[105,285],"entry":[false,true],"in":0},{"classes":["tunnel","motorway"],"out":1,"location":[2.313749,48.938321],"bearings":[105,287,288],"entry":[false,true,true],"in":0},{"classes":["motorway"],"out":1,"location":[2.313616,48.938349],"bearings":[105,285],"entry":[false,true],"in":0},{"classes":["motorway"],"out":1,"location":[2.306808,48.939012],"bearings":[105,270,285],"entry":[false,true,true],"in":0},{"classes":["tunnel","motorway"],"out":1,"location":[2.306217,48.939026],"bearings":[90,270],"entry":[false,true],"in":0},{"classes":["motorway"],"out":1,"location":[2.30477,48.939111],"bearings":[90,270],"entry":[false,true],"in":0},{"classes":["motorway"],"out":2,"location":[2.29616,48.937744],"bearings":[68,73,253],"entry":[false,false,true],"in":1},{"classes":["motorway"],"out":2,"location":[2.290053,48.936554],"bearings":[60,75,255],"entry":[false,false,true],"in":1}],"driving_side":"right","geometry":{"coordinates":[[2.339494,48.925668],[2.339473,48.92568],[2.33885,48.925972],[2.338595,48.926086],[2.33842,48.926159],[2.338259,48.926223],[2.33793,48.926341],[2.337731,48.926405],[2.337278,48.926532],[2.336677,48.926683],[2.335613,48.926943],[2.335205,48.927039],[2.334937,48.9271],[2.33462,48.927168],[2.3341,48.927294],[2.333739,48.927385],[2.33354,48.927438],[2.333379,48.927488],[2.333223,48.927539],[2.333042,48.927602],[2.332812,48.927685],[2.332702,48.927729],[2.332634,48.927761],[2.332149,48.927998],[2.332009,48.928079],[2.331729,48.928254],[2.331515,48.928401],[2.331397,48.928476],[2.331326,48.928519],[2.331191,48.928597],[2.330704,48.92882],[2.330111,48.929058],[2.329565,48.929233],[2.328685,48.929487],[2.328057,48.929641],[2.326246,48.930075],[2.325698,48.930218],[2.325286,48.930346],[2.324858,48.930493],[2.324326,48.930712],[2.324143,48.930795],[2.32391,48.93091],[2.323707,48.93102],[2.323595,48.931089],[2.323335,48.931249],[2.323219,48.931327],[2.323105,48.931404],[2.322963,48.931508],[2.322702,48.931718],[2.322526,48.931883],[2.322371,48.932041],[2.322171,48.932272],[2.321661,48.932926],[2.320643,48.934359],[2.32024,48.934923],[2.319992,48.935284],[2.31973,48.935618],[2.31941,48.935978],[2.319062,48.936306],[2.318676,48.936589],[2.31831,48.936825],[2.31788,48.937057],[2.31742,48.937286],[2.316686,48.937589],[2.316128,48.937771],[2.315585,48.937926],[2.315105,48.938064],[2.314278,48.938228],[2.313749,48.938321],[2.313616,48.938349],[2.313304,48.938383],[2.312814,48.938443],[2.311128,48.938598],[2.30979,48.938728],[2.306808,48.939012],[2.306596,48.938996],[2.306217,48.939026],[2.305524,48.939075],[2.30477,48.939111],[2.304458,48.939119],[2.304116,48.939111],[2.303731,48.939095],[2.303167,48.939056],[2.302681,48.939004],[2.302052,48.938911],[2.30152,48.938818],[2.301048,48.938721],[2.300035,48.938517],[2.29917,48.938342],[2.298074,48.938127],[2.297116,48.937938],[2.29616,48.937744],[2.294401,48.937392],[2.292074,48.936923],[2.292021,48.936912],[2.291944,48.936897],[2.291862,48.93688],[2.290208,48.936553],[2.290053,48.936554],[2.288037,48.93614]],"type":"LineString"},"mode":"driving","duration":229.2,"maneuver":{"bearing_after":311,"location":[2.339494,48.925668],"bearing_before":0,"type":"depart"},"ref":"A 86","weight":229.2,"distance":4382.1,"name":""},{"intersections":[{"in":0,"entry":[true],"bearings":[73],"location":[2.288037,48.93614]}],"driving_side":"right","geometry":{"coordinates":[[2.288037,48.93614],[2.288037,48.93614]],"type":"LineString"},"mode":"driving","duration":0,"maneuver":{"bearing_after":0,"location":[2.288037,48.93614],"bearing_before":253,"type":"arrive"},"ref":"A 86","weight":0,"distance":0,"name":""}],"distance":4382.1,"duration":229.2,"summary":"A 86, Pont sur la Seine","weight":229.2},{"annotation":{"metadata":{"datasource_names":["lua profile"]},"nodes":[5352603768,3255667394,2997225320,1833215853,1515350224,4359925346,6719738773,7144079438,4885053925,5434480562,2859913653,1180566474,2279026565,4432735048,166217718],"datasources":[0,0,0,0,0,0,0,0,0,0,0,0,0,0],"speed":[20,20,19.1,20,20.6,20,20,19.9,20.6,19.8,19.9,20.4,16.6,20.5],"weight":[3.7,10,0.9,12.6,1.6,9,5.7,1.8,1.7,1.7,1.6,1.9,0.2,2.1],"duration":[3.7,10,0.9,12.6,1.6,9,5.7,1.8,1.7,1.7,1.6,1.9,0.2,2.1],"distance":[73.969721,200.053112,17.172189,252.088477,32.904151,180.383054,114.124567,35.819293,34.994143,33.631138,31.859455,38.67221,3.318028,43.04909]},"steps":[{"intersections":[{"classes":["motorway"],"location":[2.288037,48.93614],"bearings":[253],"entry":[true],"out":0},{"classes":["motorway"],"out":1,"location":[2.284449,48.935423],"bearings":[75,240,270],"entry":[false,true,true],"in":0},{"classes":["motorway"],"out":2,"location":[2.276172,48.933767],"bearings":[60,75,255],"entry":[false,false,true],"in":1}],"driving_side":"right","geometry":{"coordinates":[[2.288037,48.93614],[2.287071,48.935941],[2.284449,48.935423],[2.284238,48.935355],[2.280936,48.934698],[2.280508,48.934606],[2.278148,48.93413],[2.276648,48.933844],[2.276172,48.933767],[2.275701,48.93371],[2.275243,48.93368],[2.274807,48.933677],[2.274279,48.933701],[2.274234,48.933705],[2.273654,48.933773]],"type":"LineString"},"mode":"driving","duration":54.5,"maneuver":{"bearing_after":253,"location":[2.288037,48.93614],"bearing_before":0,"type":"depart"},"ref":"A 86","weight":54.5,"distance":1092,"name":""},{"intersections":[{"in":0,"entry":[true],"bearings":[100],"location":[2.273654,48.933773]}],"driving_side":"right","geometry":{"coordinates":[[2.273654,48.933773],[2.273654,48.933773]],"type":"LineString"},"mode":"driving","duration":0,"maneuver":{"bearing_after":0,"location":[2.273654,48.933773],"bearing_before":280,"type":"arrive"},"ref":"A 86","weight":0,"distance":0,"name":""}],"distance":1092,"duration":54.5,"summary":"A 86","weight":54.5},{"annotation":{"metadata":{"datasource_names":["lua profile"]},"nodes":[4432735048,166217718,4942836845,4633277671,5988262762,887569290,6861977292,132298961,3897643431,5167574424,264597924,330730921,2768050336,4649022372,4522531921,1812323121,819360999,589769852,151478717,3014058081,5698590957,6885761545,8247087706,5614663479,8398762380,247916482,8215978777,6680532596,6069646511,1457834370,1942759938,6708642634,4535123842,5079074035,7231120001,1442870839,4542734346,3087249848,6241608561,3416726140,5078018544],"datasources":[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],"speed":[18.1,19.5,20,11.1,11.1,11.1,11.1,11,11,11.1,11.3,11.1,12.5,12.7,12.2,12.3,11.5,12.2,15.6,14.9,15.2,15,14.6,14.6,14.6,15.5,17.9,15.7,14.8,15.6,15.3,15.3,15.2,15.7,14.4,15.2,11.1,11.1,14.9,19.9],"weight":[0.6,1.7,16.9,1.3,8.9,9.4,9.9,3.9,3.8,3.8,2.1,1.1,2.5,2.1,2,1,0.1,1,1.9,0.9,7.1,1.8,0.6,0.6,0.6,2.6,0.2,0.9,0.8,0.5,1.5,6.9,5.9,0.8,0.4,7.7,17.4,2.3,2.2,0.1],"duration":[0.6,1.7,16.9,1.3,8.9,9.4,9.9,3.9,3.8,3.8,2.1,1.1,2.5,2.1,2,1,0.1,1,1.9,0.9,7.1,1.8,0.6,0.6,0.6,2.6,0.2,0.9,0.8,0.5,1.5,6.9,5.9,0.8,0.4,7.7,17.4,2.3,2.2,0.1],"distance":[10.834203,33.129884,337.480338,14.370349,98.674844,104.169772,110.025686,43.012486,41.965654,42.290004,23.689711,12.167557,31.157758,26.718989,24.43644,12.338675,1.145646,12.1614,29.674788,13.375541,107.812598,26.994826,8.739596,8.739592,8.739587,40.312089,3.586368,14.135421,11.824416,7.794751,22.974821,105.408738,89.486399,12.569045,5.752768,117.204896,193.566274,25.484462,32.867781,1.991787]},"steps":[{"intersections":[{"classes":["motorway"],"location":[2.273654,48.933773],"bearings":[280],"entry":[true],"out":0}],"driving_side":"right","geometry":{"coordinates":[[2.273654,48.933773],[2.273508,48.93379],[2.27307,48.933867],[2.268642,48.93473]],"type":"LineString"},"mode":"driving","duration":19.2,"maneuver":{"bearing_after":280,"location":[2.273654,48.933773],"bearing_before":0,"type":"depart"},"ref":"A 86","weight":19.2,"distance":381.4,"name":""},{"name":"","distance":572.7,"maneuver":{"bearing_after":312,"type":"off ramp","modifier":"slight right","bearing_before":285,"location":[2.268642,48.93473]},"destinations":"Argenteuil Centre, Colombes Centre, Bois Colombes","exits":"4","intersections":[{"classes":["motorway"],"out":2,"location":[2.268642,48.93473],"bearings":[105,285,315],"entry":[false,true,true],"in":0},{"classes":["motorway"],"out":2,"location":[2.262278,48.935938],"bearings":[75,90,270],"entry":[false,false,true],"in":0}],"duration":52.8,"driving_side":"right","weight":52.8,"mode":"driving","geometry":{"coordinates":[[2.268642,48.93473],[2.268498,48.934818],[2.26725,48.935157],[2.265909,48.935475],[2.264479,48.935785],[2.263912,48.935889],[2.263344,48.935945],[2.262766,48.935965],[2.262442,48.935957],[2.262278,48.935938],[2.261853,48.935915],[2.261495,48.935866],[2.261176,48.9358]],"type":"LineString"}},{"intersections":[{"classes":["motorway"],"out":2,"location":[2.261176,48.9358],"bearings":[75,225,255],"entry":[false,true,true],"in":0},{"out":2,"location":[2.260832,48.93576],"bearings":[75,135,315],"entry":[false,false,true],"in":0},{"out":3,"location":[2.258867,48.937063],"bearings":[45,135,225,315],"entry":[true,false,true,true],"in":1},{"lanes":[{"valid":false,"indications":["left"]},{"valid":true,"indications":["none"]},{"valid":true,"indications":["none"]}],"out":2,"location":[2.258357,48.937463],"bearings":[135,255,315],"entry":[false,true,true],"in":0},{"lanes":[{"valid":false,"indications":["left"]},{"valid":true,"indications":["none"]},{"valid":true,"indications":["none"]}],"out":2,"location":[2.258186,48.937599],"bearings":[135,180,315],"entry":[false,false,true],"in":0}],"driving_side":"right","geometry":{"coordinates":[[2.261176,48.9358],[2.261013,48.935771],[2.260998,48.935768],[2.260832,48.93576],[2.26055,48.935952],[2.260419,48.936036],[2.25936,48.936711],[2.259104,48.936886],[2.259025,48.936945],[2.258946,48.937004],[2.258867,48.937063],[2.258513,48.937341],[2.258482,48.937366],[2.258357,48.937463],[2.258254,48.937545],[2.258186,48.937599],[2.257989,48.93776],[2.257082,48.938497],[2.256326,48.93913],[2.256224,48.939221],[2.256176,48.939262]],"type":"LineString"},"mode":"driving","duration":42.3,"maneuver":{"bearing_after":258,"type":"fork","modifier":"slight right","bearing_before":251,"location":[2.261176,48.9358]},"ref":"D 909","weight":42.3,"distance":543.6,"name":"Avenue d'Argenteuil"},{"intersections":[{"out":2,"location":[2.256176,48.939262],"bearings":[135,240,330],"entry":[false,true,true],"in":0},{"out":1,"location":[2.253752,48.941558],"bearings":[150,330],"entry":[false,true],"in":0},{"out":1,"location":[2.253564,48.941751],"bearings":[150,330,345],"entry":[false,true,true],"in":0}],"driving_side":"right","geometry":{"coordinates":[[2.256176,48.939262],[2.255262,48.940128],[2.253752,48.941558],[2.253564,48.941751],[2.253308,48.941994],[2.253291,48.942008]],"type":"LineString"},"mode":"driving","duration":31.7,"maneuver":{"bearing_after":324,"type":"new name","modifier":"straight","bearing_before":322,"location":[2.256176,48.939262]},"ref":"D 909","weight":31.7,"distance":371.1,"name":"<NAME>"},{"intersections":[{"in":0,"entry":[true],"bearings":[141],"location":[2.253291,48.942008]}],"driving_side":"right","geometry":{"coordinates":[[2.253291,48.942008],[2.253291,48.942008]],"type":"LineString"},"mode":"driving","duration":0,"maneuver":{"bearing_after":0,"location":[2.253291,48.942008],"bearing_before":321,"type":"arrive"},"ref":"D 122","weight":0,"distance":0,"name":"Avenue <NAME>"}],"distance":1868.8,"duration":146,"summary":"Avenue d'Argenteuil","weight":146},{"annotation":{"metadata":{"datasource_names":["lua profile"]},"nodes":[3416726140,5078018544,6637987869,4686082107,2894130286,8225898766,7599310965],"datasources":[0,0,0,0,0,0],"speed":[15.9,15.9,14.6,16.7,15.2,14.3],"weight":[0.1,0.8,0.7,0.5,2.1,0.3],"duration":[0.1,0.8,0.7,0.5,2.1,0.3],"distance":[1.594682,12.711222,10.223147,8.372849,31.903958,4.298868]},"steps":[{"intersections":[{"out":0,"entry":[true],"bearings":[320],"location":[2.253291,48.942008]},{"out":3,"location":[2.253185,48.942116],"bearings":[60,150,240,330],"entry":[true,false,false,true],"in":1},{"out":3,"location":[2.253102,48.94219],"bearings":[60,150,240,330],"entry":[false,false,true,true],"in":1},{"out":2,"location":[2.252791,48.942489],"bearings":[135,150,330],"entry":[false,false,true],"in":1}],"driving_side":"right","geometry":{"coordinates":[[2.253291,48.942008],[2.253277,48.942019],[2.253185,48.942116],[2.253102,48.94219],[2.253037,48.942252],[2.252791,48.942489],[2.252758,48.942521]],"type":"LineString"},"mode":"driving","duration":4.5,"maneuver":{"bearing_after":320,"location":[2.253291,48.942008],"bearing_before":0,"type":"depart"},"ref":"D 122","weight":4.5,"distance":69.5,"name":"<NAME>"},{"intersections":[{"in":0,"entry":[true],"bearings":[146],"location":[2.252755,48.942524]}],"driving_side":"right","geometry":{"coordinates":[[2.252758,48.942521],[2.252758,48.942521]],"type":"LineString"},"mode":"driving","duration":0,"maneuver":{"bearing_after":0,"location":[2.252755,48.942524],"bearing_before":326,"type":"arrive"},"ref":"D 122","weight":0,"distance":0,"name":"<NAME>"}],"distance":69.5,"duration":4.5,"summary":"<NAME>","weight":4.5},{"annotation":{"metadata":{"datasource_names":["lua profile"]},"nodes":[7599310965,4098390025,6508507805,7243532267,6496418879,4878785492],"datasources":[0,0,0,0,0],"speed":[15.3,15.6,15.7,15.9,15.1],"weight":[2.6,1.1,0.5,1,5.7],"duration":[2.6,1.1,0.5,1,5.7],"distance":[39.805602,17.126347,7.851269,15.883747,85.937506]},"steps":[{"intersections":[{"out":0,"entry":[true],"bearings":[326],"location":[2.252755,48.942524]},{"out":3,"location":[2.252323,48.94295],"bearings":[60,150,210,330],"entry":[true,false,false,true],"in":1}],"driving_side":"right","geometry":{"coordinates":[[2.252755,48.942524],[2.252451,48.942821],[2.252323,48.94295],[2.252264,48.943009],[2.252115,48.943113],[2.251472,48.94376]],"type":"LineString"},"mode":"driving","duration":10.9,"maneuver":{"bearing_after":326,"location":[2.252755,48.942524],"bearing_before":0,"type":"depart"},"ref":"D 122","weight":10.9,"distance":166.6,"name":"<NAME>"},{"intersections":[{"in":0,"entry":[true],"bearings":[147],"location":[2.251472,48.94376]}],"driving_side":"right","geometry":{"coordinates":[[2.251472,48.94376],[2.251472,48.94376]],"type":"LineString"},"mode":"driving","duration":0,"maneuver":{"bearing_after":0,"location":[2.251472,48.94376],"bearing_before":327,"type":"arrive"},"ref":"D 122","weight":0,"distance":0,"name":"<NAME>"}],"distance":166.6,"duration":10.9,"summary":"Avenue Gabriel Péri","weight":10.9},{"annotation":{"metadata":{"datasource_names":["lua profile"]},"nodes":[6496418879,4878785492,3505431789,7470356017,7063348562,7010863580,2485687625,2588852821,2916500433,5024092144,2634636204],"datasources":[0,0,0,0,0,0,0,0,0,0],"speed":[15.4,15.7,14.4,15.3,16.3,14.5,13.9,15.5,15.3,15.5],"weight":[3.9,0.6,0.8,9.6,0.5,0.6,0.4,2.1,0.7,3.3],"duration":[3.9,0.6,0.8,9.6,0.5,0.6,0.4,2.1,0.7,3.3],"distance":[60.161378,9.448065,11.552439,146.87007,8.132117,8.703291,5.549184,32.536172,10.738741,51.125647]},"steps":[{"intersections":[{"out":0,"entry":[true],"bearings":[327],"location":[2.251472,48.94376]},{"out":3,"location":[2.250951,48.944284],"bearings":[45,150,240,330],"entry":[false,false,true,true],"in":1},{"out":3,"location":[2.249705,48.945538],"bearings":[60,150,240,330],"entry":[true,false,true,true],"in":1}],"driving_side":"right","geometry":{"coordinates":[[2.251472,48.94376],[2.251022,48.944213],[2.250951,48.944284],[2.250867,48.944372],[2.249764,48.945476],[2.249705,48.945538],[2.249641,48.945604],[2.2496,48.945646],[2.249359,48.945892],[2.249279,48.945973],[2.248899,48.946359]],"type":"LineString"},"mode":"driving","duration":24.5,"maneuver":{"bearing_after":327,"location":[2.251472,48.94376],"bearing_before":0,"type":"depart"},"ref":"D 122","weight":24.5,"distance":344.8,"name":"<NAME>"},{"intersections":[{"in":0,"entry":[true],"bearings":[150],"location":[2.248899,48.946359]}],"driving_side":"right","geometry":{"coordinates":[[2.248899,48.946359],[2.248899,48.946359]],"type":"LineString"},"mode":"driving","duration":0,"maneuver":{"bearing_after":0,"location":[2.248899,48.946359],"bearing_before":326,"type":"arrive"},"ref":"D 122","weight":0,"distance":0.3,"name":"Avenue G<NAME>"}],"distance":345.2,"duration":24.5,"summary":"Avenue Gabriel Péri","weight":24.5},{"annotation":{"metadata":{"datasource_names":["lua profile"]},"nodes":[2634636204,314005710,6326237016,5830203612,6599453790,3714367506,6243577520,2512045956,2656390776,83623751,4959064989,8191411049,2831972963,3122875608,41811871,4106637638,628011448,4315873229,72530112],"datasources":[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],"speed":[14.3,11.5,15.8,16.2,16.1,11.1,11.1,11.6,11.5,6.7,6.7,6.6,6.7,6.7,6.7,6.9,6.8,6.6],"weight":[0.8,0.2,0.5,0.3,0.3,3.3,1.2,0.9,1.1,1.9,4.4,3.8,1.9,7.4,8.4,1.4,1.6,8.8],"duration":[0.8,0.2,0.5,0.3,0.3,3.3,1.2,0.9,1.1,1.9,4.4,3.8,1.9,7.4,8.4,1.4,1.6,8.8],"distance":[11.411451,2.303093,7.882221,4.853224,4.841739,36.649931,13.321087,10.441211,12.59512,12.733841,29.308979,25.161503,12.66261,49.554943,56.21447,9.65336,10.827509,58.509305]},"steps":[{"intersections":[{"out":0,"entry":[true],"bearings":[345],"location":[2.248898,48.946362]}],"driving_side":"right","geometry":{"coordinates":[[2.248898,48.946362],[2.248857,48.946461],[2.248839,48.946478],[2.248777,48.946536]],"type":"LineString"},"mode":"driving","duration":5,"maneuver":{"bearing_after":345,"location":[2.248898,48.946362],"bearing_before":0,"type":"depart"},"ref":"D 122","weight":5,"distance":21.6,"name":"<NAME>"},{"intersections":[{"out":3,"location":[2.248777,48.946536],"bearings":[60,165,240,270],"entry":[true,false,false,true],"in":1},{"out":3,"location":[2.248711,48.946541],"bearings":[15,90,195,270],"entry":[false,false,true,true],"in":1}],"driving_side":"right","geometry":{"coordinates":[[2.248777,48.946536],[2.248711,48.946541],[2.248645,48.946545]],"type":"LineString"},"mode":"driving","duration":1,"maneuver":{"bearing_after":275,"type":"turn","modifier":"left","bearing_before":337,"location":[2.248777,48.946536]},"ref":"D 122","weight":1,"distance":9.7,"name":"<NAME>"},{"intersections":[{"out":2,"location":[2.248645,48.946545],"bearings":[45,90,225],"entry":[false,false,true],"in":1}],"driving_side":"right","geometry":{"coordinates":[[2.248645,48.946545],[2.248251,48.946341],[2.248138,48.946247],[2.248019,48.946195],[2.24788,48.946128]],"type":"LineString"},"mode":"driving","duration":6.5,"maneuver":{"bearing_after":230,"type":"turn","modifier":"left","bearing_before":274,"location":[2.248645,48.946545]},"weight":6.5,"distance":73,"name":"<NAME>"},{"intersections":[{"out":2,"location":[2.24788,48.946128],"bearings":[60,135,225,330],"entry":[false,false,true,true],"in":0},{"out":1,"location":[2.247473,48.945861],"bearings":[45,225,255],"entry":[false,true,true],"in":0},{"out":1,"location":[2.24665,48.945291],"bearings":[45,225,315],"entry":[false,true,true],"in":0},{"out":2,"location":[2.246019,48.944868],"bearings":[45,135,225],"entry":[false,false,true],"in":0},{"out":1,"location":[2.245916,48.944798],"bearings":[45,225,315],"entry":[false,true,false],"in":0}],"driving_side":"right","geometry":{"coordinates":[[2.24788,48.946128],[2.247751,48.946051],[2.247473,48.945861],[2.247239,48.945695],[2.247122,48.945611],[2.24665,48.945291],[2.246113,48.944929],[2.246019,48.944868],[2.245916,48.944798],[2.245359,48.94442]],"type":"LineString"},"mode":"driving","duration":39.6,"maneuver":{"bearing_after":225,"type":"new name","modifier":"straight","bearing_before":234,"location":[2.24788,48.946128]},"weight":39.6,"distance":264.6,"name":"<NAME>"},{"intersections":[{"in":0,"entry":[true],"bearings":[44],"location":[2.245359,48.94442]}],"driving_side":"right","geometry":{"coordinates":[[2.245359,48.94442],[2.245359,48.94442]],"type":"LineString"},"mode":"driving","duration":0,"maneuver":{"bearing_after":0,"location":[2.245359,48.94442],"bearing_before":224,"type":"arrive"},"weight":0,"distance":0,"name":"<NAME>"}],"distance":368.9,"duration":52.1,"summary":"<NAME>, <NAME>","weight":52.1},{"annotation":{"metadata":{"datasource_names":["lua profile"]},"nodes":[4315873229,72530112,6452903910,4322843744,7083297713,3226451954,207290300,346771582,1711439360,3624205312,3364884416,3053418886,4707442236,7516846080,8047913109,1662173063,4023694410],"datasources":[0,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0],"speed":[6.8,6.6,6.6,6.7,6.7,14.7,15.4,15.7,15.4,15.1,15.1,15,15.8,15.3,4.2,4.1],"weight":[2.7,1.7,2.1,4.4,26.8,1,2.9,0.5,5.6,2.8,1.3,2.5,1.3,4.9,2.6,11.2],"duration":[2.7,1.7,2.1,4.4,26.8,1,2.9,0.5,5.6,2.8,1.3,2.5,1.3,4.9,2.6,11.2],"distance":[18.271037,11.283179,13.929762,29.321945,178.963983,14.704092,44.660487,7.852033,85.975367,42.14983,19.579289,37.470263,20.513104,75.006232,10.809811,46.473979]},"steps":[{"intersections":[{"out":0,"entry":[true],"bearings":[224],"location":[2.245359,48.94442]}],"driving_side":"right","geometry":{"coordinates":[[2.245359,48.94442],[2.245185,48.944302],[2.245081,48.944227]],"type":"LineString"},"mode":"driving","duration":4.4,"maneuver":{"bearing_after":224,"location":[2.245359,48.94442],"bearing_before":0,"type":"depart"},"weight":4.4,"distance":29.6,"name":"<NAME>"},{"intersections":[{"out":2,"location":[2.245081,48.944227],"bearings":[45,135,225,315],"entry":[false,true,true,true],"in":0}],"driving_side":"right","geometry":{"coordinates":[[2.245081,48.944227],[2.244955,48.944133],[2.24469,48.943935],[2.243077,48.942724]],"type":"LineString"},"mode":"driving","duration":37.9,"maneuver":{"bearing_after":220,"type":"new name","modifier":"straight","bearing_before":223,"location":[2.245081,48.944227]},"weight":37.9,"distance":222.2,"name":"<NAME>"},{"intersections":[{"out":3,"location":[2.243077,48.942724],"bearings":[45,135,225,315],"entry":[false,true,true,true],"in":0}],"driving_side":"right","geometry":{"coordinates":[[2.243077,48.942724],[2.242931,48.942815],[2.242513,48.943108]],"type":"LineString"},"mode":"driving","duration":3.9,"maneuver":{"bearing_after":315,"type":"turn","modifier":"right","bearing_before":220,"location":[2.243077,48.942724]},"ref":"D 48","weight":3.9,"distance":59.4,"name":"<NAME>-<NAME>"},{"intersections":[{"out":3,"location":[2.242513,48.943108],"bearings":[45,135,240,315],"entry":[true,false,true,true],"in":1},{"out":2,"location":[2.241251,48.944006],"bearings":[135,240,315],"entry":[false,true,true],"in":0}],"driving_side":"right","geometry":{"coordinates":[[2.242513,48.943108],[2.242442,48.943161],[2.241626,48.943718],[2.241251,48.944006],[2.241079,48.944141],[2.240751,48.9444],[2.24057,48.944541],[2.239893,48.945048]],"type":"LineString"},"mode":"driving","duration":23.4,"maneuver":{"bearing_after":316,"type":"new name","modifier":"straight","bearing_before":316,"location":[2.242513,48.943108]},"ref":"D 48","weight":23.4,"distance":288.5,"name":"<NAME>-<NAME>"},{"intersections":[{"out":1,"location":[2.239893,48.945048],"bearings":[135,240,315],"entry":[false,true,true],"in":0}],"driving_side":"right","geometry":{"coordinates":[[2.239893,48.945048],[2.239771,48.944993],[2.239259,48.944745]],"type":"LineString"},"mode":"driving","duration":13.8,"maneuver":{"bearing_after":234,"type":"turn","modifier":"left","bearing_before":317,"location":[2.239893,48.945048]},"weight":13.8,"distance":57.3,"name":""},{"intersections":[{"in":0,"entry":[true],"bearings":[54],"location":[2.239259,48.944745]}],"driving_side":"right","geometry":{"coordinates":[[2.239259,48.944745],[2.239259,48.944745]],"type":"LineString"},"mode":"driving","duration":0,"maneuver":{"bearing_after":0,"type":"arrive","modifier":"right","bearing_before":234,"location":[2.239259,48.944745]},"weight":0,"distance":0,"name":""}],"distance":657,"duration":83.4,"summary":"<NAME>, Rue du Lieutenant-<NAME>","weight":83.4},{"annotation":{"metadata":{"datasource_names":["lua profile"]},"nodes":[1662173063,4023694410,1662173063],"datasources":[0,0],"speed":[4.2,5.8],"weight":[5.7,0.2],"duration":[5.7,0.2],"distance":[23.787855,1.160713]},"steps":[{"intersections":[{"out":0,"entry":[true],"bearings":[234],"location":[2.239259,48.944745]}],"driving_side":"right","geometry":{"coordinates":[[2.239259,48.944745],[2.238997,48.944618]],"type":"LineString"},"mode":"driving","duration":33.1,"maneuver":{"bearing_after":234,"type":"depart","modifier":"right","bearing_before":0,"location":[2.239259,48.944745]},"weight":33.1,"distance":23.8,"name":""},{"intersections":[{"out":0,"location":[2.238997,48.944618],"bearings":[60],"entry":[true],"in":0}],"driving_side":"right","geometry":{"coordinates":[[2.238997,48.944618],[2.23901,48.944624]],"type":"LineString"},"mode":"driving","duration":0.2,"maneuver":{"bearing_after":53,"type":"continue","modifier":"uturn","bearing_before":233,"location":[2.238997,48.944618]},"weight":0.2,"distance":1.2,"name":""},{"intersections":[{"in":0,"entry":[true],"bearings":[235],"location":[2.23901,48.944624]}],"driving_side":"right","geometry":{"coordinates":[[2.23901,48.944624],[2.23901,48.944624]],"type":"LineString"},"mode":"driving","duration":0,"maneuver":{"bearing_after":0,"location":[2.23901,48.944624],"bearing_before":55,"type":"arrive"},"weight":0,"distance":0,"name":""}],"distance":24.9,"duration":33.3,"summary":"","weight":33.3}],"distance":16547.6,"duration":1031.5,"weight_name":"routability","weight":1031.5}],"tracepoints":[null,{"alternatives_count":0,"waypoint_index":0,"matchings_index":0,"location":[2.429407,48.917239],"name":"","distance":6.056694,"hint":"YXZjgP___38MAAAAYAAAAAAAAAAAAAAAvsPBQZbcJ0MAAAAAAAAAAAwAAABgAAAAAAAAAAAAAADGGwAA3xElAPdq6gKQESUA52rqAgAAbxU2-8d8"},{"alternatives_count":0,"waypoint_index":1,"matchings_index":0,"location":[2.339494,48.925668],"name":"","distance":0.708887,"hint":"d0QAgP___38nAAAAKQAAAHwCAAAAAAAASy16QuhdAkBsgI5EAAAAACcAAAApAAAAfAIAAAAAAADGGwAAprIjAOSL6gKgsiMA34vqAhAAbxM2-8d8"},{"alternatives_count":0,"waypoint_index":2,"matchings_index":0,"location":[2.288037,48.93614],"name":"","distance":0.3644,"hint":"ZfoIgP___39NAAAAcgAAAAAAAABkAAAAfL8aQ3FXlEIAAAAAiZlIQ00AAAByAAAAAAAAAGQAAADGGwAApekiAMy06gKj6SIAz7TqAgAAzw82-8d8"},{"alternatives_count":0,"waypoint_index":3,"matchings_index":0,"location":[2.273654,48.933773],"name":"","distance":0.792135,"hint":"SPoIgP___38VAAAAGwAAAEcAAAC6AAAASrAsQtnXLUH24w5D-c65QxUAAAAbAAAARwAAALoAAADGGwAAdrEiAI2r6gJ4sSIAlKvqAgUAjxE2-8d8"},{"alternatives_count":0,"waypoint_index":4,"matchings_index":0,"location":[2.253291,48.942008],"name":"<NAME>","distance":3.690875,"hint":"BO5QgP___38BAAAAAgAAACoAAAAIAAAArjv_P7ZczD_vlgNClYhLQQEAAAACAAAAKgAAAAgAAADGGwAA62EiALjL6gISYiIAzcvqAgIADxQ2-8d8"},{"alternatives_count":0,"waypoint_index":5,"matchings_index":0,"location":[2.252755,48.942524],"name":"<NAME>","distance":3.121687,"hint":"Qe5QgP___38AAAAAGgAAAAAAAAAAAAAA1ZHMPjJcH0IAAAAAAAAAAAAAAAAaAAAAAAAAAAAAAADGGwAA018iALzN6gL2XyIAzM3qAgAAbxQ2-8d8"},{"alternatives_count":0,"waypoint_index":6,"matchings_index":0,"location":[2.251472,48.94376],"name":"<NAME>","distance":0.492756,"hint":"-PsIADP7CIAnAAAAOQAAAAYAAAAAAAAAc9hwQnQErELbSxdBAAAAACcAAAA5AAAABgAAAAAAAADGGwAA0FoiAJDS6gLWWiIAktLqAgEALwk2-8d8"},{"alternatives_count":0,"waypoint_index":7,"matchings_index":0,"location":[2.248898,48.946362],"name":"<NAME>","distance":2.267745,"hint":"r_JQgP___38AAAAACAAAAAAAAAAbAAAAZ-OuPpeYNkEAAAAAqR4jQQAAAAAIAAAAAAAAABsAAADGGwAAwlAiALrc6gKkUCIAtdzqAgAAjxU2-8d8"},{"alternatives_count":2,"waypoint_index":8,"matchings_index":0,"location":[2.245359,48.94442],"name":"<NAME>","distance":0.367964,"hint":"3vsIAHryUIAbAAAAWAAAABEAAAAAAAAAYmGSQYpgakI2xjRBAAAAABsAAABYAAAAEQAAAAAAAADGGwAA70IiACTV6gLzQiIAItXqAgEAzwI2-8d8"},{"alternatives_count":10,"waypoint_index":9,"matchings_index":0,"location":[2.239259,48.944745],"name":"","distance":23.213578,"hint":"tvBQALnwUIA5AAAAcAAAAAAAAAAaAAAA7a6-QatEOkIAAAAAflItQTkAAABwAAAAAAAAABoAAADGGwAAGysiAGnW6gJfKiIAEdfqAgAAXwM2-8d8"},{"alternatives_count":74,"waypoint_index":10,"matchings_index":0,"location":[2.23901,48.944624],"name":"","distance":2.747044,"hint":"tvBQgLnwUAACAAAApwAAAAAAAAAaAAAACuGUP7V6ikIAAAAAflItQQIAAACnAAAAAAAAABoAAADGGwAAIioiAPDV6gIMKiIABNbqAgAAXwM2-8d8"}]} ``` Some of the returned node-id's: 630287702: in the US 4273721568: doesn't exist 1752957152: Canada 8504959872: Mali ... I read about an overflow bug. But I thought fixing my version would keep my service stable. Seems like it doesn't. Hope there's an easy solution for this, because I can't end up having this issue on and on in a production environment... username_2: @username_0 The fix for correctly numbering node IDs only exists in v5.25.0: https://github.com/Project-OSRM/osrm-backend/master/CHANGELOG.md#L11 username_0: So: - preprocess the files with 5.25.0 - when setting up the service, specify 5.25.0 Can I rest assured that this won't happen again ? Because v5.24.0 used to work... username_2: @username_0 v5.24.0 still works if you use old OSM data - however OSM now has node IDs that are larger than OSRM 5.24.0 can handle, so you need to update to 5.25.0. The fix included in 5.25.0 doubled the space we allocate for node ID integers, so if you look at these charts, you can project when we'll hit the new limit and will need to bump this ID field size again: ![image](https://user-images.githubusercontent.com/1892250/119504229-9fa62100-bd20-11eb-87a2-6ca954fccdf5.png) ![image](https://user-images.githubusercontent.com/1892250/119504115-8604d980-bd20-11eb-9de0-3b72c30b3aac.png) username_0: To be sure, I'd do it like this: ``` docker pull osrm/osrm-backend:v5.25.0 mkdir car mv 210405_france-latest.osm.pbf ./car/ sudo docker run -t -v "${PWD}/car/:/data/car" osrm/osrm-backend:v5.25.0 osrm-extract -p /opt/car.lua /data/car/210405_france-latest.osm.pbf sudo docker run -t -v "${PWD}/car/:/data/car" osrm/osrm-backend:v5.25.0 osrm-partition /data/car/210405_france-latest.osrm sudo docker run -t -v "${PWD}/car/:/data/car" osrm/osrm-backend:v5.25.0 osrm-customize /data/car/210405_france-latest.osrm sudo docker run -t -i -p 5000:5000 -v "${PWD}/car/:/data/car" osrm/osrm-backend:v5.25.0 osrm-routed --algorithm mld /data/car/210405_france-latest.osrm ``` username_2: @username_0 Yes, that looks reasonable. username_0: ok, closing, thx for helping me out Status: Issue closed
networknt/json-schema-validator
537906113
Title: why schema keyword value can not be a path? Question: username_0: I have recently used json schema to help validate json data. but i found that its hard to validate cascade relation. for example: ``` json "ageRange": { "minAge": 20, "maxAge": 30 } ``` I want to limit 18<minAge<maxAge<50. its hard to limit minAge<maxAge. Because keyword *minimum*'s value most be number. why schema value can not be path?like this: ``` json "maxAge": { "minimum": "$.ageRange.minAge", "exclusiveMaximum": 50 } ``` do you have any idea to implement this? Answers: username_1: @username_0 I think your case is a good one; however, the JSON schema specification doesn't support it yet. Would you be able to open the same issue to ask if they can adjust the specification? https://github.com/json-schema-org/json-schema-spec username_0: I have implemented in keyword minimum maximum,can you give some review? username_0: sorry,I just remember I coded use company computer maybe I cannot push it now . sorry username_1: Yes. Please submit a PR. Meanwhile, we'd better open an issue in the json-schema-spec so that it can be a standard. Thanks. username_2: It's being discussed here with $data and relative json pointer. Not made it to the spec yet though. https://github.com/json-schema-org/json-schema-spec/issues/51 username_1: @username_2 Thanks a lot for the spec link. It is quite a read :) I think it is not hard to implement it and I am looking forward to merging it into the spec. @username_0 If we are doing something, the above discussion might guide us in the right direction. username_1: Given this has been discussed intensively, I am closing this issue. Thank you, everyone, for the inputs. Status: Issue closed
aeonix/aeon
613448664
Title: Can't open old wallet file Question: username_0: I have a wallet file from Dec 16 2017 which I cannot open and I am certain it used to work. I guess the format has changed but I am not sure. The wallet exists as a single file in my home directory. The current version of Aeon cannot open it. I have built Aeon myself from source. This is the output when trying to open my wallet file... ``` Aeon 'Aletheia' (v0.13.1.0-4dd206ab) Logging to aeon-wallet-cli.log Specify wallet file name (e.g., MyWallet). If the wallet doesn't exist, it will be created. Wallet file name (or Ctrl-C to quit): john-aeon-wallet-x Error: Key file not found. Failed to open wallet: "john-aeon-wallet-x". Exiting. ```<issue_closed> Status: Issue closed
tidyverse/ggplot2
436558866
Title: Adding limits that contain all data to a histogram generates incorrect missing values warning (due to #2719) Question: username_0: This seems to have been introduced in 3.1.1 - it does not occur in 3.0.0. I haven't verified completely but I believe it's due to #2719. Adding scale_x_continuous/xlim to specify limits where all data are contained results in a warning that is incorrect (as no data are missing). The problem is the *transformed* data (i.e. the bars, xmin/xmax etc) are outside the specified limits, as they are computed using the limits (+/- width). This makes the error quite confusing! The plot is fine - no data are missing as the missing bars contain no data. ``` r library(ggplot2) #> Warning: package 'ggplot2' was built under R version 3.5.2 set.seed(2) d <- data.frame(x=runif(1000)) g <- ggplot(d, aes(x=x)) + geom_histogram(bins=10) g ``` ![](https://i.imgur.com/1hr4AIC.png) ``` r g + scale_x_continuous(limits=c(-1,2)) #> Warning: Removed 2 rows containing missing values (geom_bar). ``` ![](https://i.imgur.com/9gcudeq.png) Created on 2019-04-24 by the [reprex package](https://reprex.tidyverse.org) (v0.2.1) Answers: username_1: @username_2 this seem to be an unforeseen consequence of your fix. The only way I can see this being fixed is by having `StatBin` remove zero-count bins, and I think that will have a lot of unforeseen consequences... alternatively we'd need to subclass `GeomBar` for `geom_histogram()` to do some extra work... Thoughts..? username_2: I have a number of thoughts: 1. This is broadly related to #2887, which we just decided we won't fix at this time. 2. Not sure what the use case is here, but I assume setting limits via `coord()` would be more appropriate here. Also, it's generally a good idea to set `binwidth` and `center` manually for `geom_histogram()`. We may need to write a tutorial to explain how to get histograms behave nicely. 3. An argument could be made that if only the requested bin number is set (via `bins`) **and** explicit limits are set in the scale, then the bins should be chosen such that they fit exactly into the scale limits. This is a fairly easy change to `stat_bin()` that somebody could implement. Below I show two examples of how one could plot this example without generating a warning. The first is my preferred solution in this case, because it exactly matches the bins to the known data range. (And it works even if limits are set to 0 to 1). ``` r library(ggplot2) set.seed(2) d <- data.frame(x=runif(1000)) ggplot(d, aes(x=x)) + geom_histogram(binwidth = 0.2, center = 0.1) + scale_x_continuous(limits = c(-1, 2)) ``` ![](https://i.imgur.com/4UbyfW5.png) ``` r # this is not entirely appropriate, because the # bins extend beyond the known data range of [0, 1) ggplot(d, aes(x=x)) + geom_histogram(bins = 10) + coord_cartesian(xlim = c(-1, 2)) ``` ![](https://i.imgur.com/KB1PHaP.png) <sup>Created on 2019-04-24 by the [reprex package](https://reprex.tidyverse.org) (v0.2.1)</sup> Status: Issue closed username_1: Ok, as this is covered by #2887 I'm closing username_0: Thanks! Agreed that #2887 covers this -apologies for missing that one! I gave the student that spotted this the `coord` solution before posting. Agreed that clearly documenting how the limit filtering applies and how coord limiting may be more suited is a decent way to go, as it definitely feels like a change would break things 😀 I wonder if it’s possible to improve the warning? I’ll take a look and see if anything jumps out. Might be able to use Inf for out of scale and add to handle_na?
OpenNMT/OpenNMT-py
339316028
Title: TypeError: _compute_loss() missing 1 required positional argument: 'copy_attn' Question: username_0: Type error while using -copy_attn during training. Answers: username_1: Please post your command line username_0: python train.py -data data/finetune -save_model finetune -train_from original_model_same_vocab_acc_89.88_ppl_3.09_e11.pt -epochs 150 -layers 4 -encoder_type transformer -decoder_type transformer -adagrad_accumulator_init 0.1 -copy_attn -copy_loss_by_seqlength -bridge Traceback (most recent call last): File "train.py", line 141, in <module> main(opt) File "train.py", line 125, in main trainer.train(train_iter_fct, valid_iter_fct, opt.start_epoch, opt.epochs) File "/home/username_0-k/OpenNMT-py/onmt/trainer.py", line 132, in train train_stats = self.train_epoch(train_iter, epoch) File "/home/username_0-k/OpenNMT-py/onmt/trainer.py", line 190, in train_epoch report_stats, normalization) File "/home/username_0-k/OpenNMT-py/onmt/trainer.py", line 321, in _gradient_accumulation trunc_size, self.shard_size, normalization) File "/home/username_0-k/OpenNMT-py/onmt/utils/loss.py", line 144, in sharded_compute_loss loss, stats = self._compute_loss(batch, **shard) TypeError: _compute_loss() missing 1 required positional argument: 'copy_attn' username_1: you are not on master, git pull and try again. username_0: Did pull still getting the same error. `python train.py -data data/finetune -save_model finetune -train_from original_model_same_vocab_acc_89.88_ppl_3.09_e11.pt -train_steps 150000 -layers 4 -encoder_type transformer -decoder_type transformer -copy_attn` Traceback (most recent call last): File "train.py", line 40, in <module> main(opt) File "train.py", line 27, in main single_main(opt) File "/home/username_0-k/OpenNMT-py/onmt/train_single.py", line 128, in main opt.valid_steps) File "/home/username_0-k/OpenNMT-py/onmt/trainer.py", line 175, in train report_stats) File "/home/username_0-k/OpenNMT-py/onmt/trainer.py", line 287, in _gradient_accumulation trunc_size, self.shard_size, normalization) File "/home/username_0-k/OpenNMT-py/onmt/utils/loss.py", line 143, in sharded_compute_loss loss, stats = self._compute_loss(batch, **shard) TypeError: _compute_loss() missing 1 required positional argument: 'copy_attn' username_2: It only happens when you set `-train_from` right? Could you provide the command you used for the initial training so we can reproduce. Thx. username_0: `python train.py -data data/original_data -save_model original_model` for original training `python preprocess.py -train_src data/new_headline.train -train_tgt data/new_title.train -valid_src data/new_headline.test -valid_tgt data/title.test -share_vocab -dynamic_dict -save_data data/finetune -src_vocab data/vocab.txt` for preprocessing the data. username_1: you did not specify an encoder / decoder type to start with ? => it was a rnn by default. Status: Issue closed
aws-amplify/amplify-android
1076724387
Title: How to get list of items with IDs in a list by graphQL Query Question: username_0: ### Before opening, please confirm: - [X] I have [searched for duplicate or closed issues](https://github.com/aws-amplify/amplify-android/issues?q=is%3Aissue+) and [discussions](https://github.com/aws-amplify/amplify-android/discussions). ### Language and Async Model Kotlin ### Amplify Categories GraphQL API, DataStore ### Gradle script dependencies <details> ```groovy // Put output below this line ``` </details> ### Environment information <details> ``` # Put output below this line ``` </details> ### Please include any relevant guides or documentation you're referencing _No response_ ### Describe the bug How can I get all Todos where ID exists in an arraylist? I couldn't find a working example in the Doc or here. This is what I tried thus far but I get the error on the 'type.OR' in the Query.list() constructor. `Amplify.API.query( ModelQuery.list(InterviewDB::class.java,InterviewDB.ID.eq(QueryPredicateGroup.Type.OR,listOfIds), ModelPagination.limit(1000)),{ //do stuff },{ //handle exception }) I was inspired by this snippet of code from a similar issue in aws-IOS: let arrayOfUserIds = ["UserId1", "UserId2", "UserId3", "UserId4", "UserId5" ] [Truncated] ### amplifyconfiguration.json _No response_ ### GraphQL Schema <details> ```graphql // Put your schema below this line ``` </details> ### Additional information and screenshots _No response_ Answers: username_1: Please refer to issue: https://github.com/aws-amplify/amplify-android/issues/1420 username_0: Hi, the answer doesn't really satisfy my question. I'm looking for a way to easily perform something like `query(citiesRef, where('country', 'in', ['USA', 'Japan']));` in firebase . The answer only gives the possibility for two values. username_1: I @username_0 I beleive this should work: Amplify.API.query( ModelQuery.list(InterviewDB.class, InterviewDB.country.contains("USA").or(InterviewDB.country.contains("Japan")), response -> { Log.i(TAG, "Query response: " + response) }, error -> { Log.e(TAG, "Query failed.", error) } ); username_0: Thing is I don't necessarily know the arraysize of the values I want to compare. that's why this approach is not working. Isn't there a containsAny( arrayListOfValues) option? username_1: @username_0 unfortunately we don't have a convenient method to support it right now. As a work around to achieve this some users have chained the query predicate dynamically looping through the list of values. I will put in a feature request for it and hopefully we will implement it soon. username_0: I have done the same but I wanted something much more efficient and simpler. Thank you @username_1 , hope you guys work on the android SDK more, it's very much lackluster. Status: Issue closed
BAmercury/PotterMagicLantern
654431973
Title: change animation switch or button Question: username_0: add either a toggle button or a three way switch to button of lamp so I can change between the types of animations Answers: username_0: I have it set so the user can strike the lamp to ch ange through the animations (aside from the default fire animation - which can be toggled using a switch at the bottom of the lamp. It might be smarter to be able to cycle through all animations through striking so I might change it to that. But first I need to change all the code to stop using the delay() blocking function to setting the timer values using millis() username_0: Referencing this code to update the Adafruit animations: https://github.com/ndsh/neopixel-without-delay username_0: Now you can cycle throw all animations by striking the lamp, removed the whole toggle switch thing. Convereted the animations to use millis instead of delay - but the fire animation will need to be re-written to work like this username_0: Fire animation is rewritten to work with the millis(), but the issue is the update rate. It's animating too fast so need to tune an appropriate time interval to make it look realistic and smooth username_0: lol I'm dumb - found the issue: ```c // Read animation files from Colors.h and convert to RGB. Output to Neopixel strip void animate_led() { uint8_t h, s, l; hsl_values hsl; for (int i = 0; i < sizeof(colors) / 3; ++i) { //Read the the next color from progmem h = pgm_read_byte_near(colors + (i * 3)); s = pgm_read_byte_near(colors + (i * 3) + 1); l = pgm_read_byte_near(colors + (i * 3) + 2); hsl.h = (float)h / 255; hsl.s = (float)s / 255; hsl.l = (float)l / 255; rgb_values rgb; rgb = hsl2RGB(hsl); for (int j = 0; j <= NUM_LED; j++) { strip.setPixelColor(j, rgb.red, rgb.blue, rgb.green); } strip.show(); // Updates every 33 ms (Can use this to control framerate of animation) //delay(33); } } ``` The for loop is an issue, need to rewrite this function to support millis() username_0: was able to fix the issue by removing the for loop and writing 2 new functions. one iterates through the colors array and one keeps track of the index and updates the LEDs. seems to be working now. may need to edit the time period value later but seems to be okay for now Status: Issue closed
DeadlyBunny24/ExGarciaSum
205954649
Title: Como va su estilo de programación? Question: username_0: Estimado <NAME>, Vamos bien, pero me preocupa el estilo de programación. Yo espero ver que el estilo sea según las convenciones de codificación de SUN. Entonces favor enviarme el reporte de Checkstyle del código tal como se encuentra hasta el momento para poder ver como estamos con el estilo de programación. Favor configurar el Checkstyle para que reporte sobre el estilo propuesto por SUN. Answers: username_1: [reporte.txt](https://github.com/username_1/ExGarciaSum/files/758535/reporte.txt)
IlyaGrigo/colmaracademy_review
260119343
Title: Great job with code organization! Question: username_0: I really like the use of indentation and whitespace! Makes your code very easy to read! Keep up the great work. Your use of comments also makes it very clear and easy to navigate the appropriate sections quickly. https://github.com/IlyaGrigo/colmaracademy_review/blob/master/index.html#L1
smith-chem-wisc/MetaMorpheus
376075061
Title: cleavable crosslinker with DissociationType Question: username_0: Need to consider the situation of cleavable crosslinker with DissociationType. For example, HCD cleavable DSSO under ETD condition. The function 'XlGetTheoreticalFragments' is wrong. Answers: username_1: @username_0 is this implemented now? username_0: Fixed in pull request #1664 Status: Issue closed
paulo-herrera/PyEVTK
655848099
Title: time variable data Question: username_0: Hello community! I'm using the function "unstructuredGridToVTK" with relative success in steady state models, where Heads are unique for each node. My problem comes in transient models, 'cause I haven't found yet the right form of the array "pointData" to make it variable in time. Can anybody help me please? @username_1 ? Regards Bruno Answers: username_1: Hello Bruno: Thanks for posting your question here. You can look at the example: https://github.com/username_1/PyEVTK/blob/master/src/examples/group.py which shows how to link several .vtu files from a single group file that contains time information. I have used this option in the past without any problem. Good luck, Paulo username_0: Thanks professor! It worked wonderfully.. I really appreciate your advice. Thanks! Bruno
uon-coehm/IQCare
288591575
Title: Pyschosocial Enrollment form: Add Patient assessment fields Question: username_0: Add the following patient assessment fields under Psychosocial Assements: 1. A select field `Feeling sad, depressed or low?` with YES and NO options 2. A select `Lacked pleasure or interest` with YES and NO options 3. A multi-select `Complaints` with the following options - Restlessness - Worry - Forgetfulness - Memory loss - Concentration - Confusion - Disorientation - Hearing voices - Seeing things - Sleep problems - Eating problems - Suicidal - None - Other odd feelings (Specify)<issue_closed> Status: Issue closed
Aidbox/Issues
973623926
Title: Display user roles in __debug=policy Question: username_0: ## Problem I need to solve It will improve debugging experience ## Solution I see Display role array along with user/client. Answers: username_1: @username_0 do you mean Role resource or something else? username_0: Yes, it's about Role resource. username_0: Just want to add few details. If User has some Role related resources, one policy in that case will be evaluated for every Role. And now __debug=policy shows: ```json "policies": [ { "eval-result": false, "errors": [ { ... } ], "id": "patient", "policy-id": "patient" }, { "eval-result": false, "errors": [ ... } ], "id": "patient", "policy-id": "patient" }, ] ``` And it's absolutely impossible to understand for what Role resource the policy was evaluated. My suggestion is to add `role` for each policy. Expected: ```json "policies": [ { "eval-result": false, "errors": [ { ... } ], "id": "patient", "policy-id": "patient", "role": { "id": "role-1", ...role info } }, { "eval-result": false, "errors": [ ... } ], "id": "patient", "policy-id": "patient", "role": { "id": "role-2", ...role info } }, ] ```
NetzkyBot/test-repo
240809955
Title: Test issue 29 Question: username_0: Probando los errores! ``` Traceback (most recent call last): File "<pyshell#0>", line 1, in <module> print(potato) NameError: name 'potato' is not defined ``` Status: Issue closed Answers: username_0: Gracias bot! username_0: Probando los errores! ``` Traceback (most recent call last): File "<pyshell#0>", line 1, in <module> print(potato) NameError: name 'potato' is not defined ``` username_0: La estoy abriendo para que quede más visible.
giampaolo/psutil
34182836
Title: Support GNU/Hurd Question: username_0: _From [sandro.tosi](https://code.google.com/u/sandro.tosi/) on June 08, 2012 00:06:35_ ``` Hello, i'm reporting here the Debian bug: http://bugs.debian.org/676450 >>> Currently python-psutil fails to build on Debian GNU/Hurd. (Doesn't recognize gnu system). Attached is a patch for building on Hurd. <<< Cheers, Sandro ``` **Attachment:** [pypsutil-hurd.diff](http://code.google.com/p/psutil/issues/detail?id=276) _Original issue: http://code.google.com/p/psutil/issues/detail?id=276_ Answers: username_0: Closing out as outdated. Status: Issue closed username_1: hey @username_0 this still happens with 5.4.1: https://buildd.debian.org/status/fetch.php?pkg=python-psutil&arch=hurd-i386&ver=5.4.1-1&stamp=1511665962&raw=0 ``` for python in python2.7 python3.6; do \ $python setup.py build; \ $python-dbg setup.py build; \ done platform gnu0 is not supported /<<PKGBUILDDIR>>/debian/rules:10: recipe for target 'override_dh_auto_build' failed make[1]: *** [override_dh_auto_build] Error 1 make[1]: Leaving directory '/<<PKGBUILDDIR>>' /<<PKGBUILDDIR>>/debian/rules:7: recipe for target 'build-arch' failed make: *** [build-arch] Error 2 dpkg-buildpackage: error: debian/rules build-arch subprocess returned exit status 2 ``` should this issue be re-opened? username_2: Well, yes this is still happening :) and blocking the availability of a hundred packages in Debian. Hurd's /proc implementation is mostly following Linux' shape, so probably much of the Linux code can be just called as such. username_0: What happens if you fix setup.py and run tests?
openssl/openssl
782234081
Title: AUTHORS is out of date Question: username_0: Many project members are missing. Answers: username_1: Has this list been maintained manually, or is there already some script to update the list from, say, the git log? username_2: @username_1, it's maintained manually username_1: So what exactly are the criteria for being mentioned in the AUTHORS file? There does not seem to be an official policy for it. I think we should discuss that in our next OTC meeting. For the moment, I believe that at least all (former and current) Committers should be in the list. I added the missing ones in #14029. username_0: Historically that had been the policy. Status: Issue closed
coopcycle/coopcycle-app
882126628
Title: When the customer makes a delivery request for a package or a package, we have the option of sending him a payment link by email. It does not work. Question: username_0: When the customer makes a delivery request for a package or a package, we have the option of sending him a payment link by email. It does not work. ![bug lien de paiement](https://user-images.githubusercontent.com/83431363/117568680-fcb79b00-b0c1-11eb-89b4-7942e99f90ee.png)
fhamborg/Giveme5W1H
806620257
Title: Having trouble running examples/possible geopy issue Question: username_0: I am getting the following error message when running parse_single_from_code.py: Traceback (most recent call last): File "/Users/annaablove/Desktop/py_test/who.py", line 53, in <module> extractor = MasterExtractor() File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/Giveme5W1H/extractor/extractor.py", line 60, in __init__ environment_extractor.EnvironmentExtractor(), File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/Giveme5W1H/extractor/extractors/environment_extractor.py", line 55, in __init__ self.geocoder = Nominatim(domain=host, timeout=8) File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/geopy/geocoders/nominatim.py", line 103, in __init__ Nominatim(user_agent="my-application") File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/geopy/geocoders/nominatim.py", line 103, in __init__ Nominatim(user_agent="my-application") File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/geopy/geocoders/nominatim.py", line 103, in __init__ Nominatim(user_agent="my-application") [Previous line repeated 487 more times] File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/geopy/geocoders/nominatim.py", line 92, in __init__ super().__init__( File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/geopy/geocoders/base.py", line 242, in __init__ self.adapter = adapter_factory( File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/geopy/adapters.py", line 335, in __init__ self.session = requests.Session() File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/requests/sessions.py", line 367, in __init__ self.headers = default_headers() File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/requests/utils.py", line 821, in default_headers return CaseInsensitiveDict({ File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/site-packages/requests/structures.py", line 46, in __init__ self.update(data, **kwargs) File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/_collections_abc.py", line 854, in update if isinstance(other, Mapping): File "/Library/Frameworks/Python.framework/Versions/3.9/lib/python3.9/abc.py", line 98, in __instancecheck__ return _abc_instancecheck(cls, instance) RecursionError: maximum recursion depth exceeded in comparison I think it has something to do with Nominatim.py, but I am inexperienced with geopy and am having trouble isolating the problem. Any help would be appreciated, thanks! **Versions (please complete the following information):** - OS: MacOS 10.15.6 - Python Version 3.9 Answers: username_1: TL;DR you can probably do ``` pip3 uninstall geopy pip3 install geopy==1.13.0 ``` For repo authors, I think `setup.py` needs to be updated - I'm seeing that `setup.py` will install `geopy>=1.11.0`, which gives me `2.1.0`. However, it seems like geopy versions > 2.0 introduces breaking changes. I saw that in `requirements.txt` that `geopy==1.13.0` is specified. I think `setup.py` needs to be updated accordingly.
TabakoffLab/PhenogenCloud
549759399
Title: Uncaught TypeError: Cannot read property 'getAttribute' of undefined Question: username_0: View details in Rollbar: [https://rollbar.com/username_0/PhenogenCloud/items/411/](https://rollbar.com/username_0/PhenogenCloud/items/411/) ``` TypeError: Cannot read property 'getAttribute' of undefined File "https://cdnjs.cloudflare.com/ajax/libs/rollbar.js/2.8.1/rollbar.min.js", line 1, in [anonymous] ```<issue_closed> Status: Issue closed
rstudio/gt
382053890
Title: For all `fmt*()` fcns, have option to include/exclude processing of NA rows Question: username_0: There should be better `NA`-handling options for all the `fmt*()` fcns. Currently, this example: ```r sza %>% filter(latitude == 20) %>% filter(!is.na(sza)) %>% spread(key = "tst", value = sza) %>% gt(rowname_col = "month") %>% fmt_number(columns = TRUE, decimals = 2) %>% fmt_missing(columns = TRUE, missing_text = "") ``` will not replace the `NA` values because the `fmt_number()` will include all rows in the stored function (regardless of `NA` or not) and so `fmt_missing()` will have no rows to format.<issue_closed> Status: Issue closed
AllTheMods/ATM3-Expert
523258342
Title: Conjuration Catalyst halves osmium ingots instead of doubling Question: username_0: ![Screen Shot 2019-11-15 at 12 10 02 AM](https://user-images.githubusercontent.com/52458743/68918503-63156480-073c-11ea-8284-bca5f6acc95f.png) Should be doubling the osmium, but instead it wastes mana to half an ingot<issue_closed> Status: Issue closed
flutter/flutter
615335876
Title: Draggables should have onDragUpdate feedback returning currrent drag offset Question: username_0: ## Use case Currently `Draggable` and `LongPressDraggable` widgets can provide `onDragStarted`, `onDragEnd`, `onDragComplete` and `onDragCancel` callbacks, which can trigger some actions on start or end of drag movement. Those are very great, but there's no option to track current `feedback`'s offset during drag, before the release of finger. ## Proposal `Draggable` and `LongPressDraggable` widgets should provide the ability to set `onDragUpdate` callback, which will receive `details` with `globalPosition` and `delta` included -- the same as `GestureDetector`'s `onPanUpdate`. Desired behavior: ``` LongPressDraggable( feedback: Icon(Icons.add), onDragEnd: (details) => print(details.offset.dx), onDragUpdate: (details) => print(details.offset.dy) // <-- this is impossible now ); ``` <!-- Briefly but precisely describe what you would like Flutter to be able to do. Consider attaching images showing what you are imagining. Does this have to be provided by Flutter directly, or can it be provided by a package on pub.dev/flutter? If so, maybe consider implementing and publishing such a package rather than filing a bug. --> Answers: username_1: duplicate of [#39490](https://github.com/flutter/flutter/issues/39490) Status: Issue closed
ziglang/zig
728364955
Title: Proposal: Introduce safety-checked UB for reinterpreting `invalid` bit patterns via `@bitCast`, pointers, and untagged unions Question: username_0: ### Status-quo The Zig type system allows for types that don't cleanly fit into bytes / registers. These have `invalid` bit patterns, which do not map to a value of the specified type. For example: - `*T` cannot be `0`, the bit pattern that `?*T` uses to represent `null`. - `enum {a, b, c}` needs more than 1 bit but less than 2 bits to be represented. The only way to produce invalid values of these types is via `reinterpretation operations`, which include: - accessing an inactive non-tagged-union field (without runtime safety) - `@bitCast`, `@ptrCast`, `@intToPtr` - dereferencing a pointer-to-non-exhaustively-valued-type (i.e. to a type which has `invalid` values), f.e. to an address that was initialized as a different type, or that was not yet initialized. As far as I could find, the documentation doesn't state how `invalid` values are handled in the general case. Instead every checked, value-preserving cast is individually documented to trigger safety-checked undefined behaviour if the input value is out-of-bounds for the target type. ### This proposal Formally specify that constructing an invalid value is _always immediately_ illegal (safety-checked undefined) behaviour. I wrote [a couple of example scenarios](https://gist.github.com/username_0/a5e28f6518fc75780493d48f4a4d959d). These conditions are currently not checked (I can create individual issues if the general idea is accepted): - [ ] Accessing an inactive field of an untagged union can create null values. - [ ] Dereferencing a pointer-to-non-exhaustively-valued-type can read an invalid value of that type. - [ ] `@bitCast`-ing to a `packed struct/union` with fields of non-exhaustively-valued types can lead to those field's values being invalid. (Essentially every potentially-new read of a `packed struct/union` field of such a type needs to be checked.) As a special case, stage1 currently always loads all integer types by exactly their bit width (masking all other bits off), even when they have byte alignment (i.e. outside of `packed` types) and even when runtime safety is disabled. This ensures `valid` values by hiding the actual invalid bit pattern, potentially hiding semantic errors. I feel like this is also the wrong default, and worth reconsidering. Answers: username_1: What are the reasons for not "doing something radical" and rejecting all of this at compile time (including untagged unions with sparse fields unless they share the same legal bit patterns)? I suppose I am missing some obvious C interop reasons... username_0: @username_1 I'm not personally opposed to these "radical" changes/limitations either. I wanted to clarify that I wouldn't blindly set them in stone up front though, since - they severely limit some forms of composition. This means if someone (besides me) has a use case that benefits from them, the program logic of their new workaround might get unreasonably complex, which can more easily lead to (or hide) mistakes. - I'm pretty most "solutions" only reduce some error surfaces, while not solving the issue. Maybe they're still valuable enough, but harder to justify, and might require assessment of the remaining risk even if implemented. - As an example, say `const E = enum(u8){a, b}`. If we outlaw `@ptrCast` from `*E` to `*u8` because `u8` has a valid bit pattern that is invalid for `E`, this only secures us against misuse for variables statically allocated as `E`. As soon as you allocate memory dynamically, you're originally dealing with addresses / pointers to bytes, which can by nature represent all bit patterns. So if our source is `*u8`, an error there can still write any bit pattern it wants, or give out the same address as different pointer types, etc. username_1: @username_0 , I had not thought about heap allocation... Looking at `mem.zig` and `Allocator.zig`: `Allocator.create(...)` is defined to return a pointer to undefined memory (line 300 https://github.com/ziglang/zig/blob/master/lib/std/mem/Allocator.zig). Using this before defining it is UB (though not tracked at compile-time). What if `@ptrCast(SparseType, valueOfLessSparseType)` somehow were defined to return a pointer to `undefined` memory _and_ :) the compiler ensured that `undefined`s are always defined before use? As soon as you cast a non-sparse to a sparse, you are really overpromising, so if such a cast were considered to `undefine` the cast memory (only as a compile-time flag, no runtime effect), that promise would go away. If the only use-cases for this were such that a compile-time-enforced define after such a cast were acceptable... MMIO (and optimal deserialization etc) will have to use non-sparse types (I suspect `extern struct` is enforcing this already, but `packed structs` are not, they accept sparse types as members and can take part in ptrCasts). username_0: Just to be clear, this proposal is (intentionally) _not_ about `undefined`, but the concept of `invalid` values from other sources. `undefined` could also be considered such a source, but currently has many implications beyond that. I'd like not to sidetrack too much, but to quickly address your ideas @username_1 : - `undefined` in general is a compile-time optimization concept we're currently adopting from LLVM. It's only "tracked" at compile-time and the compiler basically elides computations it can prove aren't deterministic anyway (because one of the inputs is undefined). As such it's similar to optimizing based on "undefined behaviour", but doesn't always imply it; #1947 gives a bit more details. - > What if [casting to a sparse type always] returned a pointer to undefined memory _and_ :) the compiler ensured that [it needs to be initialized] before use? That's an interesting idea. In order to check the validity of values, you already need to check with a `non-sparse` (`dense`) type.\ So in the usual scenario, instead of being able to go `I know I wrote a valid value as a u8` -> `cast the pointer` -> `read it as my enum`,\ you're instead required to always `check if the value is valid (whether you already know or not) and save the value` -> `cast the pointer` -> `write back the value through the casted pointer` -> `able to read it as my enum`. The only issue I really see with this approach are the ergonomics of `undefined` used for this job. Currently we put it into the LLVM IR and send that off to optimizations, without any feedback on how far LLVM analyzed/optimized it. This means if you misuse `undefined` even locally, and Zig's analysis doesn't catch you doing it, LLVM might optimize away some of your code without proper warning. So if we "solve" this use-case by referring to `undefined`, that means a normal `@ptrCast` that a developer forgot to validate may result in broken code if LLVM spotted it and Zig didn't. I'd really appreciate more feedback from more knowledgeable developers than us, but if this is deemed the right direction, then a type distinction `* undefined T` (as already proposed by @username_1 in https://github.com/ziglang/zig/issues/3328#issuecomment-715902461 ) might be the right avenue. But I think that's best discussed in its own issue, since `@ptrCast` is only the fourth bullet point of this more general proposal. username_2: Introducing a second concept of `invalid` seems interesting. I had not realized what LLVM does with `undefined`. At least to me, that makes use of `undefined` fairly undesirable since you do not really know what the LLVM machinery is going to do with it :frowning_face:. It sounds like `undefined` is closer to a synonym for unused. Would `invalid` actually need to have a keyword? Perhaps the compiler could assign every variable `invalid` status unless proven otherwise by a valid assignment/initialization? The challenge then becomes determining what a valid assignment/init is. Valid: - assignment from a literal since you can check that the literal is valid at compile time. - assignment from a function that returns that type (assuming the function itself makes a valid instance internally). - assignment from another valid variable (or part thereof such as an array/slice element). Possibly invalid: - anything involving a pointer cast. Unfortunately, I do not see how you will get valid data from an allocator. Maybe you also pass in an initialization function to the allocator? This would take an OO constructor and turn it inside out. Sorry for the incoherent rambling. Clearly I have not had enough caffeine yet. It does seems like this concept of valid vs. invalid/unproven valid is possibly very powerful and probably different from `undefined`.
firebase/quickstart-ios
232295087
Title: FCM notifications not working iOS 10.3 on Xcode 8.3 Swift 3 Question: username_0: I'm working on Xcode 8.3.2 for iOS 10.3.2 with Swift 3, my project use firebase cloud messaging, when my p12 certificates expired, I updated my certificates p12 to p8 as suggested Firebase's documentation, but the push notifications stopped coming, yesterday when I used the console firebase to test, it was working but today no, the logs print me this as normal: ``` 2017-05-30 10:13:23.932066-0400 lol[5576:1530669] WARNING: Firebase Analytics App Delegate Proxy is disabled. To log deep link campaigns manually, call the methods in FIRAnalytics+AppDelegate.h. 2017-05-30 10:13:23.949512-0400 lol[5576:1530669] Firebase automatic screen reporting is enabled. Call +[FIRAnalytics setScreenName:setScreenClass:] to set the screen name or override the default screen class name. To disable automatic screen reporting, set the flag FirebaseAutomaticScreenReportingEnabled to NO in the Info.plist 2017-05-30 10:13:24.368364-0400 lol[5576:1530669] [Crashlytics] Version 3.8.2 (118) 2017-05-30 10:13:24.397942-0400 lol[5576:1530669] [MC] System group container for systemgroup.com.apple.configurationprofiles path is /private/var/containers/Shared/SystemGroup/systemgroup.com.apple.configurationprofiles 2017-05-30 10:13:24.398433-0400 lol[5576:1530669] [MC] Reading from public effective user settings. initializeFCM Notification access accepted. 2017-05-30 10:13:24.679: <FIRInstanceID/WARNING> Failed to fetch APNS token Error Domain=com.firebase.iid Code=1001 "(null)" 2017-05-30 10:13:24.681: <FIRMessaging/INFO> FIRMessaging library version 1.2.0 2017-05-30 10:13:24.683213-0400 lol[5576:1530706] [Firebase/Crash][I-CRA000004] Successfully initialized 2017-05-30 10:13:24.683 lol[5576] <Notice> [Firebase/Crash][I-CRA000004] Successfully initialized 2017-05-30 10:13:24.685110-0400 lol[5576:1530706] <FIRAnalytics/INFO> Firebase Analytics v.3600000 started 2017-05-30 10:13:24.685 lol[5576:] <FIRAnalytics/INFO> Firebase Analytics v.3600000 started 2017-05-30 10:13:24.685438-0400 lol[5576:1530706] <FIRAnalytics/INFO> To enable debug logging set the following application argument: -FIRAnalyticsDebugEnabled 2017-05-30 10:13:24.685 lol[5576:] <FIRAnalytics/INFO> To enable debug logging set the following application argument: -FIRAnalyticsDebugEnabled "GCM TOKEN = Optional(\"it's working: PLEASE HELP GITHUB\")" "didRegisterForRemoteNotificationsWithDeviceToken: DATA" "*** deviceToken: <66666666 it's working: PLEASE HELP GITHUB 99999999>" 2017-05-30 10:13:24.837: <FIRInstanceID/WARNING> APNS Environment in profile: development "Firebase Token:" Optional("it's working: PLEASE HELP GITHUB") 2017-05-30 10:13:24.932076-0400 lol[5576:1530727] <FIRAnalytics/INFO> Firebase Analytics enabled 2017-05-30 10:13:24.932 lol[5576:] <FIRAnalytics/INFO> Firebase Analytics enabled "Connected to FCM." ``` the new strange log is: `2017-05-30 10:13:24.679: <FIRInstanceID/WARNING> Failed to fetch APNS token Error Domain=com.firebase.iid Code=1001 "(null)"` my code is: ``` import UIKit import FirebaseAnalytics import FirebaseInstanceID import FirebaseMessaging import UserNotifications @UIApplicationMain class AppDelegate: UIResponder, UIApplicationDelegate, UNUserNotificationCenterDelegate { var window: UIWindow? func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplicationLaunchOptionsKey: Any]?) -> Bool { // Override point for customization after application launch. self.initializeFCM(application) let token = FIRInstanceID.instanceID().token() debugPrint("GCM TOKEN = \(String(describing: token))") return true } func application(_ application: UIApplication, didFailToRegisterForRemoteNotificationsWithError error: Error) { debugPrint("didFailToRegisterForRemoteNotificationsWithError: \(error)") } func applicationReceivedRemoteMessage(_ remoteMessage: FIRMessagingRemoteMessage) { debugPrint("remoteMessage:\(remoteMessage.appData)") [Truncated] func applicationDidEnterBackground(_ application: UIApplication) { // Use this method to release shared resources, save user data, invalidate timers, and store enough application state information to restore your application to its current state in case it is terminated later. // If your application supports background execution, this method is called instead of applicationWillTerminate: when the user quits. } func applicationWillEnterForeground(_ application: UIApplication) { // Called as part of the transition from the background to the active state; here you can undo many of the changes made on entering the background. } func applicationDidBecomeActive(_ application: UIApplication) { // Restart any tasks that were paused (or not yet started) while the application was inactive. If the application was previously in the background, optionally refresh the user interface. connectToFcm() } func applicationWillTerminate(_ application: UIApplication) { // Called when the application is about to terminate. Save data if appropriate. See also applicationDidEnterBackground:. } } ``` Answers: username_1: Hi Dennis, a couple of things to try: #### In your `initializeFCM()` function please call `FIRApp.configure()` first. Otherwise Messaging hasn't had a chance to initialize properly. #### Please update to the latest iOS SDK, it is much more Swift 3 compliant There are several changes to improve the integration experience for you. For example: * You no longer need to decide whether or not the app is in its release version, we will try and auto-detect it. You can simply write: `Messaging.messaging().apnsToken = deviceToken` * You do not need to manually call `connectToFCM()` or disconnect when the app goes to the background. You can simply write: `Messaging.message().shouldEstablishDirectChannel = true` after `FirebaseApp.configure()`. Once you've fixed push notifications, you should also do the following, to ensure that your Analytics are working: ##### Make sure you're calling `FIRMessaging.messaging().appDidReceiveMessage(userInfo)` in two places. You should call this in `application:didReceiveRemoteNotification:fetchCompletionHandler:` and also `userNotificationCenter:willPresentNotification:withCompletionHandler:` username_0: thanks for answer username_1: I updated my code on my AppDelegate, but when I updated the pod Firebase, my pod file is: ``` # Uncomment the next line to define a global platform for your project platform :ios, '9.0' target 'lol' do # Comment the next line if you're not using Swift and don't want to use dynamic frameworks use_frameworks! # Pods for lol pod 'Firebase' pod 'Firebase/Core' pod 'Firebase/Messaging' end ``` but now I have a problem with the files frameworks: **Showing Recent Messages** ``` ld: framework not found GoogleToolboxForMac ld: framework not found Protobuf ld: warning: directory not found for option '-F/Users/dmostajo/Library/Developer/Xcode/DerivedData/lol-bftvvxkmmxtmrwerxvfxlekfixlq/Build/Products/Debug-iphoneos/GoogleToolboxForMac' ld: warning: directory not found for option '-F/Users/dmostajo/Library/Developer/Xcode/DerivedData/lol-bftvvxkmmxtmrwerxvfxlekfixlq/Build/Products/Debug-iphoneos/Protobuf' ``` I'm still working on my Workspace, any help? username_0: Or how can I get to install Version 4.0.0 manually, in the page of firebase is only to version 3.12.0? username_0: Ok I did updated firebase to version 4.0, but now I get these messages, how should I implement them or fix them? ``` 2017-06-01 16:42:12.236196-0400 lol[8235:2204361] [Firebase/Messaging][I-FCM002019] FIRMessaging received data-message, but FIRMessagingDelegate's-messaging:didReceiveMessage: not implemented 2017-06-01 16:42:12.236 lol[8235] <Warning> [Firebase/Messaging][I-FCM002019] FIRMessaging received data-message, but FIRMessagingDelegate's-messaging:didReceiveMessage: not implemented ``` please any help? thanks, s lot!! username_1: You need to add another delegate method, in addition to `messaging:didReceiveRemoteMessage:`. [See the open-source Github sample for Messaging](https://github.com/firebase/firebase-ios-sdk/blob/master/Example/Messaging/App/AppDelegate.swift#L118-L126) username_0: thanks again username_1: I implemented as you recommend: ``` func messaging(_ messaging: Messaging, didReceive remoteMessage: MessagingRemoteMessage) { debugPrint("--->didReceive Remote Message") guard let data = try? JSONSerialization.data(withJSONObject: remoteMessage.appData, options: .prettyPrinted), let prettyPrinted = String(data: data, encoding: .utf8) else { return } print("Received direct channel message:\n\(prettyPrinted)") } ``` but still print me this logs on foreground and background way: ``` 2017-06-01 17:40:12.897916-0400 lol[8275:2217196] [Firebase/Messaging][I-FCM002019] FIRMessaging received data-message, but FIRMessagingDelegate's-messaging:didReceiveMessage: not implemented 2017-06-01 17:40:12.898 lol[8275] <Warning> [Firebase/Messaging][I-FCM002019] FIRMessaging received data-message, but FIRMessagingDelegate's-messaging:didReceiveMessage: not implemented "###> 1.2 AppDelegate DidEnterBackground" 2017-06-01 17:40:29.943006-0400 lol[8275:2217037] Could not signal service com.apple.WebKit.WebContent: 113: Could not find specified service 2017-06-01 17:40:29.944689-0400 lol[8275:2217037] Could not signal service com.apple.WebKit.Networking: 113: Could not find specified service 2017-06-01 17:40:30.000428-0400 lol[8275:2217203] dnssd_clientstub read_all(27) DEFUNCT "###> 1.3 AppDelegate DidBecomeActive" 2017-06-01 17:40:30.760941-0400 lol[8275:2217443] [Firebase/Messaging][I-FCM002019] FIRMessaging received data-message, but FIRMessagingDelegate's-messaging:didReceiveMessage: not implemented 2017-06-01 17:40:30.761 lol[8275] <Warning> [Firebase/Messaging][I-FCM002019] FIRMessaging received data-message, but FIRMessagingDelegate's-messaging:didReceiveMessage: not implemented ``` Anything else I'm missing? username_2: Looks like you've forgotten to set the FIRMessaging delegate. username_0: with the Delegate command I have this issues: ![captura de pantalla 2017-06-01 a la s 6 46 11 p m](https://cloud.githubusercontent.com/assets/13056866/26704040/bbb6c624-46fa-11e7-9d73-77aa0a9c01f5.png) ![captura de pantalla 2017-06-01 a la s 6 44 43 p m](https://cloud.githubusercontent.com/assets/13056866/26704039/bbb4c46e-46fa-11e7-8a1b-c6ca6a6056c9.png) ![captura de pantalla 2017-06-01 a la s 6 43 55 p m](https://cloud.githubusercontent.com/assets/13056866/26704038/bbb4ad80-46fa-11e7-9f82-0a6b5b0bd408.png) username_2: You should be explicitly declaring your type conformance to the delegate protocol. ```swift class AppDelegate: FIRMessagingDelegate /* ... */ { /* ... */ } ``` username_0: thanks again username_2 but I'm still this issues: ![captura de pantalla 2017-06-02 a la s 9 54 23 a m](https://cloud.githubusercontent.com/assets/13056866/26728881/b7de6e06-4779-11e7-8a01-3ee7a5dd0ae8.png) username_0: even if I tried this too ![captura de pantalla 2017-06-02 a la s 10 11 34 a m](https://cloud.githubusercontent.com/assets/13056866/26729573/07e29150-477c-11e7-99bd-201220693e64.png) username_2: You'll need to then implement all the required delegate methods. Take a look at the [MessagingDelegate documentation](https://firebase.google.com/docs/reference/ios/firebasemessaging/api/reference/Protocols/FIRMessagingDelegate). username_0: Thanks again username_2: I was able to resolve the delegate issues, I updated all my code ``` import UIKit import FirebaseAnalytics import FirebaseDatabase import FirebaseMessaging import FirebaseInstanceID import UserNotifications @UIApplicationMain class AppDelegate: UIResponder, UIApplicationDelegate, UNUserNotificationCenterDelegate, MessagingDelegate { /// This method will be called whenever FCM receives a new, default FCM token for your /// Firebase project's Sender ID. /// You can send this token to your application server to send notifications to this device. func messaging(_ messaging: Messaging, didRefreshRegistrationToken fcmToken: String) { debugPrint("--->messaging:\(messaging)") debugPrint("--->didRefreshRegistrationToken:\(fcmToken)") } @available(iOS 10.0, *) public func messaging(_ messaging: Messaging, didReceive remoteMessage: MessagingRemoteMessage) { debugPrint("--->messaging:\(messaging)") debugPrint("--->didReceive Remote Message:\(remoteMessage.appData)") guard let data = try? JSONSerialization.data(withJSONObject: remoteMessage.appData, options: .prettyPrinted), let prettyPrinted = String(data: data, encoding: .utf8) else { return } print("Received direct channel message:\n\(prettyPrinted)") } var window: UIWindow? func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplicationLaunchOptionsKey: Any]?) -> Bool { debugPrint("###> 1 AppDelegate DidFinishLaunchingWithOptions") self.initializeFCM(application) let token = InstanceID.instanceID().token() debugPrint("GCM TOKEN = \(String(describing: token))") return true } func applicationDidEnterBackground(_ application: UIApplication) { //FIRMessaging.messaging().disconnect() debugPrint("###> 1.2 AppDelegate DidEnterBackground") // self.doServiceTry() } func applicationDidBecomeActive(_ application: UIApplication) { // connectToFcm() application.applicationIconBadgeNumber = 0 debugPrint("###> 1.3 AppDelegate DidBecomeActive") [Truncated] DatabaseHelper.cleanup() } //------------------------------------------------------------------ } ``` but still showing the logs when I test from console: ``` 017-06-01 17:40:12.897916-0400 lol[8275:2217196] [Firebase/Messaging][I-FCM002019] FIRMessaging received data-message, but FIRMessagingDelegate's-messaging:didReceiveMessage: not implemented 2017-06-01 17:40:12.898 lol[8275] <Warning> [Firebase/Messaging][I-FCM002019] FIRMessaging received data-message, but FIRMessagingDelegate's-messaging:didReceiveMessage: not implemented "###> 1.2 AppDelegate DidEnterBackground" 2017-06-01 17:40:29.943006-0400 lol[8275:2217037] Could not signal service com.apple.WebKit.WebContent: 113: Could not find specified service 2017-06-01 17:40:29.944689-0400 lol[8275:2217037] Could not signal service com.apple.WebKit.Networking: 113: Could not find specified service 2017-06-01 17:40:30.000428-0400 lol[8275:2217203] dnssd_clientstub read_all(27) DEFUNCT "###> 1.3 AppDelegate DidBecomeActive" 2017-06-01 17:40:30.760941-0400 lol[8275:2217443] [Firebase/Messaging][I-FCM002019] FIRMessaging received data-message, but FIRMessagingDelegate's-messaging:didReceiveMessage: not implemented 2017-06-01 17:40:30.761 lol[8275] <Warning> [Firebase/Messaging][I-FCM002019] FIRMessaging received data-message, but FIRMessagingDelegate's-messaging:didReceiveMessage: not implemented ``` I don't know why is not recognizing the methods, what else can be missing? username_2: Looks like you've forgotten to set the Messaging delegate. username_0: You're right !, many thanks! it worked! T_T ``` FirebaseApp.configure() Messaging.messaging().delegate = self Messaging.messaging().shouldEstablishDirectChannel = true ``` thank you so much dude!! username_2: You're welcome!! Status: Issue closed username_0: One last question to close the issue, if I want to show the animation bar when I get the notifications in the background, how is the new way to implement for iOS 10+? username_2: If by animation bar you mean notification banner, those should show up automatically assuming your app is correctly set up and has been granted the appropriate permission. Bear in mind that APNs delivers notifications on its own schedule and you may not necessarily see your notification appear right after push, especially if sending a large amount of notifications in a brief period. username_3: hi, i was facing this problem too, in IRAN, thanks to all google... **we are blocked** ... i was facing this problem about 5 days and try every tutorial in net... but nothing solve my problem, Domain=com.firebase.iid not responds and it doesnt give me a token... but when i connect to a **VPN** then connection between device and iid.firebase.com domain established and token generated, **try connecting to VPN and check again**.
jenkinsci/github-checks-plugin
792054250
Title: Skip status checks on Pipeline Libraries Question: username_0: Hello, Thanks for this really useful plugin! It appears it is not currently possible to skip status checks on Pipeline Libraries, is anyone else having a similar issue? On a multibranch pipeline job or a folder try to add pipeline library with a GitHub repository. One cannot add "Status Checks Properties" in its Behavior section. Unfortunately, the library repository's commits are getting hit by every check. Jenkins version: 2.263.2 GitHub Checks plugin version: 1.0.9 Answers: username_1: yes, the pipeline library is currently a problem. and maybe we should directly suppress the checks on the pipeline library instead of adding a behavior? 🧐 username_2: Any update on this? We're using two shared libraries per build and the in progress publishing is triggered three times accordingly.
AASJournals/AASTeX60
1013632335
Title: Space underneath double horizontal lines in deluxetable Question: username_0: double horizontal lines. Made custom negative skip based on each style, tested, and supplied tabsamples.zip showing tableheads in each style. mentions "custom negative skip", but the skip seems to be fixed (not custom).
babel/babel
244114475
Title: Can't export a top-level async function Question: username_0: ### Bug Report (maybe) <!--- Provide a general summary of the issue in the title above --> ### Input Code <!--- If you're describing a bug, please let us know which sample code reproduces your problem --> <!--- If you have link to our REPL or a standalone repo please link that! --> ```js // src/file.js 'use strict'; const regeneratorRuntime = require('regenerator-runtime'); async function demo () { // pass } module.exports = demo; ``` ### Babel Configuration (.babelrc, package.json, cli command) <!--- If describing a bug, tell us what your babel configuration looks like --> ```js { "presets": [ "env" ], "plugins": [ "transform-regenerator" ] } ``` ### Expected Behavior Transpiled file should have the `regeneratorRuntime` and `_asyncToGenerator` variables declared and assigned before that of the exported function, which uses them. ### Current Behavior Declarations and assignments appear after declaration/assignment of exported function, leading to an error: `TypeError: Cannot read property 'mark' of undefined` when attempting to access the not-yet-declared `regeneratorRuntime` variable. ### Possible Solution Wrapping the async function in an IIFE produces the expected behavior. This leads me to believe it's a bug somewhere in the parsing procedure. ### Context Wrapping the function in an IIFE (`module.exports = (() => async function () {})()`) will work, but it seems a little hacky, and I'm just looking to learn if I'm doing something wrong or if in fact this is a bug in the transpilation. #### Reproducing You can clone a minimal repo I created to demonstrate the error ([link](https://github.com/username_0/babel-async-bug)). There are two source files using each strategy, with the hope that you can reproduce the error. ### Your Environment | software | version(s) | ---------------- | ------- | Babel | 6.24.1 | node | 7.2.0 | npm | 3.10.9 | Operating System | Mac 10.2.5 Answers: username_1: I think this is just that the function gets hoisted before the require to regeneratorRuntime (out of order) - you can solve this by requiring the runtime/polyfill before this file is loaded (a separate entry file). username_2: Why is this hoisted at all? It's in the same function scope. username_3: It's a function declaration, so we always hoist it. We could potentially do static analysis to see if it can skip hoisting, but we don't have that at the moment. username_1: Other wise this is similar to https://github.com/babel/babel/issues/5977 and https://github.com/babel/babel/issues/5085#issuecomment-277544677 where logan recommended ``` import 'babel-polyfill'; import './app'; ```` username_0: Thanks, folks. This leads me to another case where this should cause a problem: ```js // someFile.js 'use strict'; let someClosure = 0; function someEnclosingFn () { return someClosure++; } module.exports = somEnclosingFn; ``` This seems like a pretty reasonable thing to do (Node's util.debuglog does it), but the exported function would be hoisted above the closure it relies on, right? If this is is something that you all might want to pursue, I'd be happy to help in any way. username_3: Since that isn't async, we won't touch it, but if it was, that should still work, since the order of the declarations don't make a difference. Status: Issue closed username_0: D'oh, of course. No one will use it until after export. Thanks. :)
jfc3/atehere
761763957
Title: Add <NAME> to the CHI JSON File Question: username_0: Need to add <NAME> to the CHI JSON file. <NAME> From Chicago Etaer website about <NAME>, '<NAME>, the Mexico City-style taqueria from chefs Sotero Gallegos (Le Sardine), <NAME> (Bar Lupo, Nomi), and <NAME> of Le Bouchon, debuted Friday on Western Avenue to a long line of excited patrons, according to an Instagram post. The trio feature seven tacos, including crispy pig head carnitas (“Mexican giardiniera,” arbol salsa), vegetarian al pastor (marinated portobello, celery root, pineapple), and costilla (short rib, pea tendrils, hazelnuts). They’re also offering a Que Rico Chingón Special — al pastor meat available by the half or full pound — plus pozole, ceviche, and quesadillas.' Titus Pullo recommends the Morcilla (Blood Sausage, Pickled Apple, Red Onion, Salsa Macha), Al Pastor, Green Chorizo, and Carne Asada. Looking at the menu I would like to try the Pig Head Carnitas which is Crispy Pig Head, Mexican Giardiniera, and Arbol Salsa. Address: 2234 N Western Ave, Chicago, IL 60647 Phone: (773) 687-9408 URL - Closed Monday <NAME>lo Answers: username_0: Added <NAME>ón to the CHI JSON file. Status: Issue closed
webdriverio/webdriverio
490728880
Title: Node v12 install throw errors even though installation succeeds Question: username_0: Currently all installations using Node v12 and up with NPM throw errors because NPM tries to install `fibers_node_v8` which is not suppose to be installed and successfully ignored by Yarn. This can confuse people thinking that the installation has failed while it actually succeeded. See also #3895 Answers: username_0: Fixed issue by pushing and releasing: https://github.com/webdriverio/node-fibers/commit/d0054568ce50814a2312611a4c50773c7d5fce46 .. the fibers installation is gonna be ignored for Node v10 and up so that no errors are printed anymore. Status: Issue closed
couchbaselabs/cbft
66641317
Title: cbft should parse docs to interface{} instead of bleve Question: username_0: Related to https://github.com/blevesearch/bleve/issues/186 This issue is to track the idea that cbft should do the json.Unmarshal() parsing and pass the resulting interface{} to bleve. This way cbft can error out earlier on some specific doc, rather than see a whole batch get an error. Answers: username_0: commit 38ee4ee Status: Issue closed
OpenEmu/OpenEmu
123832467
Title: PSP audio now choppy when upgrading to 2.0 Question: username_0: I've been play Metal Gear Solid: Portable Ops and Portable Ops Plus recently with no problems. After updating today there is now choppiness in the audio and only for the PSP core. I'm on an E2015 base model 13" with PS4 controller Bluetooth'd. I wanted to downgrade back to 1.04 experimental but now when downloading the cores I guess it gets your new ones and just crashes before launching any game for any console. I used AppDelete to try and clear everything out sans save states/game library (backed them up and then moved them back in newly generated one) to no avail. That being said, mostly raising the point of 2.0's PSP acting funky now, if there's a way I can go back to 1.04 experimental and the old core let me know. Answers: username_0: I just noticed you've added real-time rewinding. Perhaps the computer is not powerful enough to do that constantly at the same time. Is there a way to disable that and test? Status: Issue closed username_1: This would be a bug in PPSSPP itself not related to OpenEmu or rewind (PPSSPP doesn't have rewind enabled). I'd suggest just using an older standalone version of PPSSPP to work around the issue. username_1: #1036
hyb1996-guest/AutoJsIssueReport
285197150
Title: [163]java.lang.IllegalArgumentException: wrong number of arguments; expected 9, got 8 Question: username_0: Description: --- java.lang.IllegalArgumentException: wrong number of arguments; expected 9, got 8 at java.lang.reflect.Method.invokeNative(Native Method) at java.lang.reflect.Method.invoke(Method.java:525) at com.jecelyin.editor.v2.core.text.SpannableStringBuilder.getTextRunAdvances(SpannableStringBuilder.java:1440) at com.jecelyin.editor.v2.core.text.TextPaintCompat.getTextRunAdvances(TextPaintCompat.java:129) at com.jecelyin.editor.v2.core.text.TextLine.handleText(TextLine.java:783) at com.jecelyin.editor.v2.core.text.TextLine.handleRun(TextLine.java:995) at com.jecelyin.editor.v2.core.text.TextLine.measureRun(TextLine.java:439) at com.jecelyin.editor.v2.core.text.TextLine.measure(TextLine.java:318) at com.jecelyin.editor.v2.core.text.TextLine.metrics(TextLine.java:292) at com.jecelyin.editor.v2.core.text.Layout.getLineExtent(Layout.java:1017) at com.jecelyin.editor.v2.core.text.Layout.getLineMax(Layout.java:973) at com.jecelyin.editor.v2.core.text.Layout.getLineRight(Layout.java:950) at com.jecelyin.editor.v2.core.widget.TextView.bringPointIntoView(TextView.java:7072) at com.jecelyin.editor.v2.core.widget.TextView.onPreDraw(TextView.java:4996) at android.view.ViewTreeObserver.dispatchOnPreDraw(ViewTreeObserver.java:680) at android.view.ViewRootImpl.performTraversals(ViewRootImpl.java:2190) at android.view.ViewRootImpl.doTraversal(ViewRootImpl.java:1196) at android.view.ViewRootImpl$TraversalRunnable.run(ViewRootImpl.java:4971) at android.view.Choreographer$CallbackRecord.run(Choreographer.java:776) at android.view.Choreographer.doCallbacks(Choreographer.java:579) at android.view.Choreographer.doFrame(Choreographer.java:548) at android.view.Choreographer$FrameDisplayEventReceiver.run(Choreographer.java:762) at android.os.Handler.handleCallback(Handler.java:800) at android.os.Handler.dispatchMessage(Handler.java:100) at android.os.Looper.loop(Looper.java:194) at android.app.ActivityThread.main(ActivityThread.java:5423) at java.lang.reflect.Method.invokeNative(Native Method) at java.lang.reflect.Method.invoke(Method.java:525) at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:844) at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:611) at dalvik.system.NativeStart.main(Native Method) Device info: --- <table> <tr><td>App version</td><td>2.0.16 Beta2.1</td></tr> <tr><td>App version code</td><td>163</td></tr> <tr><td>Android build version</td><td>eng.znsj.1407903466</td></tr> <tr><td>Android release version</td><td>4.2.2</td></tr> <tr><td>Android SDK version</td><td>17</td></tr> <tr><td>Android build ID</td><td>amigo2.0.10</td></tr> <tr><td>Device brand</td><td>GiONEE</td></tr> <tr><td>Device manufacturer</td><td>GiONEE</td></tr> <tr><td>Device name</td><td>GiONEE</td></tr> <tr><td>Device model</td><td>GN9000</td></tr> <tr><td>Device product name</td><td>GiONEE</td></tr> <tr><td>Device hardware name</td><td>mt6592</td></tr> <tr><td>ABIs</td><td>[armeabi-v7a, armeabi]</td></tr> <tr><td>ABIs (32bit)</td><td>null</td></tr> <tr><td>ABIs (64bit)</td><td>null</td></tr> </table>
ruthmoog/portfolio
970314537
Title: Add alternative navigation (site map or search) Question: username_0: Missing sitemap or search feature Multiple ways should be provided for users to locate information on your website. In order to meet this requirement each page should include a Search box or a link labeled "Search" that takes the user to a search feature. Another way to meet this requirement is to include a link labeled "Site Map" or "Sitemap" that provides links to different sections of the site. [WCAG 2.1 (Level AA) - 2.4.5 ](https://www.w3.org/WAI/WCAG21/quickref/#multiple-ways) https://www.alumnionlineservices.com/accessibility/scanner/
backend-br/vagas
414705264
Title: [Curitiba, PR] Desenvolvedora(dor) Serverless na ONLi Question: username_0: Olá! A ONLi é uma Startup especializada em soluções de seguro. Estamos procurando alguém que possa compor nosso time na parte do Back-End. Ele está totalmente serverless, e totalmente em Node.js. Estamos precisando criar mais serviços que não necessariamente precisam ser em Node.js Estamos procurando um desenvolvedor com apenas UM requisito: - Dominar uma dessas linguagens: Python, JavaScript, TypeScript ou Java Diferenciais: - Ter algum projeto utilizando Serverless (https://serverless.com/) - Ter algum projeto utilizando Apex (http://apex.run/) - Ter algum projeto utilizando Up (https://github.com/apex/up) - Ter algum projeto utilizando Zappa (https://github.com/Miserlou/Zappa) - Ter algum projeto utilizando Firebase - Dominar Node.js - Contribuições Open Source O que utilizamos no back: - Node.js - AWS DynamoDB - Aurora Serverless Database - AWS Lambda - API Gateway - REST API e WebSockets Utilizamos no Front: - WebSockets - Vue.js - Nuxt.js Benefícios: - Horário Flexível - Escritório localizado na área do Água verde. Salário: 3.5k Regime: PJ Contato: <EMAIL> Answers: username_1: aceitam remoto? username_2: @username_0 aceita remoto? username_0: @username_1 @username_2 No momento estamos buscando pessoas apenas no modelo presencial Status: Issue closed
BEEmod/BEE2-items
317942341
Title: Instant Jam Button Question: username_0: Weighted or Cube or Ball. Will have orange active skin, when jammed with a cube, ball, or player will become blue unactive skin, you can get off anything on the button, but it still jammed in. And sends output. Answers: username_1: I think i have seen a button like this. so, if i may, I will revise this. This button looks like a normal button. But its a blue (Best choose a different shade of blue than the Portal ON/OFF button.) button. No cubes can activate the button. But when you step on it. It works. When you step off of the button it stays lowered. Causing the output to stay active. This would be interesting if added. Due to what you can create with this button in mind. However this is just a suggestion. And probably better if i made a different issue for it (I might still. We'll see what happens) username_1: And you really aren't wrong. However a Button takes 7 entities (how? idk) and Buttons take 2 entities, an SR Latch takes 3 entities and, while im not one of them, Some people are REALLY stingy with their entity costs. It would also make things slightly easier for the Map Maker to create. Just place one item and connect it. Rather than 4 items and have to connect all of them to something else. Im not saying that its the best alternative to those things you've mentioned But im saying that it could help in some cases. Like a... wow ok.. Im gonna say something really lame but a lot of people make them. Companion Cube Stages. One Companion Cube. Whole map. perhaps the exit door is in the middle. and half way across the map is a button for the cube. And on the other side, theres another button. but no portal surfaces. That test is impossible. So solution? Portable surfaces wouldn't help. you'd still deactivate the door. So you could use this one-use button to solve this test with ease. Maybe add a portal surface so they don't have to walk back. Or open a shortcut. Either one works. And like i said Hugo. This doesn't have to be added. And i wont go into deep depression if it isn't. I just think it would be an interesting concept. username_2: Well, like I mentioned, seems oddly case-specific username_1: Yeah. To be honest. I don't think it would have a ton of cases where it would be used. But i brought it up due to Nikita bringing up something similar. But then again, Case-specific items aren't new to BEEMod, Many of the fizzlers can only be used in specific situations where their properties would work. But hey. Im not a developer. You guys know much more then i do about this. username_3: @TeamSpen210 - Do not forget to apply the label after closing an issue; since although it may sounds unbelievable, there are some people who still checks inside the "closed issues" tab in order to see which issues were solved/closed and their reasons. So, having them shorted with the proper label after closing will help people to understand the real cause of its closure. Do not apply aditional labels such as "enhancement", "new item" etc, when the issue is invalid/duplicate. The "invalid" or "duplicate" labels are just enough to clarify a reason for it.
xmoravekm/myappsample
386588520
Title: Sledovanie materiálu Question: username_0: Treba pridať funkcionalitu na sledovanie materiálu. Dodávateľia s ktorými spolupracujeme majú v prepravných autách uložený gps lokatór, ku ktorému máme prístup a môžeme vďaka nemu zistiť presnú polohu materiálu a zamestnanci si môžu pozrieť kde sa materiál nachádza.
ocornut/imgui
813832344
Title: ImGui::GetIO().KeyCtrl is true even when not pressing down key on Mac when you use "Cmd + Tab" to go to another window and then come back - `imgui_impl_opengl3.h` + `imgui_impl_osx.h` Question: username_0: **Full info** ``` Dear ImGui 1.80 WIP (17911) -------------------------------- sizeof(size_t): 8, sizeof(ImDrawIdx): 2, sizeof(ImDrawVert): 20 define: __cplusplus=201703 define: __APPLE__ define: __GNUC__=4 define: __clang_version__=12.0.0 (clang-1200.0.32.28) define: IMGUI_HAS_VIEWPORT define: IMGUI_HAS_DOCK -------------------------------- io.BackendPlatformName: imgui_impl_osx io.BackendRendererName: imgui_impl_opengl3 io.ConfigFlags: 0x00000040 DockingEnable io.ConfigViewportsNoDecoration io.ConfigMacOSXBehaviors io.ConfigInputTextCursorBlink io.ConfigWindowsResizeFromEdges io.ConfigWindowsMoveFromTitleBarOnly io.ConfigMemoryCompactTimer = 60.0 io.BackendFlags: 0x0000100A HasMouseCursors RendererHasVtxOffset RendererHasViewports -------------------------------- ``` **Version/Branch of Dear ImGui:** Version: 1.80 WIP - commit `388ca563dbb0fa683d2a8fbf1daaf2ab77f98a21` Branch: docking **Back-end/Renderer/Compiler/OS** Back-ends: `imgui_impl_opengl3.h` + `imgui_impl_osx.h` Compiler: `clang-1200.0.32.28` Operating System: `OS X` **My Issue/Question:** When on a mac, you can use "Cmd + Tab" to quickly jump to another window. When you do that and then come back, `ImGui::GetIO().KeyCtrl` will be true even if you're not pressing down the `Cmd` key. **Standalone, minimal, complete and verifiable example:** _(see https://github.com/username_1/imgui/issues/2261)_ ``` void cmdRepro() { ImGuiIO &io = ImGui::GetIO(); bool IsWindowAppearing = ImGui::IsWindowAppearing(); bool IsWindowHovered = ImGui::IsWindowHovered(); bool IsItemHovered = ImGui::IsItemHovered(); bool RightDown = ImGui::IsMouseDown(ImGuiMouseButton_Right); bool KeySuper = io.KeySuper; bool KeyCtrl = io.KeyCtrl; bool KeyAlt = io.KeyAlt; bool KeyShift = io.KeyShift; ImGui::Text("IsWindowAppearing: %d IsWindowHovered: %d IsItemHovered: %d RightDown: %d KeySuper: %d KeyCtrl: %d KeyAlt: %d KeyShift: %d", IsWindowAppearing, IsWindowHovered, IsItemHovered, RightDown, KeySuper, KeyCtrl, KeyAlt, KeyShift ); } ``` **Workaround:** You can manually reset it with: ``` // If you cmd + tab to another window then come back io.KeySuper will be true. Reset it. if (ImGui::IsWindowAppearing()) { ImGui::GetIO().KeySuper = false; } ``` Answers: username_1: I don't know what Cmd+Tab is mapped to. It is switching to other OS window (OSX ui) or other Dear ImGui window? (dear imgui "ctrl+tab" window). Given your ConfigFlags I am assuming keyboard nav is disabled so it is the earlier, but it is bizarre you suggest using `IsWindowAppearing()` which is testing for Dear ImGui windows appearing whereas the suggestion relate to OS window gaining/losing focus. I believe this is the same as #3532 but specifically for the OSX backend. username_0: Good question. It's switching to other OS window (OSX ui). I also tested more and you're correct, using `IsWindowAppearing()` is not a proper fix for this, it still happens. username_1: I was referring to _back-end_ clearing all Keys[] and key modifiers when all dear imgui windows are unfocused. This is probably the better default solution to implement across all back-ends, which maybe an option to disable it. username_2: What if you alt-tab to another window, keep holding alt and then alt-tab back? No new "key-down" event would fire, alt would remain held down but down state would be cleared as window regained focus. username_1: That's technically a bug but pragmatically this is less problematic and a MUCH better default than currently and many apps behave this way. I'll now be applying this clearing in the Win32 backend and would appreciate the equivalent PR for native OSX backend. Status: Issue closed username_1: Merged fix by @username_2: 6d5388448794f76cc92576a74ad3cb17da2fca84
logdna/logdna-agent
488277416
Title: Passing LDLOGSSL=yes as an env variable Question: username_0: Hi all! What happens if I pass `LDLOGSSL=yes` as an environment variable to the logdna agent? Haven't got access to a testing machine right now, but considering the line below, I would suppose SSL would be disabled, which is a bit of a usable security issue. ``` , LOGDNA_LOGSSL: isNaN(process.env.LDLOGSSL) ? true : +process.env.LDLOGSSL ``` Well, it's just a minor inconvenience I suppose. Thanks for all your great work! Answers: username_1: Defaults to `true`. Status: Issue closed
OHI-Science/ohi-science.github.io
185990353
Title: Why is the NP score in my tailored repository different than the NP score in the Global assessment for the same region? Question: username_0: Q from OHI - New Caledonia (CNC): _In the global OHI assessment, the NP score for CNC is 100. However, in the CNC repository, the score dropped to 46. We haven't changed the functions or data layers. I'm wondering why the scores are so drastically different. Thanks._ Answers: username_0: When creating an OHI+ repository, we copy the same functions and pull out data layers that correspond to the same regions. In most cases, the global scores and local scores do not differ much. However, **scores could differ noticeably between OHI-Global and OHI+ repository when a region's data was gapfilled in the global OHI assessment**. When there is no data for a data layer in a specific region, our core OHI engine (`ohicore`) is able to gapfill for this region based on a suite of criteria and formation (eg. neighboring regions data, etc). In CNC's case, two NP data layers ("cyanide" and "blasting") were gapfilled in the global study for this region. This process occurs in the status calculation process (in _functions.R_) without changing the raw data layer itself. Therefore, when CNC repository was created and data layers were pulled for this region, "cyanide" and "blasting" were left blank again as there was no regional data. A random value was used as a placeholder until local data replaces it, which would change NP score from what you see in OHI-Global. **To check which layers have these placeholder values** When you receive your repository, you can check in the [layers](https://github.com/OHI-Science/cnc/tree/master/eez2016/layers) folder. All "placeholder" values are labeled in the file name: ![image](https://cloud.githubusercontent.com/assets/11824840/19818163/78f60954-9d03-11e6-8699-1866319bf408.png)
bjansen/pebble-intellij
763030913
Title: New identifier pattern Question: username_0: Soon my pull request at https://github.com/PebbleTemplates/pebble/pull/544 will be merged so the identifier regex will need to be changed to `[\\p{IsLetter}_][\\p{IsLetter}\\p{IsDigit}_]*`. I hope that `\\p{...}` is usable in ANTLR. Answers: username_1: Yep, ANTLR supports Unicode properties: https://github.com/antlr/antlr4/blob/master/doc/lexer-rules.md#lexer-rule-elements Thanks for the PR, I'll wait for your PR on PebbleTemplates/pebble to be merged before merging this one. username_0: Finally, no more 50 syntax errors per file :) username_0: @username_1 It is merged now Status: Issue closed
urain39/stuff
526268110
Title: TypeScript解决严格模式下TS2532: Object is possibly 'undefined'的报错 Question: username_0: ```log src/parser.ts:60:11 - error TS2532: Object is possibly 'undefined'. 60 value = section[TokenMember.VALUE]; ~~~~~~~ ``` 解决方法:使用类型断言对报错的对象进行修饰,使编译器不再对对象进行额外的检查。 ```ts value = (<Token>section)[TokenMember.VALUE]; ``` Answers: username_0: 在某些情况下使用`value as Type`的断言风格可能会改善可读性。 username_0: 此外`value as any`或许可以解决不能设置属性的问题。 username_1: 感谢,帮到我了
kubernetes-sigs/cluster-api-provider-openstack
956053300
Title: Cluster templates using deprecated opton "openstack" instead of "external" Question: username_0: /kind bug When using the templates cluster-template-without-lb.yaml and cluster-template.yaml, it will fail due the deprecated option cloud-provider: openstack in the KubeadmControlPlane object. Based on the latest versions of kubelet, it should be user cloud-provider: external instead **Environment:** - Cluster API Provider OpenStack version (Or `git rev-parse HEAD` if manually built): - Cluster-API version: 0.3.22 - OpenStack version: stable/Victoria - Kubernetes version (use `kubectl version`): v1.19.11 - OS (e.g. from `/etc/os-release`): Ubuntu 20.04 Answers: username_0: Sorry about the kind/bug flag, First time reporting an issue here, I forgot to remove the tag on the bug template. This issue was found using a cluster deployed via kubadm. username_1: have you check https://github.com/kubernetes-sigs/cluster-api-provider-openstack/blob/master/docs/book/src/topics/external-cloud-provider.md ? username_1: I noticed there is a broken link, fixing https://github.com/kubernetes-sigs/cluster-api-provider-openstack/pull/957 username_0: @username_1 Hey, thanks for the link! I didn't check this link before, I'll give a shoot on it, and I'll let you know, but thanks for reviewing and the current PR anyway! username_0: I believe this was fixed by https://github.com/kubernetes-sigs/cluster-api-provider-openstack/pull/957 I'm closing this issue for now. Status: Issue closed
ARM-software/ComputeLibrary
342159042
Title: result missmatch between CLConvolutionLayer and NEconvolutionLayer Question: username_0: hi, I have been working on accelerating a CNN network with Mali GPU. To validate the result, I build a one-layer conv network with CLConvolutionLayer and input tensor 40x40x3/output tensor 40x40x24/conv kernel 3x3/stride 1/padding 1. It is strange that the elements positioned in output tensor COL 39 and 40 as well as ROW 39 and 40 are all set to 0, while the other elements are correct. When I change the layer to NEConvolutionLayer, the results are all correct. arm_compute_version=v18.05 Build options: {'embed_kernels': '1', 'arch': 'arm64-v8a', 'opencl': '1', 'neon': '1', 'asserts': '1', 'debug': '0', 'os': 'linux', 'Werror': '1'} Git hash=b3a371bc429d2ba45e56baaf239d8200c2662a74 Platform: Mediatek - MT2712 OS: linux ```c++ // init data tensor CLTensor* data = new CLTensor(); const TensorShape input_shape(40, 40, 3); data->allocator() -> init(TensorInfo(input_shape, 1, DataType::F32)); data->allocator() -> allocate(); data->map(true); for (int i = 0; i < input_shape.total_size(); i ++) { (reinterpret_cast<float *>(data->buffer()))[i] = 1.0; } data->unmap(); // init weight tensor CLTensor* ConvNd_1_weight = new CLTensor(); const TensorShape ConvNd_1_weights_shape(3,3,3,24); ConvNd_1_weight->allocator()->init(TensorInfo(ConvNd_1_weights_shape,1,DataType::F32)); ConvNd_1_weight->allocator()->allocate(); ConvNd_1_weight->map(true); for (int i = 0; i < ConvNd_1_weights_shape.total_size(); i ++) { (reinterpret_cast<float *>(ConvNd_1_weight->buffer()))[i] = 1.0; } ConvNd_1_weight->unmap(); // init output tensor CLTensor* ConvNd_1 = new CLTensor(); const TensorShape ConvNd_1_shape(40,40,24); ConvNd_1->allocator()->init(TensorInfo(ConvNd_1_shape,1,DataType::F32)); ConvNd_1->allocator()->allocate(); // init conv layer CLConvolutionLayer* ConvNd_1_conv = new CLConvolutionLayer(); ConvNd_1_conv->configure(data,ConvNd_1_weight,nullptr,ConvNd_1,PadStrideInfo(1,1,1,1)); // run CLScheduler::get().sync(); ConvNd_1_conv->run(); CLScheduler::get().sync(); ``` results.txt [result.txt](https://github.com/ARM-software/ComputeLibrary/files/2204205/result.txt) Did I get something wrong? Answers: username_1: Hi @username_0, would it be possible to see the piece of code you use to print the output values? Thanks username_0: hi @username_1 , ```c++ ConvNd_1->map(block); ofstream fc1_txt; fc1_txt.open("result.txt"); float *output = reinterpret_cast<float*>(ConvNd_1->buffer()); for(int i = 0;i<24*40*40;i++){ fc1_txt<<output[i]<<"\n"; } ConvNd_1->unmap(); fc1_txt.close(); ``` thanks for your reply username_1: Hi @username_0, I think the problem is related to how you read the values from your output tensor because you are not taking into account the paddings. can you try to read the output values in the following manner? ```cpp printf("\n******Input Image 2 ******* \n"); ConvNd_1->map(true); ConvNd_1->print(std::cout); ConvNd_1->unmap(); ``` username_0: Hi @username_1 Sure, i just had a try. But nothing different. username_1: I noticed you are allocating the tensors before configuring CLConvolutionLayer. Could you try to move allocate() after the configuration of the function? ```cpp // init data tensor CLTensor* data = new CLTensor(); const TensorShape input_shape(40, 40, 3); data->allocator() -> init(TensorInfo(input_shape, 1, DataType::F32)); // init weight tensor CLTensor* ConvNd_1_weight = new CLTensor(); const TensorShape ConvNd_1_weights_shape(3,3,3,24); ConvNd_1_weight->allocator()->init(TensorInfo(ConvNd_1_weights_shape,1,DataType::F32)); // init output tensor CLTensor* ConvNd_1 = new CLTensor(); const TensorShape ConvNd_1_shape(40,40,24); ConvNd_1->allocator()->init(TensorInfo(ConvNd_1_shape,1,DataType::F32)); // init conv layer CLConvolutionLayer* ConvNd_1_conv = new CLConvolutionLayer(); ConvNd_1_conv->configure(data,ConvNd_1_weight,nullptr,ConvNd_1,PadStrideInfo(1,1,1,1)); data->allocator() -> allocate(); data->map(true); for (int i = 0; i < input_shape.total_size(); i ++) { (reinterpret_cast<float *>(data->buffer()))[i] = 1.0; } data->unmap(); ConvNd_1_weight->allocator()->allocate(); ConvNd_1_weight->map(true); for (int i = 0; i < ConvNd_1_weights_shape.total_size(); i ++) { (reinterpret_cast<float *>(ConvNd_1_weight->buffer()))[i] = 1.0; } ConvNd_1_weight->unmap(); ConvNd_1->allocator()->allocate(); // run CLScheduler::get().sync(); ConvNd_1_conv->run(); CLScheduler::get().sync(); ``` username_0: hi @username_1 Thanks for your patience. As you suggested, allocate() should be after configure(). And I found that data->info() was modified after configuration. While its shape remained 40x40x3, the memory layout was resized to 42x42x3 with padding space included. So i had to fill in the input tensor carefully skipping the padding space. Now I can get the expected output tensor using the following code. However, it seems not convenient and maybe low performance. How can I do the job gracefully and efficiently ? ```c++ // fill in input tensor data->allocator() -> allocate(); data->map(true); const size_t first_element_offset = data->info()->offset_first_element_in_bytes() / sizeof(float); const size_t stride_x = data->info()->strides_in_bytes().x() / sizeof(float); const size_t stride_y = data->info()->strides_in_bytes().y() / sizeof(float); const size_t stride_z = data->info()->strides_in_bytes().z() / sizeof(float); for (int i = 0; i < 3; i ++) { float* offset = reinterpret_cast<float *>(data->buffer()) + stride_z * i + first_element_offset; for (int j = 0; j < 40; j ++) { for (int k = 0; k < 40; k ++) { offset[k] = 1.0; } offset += stride_y; } } data->unmap(); ``` username_1: Hi @username_0, happy to know you managed to find the issue. Try to have a look at [here](https://arm-software.github.io/ComputeLibrary/v17.09/architecture.xhtml#S4_6_4_working_with_objects). At that page you can find how to fill a tensor efficiently. For your specific case, I would suggest to copy your input data per row so this means you can use a memset or memcpy for each row Hope this can help you username_0: @username_1 Ok. Thank you ~ Status: Issue closed username_2: Resolved. Reopen if needed. username_3: Hi, i met a similar problem. However, I didn't observe data->info() change after configration. ` static void print(ITensorInfo *info) { size_t i; std::cout << "--> "; for(i=0; i<info->num_dimensions(); i++) std::cout << info->dimension(i) << ", "; std::cout << std::endl; }` I use the print funtion to get the data->info(). Is the behavour changed after last time? username_3: // for( unsigned int z = 0; z < batch; ++z) // { // for( unsigned int y = 0; y < height; ++y) // { // for( unsigned int x = 0; x < width; ++x) // { // *reinterpret_cast<float*>( input.buffer() + input.info()->offset_element_in_bytes(Coordinates(x,y,z))) = src_data[ z * (width*height) + y * width + x]; // } // } // } username_4: Hi @username_3 I think the way you copy the values into the tensor is wrong, so it's difficult to speculate about results. ![copyinput](https://user-images.githubusercontent.com/5577511/56728043-d11b1f80-6749-11e9-94b5-ba7455e8d92c.png) Please take a look at https://arm-software.github.io/ComputeLibrary/v19.02/architecture.xhtml#S4_6_images_tensors for an example on how to copy data into a CLTensor using `execute_window_loop` username_3: Hi, @username_4 Thank you for the tip. I successfully get the same result after following your advice. Great thanks to you~~~
petehunt/react-touch
55103803
Title: License Question: username_0: I've used this repo as inspiration for a project, but was wondering if you'd mind if I used some code for the frosted glass stuff? And if you'd like attribution in a certain way? Answers: username_1: +1. I'd also like to borrow some code from here :)
COVID19Tracking/issues
726160178
Title: [WI] Updating Current Hospital metrics and pending 10/17-10/29 Question: username_0: State: Wisconsin Problem: On 10/17 the caching issues on the Hospital page caused the values from 10/16 to be entered. The values from 10/17 were then entered on 10/18, and repeated on 10/19. Solution: Backfill the day's proper value from 10-17-10/19 for Pending, Current Hospitalization, and Current ICU. Sources: https://covid-tracking-project-data.s3.us-east-1.amazonaws.com/state_screenshots/WI/WI-tertiary-20201017-184144.png https://covid-tracking-project-data.s3.us-east-1.amazonaws.com/state_screenshots/WI/WI-secondary-20201019-004831.png https://covid-tracking-project-data.s3.us-east-1.amazonaws.com/state_screenshots/WI/WI-secondary-20201019-184707.png Answers: username_0: BEFORE: <img width="399" alt="Screen Shot 2020-10-20 at 10 55 41 PM" src="https://user-images.githubusercontent.com/66583275/96678979-9d618b80-1327-11eb-897d-76020c49e9f4.png"> AFTER: https://covid-tracking-project-data.s3.us-east-1.amazonaws.com/state_screenshots/WI/WI-secondary-20201019-184707.png Status: Issue closed
Teun/thenBy.js
624608101
Title: None Question: username_0: Thank you for signalling that. It seems that this feature (the custom compare function) was completely missed when we created the typescript bindings. The TypeScript compiler indicates that it doesn't expect the `cmp` property. Will fix this with some priority. Answers: username_0: I have just published version 1.3.3, which should fix this problem. Could you please test that it does? Status: Issue closed
kirill-konshin/next-redux-wrapper
315023656
Title: Is it possible to access store globally in the application after using withRedux Question: username_0: My use case is, I have a fetchApi method, which picks up the access tokens from my state. It will be a lot of repetition if I pass the access token from each component whenever I make API calls. Will it be possible to access the store object in my `fetchApi` routine, so that I can use `store.getState()` and then access my `access_token` using that. Another use case is, my `fetchApi` can trigger outside of the Component lifecycle, from some `setTimeout`. Example: to refresh my `access_token`. Over there also, I will have to get the latest `refresh_token` from my state. Please suggest. Thanks! Status: Issue closed Answers: username_1: You cannot use `window` or any other global object, module is also a no go because of server side rendering. You must use everything in context of React. I also suggest to put `setTimeout` in one of your React layouts and dispatch actions from there. I suggest to add API call method to Redux via thunk & promise middlewares: ```js import {applyMiddleware, createStore} from "redux"; import thunkMiddleware from "redux-thunk"; import promiseMiddleware from "redux-promise-middleware"; import {request} from "./lib"; import reducers from "./reducers"; export default (initialState, options) => { let store; const api = ({method = 'GET', url, data, query}) => { var res = request({ // this is some library that makes requests method, url, data, query, token: store.getState().token // here goes your selector }); // service action required to make sure reducers can capture HTTP errors store.dispatch({ type: 'HTTP', payload: res }); return res; }; store = createStore( reducers, undefined, applyMiddleware( thunkMiddleware.withExtraArgument(api), promiseMiddleware() ) ); return store; }; ``` Then you can dispatch a promise returned from API which becomes available in actions: ```js export const loadUser = (id) => (dispatch, getState, api) => dispatch({ type: 'LOAD_USER', payload: api({url: `/users/${id}`}) }); ``` In reducer you can now invalidate token on HTTP `401` errors: [Truncated] ); const token = (state = '', {type, payload}) => { switch (type) { case ('HTTP_REJECTED'): return (isHttpAuthError(payload)) ? null : state; case ('LOGIN_FULLFILLED'): return payload; case ('LOGIN_REJECTED'): case ('LOGOUT_FULLFILLED'): return null; default: return state; } }; export default combineReducers({ token }); ``` username_0: @username_1 - This worked like magic. Thank you so much for your time and suggestion! username_2: Hello @username_1 is there a way to use store in global? I need to simplify the use of my alert which needs to dispatch from global instead of calling `bindActionCreators`, `connect`, and `setAlert` action. I try to call my `store` from global but it will use diffefent store (create a new one) instead of active ones. username_1: Not possible and you should never do it. The reason is that for each request the context must be independent otherwise all your clients will share some state which is a very bad practice. username_1: My use case is, I have a fetchApi method, which picks up the access tokens from my state. It will be a lot of repetition if I pass the access token from each component whenever I make API calls. Will it be possible to access the store object in my `fetchApi` routine, so that I can use `store.getState()` and then access my `access_token` using that. Another use case is, my `fetchApi` can trigger outside of the Component lifecycle, from some `setTimeout`. Example: to refresh my `access_token`. Over there also, I will have to get the latest `refresh_token` from my state. Please suggest. Thanks! EDIT: If it helps, I am reusing the entire store and API logic between my React Native mobile application and NextJS application. So, not sure if I can use the global `window` object here to access the store. Status: Issue closed