names
stringlengths 1
98
| readmes
stringlengths 8
608k
| topics
stringlengths 0
442
| labels
stringclasses 6
values |
---|---|---|---|
LLMDet | llmdet a third party large language models generated text detection tool paper https arxiv org abs 2305 15004 llmdet is a text detection tool that can identify which generated sources the text came from e g large language model or human write the core idea of the detection algorithm is to use the n grams probability sampled from specified language model to calculate proxy perplexity of large language models and use the proxy perplexity as a feature to train a text classifier features we believe that a practical llm detection tool needs to have the following capabilities which is also the goal of our llmdet 1 specificity our project aims to distinguish between different large language models and human generated text for example llmdet can tell you whether the text is generated by gpt 2 or opt or a human and give each a specific probability 2 safty our project does not need to require running large language models locally that is we can act as a third party authentication agent without maintaining large language models which may be fixed assets or sensitive information for large companies 3 efficiency our method detects very fast this is because we don t need to infer from large language models 4 extendibility our project can easily adapt to newly proposed large language models installation notes a package for large language model generated text detection tool before you go ahead with the installation of this toolkit please execute the following command to make sure that the version of transformers 4 29 0 shell pip install git https github com liadrinz transformers unilm pip install git https github com huggingface transformers code is compatible with python 3 8 fully automatic installation pip install llmdet semi automatic installation first download http pypi python org pypi llmdet decompress and run python setup py install see requirements txt for dependent python packages main functions currently llmdet offers two functions 1 detection it is supported to determine whether the given text comes from gpt 2 llama bart opt unilm t5 bloom gpt neo or human write 2 extendibility it allows model owners to extend the detection capability of llmdet to novel models examples python import llmdet llmdet load probability text the actress was honoured for her role in the reader at the annual ceremony which was held at the royal albert hall the film which is based on the novel by the same name by philip roth tells the story of a new york times reporter who returns to his hometown to cover the death of his brother in law winslet plays his wife with whom he has been divided since the death of their son nin the film winslet plays the mother of the grieving brother in law nthe actress also won a golden globe for her role in the film at the ceremony in november nwinslet was also nominated for an oscar for her role in the reader nthe 63 year old winslet was seen accepting her awards at the ceremony where she was joined by her husband john krasinski who has been nominated for best supporting actor in the film nwinslet and krasinski met while detect text is a string or string list result llmdet detect text print result detection results json opt 0 5451331013247862 gpt 2 0 4393605735865629 unilm 0 012642800848279893 t5 0 0022592730436008556 bloom 0 00025873253035729044 gpt neo 0 0002520776780109571 llama 6 0459794454546154e 05 human write 1 9576671778802474e 05 bart 1 3404522168622544e 05 todo list improve the algorithm performance and efficiency citation if you use llmdet in your research please use the following bibtex entry bibtex misc wu2023llmdet title llmdet a third party large language models generated text detection tool author kangxi wu and liang pang and huawei shen and xueqi cheng and tat seng chua year 2023 eprint 2305 15004 archiveprefix arxiv primaryclass cs cl | large-language-models llm generated-text-detection | ai |
NLP-LLMs | large language models bert bidirecctional enccoder representations from transformers distilbert distilled version of bert llama2 large language model meta ai hosted on replicate server | ai |
|
ironfish | iron fish https user images githubusercontent com 767083 113650890 d8414c80 9645 11eb 8f4d 2427fc322ce4 png iron fish node ci https github com iron fish ironfish actions workflows ci yml badge svg https github com iron fish ironfish actions workflows ci yml rust ci https github com iron fish ironfish actions workflows rust ci yml badge svg https github com iron fish ironfish actions workflows rust ci yml node ci regenerate fixtures https github com iron fish ironfish actions workflows ci regenerate fixtures yml badge svg https github com iron fish ironfish actions workflows ci regenerate fixtures yml codecov https codecov io gh iron fish ironfish branch master graph badge svg token fojpfn18xz https codecov io gh iron fish ironfish iron fish is a layer 1 blockchain that provides the strongest privacy guarantees on every single transaction leveraging zero knowledge proofs zk snarks and the highest industry standards for encryption see https ironfish network developer install the following steps should only be used to install if you are planning on contributing to the iron fish codebase otherwise we strongly recommend using the installation methods here https ironfish network use get started installation 1 install node js 18 x https nodejs org en download 1 install rust https www rust lang org learn get started 1 install yarn https classic yarnpkg com en docs install 1 windows 1 install the current version of python from the microsoft store package https www microsoft com en us p python 310 9pjpw5ldxlz5 1 install visual c build environment visual studio build tools https visualstudio microsoft com thank you downloading visual studio sku buildtools using visual c build tools or desktop development with c workload if the above steps didn t work for you please visit microsoft s node js guidelines for windows https github com microsoft nodejs guidelines blob master windows environment md compiling native addon modules for additional tips 1 run yarn install from the root directory to install packages if yarn install fails with an error that includes failed to build cmake you may need to first install cmake for example on macos 1 run brew install cmake you ll need cmake version 3 15 or higher if yarn install fails with an error that includes could not find openssl you may need to first install openssl and add an environment variable for example on macos 1 run brew install openssl 1 run export openssl root dir brew prefix openssl 1 run yarn install again if yarn install fails with an error that includes error not found make make cc command not found or make g command not found you may need to install a c c compiler toolchain https github com nodejs node gyp on unix 1 on ubuntu apt install build essential 1 on amazon linux sudo yum groupinstall development tools if yarn install fails with an error that includes error could not find any python installation to use you may need to install python3 required by node gyp on macos 1 run brew install python usage once your environment is set up you can run the cli by following these directions https github com iron fish ironfish tree master ironfish cli running tests 1 to test the entire monorepo 1 run yarn test at the root of the repository 1 run yarn test slow in ironfish to run slow tests 1 run yarn test coverage at the root of the repository for tests and coverage 1 to test a specific project 1 run yarn test at the root of the project 1 run yarn test watch in ironfish or ironfish cli if you want the tests to run on change 1 run yarn test coverage html if you want to export the coverage in an easy to use format open the index html file in the coverage folder of the project running benchmarks and performance tests 1 rust benchmarks cargo benchmark is a cargo alias defined in cargo config toml 1 cargo benchmark to run all benchmark tests 1 cargo benchmark simple to run only benchmarks containing the text simple in the name 1 typescript benchmarks 1 cd ironfish 1 yarn test perf structure of the repository ironfish ironfish readme md the library that contains the ironfishsdk and all ironfish code written in typescript ironfish cli ironfish cli readme md the main client for iron fish as of today it is a command line interface built on node more details in our documentation https ironfish network use get started installation ironfish rust ironfish rust readme md core api for interacting with the transactions and chain and using zkp ironfish rust nodejs ironfish rust nodejs readme md wrapper for ironfish rust as a native nodejs addon contributing code if you want to contribute code you must first read our contributing guidelines contributing md or risk having your pull request closed other repositories iron fish homebrew brew https github com iron fish homebrew brew contains brew formula for installing via the brew https brew sh package manager iron fish website https github com iron fish website the repo that powers ironfish network https ironfish network iron fish website testnet https github com iron fish website testnet the repo that powers testnet ironfish network https testnet ironfish network iron fish ironfish api https github com iron fish ironfish api the repository that powers most iron fish api services iron fish chain explorer https github com iron fish chain explorer a visual tool to explore the block chain and all of its forks licensing this code base and any contributions will be under the mpl 2 0 https www mozilla org en us mpl 2 0 software license | cryptocurrency blockchain p2p | blockchain |
photo-filter-dev | udagram image filtering microservice http image filter starter codecopy12 dev us east 1 elasticbeanstalk com http image filter starter codecopy12 dev us east 1 elasticbeanstalk com filteredimage image url https user images githubusercontent com 42734825 138477408 d20823b2 2510 4f4b 96b4 bdb197ef3c7d png udagram is a simple cloud application developed alongside the udacity cloud engineering nanodegree it allows users to register and log into a web client post photos to the feed and process photos using an image filtering microservice the project is split into three parts 1 the simple frontend https github com udacity cloud developer tree master course 02 exercises udacity c2 frontend a basic ionic client web application which consumes the restapi backend covered in the course 2 the restapi backend https github com udacity cloud developer tree master course 02 exercises udacity c2 restapi a node express server which can be deployed to a cloud service covered in the course 3 the image filtering microservice https github com udacity cloud developer tree master course 02 project image filter starter code the final project for the course it is a node express application which runs a simple script to process images your assignment tasks setup node environment you ll need to create a new node server open a new terminal within the project directory and run 1 initialize a new project npm i 2 run the development server with npm run dev create a new endpoint in the server ts file the starter code has a task for you to complete an endpoint in src server ts which uses query parameter to download an image from a public url filter the image and return the result we ve included a few helper functions to handle some of these concepts and we re importing it for you at the top of the src server ts file typescript import filterimagefromurl deletelocalfiles from util util deploying your system follow the process described in the course to eb init a new application and eb create a new environment to deploy your image filter service don t forget you can use eb deploy to push changes stand out optional refactor the course restapi if you re feeling up to it refactor the course restapi to make a request to your newly provisioned image server authentication prevent requests without valid authentication headers note if you choose to submit this make sure to add the token to the postman collection and export the postman collection file to your submission so we can review custom domain name add your own domain name and have it point to the running services try adding a subdomain name to point to the processing server note domain names are not included in aws free tier and will incur a cost | typescript | cloud |
cloud-project | cloud project learning the basics of cloud engineering mainly focusing on docker github cl ansible terraform and using python and maybe java to gather information i am currently using github as a way to keep track of my learning of cloud engineering the name is currently just a placeholder main project idea create github to create a lynnode server by running a pipeline description when the server spins up it will pull down the ansible playbook then deploy docker container use terraform to deploy your lynnode servers and move the workload to aws | cloud |
|
Project-airport | a journey of a thousand miles begins with a single step and this is the first one main goal of project this project was created by me for my studies i learn how to use relational databases in practise create queries and how to handle them through websites what did i get out of the project i found out how difficult it projects are i wrote detailed documentation of the project i learned how to handle connections with the database i know how important it is to design the base well at the beginning i created queries to the database triggers i created a simple search engine for flights i know what sql injections are and how to protect the database against them you can see it at github pages https treodaio github io project airport | engineering-studies relational-databases | server |
mobile-app-development-ios | mobile app development ios code examples for mobile application development for ios i teach courses in mobile application development for undergraduates of computer science and software engineering this repository includes the full code examples that are used to explain concepts in my lectures | front_end |
|
aws-fis-experiment-templates | chaos engineering with aws fault injection simulator fis issues https img shields io github issues adhorn aws fis experiment templates maintenance https img shields io badge maintained 3f yes green svg https github com adhorn aws fis experiment templates graphs commit activity twitter https img shields io twitter url https github com adhorn aws fis experiment templates style social https twitter com intent tweet text wow url https 3a 2f 2fgithub com 2fadhorn 2faws fis experiment templates collection of fis experiment templates https docs aws amazon com fis latest userguide experiment templates html these templates let you perform fault injection experiments on resources applications network and infrastructure in the aws cloud https aws amazon com currently available support for network access control list https github com adhorn aws fis experiment templates tree main network access control list fault injection using custom embedded scripts via ssm automation support for network access control list https github com adhorn aws fis experiment templates tree main network access control list fault injection using custom lambda functions via ssm automation support for stopping ec2 instances in a particular az https github com adhorn aws fis experiment templates blob main stop instances fis stop instance az json support for randomly stopping ec2 instances https github com adhorn aws fis experiment templates blob main stop instances fis stop instance random json support for ec2 api throttling error https github com adhorn aws fis experiment templates blob main ec2 control plane fis throttle error json support for ec2 api internal error https github com adhorn aws fis experiment templates blob main ec2 control plane fis internal error json support for ec2 api unavailable error https github com adhorn aws fis experiment templates blob main ec2 control plane fis unavailable error json support for ec2 spot interruption https github com adhorn aws fis experiment templates blob main spot interruption fis spot interruption json support for terminating instances in auto scaling group in particular az https github com adhorn aws fis experiment templates tree main auto scaling group faults prerequisites what is aws fault injection simulator https docs aws amazon com fis latest userguide what is html experiment templates for aws fis https docs aws amazon com fis latest userguide experiment templates html how aws fault injection simulator works with iam https docs aws amazon com fis latest userguide security iam service with iam html important before using these templates replace all occurences of account id instance id iam role cloudwatch alarm iam role ssm with your own particular ones upload an fis experiment template to your aws account sourcecode shell aws fis create experiment template cli input json fileb fis template json query experimenttemplate id extqgczsc6czpmha start an fis experiment sourcecode shell aws fis start experiment experiment template id extqgczsc6czpmha query experiment id expnk1ynt3prlcf9ln stop an fis experiment sourcecode shell aws fis stop experiment id expnk1ynt3prlcf9ln some words of caution before you start using these templates to begin with do not use these fault injection templates in production blindly always review the fis templates and the actions in them make sure your first fault injections are done in a test environment and on test instances where no real and paying customer can be affected test test and test more remember that chaos engineering is about breaking things in a controlled environment and through well planned experiments to build confidence in your application and you own tools to withstand turbulent conditions | chaos-engineering sre devops testing-tools fault-injection aws aws-fis | cloud |
Polyglot | polyglot build status https travis ci org polymathorg polyglot svg branch master https travis ci org polymathorg polyglot build status https ci appveyor com api projects status nk84odcludj242lw svg true https ci appveyor com project nikhilpinnaparaju polyglot coverage status https coveralls io repos github polymathorg polyglot badge svg branch master https coveralls io github polymathorg polyglot branch master license https img shields io badge license mit blue svg https raw githubusercontent com polymathorg polyglot master license pharo version https img shields io badge pharo 6 1 23aac9ff svg https pharo org download pharo version https img shields io badge pharo 7 0 23aac9ff svg https pharo org download pharo version https img shields io badge pharo 8 0 23aac9ff svg https pharo org download this repository is marked as a public archive and it will be deleted in the future it contains duplicated code that it can be found in the other pharo ai repositories we encourage you to look into the other nlp repositories that we have inside pharo ai a library for natural language processing implemented in pharo to get more information check out the polyglot booklet https github com squarebracketassociates booklet polyglot installation to install polyglot go to the playground ctrl ow in your fresh pharo image and execute the following metacello script select it and press do it button or ctrl d smalltalk metacello new baseline polyglot repository github polymathorg polyglot src load list of supported features tokenization n grams term frequency inverse document frequency scoring n gram language modelling stemming part of speech tagging named entity recognizer dependency parser modified atlas bridge common vector metrics google summer of code 2019 report author nikhil pinnaparaju organisation pharo https pharo org project polyglot https github com polymathorg polyglot mentors oleksandr zaitsev alexandre bergel a library for natural language processing implemented in pharo features implemented tokenization n grams term frequency inverse document frequency scoring n gram language modelling stemming part of speech tagging named entity recognizer dependency parser modified atlas bridge common vector metrics code contribution commits polyglot https github com polymathorg polyglot commits master author nikhilpinnaparaju pull requests polyglot https github com polymathorg polyglot pulls utf8 e2 9c 93 q is 3apr author 3anikhilpinnaparaju issues raised polymath https github com polymathorg polymath issues utf8 e2 9c 93 q is 3aissue author 3anikhilpinnaparaju pull requests polymath https github com polymathorg polymath pulls utf8 e2 9c 93 q is 3apr author 3anikhilpinnaparaju documentation blog posts representing documents as vectors and visualizing them using polyglot in pharo https medium com nikhilpinnaparaju representing documents as vectors and visualizing them using polyglot in pharo 73887e8bb418 stemming in polyglot https medium com nikhilpinnaparaju stemming in polyglot 2672a349e15 working with the atlas pharo python bridge https medium com nikhilpinnaparaju working with the atlas pharo python bridge 1ad6ba356f7 polyglot for large corpora https medium com nikhilpinnaparaju polyglot for large corpora 71267c525876 introducing polyglot https link medium com xrrmmbsfpx tokenization gsoc with pharo consortium https link medium com ylak5qtfpx community bonding period gsoc with pharo consortium https link medium com wyjlwqwfpx architecture design for an nlp library https link medium com az8fikxfpx pca in pharo using polymath dataframe and roassal https link medium com qcrtm0yfpx my journey into google summer of code 2019 https link medium com pz6zd4zfpx booklets the polyglot booklet https github com squarebracketassociates booklet polyglot documentation for polyglot https github com nikhilpinnaparaju polyglot documentation project demonstration presentation polyglot esug presention v1 0 https drive google com file d 18j2bgdrj6dhbaxg1 n3hc8 zd7y0p9yp view usp sharing polyglot esug presention v2 0 https drive google com file d 1pgns1xpwos1txeclhbzffvlbrbs54pgs view usp sharing | nlp natural-language-processing pharo | ai |
Beg-STM32-Devel-FreeRTOS-libopencm3-GCC | apress source code this repository accompanies beginning stm32 developing with freertos libopencm3 and gcc https www apress com 9781484236239 by warren gay apress 2018 comment cover cover image 9781484236239 jpg download the files as a zip using the green button or clone the repository to your machine using git releases release v1 0 corresponds to the code in the published book without corrections or updates contributions see the file contributing md for more information on how you can contribute to this repository bill of materials for book 1 st link v2 programmer 1 breadboard 1 dupont jumper wires 1 0 1 uf bypass caps for use on breadboard 1 usb ttl serial adapter 1 power supply on ebay mb102 solderless breadboard power supply module 3 3 and 5v switchable 1 small assortment of leds 1 220 ohm or sip 9 resistor at 220 ohms for several leds 1 get 3 x stm32 units if you want to do the full can project but you can skimp with two if necessary 1 winbond w25q32 w25q64 flash chip dip 1 pcf8574 gpio extender dip 1 oled using ssd1306 controller four wire spi only ebay 1 rc servo s and or scope to control a servo 1 rc servo controller s or 555 timer circuit to generate pwm to be read 1 optional scope for examining clock signals for one chapter notes 1 when compiling and you get the error message getline c 5 20 fatal error memory h no such file or directory at the line where it is coded as include memory h you may have to change that to include string h instead the compiler folks have sometimes moved the header file 1 it has been reported that kubuntu 18 04 ships with arm none eabi gcc 15 6 3 1 svn253039 1build1 6 3 1 20170620 with this compiler the code does not work creates problems for freertos memcpy seems to be the problematic function call in the code it is called by freertos when adding an element to the queue details in the freertos discussion on sourceforge | os |
|
ESDM_Course | esdm course here you will find all the materials for the class of embedded system design and modeling esdm taught during 2020 2021 academic year at etti tuiasi course materials lectures all lecture slides are available in the lectures lectures folder the annotated lectures written on during the classes are available in the lectures online lectures online folder the files are named as chapter annotated pdf there should be 7 files from 01 to 07 laboratory the laboratory files are available in the labs labs folder projects the project descriptions are available in the projects projects folder evaluation exam the written exam consists of a practical exercise draw the implementation of a fsm from a few requirements approx 50 of points and a few theoretical questions the rest of points for the years up to 2021 there was a fixed list of possible subjects this is not provided for 2021 since the exam is online the exercise and the theoretical questions may change from the previous years though you can use them as examples grades grades will be available only on the faculty s e learning platform edu etti tuiasi ro have fun a rel license href http creativecommons org licenses by 4 0 img alt creative commons license style border width 0 src https i creativecommons org l by 4 0 88x31 png a br this work is licensed under a a rel license href http creativecommons org licenses by 4 0 creative commons attribution 4 0 international license a | os |
|
BathiyaHelloWorldLabs | bathiyahelloworldlabs embedded system design tut 1 | os |
|
Fire-Smoke-Detection | fire smoke detection detecting fire smoke using computer vision open cv and pytorch early fire smoke detection plays a very important role in protecting many lives also property loss can be reduced and downtime for the operation minimized through early detection therefore in this project i have developed an computer vision deep learning pipeline for fire and smoke detection demo output https github com imsaksham c fire smoke detection blob master utils demo gif download the dataset download https github com deepquestai fire smoke dataset releases download v1 fire smoke dataset zip dataset folder train fire neutral smoke test fire neutral smoke dataset contains 1000 images of each class model structure for traing the model i have used transfer learning technique architecture used here is resnet50 which is pretrained on imagenet dataset i have achieved validation accuracy of 93 using resnet for more info about training and graphs open training ipynb training loss https github com imsaksham c fire smoke detection blob master utils trainloss png model accuracy https github com imsaksham c fire smoke detection blob master utils accuracy png sample results https github com imsaksham c fire smoke detection blob master utils fire png https github com imsaksham c fire smoke detection blob master utils smoke png https github com imsaksham c fire smoke detection blob master utils neutral png steps 1 clone download the repo 2 download the dataset 3 for training open training ipnb 4 for inference open inference ipynb requirements python3 pytorch opencv matplotlib numpy upcoming work restapi rest api using flask references 1 pyimagesearch https www pyimagesearch com 2019 11 18 fire and smoke detection with keras and deep learning 2 deepquestai fire smoke dataset https github com deepquestai fire smoke dataset | ai |
|
computer_vision | computer vision some computer vision tutorials for my articles feature extraction and similar image search with opencv for newbies https medium com machine learning world feature extraction and similar image search with opencv for newbies 3c59796bf774 shape context descriptor and fast characters recognition https medium com machine learning world shape context descriptor and fast characters recognition c031eac726f9 online course object detection with pytorch subscribe to my new online course learnml today http learnml today subscribe to our machine learning blog blog https medium com machine learning world telegram channel https t me ml world support beside work im trying to help homeless animals so if you like my work you can support me https www patreon com uah | ai |
|
tuya-iot-python-sdk | tuya iot python sdk pypi https img shields io pypi v tuya iot py sdk pypi downloads https img shields io pypi dm tuya iot py sdk pypi python version https img shields io pypi pyversions tuya iot py sdk a python sdk for tuya open api which provides basic iot capabilities like device management asset management and industry capabilities helping you create iot solutions with diversified devices and industries tuya iot development platform opens basic iot capabilities like device management ai scenarios and data analytics services as well as industry capabilities helping you create iot solutions features base apis tuyaopenapi connect is connect get post put delete tuyaopenmq start stop add message listener remove message listener apis tuyadevicelistener update device add device remove device device control tuyadevicemanager update device list in smart home update device caches update device function cache add device listener remove device listener get device info get device list info remove device remove device list get factory info factory reset get device status get device list status get device functions get category functions get device specification send commands home tuyahomemanager update device cache query scenes trigger scene query infrared devices trigger infrared commands assets tuyaassetmanager get device list get asset info get asset list possible scenarios homeassistant tuya plugin https github com tuya tuya home assistant tuya connector python https github com tuya tuya connector python fhem tuya plugin by fhempy https github com dominikkarall fhempy tree master fhem bindings python fhempy lib tuya cloud prerequisite registration please check tuya iot platform configuration guide https developer tuya com en docs iot platform configuration smarthome id kamcgamwoevrx to register an account on the tuya iot platform https iot tuya com source github and get the required information you need to create a cloud project and complete the configuration of asset user and application then you will get the username password access id and access secret usage installation pip3 install tuya iot py sdk sample code openapi sample https github com tuya tuya iot python sdk blob master example device py open iot hub sample https github com tuya tuya iot python sdk blob master example mq py tuya open api reference tuya opens up a variety of apis covering business scenarios such as device pairing smart home management device control and scene automation you can call apis according to api integration documents to implement applications for more information see the documentation https developer tuya com en docs cloud source github documentation cloud development api reference https developer tuya com docs iot open api api reference api reference issue feedback you can provide feedback on your issue via github issue or technical ticket https service console tuya com license tuya iot py sdk is available under the mit license please see the license license file for more info | python iot sdk tuya | server |
IUBH.TOR | bachelor s thesis this repository contains the source code https github com aspnetde iubh tor tree master src of iubh tor the application built for my bachelor s thesis along with the latex source https github com aspnetde iubh tor tree master thesis of the thesis https github com aspnetde iubh tor raw master compiled thesis pdf itself p align center a href https github com aspnetde iubh tor raw master compiled thesis pdf title suitability of functional programming for the development of mobile applications a comparison of f and c involving xamarin img src https github com aspnetde iubh tor raw master compiled thesis png width 400 alt suitability of functional programming for the development of mobile applications a comparison of f and c involving xamarin title suitability of functional programming for the development of mobile applications a comparison of f and c involving xamarin a p development notes hot reload for the c app for the c app hotreload https github com andreimisiukevich hotreload can be used by running mono xamarin forms hotreload observer exe u http 127 0 0 1 8000 when testing on android cd users user library developer xamarin android sdk macosx adb forward tcp 8000 tcp 8000 hot reload for the f app it s called live update in the speak of fabulous see detailed information here https fsprojects github io fabulous fabulous xamarinforms tools html fabulous watch send when testing on android usb adb d forward tcp 9867 tcp 9867 emulator adb e forward tcp 9867 tcp 9867 simulating background fetches on android adb shell cmd jobscheduler run f de iubh tor 69 setting up iubh tor server both apps are intended to work with the real care system however iubh tor server provides a fast and reliable test environment that behaves exactly as the original care system this means it expects the credentials in the same shape as the original login webpage and it returns an html file that contains the transcript of records in the same style as care does to set it up run npm init to run it use node index js there s also a postman config located in testdata | fsharp xamarin csharp ios android functional-programming oop object-oriented-programming fabulous | front_end |
EE712_embedded | tiva ee712 embedded system design lab experiment | os |
|
lws | view on npm https badgen net npm v lws https www npmjs org package lws npm module downloads https badgen net npm dt lws https www npmjs org package lws gihub repo dependents https badgen net github dependents repo lwsjs lws https github com lwsjs lws network dependents dependent type repository gihub package dependents https badgen net github dependents pkg lwsjs lws https github com lwsjs lws network dependents dependent type package node js ci https github com lwsjs lws actions workflows node js yml badge svg https github com lwsjs lws actions workflows node js yml coverage status https coveralls io repos github lwsjs lws badge svg https coveralls io github lwsjs lws js standard style https img shields io badge code 20style standard brightgreen svg https github com feross standard lws a lean modular web server for rapid full stack development lws is an application core for quickly launching a local web server behaviour is added via plugins giving you full control over how requests are processed and responses created supports http https and http2 small and 100 personalisable load and use only the behaviour required by your project attach a custom view to personalise how activity is visualised programmatic and command line apis synopsis core usage launch an http server on the default port of 8000 lws listening at http mba4 local 8000 http 127 0 0 1 8000 http 192 168 0 200 8000 for https or http2 pass the https or http2 flags respectively lws http2 listening at https mba4 local 8000 https 127 0 0 1 8000 https 192 168 0 200 8000 now your server is running the next step is to attach some middleware to process requests using middleware plugins install and use some middleware lws static https github com lwsjs static and lws index https github com lwsjs index to serve static files and directory listings npm install save dev lws static lws index lws stack lws static lws index listening at http mba4 local 8000 http 127 0 0 1 8000 http 192 168 0 200 8000 the current directory will now be available to explore at http 127 0 0 1 8000 install and use logging middleware note the lws prefix is optional when supplying module names to stack npm install save dev lws log lws stack log static index log format combined listening at http mba4 local 8000 http 127 0 0 1 8000 http 192 168 0 200 8000 ffff 127 0 0 1 get lws config js http 1 1 200 52 8 259 ms ffff 127 0 0 1 get package json http 1 1 200 399 1 478 ms creating a custom middleware plugin lws uses koa https github com koajs koa as its middleware engine here is a trivial plugin example save the following code as example middleware js js class exampleplugin middleware return async ctx next ctx body hello from lws await next export default exampleplugin now launch an http server using this middleware lws stack example middleware js listening at http mba4 local 8000 http 127 0 0 1 8000 http 192 168 0 200 8000 curl http 127 0 0 1 8000 hello from lws install npm install save dev lws documentation api reference lws https github com lwsjs lws blob master doc lws md middleware plugin https github com lwsjs lws blob master doc middleware plugin md view plugin https github com lwsjs lws blob master doc view plugin md see also lws plugin list https npms io search q keywords 3alws middleware local web server https github com lwsjs local web server an lws distribution with the most common plugins already installed copy 2016 22 lloyd brookes 75pound gmail com tested by test runner https github com test runner js test runner documented by jsdoc to markdown https github com jsdoc2md jsdoc to markdown | lws backend webapp server http-server development-tools | front_end |
nlp4kor | logo https github com bage79 nlp4kor raw master ipynb img nlp4kor png nlp4kor natural language processing for korean with deep learning http github com bage79 nlp4kor https www facebook com nlp4kor posts 143362066206127 nlp4kor gmail com install md https github com bage79 nlp4kor blob master install md license md https github com bage79 nlp4kor blob master license md https www youtube com playlist list ple ylep kqefhfsnh16hjknq6stig05fu season 1 nlp with tensorflow http github com bage79 nlp4kor tensorflow 1 cnn for mnist 2017 05 20 http nbviewer jupyter org github bage79 nlp4kor blob master ipynb cnn for mnist ipynb https github com bage79 nlp4kor tensorflow blob master nlp4kor tensorflow cnn cnn mnist py 2 ffnn for 2017 06 03 http nbviewer jupyter org github bage79 nlp4kor blob master ipynb ffnn for word spacing ipynb https github com bage79 nlp4kor tensorflow blob master nlp4kor tensorflow ffnn word spacing py 3 dae for 2017 07 01 http nbviewer jupyter org github bage79 nlp4kor blob master ipynb dae for spelling error correction ipynb https github com bage79 nlp4kor tensorflow blob master nlp4kor tensorflow dae spelling error correction py 4 dl for nlp a to z 2017 07 15 http nbviewer jupyter org github bage79 nlp4kor blob master ipynb dl for nlp a to z pdf 5 real tensorflow coding 2017 07 29 http nbviewer jupyter org github bage79 nlp4kor blob master ipynb real tensorflow coding ipynb season 2 deep learning 6 hyper parameters 2017 10 08 https github com bage79 nlp4kor blob master nlp4kor skeletons learn math functions demo py https github com bage79 nlp4kor tensorflow blob master nlp4kor tensorflow skeletons learn math functions demo py 7 rnn 2017 12 24 http nbviewer jupyter org github bage79 nlp4kor blob master ipynb rnn for beginners pdf 8 cnn 2017 12 27 http nbviewer jupyter org github bage79 nlp4kor blob master ipynb cnn for beginners pdf 9 nn 2018 01 02 http nbviewer jupyter org github bage79 nlp4kor blob master ipynb nn for beginners pdf season 3 nlp with pytorch http github com bage79 nlp4kor pytorch 10 word2vec 2018 05 08 https github com bage79 nlp4kor pytorch blob master nlp4kor pytorch word2vec readme md | pytorch korean word2vec natural-language-processing tensorflow deep-learning | ai |
GoStack-Level-2 | development of backend configure a project from scratch using node js and typescript including tools like ts node dev eslint prettier editorconfig continue development of app gobarber and introduce concepts of authentication autorization and database using docker database postgres typeorm multer jwt json web token | server |
|
vision | vision this is a simple vehile detection project using computer vision here i used background subtractions methods of opencv library of python and some morphological transformation for accuracy but this is for only static cameras at the corner of the video you can see the count of the vehicles which gets recorded when they cross a predefined limit for the calculation of the object coordinates and object ids i defined a class called vehicles py for running it just download all files and run the main py thank you | ai |
|
1718v-public | 1718v public web application development spring 2018 samples https github com isel leic daw 1718v public tree master samples folder contains code samples typically prepared before classes demos https github com isel leic daw 1718v public tree master demos folder contains demos done during the classes see the wiki https github com isel leic daw 1718v public wiki for more information | front_end |
|
second-semester-exam-project | second semester exam project altschool of cloud engineering 2nd semester exam submission github repo | cloud |
|
nano-node | p style text align center img src images logo svg width 300px height auto alt logo p live artifacts https github com nanocurrency nano node workflows live badge svg https github com nanocurrency nano node actions query workflow 3alive beta artifacts https github com nanocurrency nano node workflows beta badge svg https github com nanocurrency nano node actions query workflow 3abeta github release latest by date https img shields io github v release nanocurrency nano node https github com nanocurrency nano node releases latest github tag latest by date https img shields io github v tag nanocurrency nano node color darkblue label beta https github com nanocurrency nano node tags coverage status https coveralls io repos github nanocurrency nano node badge svg branch develop https coveralls io github nanocurrency nano node branch develop tests https github com nanocurrency nano node workflows tests badge svg https github com nanocurrency nano node actions query workflow 3atests relwithdebug tests https github com nanocurrency nano node workflows release 20tests badge svg https github com nanocurrency nano node actions query workflow 3a 22release tests 22 discord https img shields io badge discord join 20chat orange svg https chat nano org what is nano nano is a digital payment protocol designed to be accessible and lightweight with a focus on removing inefficiencies present in other cryptocurrencies with ultrafast transactions and zero fees on a secure green and decentralized network this makes nano ideal for everyday transactions guides documentation whitepaper https nano org en whitepaper running a node https docs nano org running a node overview integration guides https docs nano org integration guides the basics command line interface https docs nano org commands command line interface rpc protocol https docs nano org commands rpc protocol other documentation details can be found at https docs nano org links resources nano website https nano org documentation https docs nano org discord chat https chat nano org reddit https reddit com r nanocurrency medium https medium com nanocurrency twitter https twitter com nano want to contribute please see the contributors guide https docs nano org node implementation contributing contact us we want to hear about any trouble success delight or pain you experience when using nano let us know by filing an issue https github com nanocurrency nano node issues joining us on reddit https reddit com r nanocurrency or joining us on discord https chat nano org | nano cryptocurrency cryptocurrencies nanocurrency blockchain | blockchain |
LUAnity | luanity luanity unity lua 3d luanity is a solution for mobile game development using lua in unity it has been verified in onlined 3d mobile games luanity luainterface unity unity editor lua luainterface monoluainterface https github com stevedonovan monoluainterface nlua https github com nlua nlua arm64 luajit luanity is based on luainterface and integrated into unity seamlessly it makes lua codes running perfectly in unity editor and mobile devices to control your games we refactored and optimized the luainterface codes and made a few references to monoluainterface https github com stevedonovan monoluainterface and nlua https github com nlua nlua and added some functions i e arm64 luajit support for mobile game development as well luanity luanity in the meantime luanity is also some principles for code design and writing it s highly recommended to follow these principles to achieve the best results features unity lua c coding in lua c seamlessly in unity to control anything you want c lua c importing c classes functions to lua and using them at any time with no c code generated best for code updating lua luajit running in raw lua vm with great performance or even in luajit lua call stack handling errors and tracking lua call stack android ios 32 64 supporting android and ios armv7 arm64 ugui ngui working perfectly with ugui ngui and etc requires lua 5 1 x or higher 5 1 5 by default or luajit 2 0 3 for android and 2 1 0 for ios unity 4 6 or higher 5 4 by default suported platforms android armv7 arm64 ios armv7 arm64 why do we prefer dynamic binding using reflection rather than static binding unity lua luainterface c unity mono lua lua unity mono c c c luanity unity lua 1 c 2 lua protocol buffer 3 lua c lua 1 upvalue metatable index luanity 3d 100 lua c c lua luanity enjoy and have fun unity lua the following are some unity lua solutions in the last few years fyi nlua https github com nlua nlua luainterface unity unity3d nlua https github com mervill unity3d nlua nlua for unity https www assetstore unity3d com cn content 17389 ulua https www assetstore unity3d com en content 13887 unity lua luainterface luajit ulua luainterface unity ulua org ulua http ulua org download html ulua http www ceeger com forum read php tid 16483 ulua v1 08 luainterface cstolua cstolua https github com topameng cstolua tolua https github com topameng tolua tolua unity c lua slua https github com pangweiwei slua unity c lua tolua p | front_end |
|
evaluating_bdl | evaluating bdl overview image evaluating bdl png official implementation pytorch of the paper evaluating scalable bayesian deep learning methods for robust computer vision cvpr workshops 2020 arxiv https arxiv org abs 1906 01620 project http www fregu856 com publication evaluating bdl fredrik k gustafsson http www fregu856 com martin danelljan https martin danelljan github io thomas b sch n http user it uu se thosc112 we propose a comprehensive evaluation framework for scalable epistemic uncertainty estimation methods in deep learning it is specifically designed to test the robustness required in real world computer vision applications we also apply our proposed framework to provide the first properly extensive and conclusive comparison of the two current state of the art scalable methods ensembling and mc dropout our comparison demonstrates that ensembling consistently provides more reliable and practically useful uncertainty estimates youtube video https youtu be cabpvqtzsoi with qualitative results demo video with qualitative results https img youtube com vi cabpvqtzsoi 0 jpg https youtu be cabpvqtzsoi if you find this work useful please consider citing inproceedings gustafsson2020evaluating title evaluating scalable bayesian deep learning methods for robust computer vision author gustafsson fredrik k and danelljan martin and sch o n thomas b booktitle proceedings of the ieee cvf conference on computer vision and pattern recognition cvpr workshops year 2020 acknowledgements the depthcompletion code is based on the implementation by fangchangma https github com fangchangma found here https github com fangchangma self supervised depth completion the segmentation code is based on the implementation by pkurainbow https github com pkurainbow found here https github com pkurainbow ocnet pytorch which in turn utilizes inplace abn https github com mapillary inplace abn by mapillary https github com mapillary index usage usage depthcompletion depthcompletion segmentation segmentation toyregression toyregression toyclassification toyclassification documentation documentation depthcompletion documentationdepthcompletion segmentation documentationsegmentation toyregression documentationtoyregression toyclassification documentationtoyclassification pretrained models pretrained models usage the code has been tested on ubuntu 16 04 docker images are provided see below depthcompletion depthcompletion segmentation segmentation toyregression toyregression toyclassification toyclassification depthcompletion sudo docker pull fregu856 evaluating bdl pytorch pytorch 0 4 cuda9 cudnn7 evaluating bdl create start docker image toyproblems depthcompletion sh containing my username on the server is fregu482 i e my home folder is home fregu482 you will have to modify this accordingly bin bash default values gpuids 0 name toyproblems depthcompletion gpu nv gpu gpuids nvidia docker run it rm shm size 12g p 5700 5700 name name 0 v home fregu482 root fregu856 evaluating bdl pytorch pytorch 0 4 cuda9 cudnn7 evaluating bdl bash inside the image root will now be mapped to home fregu482 i e cd takes you to the regular home folder to create more containers change the lines gpuids 0 name name 0 and p 5700 5700 general docker usage to start the image sudo sh start docker image toyproblems depthcompletion sh to commit changes to the image open a new terminal window sudo docker commit toyproblems depthcompletion gpu0 fregu856 evaluating bdl pytorch pytorch 0 4 cuda9 cudnn7 evaluating bdl to exit the image without killing running code ctrl p q to get back into a running image sudo docker attach toyproblems depthcompletion gpu0 download the kitti depth completion http www cvlibs net datasets kitti eval depth php benchmark depth completion dataset data depth annotated zip data depth selection zip and data depth velodyne zip and place it in root data kitti depth root data kitti depth should contain the folders train val and depth selection create root data kitti raw and download the kitti raw http www cvlibs net datasets kitti raw data php dataset using download kitti raw py https github com fregu856 evaluating bdl blob master depthcompletion utils download kitti raw py create root data kitti rgb for each folder in root data kitti depth train e g 2011 09 26 drive 0001 sync copy the corresponding folder in root data kitti raw and place it in root data kitti rgb train download the virtual kitti https europe naverlabs com research computer vision proxy virtual worlds dataset vkitti 1 3 1 depthgt tar and vkitti 1 3 1 rgb tar and place in root data virtualkitti root data virtualkitti should contain the folders vkitti 1 3 1 depthgt and vkitti 1 3 1 rgb example usage sudo sh start docker image toyproblems depthcompletion sh cd python evaluating bdl depthcompletion ensembling train virtual py segmentation sudo docker pull fregu856 evaluating bdl rainbowsecret pytorch04 20180905 evaluating bdl create start docker image segmentation sh containing my username on the server is fregu482 i e my home folder is home fregu482 you will have to modify this accordingly bin bash default values gpuids 0 1 name segmentation gpu nv gpu gpuids nvidia docker run it rm shm size 12g p 5900 5900 name name 01 v home fregu482 home fregu856 evaluating bdl rainbowsecret pytorch04 20180905 evaluating bdl bash inside the image home will now be mapped to home fregu482 i e cd home takes you to the regular home folder to create more containers change the lines gpuids 0 1 name name 01 and p 5900 5900 general docker usage to start the image sudo sh start docker image segmentation sh to commit changes to the image open a new terminal window sudo docker commit segmentation gpu01 fregu856 evaluating bdl rainbowsecret pytorch04 20180905 evaluating bdl to exit the image without killing running code ctrl p q to get back into a running image sudo docker attach segmentation gpu01 download resnet101 imagenet pth from here http sceneparsing csail mit edu model pretrained resnet resnet101 imagenet pth and place it in segmentation download the cityscapes https www cityscapes dataset com dataset and place it in home data cityscapes home data cityscapes should contain the folders leftimg8bit and gtfine download the synscapes https 7dlabs com synscapes overview dataset and place it in home data synscapes home data synscapes should contain the folder img which in turn should contain the folders rgb 2k and class run segmentation utils preprocess synscapes py this will among other things create home data synscapes meta train img ids pkl and home data synscapes meta val img ids pkl by randomly selecting subsets of examples the ones used in the paper are found in segmentation lists synscapes example usage sudo sh start docker image segmentation sh cd home root miniconda3 bin python evaluating bdl segmentation ensembling train syn py toyregression example usage sudo sh start docker image toyproblems depthcompletion sh cd python evaluating bdl toyregression ensemble adam train py toyclassification example usage sudo sh start docker image toyproblems depthcompletion sh cd python evaluating bdl toyclassification ensemble adam train py documentation depthcompletion documentationdepthcompletion segmentation documentationsegmentation toyregression documentationtoyregression toyclassification documentationtoyclassification documentation depthcompletion example usage sudo sh start docker image toyproblems depthcompletion sh cd python evaluating bdl depthcompletion ensembling train virtual py criterion py definitions of losses and metrics datasets py definitions of datasets for kitti depth completion kitti and virtualkitti model py definition of the cnn model mcdropout py definition of the cnn with inserted dropout layers ensembling train py code for training m model py models on kitti train ensembling train virtual py as above but on virtualkitti train ensembling eval py computes the loss and rmse for a trained ensemble on kitti val also creates visualization images of the input data ground truth prediction and the estimated uncertainty ensembling eval virtual py as above but on virtualkitti val ensembling eval auce py computes the auce mean std for m 1 2 4 8 16 32 on kitti val based on a total of 33 trained ensemble members also creates calibration plots ensembling eval auce virtual py as above but on virtualkitti val ensembling eval ause py computes the ause mean std for m 1 2 4 8 16 32 on kitti val based on a total of 33 trained ensemble members also creates sparsification plots and sparsification error curves ensembling eval ause virtual py as above but on virtualkitti val ensembling eval seq py creates visualization videos input data ground truth prediction and the estimated uncertainty for a trained ensemble on all sequences in kitti val ensembling eval seq virtual py as above but on all sequences in virtualkitti val mcdropout train py code for training m model mcdropout py models on kitti train mcdropout train virtual py as above but on virtualkitti train mcdropout eval py computes the loss and rmse for a trained mc dropout model with m forward passes on kitti val also creates visualization images of the input data ground truth prediction and the estimated uncertainty mcdropout eval virtual py as above but on virtualkitti val mcdropout eval auce py computes the auce mean std for m 1 2 4 8 16 32 forward passes on kitti val based on a total of 16 trained mc dropout models also creates calibration plots mcdropout eval auce virtual py as above but on virtualkitti val mcdropout eval ause py computes the ause mean std for m 1 2 4 8 16 32 forward passes on kitti val based on a total of 16 trained mc dropout models also creates sparsification plots and sparsification error curves mcdropout eval ause virtual py as above but on virtualkitti val mcdropout eval seq py creates visualization videos input data ground truth prediction and the estimated uncertainty for a trained mc dropout model with m forward passes on all sequences in kitti val mcdropout eval seq virtual py as above but on all sequences in virtualkitti val documentation segmentation example usage sudo sh start docker image segmentation sh cd home root miniconda3 bin python evaluating bdl segmentation ensembling train syn py models model py definition of the cnn model mcdropout py definition of the cnn with inserted dropout layers aspp py definition of the aspp module resnet block py definition of a resnet block utils criterion py definition of the cross entropy loss preprocess synscapes py creates the synscapes train val dataset by randomly selecting a subset of 2975 500 examples and resizes the labels to 1024 x 2048 utils py helper functions for evaluation and visualization datasets py definitions of datasets for cityscapes and synscapes ensembling train py code for training m model py models on cityscapes train ensembling train syn py as above but on synscapes train ensembling eval py computes the miou for a trained ensemble on cityscapes val also creates visualization images of the input image ground truth prediction and the estimated uncertainty ensembling eval syn py as above but on synscapes val ensembling eval ause ece py computes the ause mean std and ece mean std for m 1 2 4 8 16 on cityscapes val based on a total of 26 trained ensemble members also creates sparsification plots sparsification error curves and reliability diagrams ensembling eval ause ece syn py as above but on synscapes val ensembling eval seq py creates visualization videos input image prediction and the estimated uncertainty for a trained ensemble on the three demo sequences in cityscapes ensembling eval seq syn py creates a visualization video input image ground truth prediction and the estimated uncertainty for a trained ensemble showing the 30 first images in synscapes val mcdropout train py code for training m model mcdropout py models on cityscapes train mcdropout train syn py as above but on synscapes train mcdropout eval py computes the miou for a trained mc dropout model with m forward passes on cityscapes val also creates visualization images of the input image ground truth prediction and the estimated uncertainty mcdropout eval syn py as above but on synscapes val mcdropout eval ause ece py computes the ause mean std and ece mean std for m 1 2 4 8 16 forward passes on cityscapes val based on a total of 8 trained mc dropout models also creates sparsification plots sparsification error curves and reliability diagrams mcdropout eval ause ece syn py as above but on synscapes val mcdropout eval seq py creates visualization videos input image prediction and the estimated uncertainty for a trained mc dropout model with m forward passes on the three demo sequences in cityscapes mcdropout eval seq syn py creates a visualization video input image ground truth prediction and the estimated uncertainty for a trained mc dropout model with m forward passes showing the 30 first images in synscapes val documentation toyregression example usage sudo sh start docker image toyproblems depthcompletion sh cd python evaluating bdl toyregression ensemble adam train py ensemble adam ensembling by minimizing the mle objective using adam and random initialization datasets py definition of the training dataset model py definition of the feed forward neural network train py code for training m models eval py creates a plot of the obtained predictive distribution and the hmc ground truth predictive distribution for a set value of m also creates histograms for the model parameters eval plots py creates plots of the obtained predictive distributions for different values of m eval kl div py computes the kl divergence between the obtained predictive distribution and the hmc ground truth for different values of m ensemble map adam ensembling by minimizing the map objective using adam and random initialization ensemble map adam fixed ensembling by minimizing the map objective using adam and no random initialization ensemble map sgd ensembling by minimizing the map objective using sgd and random initialization ensemble map sgdmom ensembling by minimizing the map objective using sgdmom and random initialization mc dropout map 02 adam mc dropout by minimizing the map objective using adam p 0 2 mc dropout map 02 sgd mc dropout by minimizing the map objective using sgd p 0 2 mc dropout map 02 sgdmom mc dropout by minimizing the map objective using sgdmom p 0 2 sgld 256 implementation of sgld trained for 256 times longer than each member of an ensemble sgld 64 implementation of sgld trained for 64 times longer than each member of an ensemble sghmc 256 implementation of sghmc trained for 256 times longer than each member of an ensemble sghmc 64 implementation of sghmc trained for 64 times longer than each member of an ensemble hmc implementation of hmc using pyro http pyro ai deterministic implementation of a fully deterministic model i e direct regression documentation toyclassification example usage sudo sh start docker image toyproblems depthcompletion sh cd python evaluating bdl toyclassification ensemble adam train py ensemble adam ensembling by minimizing the mle objective using adam and random initialization datasets py definition of the training dataset model py definition of the feed forward neural network train py code for training m models eval py creates a plot of the obtained predictive distribution and the hmc ground truth predictive distribution for a set value of m also creates histograms for the model parameters eval plots py creates plots of the obtained predictive distributions for different values of m eval kl div py computes the kl divergence between the obtained predictive distribution and the hmc ground truth for different values of m ensemble adam fixed ensembling by minimizing the mle objective using adam and no random initialization ensemble map adam ensembling by minimizing the map objective using adam and random initialization ensemble map sgd ensembling by minimizing the map objective using sgd and random initialization ensemble map sgdmom ensembling by minimizing the map objective using sgdmom and random initialization mc dropout map 01 adam mc dropout by minimizing the map objective using adam p 0 1 mc dropout map 02 sgd mc dropout by minimizing the map objective using sgd p 0 2 mc dropout map 02 sgdmom mc dropout by minimizing the map objective using sgdmom p 0 2 sgld 256 implementation of sgld trained for 256 times longer than each member of an ensemble sgld 64 implementation of sgld trained for 64 times longer than each member of an ensemble sghmc 256 implementation of sghmc trained for 256 times longer than each member of an ensemble sghmc 64 implementation of sghmc trained for 64 times longer than each member of an ensemble hmc implementation of hmc using pyro http pyro ai pretrained models depthcompletion depthcompletion trained models ensembling virtual 0 checkpoint 40000 pth https drive google com open id 1dupl3nesxhrucgfs8r3vlvsy6j22nph obtained by running ensembling train virtual py depthcompletion trained models mcdropout virtual 0 checkpoint 40000 pth https drive google com open id 1qke3pw2jldxx4hn 4balyn2zyq1l2nzo obtained by running mcdropout train virtual py segmentation segmentation trained models ensembling 0 checkpoint 40000 pth https drive google com open id 1bg3xrsa26tcavrkmykbphvoierfn ybz obtained by running ensembling train py segmentation trained models ensembling syn 0 checkpoint 40000 pth https drive google com open id 1j8tibq8ycol qonodqiajw6he4yyajo obtained by running ensembling train syn py segmentation trained models mcdropout syn 0 checkpoint 60000 pth https drive google com open id 11jcmn62vliydwfnhik7pj52hna md0yl obtained by running mcdropout train syn py | bayesian-deep-learning uncertainty-estimation pytorch deep-learning computer-vision machine-learning autonomous-driving | ai |
lunar-lander | lunar lander embedded systems design game styled after the classic lunar lander https en wikipedia org wiki lunar lander 1979 video game usage hardware i o connections buttons pin jet button pe0 left button pe1 right button pe2 dac resistor value pin 32 r pb0 16 r pb1 8 r pb2 4 r pb3 2 r pb4 1 r pb5 st7735 screen pin backlight 3 3 v miso unconnected sck pa2 mosi pa5 tft cs pa3 ssi0fss card cs unconnected data command pa6 reset pa7 vcc 3 3 v gnd ground software create a makedefs file and declare twroot to be the path to tiva ware ti s useful bundle of resources and libraries for working with launchpad you may also specify a compiler prefix in this file by defining prefix twroot applications arm tivaware lines number 1576 1580 in the provided st7735 c file are unecessary and will prevent you from building you should delete them or comment them out run make make upload to build and flash the code to a connected board make silent will suppress warnings | os |
|
javafx-tasks | javafx task management sample a project to showcase custom list views using fxml files incorporating modern ui design systems sc1 jpg | os |
|
cleanvision | p align center img src https raw githubusercontent com cleanlab assets master cleanlab cleanvision logo open source transparent png width 50 height 50 p img width 1200 alt screen shot 2023 03 10 at 10 23 33 am src https user images githubusercontent com 10901697 224394144 bb0e1c85 6851 4828 bcd2 4ed234270a78 png cleanvision automatically detects potential issues in image datasets like images that are blurry under over exposed near duplicates etc this data centric ai package is a quick first step for any computer vision project to find problems in the dataset which you want to address before applying machine learning cleanvision is super simple run the same couple lines of python code to audit any image dataset read the docs https readthedocs org projects cleanvision badge version latest https cleanvision readthedocs io en latest pypi https img shields io pypi v cleanvision color blue https pypi org pypi cleanvision os https img shields io badge platform noarch lightgrey https pypi org pypi cleanvision py versions https img shields io badge python 3 7 2b blue https pypi org pypi cleanvision codecov https codecov io github cleanlab cleanvision branch main graph badge svg token y1n6mlun9h https codecov io gh cleanlab cleanvision slack community https img shields io static v1 logo slack style flat color white label slack message community https cleanlab ai slack twitter https img shields io twitter follow cleanlabai style social https twitter com cleanlabai cleanlab studio https raw githubusercontent com cleanlab assets master shields cl studio shield svg https cleanlab ai studio utm source github utm medium readme utm campaign clostostudio installation shell pip install cleanvision quickstart download an example dataset optional or just use any collection of image files you have shell wget nc https cleanlab public s3 amazonaws com cleanvision image files zip 1 run cleanvision to audit the images python from cleanvision import imagelab specify path to folder containing the image files in your dataset imagelab imagelab data path folder with images automatically check for a predefined list of issues within your dataset imagelab find issues produce a neat report of the issues found in your dataset imagelab report 2 cleanvision diagnoses many types of issues but you can also check for only specific issues python issue types dark blurry imagelab find issues issue types issue types produce a report with only the specified issue types imagelab report issue types issue types more resources on how to use cleanvision tutorial https cleanvision readthedocs io en latest tutorials tutorial html run cleanvision on a huggingface dataset https cleanvision readthedocs io en latest tutorials huggingface dataset html run cleanvision on a torchvision dataset https cleanvision readthedocs io en latest tutorials torchvision dataset html example script https github com cleanlab cleanvision blob main docs source tutorials run py that can be run with python examples run py path folder with images additional example notebooks https github com cleanlab cleanvision examples documentation https cleanvision readthedocs io blog post https cleanlab ai blog cleanvision clean your data for better computer vision the quality of machine learning models hinges on the quality of the data used to train them but it is hard to manually identify all of the low quality data in a big dataset cleanvision helps you automatically identify common types of data issues lurking in image datasets this package currently detects issues in the raw images themselves making it a useful tool for any computer vision task such as classification segmentation object detection pose estimation keypoint detection generative modeling https openai com research dall e 2 pre training mitigations etc to detect issues in the labels of your image data you can instead use the cleanlab https github com cleanlab cleanlab package in any collection of image files most formats https pillow readthedocs io en stable handbook image file formats html supported cleanvision can detect the following types of issues issue type description issue key example 1 exact duplicates images that are identical to each other exact duplicates https raw githubusercontent com cleanlab assets master cleanvision example issue images exact duplicates png 2 near duplicates images that are visually almost identical near duplicates https raw githubusercontent com cleanlab assets master cleanvision example issue images near duplicates png 3 blurry images where details are fuzzy out of focus blurry https raw githubusercontent com cleanlab assets master cleanvision example issue images blurry png 4 low information images lacking content little entropy in pixel values low information https raw githubusercontent com cleanlab assets master cleanvision example issue images low information png 5 dark irregularly dark images under exposed dark https raw githubusercontent com cleanlab assets master cleanvision example issue images dark jpg 6 light irregularly bright images over exposed light https raw githubusercontent com cleanlab assets master cleanvision example issue images light jpg 7 grayscale images lacking color grayscale https raw githubusercontent com cleanlab assets master cleanvision example issue images grayscale jpg 8 odd aspect ratio images with an unusual aspect ratio overly skinny wide odd aspect ratio https raw githubusercontent com cleanlab assets master cleanvision example issue images odd aspect ratio jpg 9 odd size images that are abnormally large or small odd size img src https raw githubusercontent com cleanlab assets master cleanvision example issue images odd size png width 20 height 20 cleanvision supports linux macos and windows and runs on python 3 7 join our community the best place to learn is our slack community https cleanlab ai slack join the discussion there to see how folks are using this library discuss upcoming features or ask for private support need professional help with cleanvision join our help slack channel https cleanlab ai slack and message us there or reach out via email team cleanlab ai interested in contributing see the contributing guide contributing md an easy starting point is to consider issues https github com cleanlab cleanvision labels good 20first 20issue marked good first issue or simply reach out in slack https cleanlab ai slack we welcome your help building a standard open source library for data centric computer vision ready to start adding your own code see the development guide development md have an issue search existing issues https github com cleanlab cleanvision issues q is 3aissue or submit a new issue https github com cleanlab cleanvision issues new choose have ideas for the future of data centric computer vision check out our active planned projects and what we could use your help with https github com cleanlab cleanvision projects license copyright c 2022 cleanlab inc cleanvision is free software you can redistribute it and or modify it under the terms of the gnu affero general public license as published by the free software foundation either version 3 of the license or at your option any later version cleanvision is distributed in the hope that it will be useful but without any warranty without even the implied warranty of merchantability or fitness for a particular purpose see gnu affero general public license https github com cleanlab cleanvision blob main license for details commercial licensing is available for enterprise teams that want to use cleanvision in production workflows but are unable to open source their code as is required by the current license https github com cleanlab cleanvision blob main license please email us team cleanlab ai issue https github com cleanlab cleanvision issues new | computer-vision data-centric-ai data-exploration data-quality data-validation deep-learning exploratory-data-analysis image-analysis image-classification image-generation image-quality image-segmentation data-profiling hacktoberfest | ai |
docker-iot-dashboard | dashboard example for internet of things iot this repository contains a complete example that grabs device data from iot network server stores it in a database and then displays the data using a web based dashboard you can set this up on a docker droplet from digital ocean https www digitalocean com or on a ubuntu vm from dreamcompute https www dreamhost com cloud computing or on a ubuntu docker vm from the microsoft azure store https portal azure com with minimal effort you should set up this service to run all the time to capture the data from your devices you then access the data at your convenience using a web browser table of contents markdownlint disable md033 markdownlint capture markdownlint disable toc depthfrom 2 updateonsave true introduction introduction definitions definitions security security user access user access to access mongodb externally to access mongodb externally assumptions assumptions composition and external ports composition and external ports data files data files reuse and removal of data files reuse and removal of data files node red and grafana examples node red and grafana examples connecting to influxdb from node red and grafana connecting to influxdb from node red and grafana logging in to grafana logging in to grafana data source settings in grafana data source settings in grafana mqtts examples mqtts examples integrating data normalization control dnc support integrating data normalization controldnc support what is dnc what is dnc advantages advantages application architecture application architecture dnc components dnc components plugins plugins dnc server architecture dnc server architecture setup instructions setup instructions influxdb backup and restore influxdb backup and restore release history release history meta meta toc markdownlint restore due to a bug in markdown toc the table is formatted incorrectly if tab indentation is set other than 4 due to another bug this comment must be after the toc entry introduction this setup md setup md explains the application server installation and its setup docker https docs docker com and docker compose https docs docker com compose are used to make the installation and setup easier this dashboard uses docker compose https docs docker com compose overview to set up a group of eight primary docker containers https www docker com backed by two auxiliary containers 1 an instance of nginx https www nginx com which proxies the other services handles access control gets ssl certificates from let s encrypt https letsencrypt org and faces the outside world 2 an instance of node red http nodered org which processes the data from the individual nodes and puts it into the database 3 an instance of influxdb https docs influxdata com influxdb which stores the data as time series measurements with tags and provides backup support for the databases 4 an instance of grafana http grafana org which gives a web based dashboard interface to the data 5 an instance of mqtt https mosquitto org which provides a lightweight method of carrying out messaging using a publishing subscribe model 6 an instance of apiserver https nodejs org en which runs the below apis dncserver it is a back end component of generic dnc developed on nodejs provides restful apis for generic dnc user interface grafana influx plugin and dnc standard plugin stores user data in mongodb and communicates with influxdb to get database name and measurement name dncgiplugin it is a backend component of generic dnc developed on nodejs provides restful api service for interfacing grafana ui with the generic dnc back end handles the influx query from grafana and replace the influx tags by the dnc tags server data then communicates with the influxdb finally send back the response to the grafana dncstdplugin it is a backend component of generic dnc developed on nodejs provides restful api service for customized excel plugin ui receives request from excel plugin and communicates with dnc server influxdb and send back response to the excel plugin ui component version it will list us version details of the system info and docker iot dashboard info 7 an instance of mongodb https www mongodb com mongodb is a document oriented nosql database used for high volume data storage instead of using tables and rows as in the traditional relational databases it makes use of collections and documents documents consist of key value pairs which are the basic unit of data in mongodb collection contains sets of documents and function which is the equivalent of relational database tables mongodb s data records are stored in json javascript object notation format dnc uses mongodb database system to store the data 8 an instance of expo https docs expo dev expo is a framework and a platform for universal react applications dnc user interface designed by using react native and native platforms expo runs dncui api which provides simple user interface to handle all dnc components the auxiliary containers are 1 postfix http www postfix org documentation html which if configured handles outbound mail services for the containers for now influxdb node red cron backup and grafana 2 cron backup cron backup which provides backup support for the nginx node red grafana mongodb and mqtts containers and pushed the backed up data to s3 compatible storage to make things more specific most of the description here assumes use of digital ocean however this was tested on ubuntu 20 04 with no issues apart from the additional complexity of setting up apt get to fetch docker and the need for a manual installation of docker compose on dream compute and on microsoft azure this will work on any linux or linux like platform that supports docker and docker compose note its likelihood of working with raspberry pi has not been tested as yet definitions the host system is the system that runs docker and docker compose a container is one of the virtual systems running under docker on the host system a file on the host is a file present on the host system typically not visible from within the container s a file in container x or a file in the x container is a file present in a file system associated with container x and typically not visible from the host system security all communication with the nginx server is encrypted using ssl with auto provisioned certificates from let s encrypt grafana is the primary point of access for most users and grafana s login is used for that purpose access to node red and influxdb is via special urls base node red and base influxdb 8086 where base is the url served by the nginx container these urls are protected via nginx htpasswd file entries these entries are files in the nginx container and must be manually edited by an administrator the initial administrator s login password for grafana must be initialized prior to starting it s stored in env when the grafana container is started for the first time it creates grafana db in the grafana container and stores the password at that time if grafana db already exists the password in grafana env is ignored note microsoft azure by default will not open any of the ports to the outside world so the user will need to open port 443 for ssl access to nginx for concreteness the following table assumes that base is dashboard example com user access to access open this link notes node red https dashboard example com node red port number is not needed and shouldn t be used note trailing after node red influxdb api queries https dashboard example com influxdb 8086 port number is needed also note trailing after influxdb grafana https dashboard example com port number is not needed and shouldn t be used mqtts wss dashboard example com mqtts mqtt client is needed to test it via mqtt web portal http tools emqx io apiserver dnc server https dashboard example com dncserver this api calls starts with the url dnc standard plugin https dashboard example com dncstdplugin this api calls starts with the url dnc grafana influx plugin https dashboard example com dncgiplugin this api calls starts with the url version info https dashboard example com version can view the version details on any web browser using the link expo https dashboard example com dncui port number is not needed and shouldn t be used to access mongodb externally via nginx ssl termination mode https docs nginx com nginx admin guide security controls terminating ssl tcp server dashboard example com port 27020 username iot dashboard mongo initdb root username password iot dashboard mongo initdb root password authentication db admin connect to database admin ssl support yes this can be visualized as shown in the figure below docker connection and user access connection architecture using ssh assets connection architecture png assumptions the host system must have docker compose version 1 9 or later for which https github com docker compose be aware that apt get normally doesn t grab this if configured at all it frequently gets an out of date version the environment variable iot dashboard data if set points to the common directory for the data if not set docker compose will quit at start up this is by design iot dashboard data node red will have the local node red data iot dashboard data influxdb will have the local influxdb data this should be backed up iot dashboard data grafana will have all the dashboards iot dashboard data docker nginx will have htpasswd credentials folder authdata and let s encrypt certs folder letsencrypt iot dashboard data mqtt credentials will have the user credentials iot dashboard data apiserver dncserver apiserver dncserver will have the source data required to run dncserver api iot dashboard data apiserver dncstdplugin apiserver dncstdplugin will have the source data required to run dncstdplugin api iot dashboard data apiserver dncgiplugin apiserver dncgiplugin will have the source data required to run dncgiplugin api iot dashboard data mongodb mongodb data will have local mongodb data this should be backed up iot dashboard data expo dncui expo dncui will have the source data required to run dncui api composition and external ports within the containers the individual programs use their usual ports but these are isolated from the outside world except as specified by docker compose yml file in docker compose yml the following ports on the docker host are connected to the individual programs nginx runs on 80 tcp and 443 tcp all connections to port 80 are redirected to 443 using ssl mqtts mosquitto runs on 443 tcp for mqtt over nginx proxy 8883 tcp for mqtt over tls ssl 8083 tcp for websockets over tls ssl 1883 tcp for mqtt over tcp protocol not secure disabled by default the below ports are exposed only for the inter container communication these ports can t be accessed by host system externally grafana runs on 3000 tcp influxdb runs on 8086 tcp node red runs on 1880 tcp postfix runs on 25 tcp apiserver runs on the below ports dnc server runs on 8891 tcp dnc std plugin runs on 8892 tcp dnc gi plugin runs on 8893 tcp expo runs on 19006 tcp mongodb runs on 27017 tcp remember if the server is running on a cloud platform like microsoft azure or aws one needs to check the firewall and confirm that the ports are open to the outside world data files when designing this collection of services there were two choices to store the data files we could keep them inside the docker containers or we could keep them in locations on the host system the advantage of the former is that everything is reset when the docker images are rebuilt the disadvantage of the former is that there is a possibility to lose all the data when it s rebuilt on the other hand there s another level of indirection when keeping things on the host as the files reside in different locations on the host and in the docker containers because iot data is generally persistent we decided that the extra level of indirection was required to help find things consult the following table data files are kept in the following locations by default component data file location on host location in container node red iot dashboard data node red data influxdb iot dashboard data influxdb var lib influxdb grafana iot dashboard data grafana var lib grafana mqtt iot dashboard data mqtt credentials etc mosquitto credentials nginx iot dashboard data docker nginx authdata etc nginx authdata let s encrypt certificates iot dashboard data docker nginx letsencrypt etc letsencrypt dncserver iot dashboard data apiserver dncserver apiserver dncserver dncgiplugin iot dashboard data apiserver dncgiplugin apiserver dncgiplugin dncstdplugin iot dashboard data apiserver dncstdplugin apiserver dncstdplugin mongodb iot dashboard data mongodb mongodb data data db expo iot dashboard data expo dncui expo dncui as shown one can easily change locations on the host e g for testing this can be done by setting the environment variable iot dashboard data to the absolute path with trailing slash to the containing directory prior to calling docker compose up the above paths are appended to the value of iot dashboard data directories are created as needed normally this is done by an appropriate setting in the env file consider the following example console grep iot dashboard data env iot dashboard data dashboard data docker compose up d in this case the data files are created in the following locations table data location examples component data file location node red dashboard data node red influxdb dashboard data influxdb grafana dashboard data grafana mqtt dashboard data mqtt credentials nginx dashboard data docker nginx authdata let s encrypt certificates dashboard data docker nginx letsencrypt dncserver dashboard data apiserver dncserver dncgiplugin dashboard data apiserver dncgiplugin dncstdplugin dashboard data apiserver dncstdplugin mongodb dashboard data mongodb mongodb data expo dashboard data expo dncui reuse and removal of data files since data files on the host are not removed between runs as long as the files are not removed between runs the data will be preserved sometimes this is inconvenient and it is necessary to remove some or all of the data for a variety of reasons the data files and directories are created owned by root so the sudo command must be used to remove the data files here s an example of how to do it bash source env sudo rm rf iot dashboard data node red sudo rm rf iot dashboard data influxdb sudo rm rf iot dashboard data grafana sudo rm rf iot dashboard data mqtt sudo rm rf iot dashboard data docker nginx sudo rm rf iot dashboard data apiserver sudo rm rf iot dashboard data mongodb sudo rm rf iot dashboard data expo node red and grafana examples this version requires that you set up node red the influxdb database and the grafana dashboards manually but we hope to add a reasonable set of initial files in a future release connecting to influxdb from node red and grafana there is one point that is somewhat confusing about the connections from node red and grafana to influxdb even though influxdb is running on the same host it is logically running on its own virtual machine created by docker because of this node red and grafana cannot use local host when connecting to influxdb a special name is provided by docker influxdb note that there s no dns suffix if influxdb is not used node red and grafana will not be able to connect logging in to grafana on the login screen the initial username is admin the initial password is given by the value of the variable iot dashboard grafana admin password in env note that if you change the password in env after the first time you launch the grafana container the admin password does not change if you somehow lose the previous value of the admin password and you don t have another admin login it s very hard to recover easiest is to remove grafana db and start over data source settings in grafana set the url under http settings to http influxdb 8086 select the database leave the username and password blank click save test mqtts examples mqtts can be accessed in the following ways method hostname path port credentials mqtt over nginx proxy wss dashboard example com mqtts 443 username password come from mosquitto s configuration password file mqtt over tls ssl dashboard example com 8883 username password come from mosquitto s configuration password file websockets over tls ssl wss dashboard example com 8083 username password come from mosquitto s configuration password file mqtt over tcp protocol not secure dashboard example com 1883 username password come from mosquitto s configuration password file integrating data normalization control dnc support what is dnc dnc is a logical data server designed to achieve location based data measurement by provide customized tag mapping on top of a general database where sensor data is organized based on device ids hardware device id about dnc assets about dnc png the visibility of clients data server is controlled by the dnc server users can make data query based on the tags provided in the dnc device mapping dnc server removes the customized dnc tags and add required tags available in the mapping field send converted query to client s data server receives the response then remove the tags add the respective dnc tags and send the response to the user advantages location based data measurement device is considered as a logical unique object loosely coupled with hardware device changing of a location can easily be mapped no data loss due to device change replacement user can provide convenient naming for their device application architecture application architecture assets dnc arch png dnc components client in dnc client is like a profile each client can be created with a set of tags and it requires database credentials to query data from data server field description name name of the client tagslist list of tags customized tags example 1 tag1 country tag2 state tag3 city tag4 street tag5 device name example 2 tag1 site tag2 pile tag3 location tag4 devicename dbdata database server credentials most cases influxdb 1 db url 2 username 3 password 4 database name a client can be created only by the master admin once a client profile is created admin and user account can be created for that client the client admin can add new devices under the client profile device registry this is the gateway for adding devices to the dnc server this record contains all devices entries irrespective of clients fields description client name device will be assigned to this client hardware id id of the hardware printed in the pcb deviceid id received from ttn sigfox this data should be existed as a tag in the influxdb database devid id received from ttn sigfox this data should be existed as a tag in the influxdb database deveui id received from ttn sigfox this data should be existed as a tag in the influxdb database measurement name where the device data gets logged in the database server in influxdb it is called as measurement name field data name name of the data which required by the client example twater rh vbat etc this data should be existed as a field tag in the influxdb database date of installation the date when the device installed in the field data of removal the date when device removed from the field this will not be asked while adding device required only remove replace device the master admin has the access to manage this record no other can do any changes in this record note the deviceid devid and deveui are optional but any one should be mandatory devices this is the gateway for adding devices under a client with the customized tag details all tag details are optional the admin can add device with or without tag fields fields description hardware id select from the master record direct entry not allowed tag 1 optional value for the tag tag 2 optional value for the tag n optional value for the tag latitude location coordinates required when showing data on world map longitude location coordinates required when showing data on world map installation date get it from the master record removal date enter the date when the device removed from the field this date automatically updated to master record plugins plugins are created to provide user interface for using dnc application which communicates with dnc engine by using the plugin api standard plugin api this api receives the request from excel sheet or google sheet plugin communicates with the dnc server influxdb and sends back the response to the plugin grafana influx plugin api this api receives requests from grafana application which are influxdb based queries in grafana ui all the requests are redirected to grafana influx plugin which communicates with dnc server influxdb and then sends the response back to grafana ui dnc server architecture dnc server arch assets dnc server arch png setup instructions please refer to setup md setup md for detailed set up instructions please create a new discussion on this repository https github com mcci catena docker iot dashboard discussions for getting support on dnc influxdb backup and restore please refer to influxdb readme md influxdb readme md release history v3 0 2 has the following changes updated dnc std plugin added token validation new status code v3 0 1 has the following changes updated dnc server modified login response and query response updated dnc std plugin added support for dnc mapping v3 0 0 has the following changes included apiserver and expo containers for dnc support documented the process behind the dnc support provided backup support for mongodb container s data v2 0 0 https github com mcci catena docker iot dashboard releases tag v2 0 0 includes the following changes included auxiliary backup container cron backup for providing backup support for nginx node red grafana and mqtts containers updated the base images used in all dockerfile from bionic to focal added mosquitto mqtt client ubuntu ppa repository to install the latest version and fixed ownership issue when accessing let s encrypt certs added tls ssl based smtp authentication support in postfix container some minor changes in the following files dockerfile docker compose yml setup md and shell scripts v1 0 0 https github com mcci catena docker iot dashboard releases tag v1 0 0 has the following changes influxdb 1 backup script is updated for backing up online live databases and to push the backup to amazon bucket 2 crontab was set for automatic backup 3 supports sending email for backup alerting nginx 1 the apache setup is migrated to nginx 2 proxy ing the services like influxdb grafana node red mqtts over proxy was updated node red 1 supports data flowing via mqtt channel and https endpoint 2 supports sending email mqtts 1 supports different connections as below 1 mqtt over nginx proxy 2 mqtt over tcp disabled by default 3 mqtt over tls ssl 4 mqtt over websockets wss postfix 1 configured to relay mails via external smtp auth tested with gmail and mailgun 2 mails generated from containers like grafana influxdb and node red will be relayed through postfix container meta | node-red grafana influxdb docker-container dashboard letsencrypt apache2 iot-cloud iot | server |
FramesSegmentationAlgorithms | point cloud segmentation algorithms repository contains files with two segmentation algorithms developed at the warsaw university of technology as part of an engineering thesis the files are use as a plugin in the frames software for point cloud processing frames is a software which uses custom data structures and algorithms it uses many open source libraries such as the eigen library the cpp file contains the two segmentation algorithms and a parameter calculating algorithm they are executed by a data structure called a method which is used to export the algorithm as a plugin in the frames ui the three methods contain 1 parameter calculation local plane coefficient 2 segmentation by region growing 3 segmentation using a normal vector histogram in this implementation two custom functions were added as these operations were more frequently used more info more information can be found in my engineering thesis entitled development of algorithm for cloud of point s segmentation based on orientation similarity where i take a deep dive into how i designed implemented and tested these algorithms the thesis is available at the warsaw university of technology database | cpp | cloud |
design-system | citizens advice design system continuous integration https github com citizensadvice design system actions workflows ci workflow yml badge svg https github com citizensadvice design system actions workflows ci workflow yml view the documentation site https citizens advice design system netlify app getting started you can check out the getting started guide https citizens advice design system netlify app getting started for a quick start build contributing for any dev related information including contributing and building locally see the contribution guide contributing md | os |
|
real-estate-react-app | real estate the idea behind this application is to create a place where buyers and sellers can meet in this application the user can register an account search for a real estate listing base of his her preference s and make an inquiry of the listing and the admin can manage resources including property listings realtors and contact inquiries in the admin area here we are creating the front end of the application that you can see live here https deploy real estate app lm r appspot com you can find the backend repository here https github com damyanbg real estate flask rest api representation of the project in youtube https www youtube com watch v dr8qx2cmpce t 25s getting started dependencies before start you need install the following tools in your computer img align center alt git height 25 width 35 src https raw githubusercontent com devicons devicon master icons git git original svg style max width 100 git https git scm com img img align center alt nodejs height 25 width 35 src https raw githubusercontent com devicons devicon master icons nodejs nodejs original svg style max width 100 node js https nodejs org en img it is also recommended to have a good code editor like the following img align center alt visualstudiocode height 25 width 35 src https raw githubusercontent com devicons devicon master icons visualstudio visualstudio plain svg style max width 100 vscode https code visualstudio com img installing on local machine bash clone the repository git clone https github com damyanbg real estate react app git enter the project folder in the terminal cd real estate react app install all the dependencies npm install execute the appliction with this command npm start the server will start in the port 3000 go to http localhost 3000 run the project with docker bash clone the integration repo git clone https github com damyanbg real estate integration add env files build the image docker compose build run the application docker compose up integrate the backend since this is the front end of the application you will need the back end to run since i do not host anymore the project on azure to achieve this you have 3 options 1 to install python and postgresql to clone the backend and run the back end 2 to install postgresql and docker and to use the container for the back end 3 to instal docker and to use containers for the back end for the database i recommend this way since there will be added container for the nextcloud which i am using to store the images contributing first off we would like to thank you for taking the time to contribute and make this a better project this is perfect project for people with not so much experience in react as such pull requests are welcome for major changes please open an issue first to discuss what you would like to change for details about contributing you can access contributing https github com damyanbg real estate react app blob main contributing md and do not forget to enjoy and have fun authors a href https github com damyanbg img style border radius 50 src https avatars githubusercontent com u 93829069 v 4 width 100px alt a a href https github com tihomirtx88 img style border radius 50 src https avatars githubusercontent com u 88166066 v 4 width 100px alt a used technologies those following tools were used in the project development application reactjs https reactjs org sass https sass lang com reactdom https reactjs org docs react dom html reactrouter https reactrouter com en main reactcontext https reactjs org docs context html | css html javascript react reactjs scss | front_end |
protocol-tokens | protocol tokens design tokens for protocol mozilla s design system em javascript json css scss em information table tr td package td td mozilla protocol tokens td tr tr td description td td design tokens for protocol mozilla s design system td tr tr td version td td a href https github com mozilla protocol tokens blob master changelog md 5 0 5 a td tr table installation protocol design tokens are available as an npm package mozilla protocol tokens on npm https www npmjs com package mozilla protocol tokens the recommended way to use and install design tokens may vary depending on your project the most common are documented below javascript package installation using npm https www npmjs com npm install mozilla protocol tokens save using yarn https yarnpkg com en yarn add mozilla protocol tokens javascript in javascript design token names are formatted in lower camelcase http wiki c2 com camelcase js const tokens require mozilla protocol tokens dist index console log tokens colorbluelighter rgb 0 0 0 in json design token names are formatted in kebab case http wiki c2 com kebabcase js const tokens require mozilla protocol tokens dist index json console log tokens color black rgb 0 0 0 sass sass variables and map keys are formatted in kebab case http wiki c2 com kebabcase scss using variables import mozilla protocol tokens dist index a color color black sass with css custom properties custom properties are formatted in kebab case http wiki c2 com kebabcase scss omit css at the end of the file import mozilla protocol tokens dist colors colors custom properties a color var color black publishing to publish to the npmjs registry you ll need access to the mozilla protocol org on npmjs com first run gulp to compile the package locally you can check your local dist folder to verify it has the up to date tokens then run npm publish contributing code of conduct https www mozilla org en us about governance policies participation we have a code of conduct https www mozilla org en us about governance policies participation please follow it in all your interactions with the project contributing guide https github com mozilla protocol tokens blob master contributing md read the contributing guide https github com mozilla protocol tokens blob master contributing md to learn how to propose changes and understand our development process license https github com mozilla protocol tokens blob master license md the protocol tokens project is available under the mpl 2 0 https github com mozilla protocol tokens blob master license md | os |
Subsets and Splits