names
stringlengths
1
98
readmes
stringlengths
8
608k
topics
stringlengths
0
442
labels
stringclasses
6 values
emr-nlp-server
nlpreviz emr nlp server emr nlp server provides the backend service for the emr vis web https github com nlpreviz emr vis web project getting started to get started install the pre requisites get the emr nlp server application and then launch the service as described below prerequisites 1 you must have java development kit jdk 1 7 to build or java runtime jre 1 7 to run this project to confirm that you have the right version of jre installed run java version and verify that the output is similar to java version 1 7 0 51 java tm se runtime environment build 1 7 0 51 b13 java hotspot tm 64 bit server vm build 24 51 b03 mixed mode if you don t have the jdk installed or have an older one you may get the latest version from the oracle technology network http www oracle com technetwork java index html 2 we use the apache tomcat http tomcat apache org server to deploy the app on a mac with homebrew homebrew you may use brew install tomcat to install the server on your machine building the project 1 clone the emr nlp server repository using git git git clone https github com nlpreviz emr nlp server git cd emr nlp server 2 our project depends on the following external dependencies which can be downloaded using apache ant ant java jersey http jersey java net which is dual licensed https jersey java net license html common development and distribution license and gpl 2 weka http www cs waikato ac nz ml weka licensed under gpl 3 libsvm http www csie ntu edu tw cjlin libsvm with a license compatible with gpl stanford corenlp http nlp stanford edu software corenlp shtml licensed under the gnu general public license v3 or later stanford nlp code is gpl v2 but the composite with external libraries is v3 to download and resolve these dependencies from their respective repositories use ant resolve 3 specify the path to the webapps directory in catalina home environment variable and use ant deploy to to build and deploy the backend app for example if your tomcat s webapps directory accessible as usr local cellar tomcat 7 0 54 libexec webapps then you may use env catalina home usr local cellar tomcat 8 0 9 libexec ant deploy we recommend using the eclipse ide for java ee developers http www eclipse org downloads with the egit plugin http www eclipse org egit download installed for development the repository contains appropriate project files to be imported into eclipse running the server we have included some dummy data https github com nlpreviz emr nlp server releases download empirical study data zip with our release so that you can run the tool and play with the interface these are not actual medical records and and your models will not be useful contact the devs if you need more information about real datasets 1 download and copy the data https github com nlpreviz emr nlp server releases download empirical study data zip directory inside catalina base you should be able to figure this path from the print messages you see after launching the server example path usr local cellar tomcat 8 0 9 libexec data 2 you need to build libsvm before you may run the server for the first time to do that run make inside data libsvm directory or follow the instructions in the readme file present there 3 start the tomcat server eg using catalina run or service tomcat start etc now follow the steps on emr vis web https github com nlpreviz emr vis web to setup the front end application using your own dataset and defining custom variables the tool is currently configured to make predictions for 14 colonoscopy quality variables it also does specific format parsing for colonoscopy and pathology reports in the data provided with the release we have a more generic version of the tool in the general branch of this repository checkout this experimental branch here https github com nlpreviz emr nlp server blob general readme md you will still need to download the sample data https github com nlpreviz emr nlp server releases download empirical study data zip directory and organize your documents in the same structure defined as follows documentlist initialidlist xml testidlist xml fullidlist xml feedbackidlist xml docs 0719 report txt 0973 report txt 0184 report txt pathology txt 0726 report txt pathology txt labels class appendiceal orifice csv class ileo cecal valve csv class informed consent csv class proc aborted csv class asa csv class prep adequateyes csv class any adenoma csv class cecum csv class withdraw time csv class indication type csv class prep adequatenot csv class biopsy csv class prep adequateno csv class nursing report csv docs contains the list of documents each patient or case is represented by a four digit id as sub directories the id length is hard coded in colonoscopyds svmlightformat java these may contain at most 2 files report txt and pathology txt pathology txt is optional if you have more than 2 files you may concatenate them into one report or extend our code to support them documentlist directory has the following files with references to the documents described above initialidlist xml used to train the initial model this is how we boostrap the system feedbackidlist xml this is the list of documents you should be working on to give feedback on and improve the models used to create the global feature vector fullidlist xml held out test set there is code to general evaluation metrics but it is not exposed to the front end at this point feel free to contribute testidlist xml list of all the ids labels directory contains the gold standard data used to train the initial models and run evaluation metrics the rest of the files are can be reset by pointing your browser to backend url rest server resetdb for example http localhost 8080 emr nlp server rest server resetdb remember to update emr vis web https github com nlpreviz emr vis web to it s general branch the easiest way to configure the tool to use your own variables is to map them to the names of your choice in the front end remember to update emr vis web https github com nlpreviz emr vis web as described in its readme as well this project will be updated to make this configuration easier in the near future login the the rest calls to the server are protected with a basic access http authentication https en wikipedia org wiki basic access authentication the default login credentials are username and password you are encouraged to change them in userauthentication java src io github nlpreviz server userauthentication java when running the app on a publicly accessible server homebrew http brew sh git http git scm com ant http ant apache org license this project is released under the gpl 3 license take a look at the license license md file in the source for more information
natural-language-processing interactive-visualizations interactive-learning clinical-notes clinical-research
ai
CortexTheseus
related projects cvm runtime ai container https github com cortexfoundation cvm runtime file storage stop your cortex full node daemon when you do this test https github com cortexfoundation torrentfs git clone https github com cortexfoundation torrentfs git cd torrentfs make build bin torrent download infohash 6b75cc1354495ec763a6b295ee407ea864a0c292 build bin torrent download infohash b2f5b0036877be22c6101bdfa5f2c7927fc35ef8 build bin torrent download infohash 5a49fed84aaf368cbf472cc06e42f93a93d92db5 build bin torrent download infohash 1f1706fa53ce0723ba1c577418b222acbfa5a200 build bin torrent download infohash 3f1f6c007e8da3e16f7c3378a20a746e70f1c2b0 downloaded all the torrents make sure you can download the file successfully accept in out traffic of fw settings as possible for stable and fast downloading speed 40401 40404 5008 both in and out tcp udp traffic accepted at least ai wrapper fixed api for inference and file storage https github com cortexfoundation inference pow cortex cuckoo cycle https github com cortexfoundation solution rosseta https github com cortexfoundation rosetta cortex docker https github com cortexfoundation docker robot https github com cortexfoundation robot system requirements x64 support flags fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ss ht syscall nx pdpe1gb rdtscp lm constant tsc rep good nopl cpuid tsc known freq pni pclmulqdq ssse3 fma cx16 pcid sse4 1 sse4 2 x2apic movbe popcnt tsc deadline timer aes xsave avx f16c rdrand hypervisor lahf lm abm invpcid single pti ibrs ibpb stibp fsgsbase bmi1 avx2 smep bmi2 erms invpcid xsaveopt for example cat proc cpuinfo support processor 0 vendor id genuineintel cpu family 6 model 63 model name intel r xeon r cpu e5 2680 v3 2 50ghz stepping 2 microcode 0x1 cpu mhz 2494 224 cache size 30720 kb physical id 0 siblings 2 core id 0 cpu cores 1 apicid 0 initial apicid 0 fpu yes fpu exception yes cpuid level 13 wp yes flags fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ss ht syscall nx pdpe1gb rdtscp lm constant tsc rep good nopl cpuid tsc known freq pni pclmulqdq ssse3 fma cx16 pcid sse4 1 sse4 2 x2apic movbe popcnt tsc deadline timer aes xsave avx f16c rdrand hypervisor lahf lm abm invpcid single pti ibrs ibpb stibp fsgsbase bmi1 avx2 smep bmi2 erms invpcid xsaveopt bugs cpu meltdown spectre v1 spectre v2 spec store bypass l1tf mds swapgs itlb multihit bogomips 4988 44 clflush size 64 cache alignment 64 address sizes 46 bits physical 48 bits virtual not support architecture x86 64 cpu op mode s 32 bit 64 bit byte order little endian cpu s 32 on line cpu s list 0 31 thread s per core 2 core s per socket 16 socket s 1 numa node s 2 vendor id authenticamd cpu family 23 model 1 model name amd epyc 7571 stepping 2 cpu mhz 2534 021 bogomips 4399 86 hypervisor vendor kvm virtualization type full l1d cache 32k l1i cache 64k l2 cache 512k l3 cache 8192k numa node0 cpu s 0 7 16 23 numa node1 cpu s 8 15 24 31 flags fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ht syscall nx mmxext fxsr opt pdpe1gb rdtscp lm constant tsc rep good nopl nonstop tsc cpuid extd apicid amd dcm aperfmperf tsc known freq pni pclmulqdq ssse3 fma cx16 sse4 1 sse4 2 movbe popcnt aes xsave avx f16c rdrand hypervisor lahf lm cmp legacy cr8 legacy abm sse4a misalignsse 3dnowprefetch topoext perfctr core vmmcall fsgsbase bmi1 avx2 smep bmi2 rdseed adx smap clflushopt sha ni xsaveopt xsavec xgetbv1 clzero xsaveerptr arat npt nrip save ubuntu cortex node is developed in ubuntu 18 04 x64 cuda 9 2 nvidia driver 396 37 environment with cuda compute capability 6 1 latest ubuntu distributions are also compatible but not fully tested recommend cmake 3 11 0 wget https cmake org files v3 11 cmake 3 11 0 rc4 linux x86 64 tar gz tar zxvf cmake 3 11 0 rc4 linux x86 64 tar gz sudo mv cmake 3 11 0 rc4 linux x86 64 opt cmake 3 11 sudo ln sf opt cmake 3 11 bin usr bin sudo apt get install make go 1 20 wget https go dev dl go1 20 2 linux amd64 tar gz sudo tar c usr local xzf go1 20 2 linux amd64 tar gz echo export path path usr local go bin bashrc source bashrc gcc g 5 4 sudo apt install gcc sudo apt install g cuda 9 2 if u have gpu export ld library path usr local cuda lib64 usr local cuda lib64 stubs ld library path export library path usr local cuda lib64 usr local cuda lib64 stubs library path nvidia driver 396 37 reference https docs nvidia com cuda cuda toolkit release notes index html major components ubuntu 18 04 centos not recommended recommend cmake 3 11 0 yum install cmake3 go 1 20 gcc g 5 4 reference https docs nvidia com cuda cuda installation guide linux index html system requirements sudo yum install centos release scl sudo yum install devtoolset 7 gcc scl enable devtoolset 7 bash which gcc gcc version cuda 10 1 if u have gpu export ld library path usr local cuda lib64 usr local cuda lib64 stubs ld library path export library path usr local cuda lib64 usr local cuda lib64 stubs library path nvidia driver 418 67 centos 7 6 cortex full node compile source code 8g memory suggested 1 git clone recursive https github com cortexfoundation cortextheseus git 2 cd cortextheseus 3 make clean make j nproc it is important to pass this check of libcvm runtime so ldd plugins libcvm runtime so linux vdso so 1 0x00007ffe107fa000 libstdc so 6 lib64 libstdc so 6 0x00007f250e6a8000 libm so 6 lib64 libm so 6 0x00007f250e3a6000 libgomp so 1 lib64 libgomp so 1 0x00007f250e180000 libgcc s so 1 lib64 libgcc s so 1 0x00007f250df6a000 libpthread so 0 lib64 libpthread so 0 0x00007f250dd4e000 libc so 6 lib64 libc so 6 0x00007f250d980000 lib64 ld linux x86 64 so 2 0x00007f250ed35000 if failed run rm rf cvm runtime git submodule init git submodule update and try again running bash and then run any command to start full node cortex bash 1 cd cortextheseus 2 export ld library path pwd pwd plugins ld library path 3 build bin cortex it is easy for you to view the help document by running build bin cortex help running testnet for developers bernard cortex bernard
blockchain cortex ctxc ai machine-learning cvm
blockchain
llm-hosting-container
llm hosting container welcome to the llm hosting container github repository this repository contains dockerfile and associated resources for building and hosting containers for large language models huggingface text generation inference tgi container security see contributing contributing md security issue notifications for more information license this project is licensed under the apache 2 0 license
ai
Algorithm
algorithm interview algorithm amp system design amp db questions
os
udagram-microservice
udagram microservice udagram is a simple cloud application developed alongside the udacity cloud engineering nanodegree it allows users to register and log into a web client post photos to the feed and process photos using an image filtering microservice tasks setup docker environment you ll need to install docker https docs docker com install open a new terminal within the project directory and run 1 switch the folder cd udacity c3 deployment docker 2 build the images docker compose f docker compose build yaml build parallel 3 push the images docker compose f docker compose build yaml push 4 run the container docker compose up creating infrastructure with terraform 1 go to your terraform aws folder 2 see what commands will be made to create infrastructure terraform plan 3 provision the infrastructure terraform apply say yes to confirm provisioning the infrastructure 4 create terraform state file to be parsed by kubeone terraform output json tf json 5 install kubernetes using configuration output from terraform kubeone install config yaml tfjson tf json 6 setup kuebconfig variable that will be used by kubectl commands run this command in any terminal window that the kubectl command will be run in export kubeconfig pwd udagrambox setup kubernetes environment you will need to install the kubectl command open a new terminal within the project directory and run 1 generate encrypted values for aws credentials database user name and database password using bcrypt and put the values into aws secret yaml and env secret yaml files 2 load secret files kubectl apply f aws secret yaml kubectl apply f env secret yaml 3 load config map kubectl apply f env configmap yaml 4 apply deployments kubectl apply f backend feed deployment yaml kubectl apply f frontend deployment yaml kubectl apply f backend user deployment yaml 5 apply services kubectl apply f backend feed service yaml kubectl apply f backend user service yaml kubectl apply f frontend service yaml 6 deploy reverse proxy kubectl apply f reverseproxy deployment yaml kubectl apply f reverseproxy service yaml 7 perform port forwarding kubectl port forward service frontend 8100 8100 kubectl port forward service reverseproxy 8080 8080 continuous integration continuous development travis ci is setup to monitor for updates to master branch
cloud
theta-protocol-ledger
theta blockchain ledger protocol the theta blockchain ledger is a proof of stake decentralized ledger designed for the video streaming industry it powers the theta token economy which incentives end users to share their redundant bandwidth and storage resources and encourage them to engage more actively with video platforms and content creators the ledger employs a novel multi level bft consensus engine docs multi level bft tech report pdf which supports high transaction throughput fast block confirmation and allows mass participation in the consensus process off chain payment support is built directly into the ledger through the resource oriented micropayment pool which is designed specifically to achieve the pay per byte granularity for streaming use cases moreover the ledger storage system leverages the microservice architecture and reference counting based history pruning techniques and is thus able to adapt to different computing environments ranging from high end data center server clusters to commodity pcs and laptops the ledger also supports turing complete smart contracts which enables rich user experiences for dapps built on top of the theta ledger for more technical details please refer to our technical whitepaper docs theta technical whitepaper pdf and 2019 ieee icbc paper https arxiv org pdf 1911 04698 pdf scalable bft consensus mechanism through aggregated signature gossip to learn more about the theta network in general please visit the theta documentation site https docs thetatoken org docs what is theta network table of contents setup setup smart contract and dapp development on theta smart contract and dapp development on theta setup intall go install go and set environment variables gopath gobin and path the current code base should compile with go 1 14 2 on macos install go with the following command brew install go 1 14 1 brew link go 1 14 1 force build and install next clone this repo into your gopath the path should look like this gopath src github com thetatoken theta git clone https github com thetatoken theta protocol ledger git gopath src github com thetatoken theta export theta home gopath src github com thetatoken theta cd theta home now execute the following commands to build the theta binaries under gopath bin two binaries theta and thetacli are generated theta can be regarded as the launcher of the theta ledger node and thetacli is a wallet with command line tools to interact with the ledger export go111module on make install notes for linux binary compilation the build and install process on linux is similar but note that ubuntu 18 04 4 lts centos 8 or higher version is required for the compilation notes for windows binary compilation the windows binary can be cross compiled from macos to cross compile a windows binary first make sure mingw64 is installed brew install mingw w64 on your macos then you can cross compile the windows binary with the following command make exe you ll also need to place three dll files libgcc s seh 1 dll libstdc 6 dll libwinpthread 1 dll under the same folder as theta exe and thetacli exe run unit tests run unit tests with the command below make test unit smart contract and dapp development on theta theta provides full support for turing complete smart contract and is evm compatible to start developing on the theta blockchain please check out the following links smart contracts smart contract and dapp development overview link here https docs thetatoken org docs turing complete smart contract support tutorials on how to interact with the theta blockchain through metamask https docs thetatoken org docs web3 stack metamask truffle https docs thetatoken org docs web3 stack truffle hardhat https docs thetatoken org docs web3 stack hardhat web3 js https docs thetatoken org docs web3 stack web3js and ethers js https docs thetatoken org docs web3 stack hardhat tnt20 token i e erc20 on theta integration guide link here https docs thetatoken org docs theta blockchain tnt20 token integration guide local test environment setup launching a local privatenet link here https docs thetatoken org docs launch a local privatenet command line tools link here https docs thetatoken org docs command line tool connect to the testnet https docs thetatoken org docs connect to the testnet and the mainnet https docs thetatoken org docs connect to the mainnet node configuration link here https docs thetatoken org docs theta blockchain node configuration api references native rpc api references link here https docs thetatoken org docs rpc api reference ethereum rpc api support link here https docs thetatoken org docs web3 stack eth rpc support
blockchain-technology decentralized distributed-systems
blockchain
localhose
deprecated please use local tld https github com hoodiehq local tld instead localhose is a node js http nodejs org module that provides a simple api for dynamically adding hosts to the etc hosts file to fool your web browser into thinking anydomain com points to your local machine this makes web development easier since the local build and production build can now use identical urls warning this software requires superuser access and will temporarily overwrite your hosts file if you do not understand what that means it s probably not a good tool for you requirements node js http nodejs org tested with 0 4 1 os x version 10 5 or later soon any os with a hosts file superuser access to your machine install npm install localhose module api localhose require localhose returns a global localhose object that keeps track of what domains are being rerouted localhose set host1 host2 etc adds one or more hosts to be routed to your local machine the hosts are stored within the existing hosts file like this localhose the following have been added temporarily by localhose for more information see https github com jed localhose 127 0 0 1 yourdomain com 127 0 0 1 yourdomain net localhose the path of the host file can be specified using the hostspath property of the localhose constructor prototype object localhose add host1 host2 etc adds additional domain s the the localhose section of your hosts file but will not overwrite any existing hosts in localhose section localhose unset host1 host2 etc removes some or all of the hosts rerouted to your machine if no arguments are specified all currently hosts are unset if no hosts remain after this is called the localhose localhose section of the current hosts file is removed leaving your file system as pristine as it was before localhose domains returns a list of domains currently being rerouted to 127 0 0 1 command line api sudo localhose set host1 host2 etc same as localhose set but for the command line sudo localhose unset host1 host2 etc same as localhose unset but for the command line example for an example rerouting nodejs org to your machine see test js jed localhose blob master test js otherwise usage is basically like this sudo node somefile js localhose require localhose resolve google com and google org to 127 0 0 1 note that you will be unable to use google while this is set localhose set google com google org localhose domains google com google org remove all domains and revert the hosts file to its original state localhose unset google com todo add support for non os x systems if anyone s interested copyright copyright c 2011 jed schmidt see license txt for details send any questions or comments here http twitter com jedschmidt
front_end
Unity
unity tools used unity 2018 libraries features mobile development 3d graphics experiments vr projects augmented reality preview not available contact alifyz outlook com license copyright 2018 permission is hereby granted free of charge to any person obtaining a copy of this software and associated documentation files the software to deal in the software without restriction including without limitation the rights to use copy modify merge publish distribute sublicense and or sell copies of the software and to permit persons to whom the software is furnished to do so subject to the following conditions the above copyright notice and this permission notice shall be included in all copies or substantial portions of the software the software is provided as is without warranty of any kind express or implied including but not limited to the warranties of merchantability fitness for a particular purpose and noninfringement in no event shall the authors or copyright holders be liable for any claim damages or other liability whether in an action of contract tort or otherwise arising from out of or in connection with the software or the use or other dealings in the software
front_end
Udagram-Microservices-Refactor
about the project udagram image filtering microservice udagram is a simple cloud application developed alongside the udacity cloud engineering nanodegree it allows users to register and log into a web client post photos to the feed and process photos using an image filtering microservice following are the services involved in this project user allows users to register and log into a web client feed allows users to post photos and process photos using image filtering frontend acts as an interface between the user and the backend services reverseproxy for resolving multiple services running on same port in separate containers correspondingly the project is split into following parts 1 the restapi feed backend a node express feed microservice 1 the restapi user backend a node express user microservice 1 the simple frontend a basic ionic client web application which consumes the restapi backend 1 nginx as a reverse proxy server when different backend services are running on the same port then a reverse proxy server directs client requests to the appropriate backend server and retrieves resources on behalf of the client clone the project github repository create a project folder in your local computer and clone the following git repository https github com udacity nd990 c3 microservices v1 dependencies and getting setup tip this frontend is designed to work with the restapi backends it is recommended you stand up the backend first test using postman and then the frontend should integrate 1 installing node and npm this project depends on nodejs and node package manager npm before continuing you must download and install node npm is included from https nodejs com en download https nodejs org en download verify the installation of node js using following command in your terminal cmd node v verify the installation of npm and update npm v npm update how to install project dependencies using npm this project uses npm to manage software dependencies npm relies on the package json file located in the root of this repository after cloning open your terminal and run bash npm install tip npm i is shorthand for npm install 2 installing ionic cli the ionic command line interface is required to serve and build the frontend instructions for installing the cli can be found in the ionic framework docs https ionicframework com docs installation cli when we would configure and start the backend services then the frontend server can be started using following command in the terminal bash ionic serve 3 aws rds postgresql instance postbird tool and an s3 bucket you ll need an aws account to set up these resources create the postgresql instance on aws https classroom udacity com nanodegrees nd9990 parts 5d4b2317 8333 47b3 a9ec ea2cf0a3efbb modules ab95831d 3105 400e 9c49 01a9d85e5a65 lessons a89390c2 0832 4de0 833f 2dcb929a665e concepts 001c5798 6c84 448e ad63 9281f0e2fabe and install the postbird tool https github com paxa postbird to interact remotely with the database additionally you ll need to create an s3 filestore bucket in aws https classroom udacity com nanodegrees nd9990 parts 5d4b2317 8333 47b3 a9ec ea2cf0a3efbb modules ab95831d 3105 400e 9c49 01a9d85e5a65 lessons a89390c2 0832 4de0 833f 2dcb929a665e concepts a04068a9 6267 4c37 9eeb a413949a48f2 4 docker desktop lesson 3 of this course would require you to install docker desktop to create containers for individual microservices refer the following links for instructions macos https docs docker com docker for mac install windows 10 64 bit pro enterprise or education https docs docker com docker for windows install windows 10 64 bit home https docs docker com toolbox toolbox install windows you can find installation instructions for other operating systems at https docs docker com install 5 kubernetes lesson 4 of this course would require you to install any one tool for creating a kubernetes cluster kubeone minikube kubectl on top of docker desktop refer creation of kubernetes cluster https classroom udacity com nanodegrees nd9990 parts 96fffeca 63e0 4bfc 92a6 a869b5b64b9e modules 8c55d5a1 ae41 4313 ab37 86b1f35b9ada lessons e03717be 332d 4a2e 8576 69f7aae7726e concepts fac375ff 8a1c 461f 8e7c 6c9a844358ac 6 travis ci lesson 6 of this course would require you to use travis ci you would have to sign up on travis ci com using your github account credentials and then create a travis yml for your project refer this tutorial to get started with travis ci https docs travis ci com user tutorial
cloud
awesome-project-ideas
markdownlint disable md033 awesome deep learning project ideas awesome https cdn rawgit com sindresorhus awesome d7305f38d29fed78fa85652e3a63e154dd8e8829 media badge svg https github com sindresorhus awesome a curated list of practical deep learning and machine learning project ideas 30 ideas relevant to both the academia and industry ranges from beginner friendly to research projects contents hackathon ideas hackathon ideas project ideas unlocked by use of large language models specially text to text note that a lot of the text to text ideas can also be buit a lot better with llms now text text with some topics about natural language processing forecasting forecasting most of the topics in this section is about time series and similar forecasting challenges recommendation systems recommendation systems vision vision with topics about image and video processing music and audio music these topics are about combining ideas from language and audio to understand music conclusion conclusion hackathon ideas developer ideas text to cmd for terminal take user intent in terminal e g bash ask how to list all files with details execute ls l y n y ls l build and edit yamls using natural language e g kubernetes and other form of config files kor eyurtsev github io kor for ideas on how this is done for json can be use case specific build pipelines kube mobile android ios sdk for stable diffusion inference apple has released a coreml stable diffusion inference https github com apple ml stable diffusion voice powered experiences audio conversation with chatgpt can combine with fast text to speech e g eleven labs https elevenlabs io to have a two way conversation telegram whatsapp bot to get audio and save as text with metadata into mem ai or roam research or obsidian edit image by giving instructions of what you want to do seechatgpt https github com nischaydnk seechatgpt and playgroundai com playgroundai com as examples the underlying mechanism which you can use is called instructpix2pix huggingface co spaces timbrooks instruct pix2pix semantic search over any media can build using clip or blip 2 embeddings huggingface co docs transformers main model doc blip 2 for images and clap https github com laion ai clap tree clap quick start for all audio including music and speech text to music generation see musiclm https google research github io seanet musiclm examples for reference knowledge base qa aka answer engines take any plaintext dataset e g state of the union address and build on top of that image https user images githubusercontent com 3250749 223094577 8126570b f7a4 48ad 9f77 ff86a8b21161 png can use this over video subtitles to search and qa over videos as well by mapping back to source guided summarisation rewriting take specific questions which the user might have about a large text dataset e g a novel or book and include that in your summary of the piece pay attention to specific entities and retell the events which happen in a story with attention to that character controlnet stable diffusion for aethetic control build tooling using diffusers https github com huggingface diffusers which takes in a set of photos finetunes a model lora on a person detects face and moves it to a new aesthetic e g futuristic neon punk grunge rock studio ghibli can also add instructpix2pix to give user more control text to code sql use code understanding to convert use query to sql or another executable programming language including domain specific languages here is an example of the same qabot github com hardbyte qabot text autonomous tagging of stackoverflow questions make a multi label classification system that automatically assigns tags for questions posted on a forum such as stackoverflow or quora dataset stacklite https www kaggle com stackoverflow stacklite or 10 sample https www kaggle com stackoverflow stacksample keyword concept identification identify keywords from millions of questions dataset stackoverflow question samples by facebook https www kaggle com c facebook recruiting iii keyword extraction data topic identification multi label classification of printed media articles to topics dataset greek media monitoring multi label classification https www kaggle com c wise 2014 data natural language understanding sentence to sentence semantic similarity can you identify question pairs that have the same intent or meaning dataset quora question pairs https www kaggle com c quora question pairs data with similar questions marked fight online abuse can you confidently and accurately tell whether a particular comment is abusive dataset toxic comments on kaggle https www kaggle com c jigsaw toxic comment classification challenge open domain question answering can you build a bot which answers questions according to the student s age or her curriculum facebook s fair https github com facebookresearch drqa is built in a similar way for wikipedia dataset ncert books https ncert nic in textbook php for k 12 school students in india narrativeqa by google deepmind https github com deepmind narrativeqa and squad by stanford https rajpurkar github io squad explorer automatic text summarization can you create a summary with the major points of the original document abstractive write your own summary and extractive select pieces of text from original are two popular approaches dataset cnn and dailymail news pieces http cs nyu edu kcho dmqa by google deepmind copy cat bot generate plausible new text which looks like some other text obama speeches for instance you can create a bot which writes some new speeches in obama s style https medium com samim obama rnn machine generated political speeches c8abd18a2ea0 trump bot or a twitter bot which mimics realdonaldtrump http www twitter com realdonaldtrump narendra modi bot saying doston start by scrapping off his hindi speeches from his personal website http www narendramodi in example dataset english transcript of modi speeches https github com mgupta1410 pm modi speeches repo check mlm blog http machinelearningmastery com text generation lstm recurrent neural networks python keras for some hints sentiment analysis do twitter sentiment analysis on tweets sorted by geography and timestamp dataset tweets sentiment tagged by humans https inclass kaggle com c si650winter11 data forecasting univariate time series forecasting how much will it rain this year dataset 45 years of rainfall data http research jisao washington edu data sets widmann multi variate time series forecasting how polluted will your town s air be pollution level forecasting dataset air quality dataset https archive ics uci edu ml datasets beijing pm2 5 data demand load forecasting find a short term forecast on electricity consumption of a single home dataset electricity consumption of a household https archive ics uci edu ml datasets individual household electric power consumption predict blood donation we re interested in predicting if a blood donor will donate within a given time window more on the problem statement at driven data https www drivendata org competitions 2 warm up predict blood donations page 7 dataset uci ml datasets repo https archive ics uci edu ml datasets blood transfusion service center recommendation systems movie recommender can you predict the rating a user will give on a movie do this using the movies that user has rated in the past as well as the ratings similar users have given similar movies dataset netflix prize http www netflixprize com and movielens datasets https grouplens org datasets movielens search recommendation system predict which xbox game a visitor will be most interested in based on their search query dataset bestbuy https www kaggle com c acm sf chapter hackathon small data can you predict influencers in the social network how can you predict social influencers dataset peerindex https www kaggle com c predict who is more influential in a social network data vision image classification object recognition or image classification task is how deep learning shot up to it s present day resurgence datasets cifar 10 https www cs toronto edu kriz cifar html imagenet http www image net org ms coco http mscoco org is the modern replacement to the imagenet challenge mnist handwritten digit classification challenge http yann lecun com exdb mnist is the classic entry point character recognition digits http ai stanford edu btaskar ocr is the good old optical character recognition problem bird species identification from an image using the caltech ucsd birds dataset http www vision caltech edu visipedia cub 200 2011 html dataset diagnosing and segmenting brain tumors and phenotypes using mri scans dataset miccai machine learning challenge aka mlc 2014 https www nmr mgh harvard edu lab laboratory computational imaging biomarkers miccai 2014 machine learning challenge identify endangered right whales in aerial photographs dataset moaa right whale https www kaggle com c noaa right whale recognition can computer vision spot distracted drivers dataset state farm distracted driver detection https www kaggle com c state farm distracted driver detection data on kaggle bone x ray competition can you identify if a hand is broken from a x ray radiographs automatically with better than human performance stanford s bone xray deep learning competition with mura dataset https stanfordmlgroup github io competitions mura image captioning can you caption explain the photo a way human would dataset ms coco http mscoco org dataset captions challenge2015 image segmentation object detection can you extract an object of interest from an image dataset ms coco http mscoco org dataset detections challenge2017 carvana image masking challenge https www kaggle com c carvana image masking challenge data on kaggle large scale video understanding can you produce the best video tag predictions dataset youtube 8m https research google com youtube8m index html video summarization can you select the semantically relevant important parts from the video example fast forward video based on semantic extraction https arxiv org abs 1708 04160 dataset unaware of any standard dataset or agreed upon metrics i think youtube 8m https research google com youtube8m index html might be good starting point style transfer can you recompose images in the style of other images dataset fzliu on github https github com fzliu style transfer tree master images shared target and source images with results chest xray can you detect if someone is sick from their chest xray or guess their radiology report dataset mimic cxr at physionet https physionet org content mimic cxr 2 0 0 clinical diagnostics image identification classification segmentation can you help build an open source software for lung cancer detection to help radiologists link concept to clinic https concepttoclinic drivendata org challenge on drivendata satellite imagery processing for socioeconomic analysis can you estimate the standard of living or energy consumption of a place from night time satellite imagery reference for project details stanford poverty estimation project http sustain stanford edu predicting poverty satellite imagery processing for automated tagging can you automatically tag satellite images with human features such as buildings roads waterways and so on help free the manual effort in tagging satellite imagery kaggle dataset by dstl uk https www kaggle com c dstl satellite imagery feature detection music music audio recommendation systems can you tell if two songs are similar using their sound or lyrics dataset million songs dataset https labrosa ee columbia edu millionsong and it s 1 sample example anusha et al https cs224d stanford edu reports balakrishnandixit pdf music genre recognition using neural networks can you identify the musical genre using their spectrograms or other sound information datasets fma https github com mdeff fma or gtzan on keras https github com hguimaraes gtzan keras get started with librosa https librosa github io librosa index html for feature extraction faq can i use the ideas here for my thesis yes totally i d love to know how it went do you have any advice before i start my project advice for short term machine learning projects https rockt github io 2018 08 29 msc advice by tim r is a pretty good starting point how can i add my ideas here just send a pull request and we ll discuss hey something is wrong here yikes i am sorry please tell me by raising a github issue https github com nirantk awesome project ideas issues i ll fix it as soon as possible acknowledgements problems are motivated by the ones shared at cmu machine learning http www cs cmu edu 10701 projects html stanford cs229 machine learning projects http cs229 stanford edu swyx https github com sw yx ai notes blob main resources ai hackathon stack md credit built with lots of keyboard smashing and copy pasta love by nirantk find me on twitter http www twitter com nirantk license this repository is licensed under the mit license please see the license file license for more details
deep-learning forecasting machine-learning classification series-forecasting image-classification awesome-list awesome dataset multi-label-classification
ai
web-development-projects
the repository contains all project files and demo links for web development foundation course html5 css3 sass and bootstrap 5 b project 1 b profile card repository link https github com shubhamsarda web development projects tree master profile card demo link https profilecard ul netlify app code pen https codepen io shubham ul pen zyzqmxz b project 2 b login page repository link https github com shubhamsarda web development projects tree master login page demo link https loginpage ul netlify app code pen https codepen io shubham ul pen wnpbyzx b project 3 b spotify clone flex box website repository link https github com shubhamsarda web development projects tree master spotify landing page demo link https spotifyclone ul netlify app code pen https codepen io shubham ul pen eywmypp b project 4 b testimonial section css grid repository link https github com shubhamsarda web development projects tree master testimonial section demo link https testimonial ul netlify app code pen https codepen io shubham ul pen ojmnead b project 5 b portfolio website css framework bootstrap repository link https github com shubhamsarda web development projects tree master portfolio demo link https portfolio ul netlify app b project 6 b ed tech landing page all concepts repository link https github com shubhamsarda web development projects tree master edtech landing page demo link https edtech ul netlify app
html5 css3 bootstrap5 flexbox grid scss
front_end
Build-ML-pipelines-for-Computer-Vision-NLP-and-Graph-Neural-Networks-using-Nvidia-Triton-Server
build ml pipelines for computer vision nlp and graph neural networks using nvidia triton inference server in this workshop we are going to use nvidia s triton inference server https developer nvidia com nvidia triton inference server ncid partn 88872 cid dl13 partn en us formerly known as tensorrt inference server which simplifies the deployment of ai models at scale in production it natively supports multiple framework backends like tensorflow pytorch onnx runtime python and even custom backends it supports different types of inference queries through advanced batching and scheduling algorithms supports live model updates and runs models on cpus and gpus triton is also designed to increase inference performance by maximizing hardware utilization through concurrent model execution and dynamic batching concurrent execution allows you to run multiple copies of a model and multiple different models in parallel on the same gpu through dynamic batching triton can dynamically group inference requests on the server side to maximize performance for this examination we focus on hosting deploying multiple trained models tensorflow pytorch on triton inference server leverage its full potential once models are deployed we can make inference requests and can get back the predictions center img width 797 alt screenshot 2021 07 07 at 08 58 50 src https user images githubusercontent com 40523048 124829341 cb9bff80 df78 11eb 99ef 9b650010b039 png center the above image represents the triton inference server architecture https developer nvidia com blog simplifying ai inference in production with triton with its various connected components nvidia triton inference server features screenshot 2021 09 13 at 11 42 45 https user images githubusercontent com 40523048 133061949 f49d636c b2a4 4dc2 b80e e32896d2ae64 png 1 126ig2mnfl4i6ih9fku3sg https user images githubusercontent com 40523048 120965914 c4a98380 c765 11eb 86f0 eb2ce2574e97 png image depicting the capability of nvidia s triton inference server https developer nvidia com nvidia triton inference server ncid partn 88872 cid dl13 partn en us to host multiple heterogeneous deep learning frameworks on a gpu or a cpu depending upon the backened for setting up the triton inference server we generally need to pass two hurdles 1 set up our own inference server and 2 after that we have to write a client side python script that can communicate with the inference server to send requests in our case text and get back predictions or image text feature embeddings part1 setting up triton inference server on the machine let s start by setting up a triton server locally on the computer by following the below steps quickstart with docker 1 install docker 2 docker pull nvcr io nvidia tritonserver 21 06 1 py3 3 git clone https github com sachinsharma9780 ai enterprise workshop building ml pipelines git 4 cd ai enterprise workshop building ml pipelines 5 docker run rm p8000 8000 p8001 8001 p8002 8002 v pwd model repository models nvcr io nvidia tritonserver 21 06 1 py3 tritonserver model repository models 6 curl v http localhost 8000 v2 health ready continue to part 2 below install docker docker https docs docker com get docker pulling triton server docker image from nvidia ngc 1 download https ngc nvidia com catalog containers nvidia tritonserver docker image 2 or use the command docker pull nvcr io nvidia tritonserver 21 06 1 py3 4 image size 10 6 gb 10 15 mins to install 5 to view the downloaded docker image docker images create a model repository to add your models 1 clone the triton inference server github repository https github com triton inference server server git if you need an example model repository this will also download some pre trained models structured in a manner as expected by triton 2 after cloning you can find the trained models under server docs examples model repository 3 or you can clone this repo and in the model repository folder i have already stored some default trained models with their corresponding configuration file which comes along while cloning the above repository 4 instantiate triton server using the cmd br docker run gpus 1 rm p8000 8000 p8001 8001 p8002 8002 v full path to example model repository models docker image tritonserver model repository models note where docker image is nvcr io nvidia tritonserver xx yy py3 if you pulled the triton container from ngc v flag points to the path of your model repository where all your models are stored and gpus 1 flag refers to 1 system gpu should be available to triton for inference as shown above e g docker run gpus 1 rm p8000 8000 p8001 8001 p8002 8002 v users sachin desktop arangodb scripts triton model repository models nvcr io nvidia tritonserver 21 06 1 py3 tritonserver model repository models screenshot 2021 06 07 at 11 24 57 https user images githubusercontent com 40523048 120992588 0ac11000 c783 11eb 8fdb 43404f52f97b png center the above image shows the successful instantiation of triton server center verify triton is running correctly curl v http localhost 8000 v2 health ready the expected output should be by default triton provide services on port 8000 br http 1 1 200 ok br content length 0 br content type text plain br part2 setting up triton inference client in this part we will download the libraries required to interact with triton server i e sending inference requests input data to the deployed models and receiving back the predictions it is recommended to install the below packages in a separate conda https docs conda io projects conda en latest index html environment install required libraries 1 cd into scripts folder 2 pip install r requirements txt or install as show below requried libraries for application 1 and 2 1 pip install nvidia pyindex 2 pip install tritonclient all 3 pip install torch 4 pip install transformers 5 python m pip install grpcio 6 python m pip install grpcio tools 7 pip install geventhttpclient 8 pip install attrdict 9 pip install pillow requried libraries for application 3 arangodb https www arangodb com docs stable getting started installation html pip install nvidia pyindex pip install tritonclient all python m pip install grpcio python m pip install grpcio tools pip install geventhttpclient https pytorch geometric readthedocs io en latest notes installation html depends on your machine pip install ogb order of execution application 1 deploying hugging face transformer model on triton inference server with an application to zero shot text classiifcation 1 start with creating a triton acceptabele model using a notebook under folder create triton acceptable models trace pytorch models ipynb 2 add this created model into a model repository 3 start the triton server with this newly added model 4 run the application using the notebook triton client zero shot text classification application ipynb application 2 movie recommendation with triton inference server and arangodb 1 start with creating a triton acceptabele model using a notebook trace sentence repn bert model ipynb 2 add this created model into a model repository 3 start the triton server with this newly added model you can add multiple models in this repository depending upon the memory 4 run the application using the notebook movie recommendation triton client ipynb application 3 graph ml nvidia triton and arangodb amazon product recommendation apr application train grapsage model on apr dataset using a notebook comprehensive graphsage guide with pytorchgeometric ipynb either you can chose your own generated checkpoints from 1 or i have already stored them under checkpoint folder for both gpu and cpu trained graphsage model create a trace on graphsage model using these checkpoints using notebook trace obgn product graphsage model ipynb update your model repository for this traced model like shown in model repository folder graph embeddings load apr graph dataset into arangodb using the dump https drive google com drive folders 1jf0gkammslrsmmnb9uzezdgmx8ngfwv4 and arangorestore https www arangodb com docs stable programs arangorestore html utility for eg arangorestore input directory dump start the triton server with this newly added model you can add multiple models in this repository depending upon the memory run the application using the graph ml triton arangodb product recommendation app ipynb dump folder this folder already contains movie embeddings for all the movie descriptions present inside the imdb dataset we did this to save time in case you run the movie recommendation notebook on cpu then it takes some time to generate movie embeddings and then store them in arangodb in order to restore the movie embeddings inside the arangodb we can use its arangorestore https www arangodb com docs stable programs arangorestore html utiliy image classification example once the libraries are installed we can start communicating with triton server using inference scripts e g python image client py c 3 m inception graphdef s inception path to example image slide deck presentation https docs google com presentation d 1w0bnesjrn5tr1e7ahvzue70nnksiwhmroxieh5ria2w edit slide id ge266904e26 0 530 workshop 1 nvidia triton meets arangodb youtube https www youtube com watch v voim7hibgdo t 1952s workshop 2 machine learning on graphs with pytorch geometric nvidia triton and arangodb youtube https www youtube com watch v hvgrpuld5zm t 262s references 1 https medium com nvidia ai how to deploy almost any hugging face model on nvidia triton inference server with an 8ee7ec0e6fc4 2 https towardsdatascience com a comprehensive case study of graphsage algorithm with hands on experience using pytorchgeometric 6fc631ab1067 3 https sachinsharma9780 medium com a voyage through graph machine learning universe motivation applications datasets graph ml e573a898b346 4 https developer nvidia com nvidia triton inference server ncid partn 88872 cid dl13 partn en us 5 https github com triton inference server server blob main docs quickstart md
pytorch deep-learning machine-learning graph-machine-learning arangodb
ai
GameEngineIntegrations
gpuopen game engine integrations we will be providing example integrations of various gpuopen technologies into game engines these example integrations can help jumpstart your own integration of these features into your software applicable software includes either games vr titles or pro applications unity we have created a repository with examples that demonstrate unity reg integrations for some of the libraries and sdks available on gpuopen unity integrations unityintegrations https github com gpuopen librariesandsdks unityintegrations unreal engine we have created a fork of unreal reg engine to provide example ue4 integrations for some of the libraries and sdks available on gpuopen gpuopen fork of unreal engine https github com gpuopensoftware unrealengine example ue4 integrations liquidvr mgpu https github com gpuopensoftware unrealengine tree liquidvr mgpu ue4 plugins there are 8 branches of the amfmedia plugin amfmedia 4 15 https github com gpuopensoftware unrealengine tree amfmedia 4 15 amfmedia 4 16 https github com gpuopensoftware unrealengine tree amfmedia 4 16 amfmedia 4 17 https github com gpuopensoftware unrealengine tree amfmedia 4 17 amfmedia 4 18 https github com gpuopensoftware unrealengine tree amfmedia 4 18 amfmedia 4 19 https github com gpuopensoftware unrealengine tree amfmedia 4 19 amfmedia 4 20 https github com gpuopensoftware unrealengine tree amfmedia 4 20 amfmedia 4 21 https github com gpuopensoftware unrealengine tree amfmedia 4 21 amfmedia 4 22 https github com gpuopensoftware unrealengine tree amfmedia 4 22 amfmedia 4 23 https github com gpuopensoftware unrealengine tree amfmedia 4 23 amfmedia 4 24 https github com gpuopensoftware unrealengine tree amfmedia 4 24 amfmedia 4 25 https github com gpuopensoftware unrealengine tree amfmedia 4 25 amfmedia 4 26 https github com gpuopensoftware unrealengine tree amfmedia 4 26 there is 5 branches of the amfstitchmedia plugin amfstitchmedia 4 18 https github com gpuopensoftware unrealengine tree amfstitchmedia 4 18 amfstitchmedia 4 19 https github com gpuopensoftware unrealengine tree amfstitchmedia 4 19 amfstitchmedia 4 20 https github com gpuopensoftware unrealengine tree amfstitchmedia 4 20 amfstitchmedia 4 21 https github com gpuopensoftware unrealengine tree amfstitchmedia 4 21 amfstitchmedia 4 22 https github com gpuopensoftware unrealengine tree amfstitchmedia 4 22 amfstitchmedia 4 23 https github com gpuopensoftware unrealengine tree amfstitchmedia 4 23 amfstitchmedia 4 24 https github com gpuopensoftware unrealengine tree amfstitchmedia 4 24 amfstitchmedia 4 25 https github com gpuopensoftware unrealengine tree amfstitchmedia 4 25 amfstitchmedia 4 26 https github com gpuopensoftware unrealengine tree amfstitchmedia 4 26 there is 2 branch of the femfx plugin femfx 4 18 https github com gpuopensoftware unrealengine tree femfx 4 18 femfx 4 18 alienpods sample https github com gpuopensoftware unrealengine tree femfx alienpods there is 1 branch of the amfmediachromakey plugin amfmediacrhomakey 4 22 https github com gpuopensoftware unrealengine tree amfmediachromakey 4 22 amfmediacrhomakey 4 23 https github com gpuopensoftware unrealengine tree amfmediachromakey 4 23 amfmediacrhomakey 4 24 https github com gpuopensoftware unrealengine tree amfmediachromakey 4 24 amfmediacrhomakey 4 25 https github com gpuopensoftware unrealengine tree amfmediachromakey 4 25 amfmediacrhomakey 4 26 https github com gpuopensoftware unrealengine tree amfmediachromakey 4 26 there are 3 branches of the tressfx plugin tressfx 4 22 https github com gpuopensoftware unrealengine tree tressfx 4 22 tressfx 4 23 https github com gpuopensoftware unrealengine tree tressfx 4 23 tressfx 4 24 https github com gpuopensoftware unrealengine tree tressfx 4 24 tressfx 4 25 https github com gpuopensoftware unrealengine tree tressfx 4 25 tressfx 4 26 https github com gpuopensoftware unrealengine tree tressfx 4 26 there is 2 branches of the fidelityfx cas plugin fidelityfxcas 4 23 https github com gpuopensoftware unrealengine tree fidelityfxcas 4 23 fidelityfxcas 4 24 https github com gpuopensoftware unrealengine tree fidelityfxcas 4 24 fidelityfxcas 4 25 https github com gpuopensoftware unrealengine tree fidelityfxcas 4 25 1 fidelityfxcas 4 26 https github com gpuopensoftware unrealengine tree fidelityfxcas 4 26 there is 1 branches of the fidelityfx lpm plugin fidelityfxlpm 4 25 1 https github com gpuopensoftware unrealengine tree fidelityfxlpm 4 25 1 fidelityfxlpm 4 26 https github com gpuopensoftware unrealengine tree fidelityfxlpm 4 26 there is 1 branches of the fidelityfx spd plugin fidelityfxspd 4 25 3 https github com gpuopensoftware unrealengine tree fidelityfxspd 4 25 3 fidelityfxspd 4 26 https github com gpuopensoftware unrealengine tree fidelityfxspd 4 26 there is 1 branches of the fidelityfx vrs plugin fidelityfxvrs 4 26 https github com gpuopensoftware unrealengine tree fidelityfxvrs 4 26 instructions to access the fork and its various branches you will first need access to epic s unreal engine repo https github com epicgames unrealengine see epic s signup information https github com epicgames signup on github for instructions you will also need to be signed in to github with the username you entered into your epic games account profile otherwise the links above to the gpuopen fork example integrations and epic s repo will not work attribution unity is a trademark of unity technologies unreal is a trademark or registered trademark of epic games inc in the united states of america and elsewhere
server
convertigo
convertigo low code no code mobile web platform convertigo is an open source low code no code platform for mobile web application development and back end as a service circleci ci image ci url p align center img src https uploads ssl webflow com 62d55bc018a5be3f0b91fcf3 62d55bc018a5be9efe91fd62 animation2 gif alt convertigo low code studio width 1000px p convertigo community edition is an open source fullstack low code no code platform the platform is used to build and run complex cross platform enterprise mobile and web apps in a few days convertigo platform is composed of several components 1 convertigo server the back end server mbaas part handles back end connectors micro services execution offline data device synchronization and serves mobile and pwa web apps convertigo server can be downloaded from github https github com convertigo convertigo releases latest as a tomcat war file or run directly as containers from dockerhub https hub docker com convertigo 2 convertigo studio runs on a windows or a macos workstation eclipse based ide used to program mbaas micro services workflows and use the mobile builder to build mobile apps uis in low code mode can be directly downloaded from github https github com convertigo convertigo releases latest 3 convertigo sdks can be used with third party mobile development tools such as xcode ios android studio android sdks are available on each platform standard repository bintray for android cocoapods for ios and npm for angular reactnative and vue js 4 convertigo forms the no code app builder to build form based apps as pwas or web applications with a web based nocode studio intented for non technical developpers citizen developpers convertigo community edition brought to you by convertigo sa paris san francisco the platform is currently used by more than 150k developers worldwide building enterprise class web and mobile apps convertigo image convertigo url www convertigo com convertigo url license convertigo community edition is agpl https www gnu org licenses agpl 3 0 html based changelog see changelog md changelog md markdown link img dfn s ci image https circleci com gh convertigo convertigo svg style shield ci url https circleci com gh convertigo workflows convertigo convertigo image https www convertigo com wp content themes eightdegree images logo convertigo png convertigo url https www convertigo com
low-code-development-platform mobile-development microservices kubernetes ionic-framework angular convertigo no-code opensource
front_end
udagram
udagram image filtering microservice udagram is a simple cloud application developed alongside the udacity cloud engineering nanodegree it allows users to register and log into a web client post photos to the feed and process photos using an image filtering microservice the project is split into three parts 1 the simple frontend https github com udacity cloud developer tree master course 02 exercises udacity c2 frontend a basic ionic client web application which consumes the restapi backend covered in the course 2 the restapi backend https github com udacity cloud developer tree master course 02 exercises udacity c2 restapi a node express server which can be deployed to a cloud service covered in the course 3 the image filtering microservice https github com udacity cloud developer tree master course 02 project image filter starter code the final project for the course it is a node express application which runs a simple script to process images your assignment tasks setup node environment you ll need to create a new node server open a new terminal within the project directory and run 1 initialize a new project npm i 2 run the development server with npm run dev create a new endpoint in the server ts file the starter code has a task for you to complete an endpoint in src server ts which uses query parameter to download an image from a public url filter the image and return the result we ve included a few helper functions to handle some of these concepts and we re importing it for you at the top of the src server ts file typescript import filterimagefromurl deletelocalfiles from util util deploying your system follow the process described in the course to eb init a new application and eb create a new environment to deploy your image filter service don t forget you can use eb deploy to push changes stand out optional refactor the course restapi if you re feeling up to it refactor the course restapi to make a request to your newly provisioned image server authentication prevent requests without valid authentication headers note if you choose to submit this make sure to add the token to the postman collection and export the postman collection file to your submission so we can review custom domain name add your own domain name and have it point to the running services try adding a subdomain name to point to the processing server note domain names are not included in aws free tier and will incur a cost
cloud
pyRTOS
pyrtos introduction pyrtos is a real time operating system rtos written in python the primary goal of pyrtos is to provide a pure python rtos that will work in circuitpython the secondary goal is to provide an educational tool for advanced circuitpython users who want to learn to use an rtos pyrtos should also work in micropython and it can be used in standard python as well pyrtos was modeled after freertos with some critical differences the biggest difference is that it uses a voluntary task preemption model where freertos generally enforces preemption through timer interrupts this means there is a greater onus on the user to ensure that all tasks are well behaved pyrtos also uses different naming conventions and tasks have built in message passing to the best of my knowledge aside from voluntary preemption the task scheduling is identical to that found in freertos tasks are assigned numerical priorities the lower the number the higher the priority and the highest priority ready task is given cpu time where ties favor the currently running task alternative scheduling algorithms may be added in the future table of contents basic usage basic usage tasks tasks notifications notifications messages messages error handling error handling pyrtos api pyrtos api main api main api mutual exclusion synchronization mutual exclusion synchronization task api task api task block conditions task block conditions message api message api os api os api service routines service routines templates examples templates examples task template task template message handling example template message handling example template timeout delay examples timeout delay examples messages passing examples message passing examples notification examples notification examples message queue exmaples message queue examples mutex examples mutex examples service routine examples service routine examples communication setup examples communication setup examples future additions future additions notes notes basic usage pyrtos separates functionality into tasks a task is similar to a thread in a desktop operating system except that in pyrtos tasks cannot be migrated to other processors or cores this is due to limitations with circuitpython in theory though it should be possible to write a scheduler with thread migration for micropython which does support hardware multithreading a simple pyrtos program will define some task functions wrap them in task objects and then register them with the os using the add task api function once all tasks are added the start function is used to start the rtos once started the rtos will schedule time for tasks giving tasks cpu time based on a priority scheduling algorithm when the tasks are well behaved designed to work together and given the right priorities the operating system will orchestrate them so they work together to accomplish whatever goal the program was designed for see sample py for an example task and usage tasks a pyrtos task is composed of a task object combined with a function containing the task code a task function takes a single argument a reference to the task object containing it task functions are python generators any code before the first yield is setup code anything returned by this yield will be ignored the main task loop should follow this yield this is the code that will be executed when the scheduler gives the task cpu time the main task loop is typically an infinite loop if the task needs to terminate a return call should be used and any teardown that is necessary should be done directly before returning typically though tasks never return preemption in pyrtos is completely voluntary this means that all tasks must periodically yield control back to the os or no other task will get cpu time messages cannot be passed between tasks and other administrative duties of the os will never get done yields have two functions in pyrtos one is merely to pass control back to the os this allows the os to reevaluate task priorities and pass control to a higher priority ready task and it allows the os to take care of administration like message passing lock handling and such yields should be fairly frequent but not so frequent that the program spends more time in the os than in tasks for small tasks once per main loop may be sufficient for larger tasks yields should be placed between significant subsections if a task has a section of timing dependent code though do not place yields in places where they could interrupt timing critical processes there is no guarantee a yield will return within the required time yields are also used to make certain blocking api calls the most common will likely be delays higher priority processes need to be especially well behaved because even frequent yields will not give lower priority processes cpu time the default scheduler always gives the highest priority ready task the cpu time the only way lower priority tasks ever get time is if higher priority tasks block when they do not need the cpu time typically this means blocking delays which are accomplished in pyrtos by yielding with a timeout generator when the timeout generator expires the task will become ready again but until then lower priority tasks will be allowed to have cpu time tasks can also block when waiting for messages or mutual exclusion locks in the future more forgiving non real time schedulers may be available there are also some places tasks should always yield whenever a message is passed it is placed on a local queue messages in the local task outgoing queue are delivered when that task yields other places where yielding is necessary for an action to resolve will be noted with the documentation on those actions notifications notifications are a lightweight message passing mechanic native to tasks when a task is created a number of notifications can be specified these notifications can be used by other tasks or by service routines to communicate with the task notifications have a state and a value the state is an 8 bit signed value used to communicate the state of the notification the meaning of the state is user defined but the default values for the notification functions assume 0 means the notification is not currently active and 1 means it is active notifications also have a 32 bit value also signed which can be used as a counter or to communicate a small amount of data a series of functions are provided to send read and otherwise interact with notifications a notification wait is provided as a task block condition allowing a task to wait for a notification to be set to a specific state this blocking wait can even be used on other tasks to wait for a notification to be set to a particular value for example a task may want to send a notification but only once that notification is inactive for the target task and thus it might block to wait for that notification state to be set to 0 before it sends notifications are designed for lightweight message passing both when full messages are not necessary and for service routines to communicate with tasks in a very fast and lightweight manner to communicate via notification it is necessary to have a reference to the task you want to communicate with messages message passing mechanics are built directly into tasks in pyrtos in the form of mailboxes by default tasks are lightweight without mailboxes but a constructor argument can be used to give a task has its own incoming mailbox messages are delivered when the currently running task yields this message passing system is fairly simple each message has a single sender and a single recipient messages also have a type which can be pyrtos quit or a user defined type see sample py user defined types start with integer values of 128 and higher types below 128 are reserved for future use by the pyrtos api messages can also contain a message but this is not required if the type field is sufficient to convey the necessary information it is better to leave the message field empty to save memory the message field can contain anything including objects and lists if you need to pass arguments into a new task that has a mailbox one way to do this is to call deliver on the newly created task object with a list or tuple of arguments this will add the arguments to the task s mailbox allowing it to access the arguments during initialization checking messages is a critical part of any task that may receive messages unchecked mailboxes can accumulate so many messages that your system runs out of memory if your task may receive messages it is important to check the mailbox every loop also be careful not to send low priority tasks too many messages without periodically blocking all higher priority tasks so they can have time to process their messages if a task that is receiving messages never gets cpu time that is another way to run out of memory messages can be addressed with a reference to the target task object or with the name of the object names can be any sort of comparable data but numbers are the most efficient while strings are the most readable object reference addressing must target an object that actually exists otherwise the os will crash also note that keeping references of terminated tasks will prevent those tasks from being garbage collected creating a potential memory leak object references are the fastest message addressing method and they may provide some benefits when debugging but its up to the user to understand and avoid the associated hazards name addressing is much safer however messages addressed to names that are not among the existing tasks will silently fail to be delivered making certain bugs harder to find in addition because name addresses require finding the associated object name addressed messages will consume significantly more cpu time to deliver sample py has several examples of message passing error handling the error handling philosophy of pyrtos is write good code the os operates on the assumption that the user will write good code that does not cause issues for the os if this assumption is broken the os will crash when it comes across the broken elements and it probably will not give you very meaningful error messages for example attempting to send a notification to a task that does not have notifications will cause a crash with a message about the task object having no notifications attribute which is actually somewhat meaningful in this particular case pyrtos is designed to be used with circuitpython on devices that may have very limited resources adding os level error handling would require significantly more code using more flash and ram space as well as requiring more processing this is unacceptable as such we will not be adding os error handling code to gracefully handle os exceptions caused by incorrect use of the os we will also not add special os exceptions to throw when errors occur nor will we add preemptive error detection these are all expensive requiring significantly more code and processing time this means that errors that occur within the os may not produce high quality error messages users are encouraged to write good code so that errors in the os do not occur and barring that users can add error handling in their own code but note that we do not condone writing poor code and then covering up the errors with error handling please do not file issues for crashes caused by failures to use the apis provided correctly instead fix your own code that said if there is a bug in the os itself please do file an issue users should not have to work around bugs in pyrtos we apply the same standard write good code to ourselves and if we have failed to do that please let us know so we can fix it if you are having a crash and you are not sure where the error is occurring please do your best to check your own code first and if you cannot find the bug in your own code feel free to file an issue we will do our best to track down the issue as we have time at the time of writing this is a one man operation and i am not getting paid for this so it will likely not be immediate do not be offended if we find the error in your code and inform you of that if the error is on our end we will do our best to fix it in a timely manner but again one man team working for free so no promises this is open source so if it is urgent please consider fixing it yourself similarly if you find it difficult to correctly use the apis because the documentation is lacking or poorly written please do file an issue and we will try to improve it our philosophy of write good code also applies to our documentation if this sounds harsh we sincerely apologize we understand that this is not ideal unfortunately sacrifices must be made when working on systems with extremely limited resources limited flash means our code has to be very small limited ram means we are limited in what we can keep track of limited processing power means we have to weigh the value of every command we issue the purpose of an os is to facilitate the tasks the user deems important and the more resources the os uses the fewer resources are available for the user s tasks given such limited resources keeping the os as small and streamlined as possible takes precedence over error handling and debugging convenience if your application needs the error handling and you are confident your device has the resources you can always create a fork of pyrtos and add error handling yourself pyrtos is pretty small and it is not terribly difficult to understand if you are familiar with python so this should not be very hard pyrtos api main api add task task ul this adds a task to the scheduler tasks that have been created but not added will never run this can be useful if you want to create a task and then add it at some time in the future but in general tasks are created and then added to the scheduler before the scheduler is started ul ul task a task object ul ul note that add task will automatically initialize any task that has not previously been initialized this is important to keep in mind because initializing a task manually after adding it to the scheduler may cause serious problems if the initialization code cannot safely be run more than once ul start scheduler none ul this begins execution this function will only return when all tasks have terminated in most cases tasks will not terminate and this will never return note that this means no code after this will ever be run ul ul scheduler when this argument is left with its default value the default scheduler is used since no other schedulers currently exist this is really only useful if you want to write your own scheduler otherwise just call start without an argument this should be called only after you have added all tasks additional tasks can be added while the scheduler is running within running tasks but this should generally be avoided a better option if you need to have a task that is only activated once some condition is met is to create the task and then immediately suspend it this will not prevent the initialization code from running though if you need to prevent initialization code from running until the task is unsuspended you can place the first yield in the task before initialization instead of after ul mutual exclusion synchronization class mutex ul this is a simple mutex with priority inheritance ul ul mutex lock task ul this will attempt to acquire the lock on the mutex with a blocking call note that because this is a blocking call the returned generator must be passed to a yield in a list eg yield mutex lock self ul ul task the task requesting the lock ul ul ul mutex nb lock task ul this nonblocking lock will attempt to acquire the lock on the mutex it will return true if the lock is successfully acquired otherwise it will immediately return false ul ul task the task requesting the lock ul ul ul mutex unlock ul use this to release the lock on the mutex if the mutex is not locked this will have no effect note that there is no guard to prevent a mutex from being unlocked by some task other than the one that acquired it so it is up to the user to make sure a mutex locked in one task is not accidentally unlocked in some other task ul ul class binarysemaphore ul this is another simple mutex but unlike mutex it uses request order priority essentially this is a first come first served mutex ul ul binarysemaphore lock task ul this will attempt to acquire the lock on the mutex with a blocking call note that because this is a blocking call the returned generator must be passed to a yield in a list eg yield mutex lock task ul ul task the task requesting the lock ul ul ul binarysemaphore nb lock task ul this nonblocking lock will attempt to acquire the lock on the mutex it will return true if the lock is successfully acquired otherwise it will immediately return false ul ul task the task requesting the lock ul ul ul binarysemaphore unlock ul use this to release the lock on the mutex if the mutex is not locked this will have no effect note that there is no guard to prevent a binarysemaphore from being unlocked by some task other than the one that acquired it so it is up to the user to make sure a binary semaphore locked in one task is not accidentally unlocked in some other task when this is called if there are other tasks waiting for this lock the first of those to have requested it will acquire the lock ul ul task api class task func priority 255 name none notifications none mailbox false ul task functions must be wrapped in task objects that hold some context data this object keeps track of task state priority name blocking conditions and ingoing and outgoing message queues it also handles initialization transition to blocking state and message queues the task object also provides some utility functions for tasks ul ul func this is the actual task function this function must have the signature func name self and the function must be a generator the self argument is a reference to the task object wrapping the function and it will be passed in when the task is initialized see sample py for an example task function ul ul priority this is the task priority the lower the value the higher priority the task the range of possible values depends on the system but typically priority values are generally kept between 0 and 8 to 32 depending on the number of tasks the default of 255 is assumed to be far lower priority than any sane developer would ever use making the default the lowest possible priority normally each task should have a unique priority if multiple tasks have the same priority and no higher priority task is ready whichever is already running will be treated as the higher priority task so long as it remains the running task tasks may be given the same priority if this behavior is useful ul ul name naming tasks can make message passing easier see basic usage messages above for the pros and cons of using names if you do need to use names using integer values will use less memory and give better performance than strings but strings can be used for readability if memory and performance are not an issue ul ul notifications this sets the number of notifications a task has by default tasks are lightweight and have no notifications attempting to interact with notifications when this is not set will cause a crash and attempting to access notifications above the number that exist will also cause a crash ul ul mailbox when set to true the task is given a mailbox that can be accessed with task deliver task recv and task message count when set to false the default the task cannot receive messages and attempting to send the task messages will crash the os but it can still use task send to send messages to other tasks ul ul task initialize ul this will initialize the task function to obtain the generator and run any setup code code before the first yield note that this passes self into the task function to make the following methods of task available to the task this can be run explicitly if it is not it will be run when the task is added to the scheduler using add task in most cases it is not necessary to manually initialize tasks but if there are strict ordering and timing constraints between several tasks manual initialization can be used to guarantee that these constraints are met if a task is manually initialized add task will not attempt to initialize it again ul ul ul task notify set value index 0 state 1 value 0 ul this sets a notification state and value by default this sets notification 0 to state 1 and value 0 the main use case for this is when a notification needs to provide some data to the task for example an input sampled by an adc or states of an array of buttons or digital pins ul ul index the index of the notification to be set if the task only has one notification this argument can be omitted as the default index is 0 ul ul state the new state value of the notification a state of 0 means the notification is inactive a state of 1 means the notification is active and needs attention which is why 1 is the default value aside from default values the state value actually has no special meaning to pyrtos so if needed the state value can be treated as having whatever meaning is desired this is a signed byte type and thus it can take a value anywhere in the range 127 to 128 ul ul value the value to set the notification to the meaning of the value is purely user defined it can be used as a counter to keep track of how many times a given notification has been sent or as a data field to send integer data up to 32 bits this allows the sender to provide anything from temperature data to one pixel of 24 or 32 bit color data through a notification as with state this is a signed type ul ul ul task notify inc value index 0 state 1 step 1 ul similar the task notify set value this increments the value instead of setting it if the value of this notification is being used to keep track of how many notifications of this type have been received use this to send the notification see task notify set value for more detailed information on the first two arguments ul ul index the index of the notification to be set if the task only has one notification this argument can be omitted as the default index is 0 ul ul state the new state value of the notification ul ul step the increment step for the value this can be set to a negative value to decrement the default is to increment the value by 1 ul ul ul task notify get value index 0 ul returns the value of a notification ul ul index the index of the notification to retrieve the value of if the task only has one notification this argument can be omitted as the default index is 0 ul ul ul task notify set state index 0 state 1 ul in some cases it may be desirable to send a notification while retaining the current value or it may be necessary to change the state without changing the value this is especially useful in cases where the state value is used by the task to keep track of things like accesses for example a state of 2 might be used to indicate that a notification has not been read and after reading the notification the task might change the state to 1 to indicate that it has read the notification but is not ready for it to be overwritten with a new notification ul ul index the index of the notification to be set if the task only has one notification this argument can be omitted as the default index is 0 ul ul state the new state value of the notification ul ul ul task notify inc state index 0 step 1 ul instead of setting the state it may be desirable to increment it this can be used make a task block until it has received a particular notification a specific number of times it is possible to create something like a lightweight semaphore using this mechanic using a service routine this mechanic can also be used to wake up a task when a button or other input has been activated a specific number of times it is even possible to build certain types of state machines around this mechanic ul ul index the index of the notification to be set if the task only has one notification this argument can be omitted as the default index is 0 ul ul step the increment step for the value this can be set to a negative value to decrement the default is to increment the value by 1 ul ul ul task notify get state index 0 ul in some cases it may be necessary to check the state of a notification for example if a notification should only be sent if the notification is inactive this can be used to check if the state is 0 if a program is using a state of 2 to indicate an unread notification and a 1 to indicate a read one that should be preserved the task can use this to check the state ul ul index the index of the notification to be set if the task only has one notification this argument can be omitted as the default index is 0 ul ul ul task send msg ul put a message object in the outgoing message queue note that while it is possible to call this with any kind of data without an immediate exception the message passing code in the os will throw an exception if it cannot find a target member within the data and well behaved tasks will throw an exception if there is no type member also note that sent messages will remain in the outgoing message queue until the next yield unless there is some good reason not to it is probably a good idea to yield immediately after any message is sent the exception is if the task needs to send out messages to multiple targets before giving up the cpu send all of the messages then yield ul ul ul task recv ul this returns the incoming message queue and clears it this should be called regularly by any task that messages may be sent to to prevent the queue from accumulating so many messages that the devices runs out of memory note that because messages are distributed by the os once a task has called this no new messages will be added to the incoming queue until a yield has allowed some other task to run this means that if this is the highest priority task and it issues a non blocking yield no other task will have a chance to send a message thus high priority tasks should issue blocking yields typically timeouts periodically to allow lower priority tasks some cpu time ul ul ul task message count ul this returns the number of messages in the incoming queue ul task deliver msg ul this adds a message to the incoming queue this should almost never be called directly the one exception is that this can be used to pass arguments into a task in the main thread before the scheduler is started once the scheduler is started messages should be passed exclusively through the os and this should never be called directly note also that a message passed this way does not need to be a message object if you are using this to pass in arguments use whatever sort of data structure you want but make sure that the task expects it if you deliver your arguments to the task before initialization you can use self recv in the initialization code to retrieve them ul ul ul task suspend ul puts the task into the suspended state suspended tasks do not run while they are suspended unlike blocked tasks there are no conditions for resuming a suspended task suspended tasks are only returned to a ready state when they are explicitly resumed note that suspension is cheaper than blocking because suspended tasks do not have conditions that need to be evaluated regularly also note that suspending a blocked task will clear all blocking conditions ul ul ul task resume ul resumes the task from a suspended state this can also be used to resume a blocked task note that using this on a blocked task will clear all blocking conditions resume should not be used on the running task doing so will change the state to ready telling the os that the task is not running when it is running under the default scheduler this is unlikely to cause serious problems but the behavior of a running task that is in the ready state is undefined and may cause issues with other schedulers ul ul task block conditions task block conditions are generators that yield true if their conditions are met or false if they are not when a block condition returns true the task blocked by it is unblocked and put into the ready state a task is blocked when a yield returns a list of block conditions when any condition in that list returns true the task is unblocked this allows any blocking condition to be paired with a timeout condition to unblock it when the timeout expires even if the main condition is not met for example yield wait for message self timeout 5 will block until there is a message in the incoming message queue but it will timeout after 5 seconds and return to ready state even if no message arrives note that blocking conditions must be returned as lists even if there is only one condition thus for a one second blocking delay use yield timeout 1 timeout seconds ul by itself this blocks the current task for the specified amount of time this does not guarantee that the task will begin execution as soon as the time has elapsed but it does guarantee that it will not resume until that time has passed if this task is higher priority than the running task and all other ready tasks then this task will resume as soon as control is passed back to the scheduler and the os has completed its maintenance ul ul when combined with other blocking conditions this will act as a timeout because only one condition must be met to unblock when this evaluates to true the task will unblock even if other blocking conditions are not met ul ul seconds the number of seconds as a floating point value to delay ul timeout ns nanoseconds ul this is exactly like timeout except the argument specifies the delay in nanoseconds note that the precision of this condition is dependent on the clock speed of your cpu in addition to the limitations affecting timeout ul ul nanoseconds the number of nanoseconds as an integer value to delay ul delay cycles ul this delay is based on os cycles rather than time this allows for delays that are guaranteed to allow a specific number of cycles for other tasks to run this can be especially useful in cases where it is known that a specific task will take priority during the delay and that task is doing something that will require a known number of cycles to complete note that a cycle lasts from one yield to the next rather than going through the full loop of a task ul wait for message self ul this blocks until a message is added to the incoming message queue for this task self should be the task object of the calling task ul wait for notification task index 0 state 1 ul this blocks until the notification of number index for task is equal to state task does not necessarily have to be the caller this can be used to have the caller wait for another entity to send it a notification or this can be used to have another entity wait for a notification on a particular task to be set to a particular value for example wait for a notification to be inactive set to 0 before sending a new notification to the same index ul ufunction ul it is also possible to create your own blocking conditions user defined blocking conditions must follow the same pattern as api defined conditions blocking conditions are generator functions that yield true or false they must be infinite loops so they never throw a stopiteration exception the initial call to the function can take one or more arguments subsequent calls to the generator may take arguments using the generator send function but must not require arguments the scheduler will never pass arguments when testing blocking conditions in general it is probably better to use global variables or passed in objects for tracking and controlling state than it is to create conditions that can take arguments in the generator calls ul ul user defined blocking conditions are used exactly like api blocking conditions they are passed into a yield in a list ul message api class message type source target message none ul the message object is merely a container with some header data and a message the message element is optional as in many cases the type can be used to convey everything necessary ul ul type currently only one built in type exists quit types are used to convey what the message is about in many cases type may convey sufficient information to make the message element unnecessary type values from 0 to 127 are reserved for future use while higher values are available for user defined types note that type can also be used to communicate the format of the data passed in the message element ul ul source this is the sender of the message it is essentially a from field this is critical in messages requesting data from another task so that task will know where to send that data when no response is expected and the target task does not need to know the source this is less important but it is probably good practice to be honest about the source anyway just in case it is eventually needed this can be set to self or self name ul ul target this specifies the target task this is essentially the to field for the message this can be a direct object reference or the name of the target object see basic usage messages above for the pros and cons of using names versus objects ul ul message this is the message to be passed by default this is none because in many cases type is sufficient to convey the desired information message can be any kind of data or data structure if type is not empty type may be used to communicate the structure or format of the data contained in message ul class messagequeue capacity 10 ul the messagequeue object is a fifo queue for tasks to communicate with each other any task with a reference to a messagequeue can add messages to the queue and take messages from it both blocking and nonblocking calls are provided for these ul ul capacity by default the maximum number of messages allowed on the queue is 10 if the queue is full and a task attempts to push another onto it it will block if the blocking call is used otherwise it will just fail this can be used to limit how much memory is being used keeping track of messages ul ul messagequeue send msg ul this is a blocking send if the queue is full this will block until the message can be added ul ul msg the message can be any kind of data no destination or source needs to be specified but messages can contain that information if necessary ul ul keep in mind that blocking functions return generators that must be passed into a yield in a list thus a message would be sent with yield queue send msg ul ul ul messagequeue nb send msg ul this is nonblocking send if the queue is full this will return false otherwise the message will be added to the queue and this will return true ul ul msg the data to be put on the queue ul ul ul messagequeue recv out buffer ul this is a blocking receive if the queue is empty it will block until a message is added when a message is available it will append that message to out buffer ul ul out buffer this should be a list or some list like data container with an append method when this method unblocks the message will be deposited in this buffer ul ul ul messagequeue nb recv ul this is the nonblocking receive it will return a message if there is one in the queue or it will return none otherwise ul ul os api the os api provides tools for extending pyrtos some things just do not make sense to use tasks to do some things need higher reliability than tasks for the most part messing around inside the os is not a great idea while part of the pyrtos project policy is to not break userspace within a given major version this policy does not hold for the os api so when deciding whether to use the os api keep in mind that you may be creating a dependency on a specific release or even commit service routines service routines are os extensions that run every os loop an os loop occurs every time a task yields service routines have no priority mechanic and they run in the order they are registered registered service routines are intended to be permanent while it is possible to remove them this is part of the os implementation that may change without warning and there is no formal mechanic for removing a service routine likewise while service routines can technically be added from within tasks it is generally better practice to add them in the main initialization code before calling pyrtos start starting service routines outside of the main initialization code may make performance problems related to the service routine extremely difficult to debug service routines are simple functions which take no arguments and return nothing because they run every os loop service routines should be small and fast much like isrs in rtoss that use real time preemption normally service routines should also be stateless service routines that need to communicate with tasks can be created with references to global messagequeue or task objects as os extensions it is appropriate for service routines to call task deliver to send tasks messages however note that creating message objects is expensive sending lighter messages in messagequeue s is cheaper and future features may provide even better options service routines that absolutely need internal state can be created by wrapping a generator in a lambda function note that this will produce much heavier service routines than normal so this should be used sparingly and only when necessary to do this first create a generator function the function itself can take arguments but the yield cannot ideally there should be a single yield within an infinite loop that takes no arguments and returns nothing each os loop the service routine will begin execution directly after the yield and it will end when it gets back to the yield the generator must never return or a stopiteration exception will be thrown crashing the os once the generator has been created by calling the function wrap it in a lambda function like this lambda next gen this lambda function is your service routine which should be registered with add service routine use cases for service routines start with the kind of things isrs are normally used for in circuitpython as of 6 3 0 there are no iterrupts if you need to regularly check the state of a pin normally used as an interrupt source a service routine is a good place to do that just like with an isr you should not handle the program business in the service routine instead the service routine should notify a task that will handle the business associated with the iterrupt service routines can also be used to handle things that multiple tasks care about to avoid the need for semaphores for example if multiple tasks need network communication generally avoid this if possible a service routine can handle routing traffic between the network and the tasks note though that putting a large network stack in a service routine is a terrible idea that will starve your tasks of cpu time if you need something bigger than a very slim traffic routing routine it should be put into a task rather than a service routine no we will not wrap the service routine os code in a try except statement this would increase the size of the os and make it run more slowly instead write good code and follow the instructions in this document and no errors will ever get to the os attempting to start a service routine in the main initialization after pyrtos start will fail as this function does not return in normal usage and thus no code after it will ever run add service routine service routine ul this adds a service routine to the os to be called every os loop ul ul service routine a simple function that takes no argument and returns nothing if necessary this can also be a wrapped generator however stateful service routines like this will tie up memory and take a little longer to run and thus should be used sparingly ul templates examples task template def task self uncomment this to get argument list passed in with task deliver if you do this it will crash if no arguments are passed in prior to initialization args self recv 0 setup code here end setup code pass control back to rtos yield main task loop while true work code here end work code yield do this at least once per loop message handling example template msgs self recv for msg in msgs if msg type pyrtos quit if your task should never return remove this section tear down code here end tear down code return elif msg type temp temp is a user defined integer constant larger than 127 temperature data will be in msg message code here end code this will silently throw away messages that are not one of the specified types unless you add an else timeout delay examples delay for 0 5 seconds yield pyrtos timeout 0 5 delay for 100 nanoseconds yield pyrtos timeout ns 100 delay for 10 os cycles other tasks must yield 10 times unless all other tasks are suspended or blocked yield pyrtos delay 10 message passing examples send message send temperature of 45 degrees to display task temp constant is set to some value 127 self send pyrtos message temp self display 45 this message will be delivered at the next yield read message instruct hum read task to read the humidity sensor and send back the result when wait for a message to arrive read hum constant is set to some value 127 self send pyrtos message read hum self hum read yield wait for message self message queue examples create messagequeue create a messagequeue and pass it into some newly created tasks so it can be retrived during initialization of the tasks display pyrtos task display task priority 1 display tsensor pyrtos task tsensor task priority 2 tsensor temp queue messagequeue capacity 4 display deliver temp queue tsensor deliver temp queue pyrtos add task display pyrtos add task tsensor write messagequeue write the temperature to a messagequeue if the queue is full this will block until it has room yield temp queue send current temp read messagequeue read the temperature from a messagequeue if the queue is empty this will block until a message is added temp buffer yield temp queue recv temp buffer temp temp buffer pop notification examples example task with notification task that runs one step each time it receives a notification at index 0 this task uses one notification def task w notification self no setup yield main task loop while true self wait for notification index 0 state 1 task code here self notify get value 0 returns the value of notification 0 create task instance task task task w notification notifications 1 set notification with increment set notification 0 to a state of 1 and increment its value as a counter task notify inc value index 0 step 1 set notification to value set notification 0 to a state of 1 and value of 27 task notify set value index 0 value 27 mutex examples create mutex create a mutex and pass it into some newly created tasks temp printer pyrtos task temp task priority 3 temp printer hum printer pyrtos task hum task priority 3 hum printer print mutex pyrtos mutex temp printer deliver print mutex hum printer deliver print mutex use mutex use a mutex to avoid collisions when printing multiple lines of data note that it should never be necessary to actually do this since no preemption occurs without a yield this should only be necessary when at least one task yields within the code that needs lock protection yield print mutex lock print the last five temperature readings were for temp in temps print temp c print mutex unlock service routine examples scheduler delay simple service routine when using pyrtos within an os instead of as the os of an embedded microcontroller it will likely use significantly more cpu time than expected this is because it assumes it is the only thing running and needs to run as fast as the hardware will allow while there are several ways to solve this the simplest is probably to just create a service routine that introduces a delay to the scheduler the delay probably does not need to be very long to reduce the cpu time consumed by the scheduler to almost nothing but note that if your tasks do a lot between yields they may still use a lot of cpu time service routines are simple functions that do not take any arguments or return anything if a service routine needs outside data or communication it will need to be done through global variables more complex service routines can be made with generators if internal state needs to be preserved scheduler delay 0 001 scheduler delay in seconds 0 001 is 1 millisecond adjust as needed service routine function def delay sr global scheduler delay time sleep scheduler delay don t forget to import time pyrtos add service routine delay sr register service routine to run every scheduler loop communication setup examples before tasks can communicate with each other they have to know about each other giving tasks references to other tasks can be done in a variety of ways global tasks tasks are typically going to be global variables just to start with this makes them automatically available to anything that can access global scope for this to work though things need to be done in the correct order a task function cannot know about a task that does not exist yet and a task cannot be created until the associated task function is defined if things are done in the right order though this can still work we have to create the globals before we can define the task functions task0 none task1 none def task0 fun self global task1 give this task access to the task1 global variable initialization code here yield while true task code here yield def task1 fun self global task0 give this task access to the task0 global variable initialization code here yield while true task code here yield task0 pyrtos task task0 fun task1 pyrtos task task1 fun start tasks and then scheduler deliver tasks using mailboxes tasks can be delivered to other tasks using their mailboxes obviously this only works for tasks initialized with mailboxes order of events is less important here but the tasks must explicitly read their mailboxes to get the task references note that this is the accepted method for giving any arguments to tasks not just references to other tasks def task fun self target task self recv 0 yield while true code here including communication with target task yield task pyrtos task task fun priority 3 task deliver some other task module level globals if the tasks exist within a separate module the global nature of modules can be leveraged to provide what are essentially global references to those tasks this can be done simply by making the tasks global variables at the module level and then referencing them as variables contained in the module this eliminates the need for using the global directive however that may make the code less readable becaues the global directive at the begining of a task function is a clear indicator that the task is using that global excerpt from mod tasks py task pyrtos task task fun excert from external file import mod tasks def task fun self initialization code yield while true task code using reference to task without needing to declare it global mod tasks task etc yield future additions mutual exclusion we currently have a mutex object with priority inheritance and a binary semaphore object essentially a first come first served mutex but this isn t really a complete set of mutual exclusion tools freertos has counting semaphores and recursive mutexes because this uses voluntary preemption these are not terribly high priority as tasks can just not yield during critical sections rather than needing to use mutual exclusion there are still cases where mutual exclusion is necessary though this includes things like locking external hardware that has time consuming i o where we might want to yield for some time to allow the i o to complete without allowing other tasks to tamper with that hardware while we are waiting in addition some processors have vector processing and or floating point units that are slow enough to warrant yielding while waiting without giving up exclusive access to those units the relevance of these is not clear in the context of python but we definitely want some kind of mutual exclusion freertos we need to look through the freertos documentation to see what other things a fully featured rtos could have size because this is intended for use on microcontrollers size is a serious concern the code is very well commented but this means that comments take up a very significant fraction of the space we are releasing in mpy format for circuit python now which is cutting the size down to around 5kb maybe we should include a source version with comments stripped out in future releases notes this needs more extensive testing the mutex class has not been tested we also need more testing on block conditions sample py uses wait for message twice successfully timeout is also tested in sample py what we really need is a handful of example problems including some for actual circuitpython devices when the trinkey rp2040 comes out there will be some plenty of room for some solid circuitpython rtos example programs i have a neokey trinkey and a rotary trinkey neither of these have much going on so they are really only suitable for very simple examples
os
nlp-recipes-ja
nlp recipes for japanese this repository contains samples codes for natural language processing in japanese it s highly inspired by microsoft nlp recipes https github com microsoft nlp recipes content the following is a summary of the commonly used nlp scenarios covered in the repository each scenario is demonstrated in one or more scripts or jupyter notebook examples that make use of the core code base of models and repository utilities category methods basic examples basic cleaning normalization stopwords sentence segmantation ruby embeddings examples embeddings word2vec fasttext universal sentence encoder feature engineering examples feature engineering bag of words tf idf bm25 swem scdv morphological analysis examples morphological analysis konoha nagisa sentence similarity examples sentence similarity cosine similarity sentiment analysis sentiment analysis oseti text classification examples text classification tf idf logistic regression tf idf lightgbm bert t5 visualization examples visualization visualization with japanese texts environment bash docker compose up d build docker exec it nlp recipes ja bash
ai
notepad-based-calculator
notepad based calculator this is a unfinished project but it is in a state where you can easily reuse it the concept making an app similar to numi app https numi app soulver app https soulver app and parsify app https parsify app cross platform using c current state the calculator engine is about 100 done but it remains a few features i wanted to add that i didn t finish feature example support basic algebra using digits 2 3 basic algebra using plain words two plus three implicit algebra operations 20 50 nested grouped operation 12 3 1 2 3 2 1 2 3 binary operations 0 2 a fifth percentage calculation 10 25 comments 12 3 anything after is ignored headers title variables my income tax 25 conditions if my income tax 30 then 123 else 456 unit calculation 2km 25 should result 2 5 km br 2 usd 2 cad should result something like 3 05 usd br 3 mb 2 kb should result 2 998 mb br supports length mass currencies area angle volume speed temperature date and time percentage computer data function grammars see detail below percentage of what is numeric br i e 25 of what is 50 should result 200 multilanguage deux plus trois partial unit conversion what is 1 usd in eur br 1km in meters br shouldn t be too complicated to implement using function grammar please check the unit tests for a complete list of supported scenarios technical details overall architecture notepadbasedcalculator api br this is the public api notepadbasedcalculator core br this is core engine it contains the lexer parser and generic interpreter notepadbasedcalculator builtinplugins br contains all the basic extensions which includes the components for parsing and interpreting units basic algebra and binary operation and built in functions grammars notepadbasedcalculator standaloneconsoletestapp this is a console app you can use for testing the calculator manually check out how this project is made to understand how to use the calculator screenshot screenshot png extensiblity dependency injection this project uses mef https learn microsoft com en us dotnet framework mef as primary dependency injection extensibility framework function grammars the calculation engine supports various kind of input data percentage length numbers etc while this app library tries to be smart enough to understand input like 25 of what is 50 it isn t smart enough to understand such an input without the help of a pre defined grammar the full grammar for english language can be found here functiondefinition json https github com veler notepad based calculator blob main src app dev notepadbasedcalculator builtinplugins grammars en us functiondefinition json each function is associated to a mef https learn microsoft com en us dotnet framework mef extension the c implementation automatically binds the class with the function name from the grammar here is an example 1 grammar for 25 of what is 50 https github com veler notepad based calculator blob 84c8842b1f15572c9f27608f0c09c0e059a17017 src app dev notepadbasedcalculator builtinplugins grammars en us functiondefinition json l25 l33 2 implementation in c https github com veler notepad based calculator blob 84c8842b1f15572c9f27608f0c09c0e059a17017 src app dev notepadbasedcalculator builtinplugins functions percentage ispercentofwhatinterpreter cs l3 l51 main third party dependencies for the calculator engine community toolkit https github com communitytoolkit dotnet microsoft recognizers text https github com microsoft recognizers text unitsnet https github com angularsen unitsnet newtonsoft json https www newtonsoft com json how to build and run set up development environment visual studio 2022 with desktop development workload windows rider windows or macos vs code with c extension macos optional avalonia ui https docs avaloniaui net docs getting started set up repository 1 clone the repository 2 run init ps1 or sh to download all the required dependencies build debug 1 open src notepadbasedcalculator app sln in visual studio or rider 2 set notepadbasedcalculator standaloneconsoletestapp as startup project 3 f5
calculator csharp dotnet mef natural-language-processing nlp
ai
phoenix-rtos-hostutils
phoenix rtos hostutils this repository contains the source for the phoenix server application relies on hidapi library if you have problems with building it make sure it is installed in your environment https github com signal11 hidapi this work is licensed under a bsd license see the license file for details
os
chulacv2018
2110443 computer vision 2018 1 computer vision chulalongkorn university cv2018 assets make computer vision great again jpg anaconda download link official https www anaconda com download unofficial chula mirror https cgci cp eng chula ac th cv2018
ai
Web-Development
web development mini project in a web development course description a set of web mini projects using xhtml html5 css3 java script bootstrap php ajax proj b1 a simple clean user interface using bootstrap alt text https github com raghavkishan web development blob master proj 20b1 purrfect match png projb2 a project that displays the responsive design to cater to a variety of screen sizes this project has beebn developed using bootstrap fontawesome and simple css3 computer screen alt text https github com raghavkishan web development blob master projb2 projb2 1 png tablet screen alt text https github com raghavkishan web development blob master projb2 projb2 2 png phone screen alt text https github com raghavkishan web development blob master projb2 projb2 3 png proj 1 a simple clean static web page follwoing strict xhtml principles with css3 alt text https github com raghavkishan web development blob master proj1 proj1 png proj2 a responsive clean form built using bootstrap 4 with field validation in java script alt text https github com raghavkishan web development blob master proj2 proj2 png proj3 a responsive clean form built using bootstrap4 with field validations being performed on the server side using php before iserting data into the mysql database alt text https github com raghavkishan web development blob master proj3 proj3 png
front_end
entropy
entropy test workflow https github com odpf entropy actions workflows test yml badge svg go report card https goreportcard com badge github com odpf entropy https goreportcard com report github com odpf entropy version https img shields io github v release odpf entropy logo semantic release version license https img shields io badge license apache 202 0 blue svg logo apache license entropy is an extensible infrastructure orchestration and application deployment tool entropy provides features required for deploying and managing complex applications like resource versioning config schema versioning rollbacks dry runs etc key features no dependency written in go it compiles into a single binary with no external dependency extensible entropy provides framework to easily write and deploy applications to your choice of cloud runtime entropy can run inside vms or containers with minimal memory footprint refer docs docs for more on capabilites internals etc installation install entropy on macos windows linux openbsd freebsd and on any machine binary cross platform download the appropriate version for your platform from releases https github com odpf entropy releases page once downloaded the binary can be run from anywhere you don t need to install it into a global location this works well for shared hosts and other systems where you don t have a privileged account ideally you should install it somewhere in your path for easy use usr local bin is the most probable location homebrew sh install entropy requires homebrew installed brew install odpf tap entropy check for installed entropy version entropy version usage entropy typically runs as a service and requires a postgres to store its state refer entropy yaml entropy yaml for sample configuration values you can override the configurations by directly editing the entropy yaml file or by setting environment variables environment variable name will be uppercased version of the complete path in yaml along with replaced with character for example the service host can be overriden by setting service host it is also possible to create a copy of the sample configuration file with different name and provide that path to entropy shell entropy serve config my config yaml development running locally sh clone the repo git clone https github com odpf entropy git build entropy binary file make build start a mongodb instance docker compose up run entropy on a recipe file dist entropy serve running tests sh running all unit tests excluding extractors make test contribute development of entropy happens in the open on github and we are grateful to the community for contributing bugfixes and improvements read below to learn how you can take part in improving entropy read our contributing guide https odpf github io entropy docs contribute contributing to learn about our development process how to propose bugfixes and improvements and how to build and test your changes to entropy to help you get your feet wet and get you familiar with our contribution process we have a list of good first issues https github com odpf entropy labels good 20first 20issue that contain bugs which have a relatively limited scope this is a great place to get started this project exists thanks to all the contributors https github com odpf entropy graphs contributors license entropy is apache 2 0 license licensed
dataops
cloud
rs-llm-paper-list
recommender systems with large language models paper list this is an actively maintaing curated paper list on recommender systems with large language models the adopted language models and the correspponding model size the first pulication date the first affiliation of the authors are also presented overview related survey paper related survey paper lms as textual encoders lms as textual encoders llms as recommenders llms as recommenders related paper repo related paper repo related survey paper pre train prompt and recommendation a comprehensive survey of language modelling paradigm adaptations in recommender systems norwegian university of science and technology arxiv 2023 15 mar 2023 paper link https arxiv org pdf 2302 03735 pdf https arxiv org pdf 2302 03735 pdf a survey on large language models for recommendation university of science and technology of china arxiv 2023 1 jun 2023 paper link https arxiv org pdf 2305 19860 pdf https arxiv org pdf 2305 19860 pdf how can recommender systems benefit from large language models a survey shanghai jiao tong university huawei noah s ark lab arxiv 2023 12 jun 2023 paper link https arxiv org pdf 2306 05817 pdf https arxiv org pdf 2306 05817 pdf lms as textual encoders u bert pre training user representations for improved recommendation tencent aaai 2021 18 may 2021 paper link https ojs aaai org index php aaai article view 16557 https ojs aaai org index php aaai article view 16557 adopted language model bert 340 milllion zeor shot recommender systems amazon arxiv 2021 12 oct 2021 paper link https arxiv org pdf 2105 08318 pdf https arxiv org pdf 2105 08318 pdf adopted language model bert 340 milllion towards universal sequence representation learning for recommender systems renmin university of china kdd 2022 14 aug 2022 paper link https dl acm org doi pdf 10 1145 3534678 3539381 casa token aj ili7oeaaaaaaa kn qnglabaxz8nucvtcq7aqm7q5gyqdzvwc8d8mkyfg0mmbvocmw5vqdnuvl3ah9wb9onpzyptw https dl acm org doi pdf 10 1145 3534678 3539381 casa token aj ili7oeaaaaaaa kn qnglabaxz8nucvtcq7aqm7q5gyqdzvwc8d8mkyfg0mmbvocmw5vqdnuvl3ah9wb9onpzyptw adopted language model bert 340 million learning vector qantized item representation for transferable sequential recommenders renmin university of china www 2023 30 apr 2023 paper link https dl acm org doi pdf 10 1145 3543507 3583434 casa token yetrrb bsf4aaaaa yuyif bgl6uyjwbscdn0zia7oifgez vhku5zetltyl8vyhp2eiw6z9iysbfvkpbk170lprm1ac https dl acm org doi pdf 10 1145 3543507 3583434 casa token yetrrb bsf4aaaaa yuyif bgl6uyjwbscdn0zia7oifgez vhku5zetltyl8vyhp2eiw6z9iysbfvkpbk170lprm1ac adopted language model bert 340 million llms as recommenders language models as recommender systems evaluations and limitations amazon icbinb neurips2021 19 oct 2021 paper link https openreview net pdf id hfx3fy7 m9b https openreview net pdf id hfx3fy7 m9b adopted language model bert 340 milllion and gpt 2 1 5 billion m6 rec generative pretrained language models are open ended recommender systems alibaba arxiv 2022 19 may 2022 paper link https arxiv org pdf 2205 08084 pdf https arxiv org pdf 2205 08084 pdf adopted language model m6 10 100 billion recommendation as language processing rlp a unified pretrain personalized prompt predict paradigm p5 rutgers university recsys 2022 18 sep 2022 paper link https dl acm org doi pdf 10 1145 3523227 3546767 casa token s3uwoj1ren0aaaaa v4ii6fdxmjt8sarqcela79og0jtil4bpmyemcvqvg37k4giorpvlg2acngwll9mdmztfl7lct08 https dl acm org doi pdf 10 1145 3523227 3546767 casa token s3uwoj1ren0aaaaa v4ii6fdxmjt8sarqcela79og0jtil4bpmyemcvqvg37k4giorpvlg2acngwll9mdmztfl7lct08 adopted language model t5 11 billion chat rec towards interactive and explainable llms augmented recommender system fudan university arxiv 2023 4 apr 2023 paper link https arxiv org pdf 2303 14524 pdf https arxiv org pdf 2303 14524 pdf adopted language model chatgpt zero shot next item recommendation using large pretrained language models singapore management university arxiv 2023 6 apr 2023 paper link https arxiv org pdf 2304 03153 pdf https arxiv org pdf 2304 03153 pdf adopted language model gpt 3 175 billion generative recommendation towards next generation recommender paradigm national university of singapore arxiv 2023 7 apr 2023 paper link https arxiv org pdf 2304 03516 pdf https arxiv org pdf 2304 03516 pdf adopted language model chatgpt tallrec an effective and efficient tuning framework to align large language model with recommendation university of science and technology of china arxiv 2023 30 apr 2023 paper link https arxiv org pdf 2305 00447 pdf https arxiv org pdf 2305 00447 pdf adopted language model llama 7 billion is chatgpt a good recommender a preliminary study alibaba arxiv 2023 20 apr 2023 paper link https arxiv org pdf 2304 10149 pdf https arxiv org pdf 2304 10149 pdf adopted language model chatgpt do llms understand user preferences evaluating llms on user rating prediction google research arxiv 2023 10 may 2023 paper link https arxiv org pdf 2305 06474 pdf https arxiv org pdf 2305 06474 pdf adopted language model chatgpt and flan u palm 540 billion sparks of artificial general recommender agr early experiments with chatgpt rutgers university arxiv 2023 8 may 2023 paper link https arxiv org pdf 2305 04518 pdf https arxiv org pdf 2305 04518 pdf adopted language model chatgpt uncovering chatgpt s capabilities in recommender systems remin university of china arxiv 2023 11 may 2023 paper link https arxiv org pdf 2305 02182 pdf https arxiv org pdf 2305 02182 pdf adopted language model chatgpt and others a first look at llm powered generative news recommendation the hong kong polytechnic university arxiv 2023 11 may 2023 paper link https arxiv org pdf 2305 06566 pdf https arxiv org pdf 2305 06566 pdf adopted language model chatgpt recommendation as instruction following a large language model empowered recommendation approach remin university of china arxiv 2023 11 may 2023 paper link https arxiv org pdf 2305 07001 pdf https arxiv org pdf 2305 07001 pdf adopted language model flan t5 xl 3 billion palr personalization aware llms for recommendation drexel university arxiv 2023 12 may 2023 paper link https arxiv org pdf 2305 07622 pdf https arxiv org pdf 2305 07622 pdf adopted language model llama 7 billion is chatgpt fair for recommendation evaluating fairness in large language model recommendation university of science and technology of china arxiv 2023 12 may 2023 paper link https arxiv org pdf 2305 07609 pdf https arxiv org pdf 2305 07609 pdf adopted language model chatgpt how to index item ids for recommendation foundation models rutgers university arxiv 2023 12 may 2023 paper link https arxiv org pdf 2305 06569 pdf https arxiv org pdf 2305 06569 pdf adopted language model p5 11 billion large language models are zero shot rankers for recommender systems renmin university of china arxiv 2023 15 may 2023 paper link https arxiv org pdf 2305 08845 pdf https arxiv org pdf 2305 08845 pdf adopted language model chatgpt leveraging large language models in conversational recommender systems google research arxiv 2023 16 may 2023 paper link https arxiv org pdf 2305 07961 pdf https arxiv org pdf 2305 07961 pdf adopted language model lamda 137 billlion rethinking the evaluation for conversational recommendation in the era of large language models renmin university of china arxiv 2023 22 may 2023 paper link https arxiv org pdf 2305 13112 pdf https arxiv org pdf 2305 13112 pdf adopted language model chatgpt bookgpt a general framework for book recommendation based on a large language model ai for science institute arxiv 2023 25 may 2023 paper link https arxiv org pdf 2305 15673 pdf https arxiv org pdf 2305 15673 pdf adopted language model chatgpt unitrec a unified text to text transformer and joint contrastive learning framework for text based recommendation the chinese university of hong kong acl 2023 short may 25 2023 paper link https arxiv org pdf 2305 15756 pdf https arxiv org pdf 2305 15756 pdf adopted language model bart 140 million text is all you need learning language representations for sequential recommendation university of california san diego kdd 2023 26 may 2023 paper link https arxiv org pdf 2305 13731 pdf https arxiv org pdf 2305 13731 pdf adopted language model longformer 102 million prompt tuning large language models on personalized aspect extraction for recommendations new york university google research arxiv 2023 2 jun 2023 paper link https arxiv org pdf 2306 01475 pdf https arxiv org pdf 2306 01475 pdf adopted language model gpt 2 1 5 billion large language model augmented narrative driven recommendations university of massachusetts arxiv 2023 4 jun 2023 paper link https arxiv org pdf 2306 02250 pdf https arxiv org pdf 2306 02250 pdf adopted language model instructgpt 175 billion ctrl connect tabular and language model for ctr prediction huawei noah s ark lab arxiv 2023 8 jun 2023 paper link https arxiv org pdf 2306 02841 pdf https arxiv org pdf 2306 02841 pdf adopted language model roberta 355 million a preliminary study of chatgpt on news recommendation personalization provider fairness fake news northwestern university arxiv 2023 19 jun 2023 paper link https arxiv org pdf 2306 10702 pdf https arxiv org pdf 2306 10702 pdf adopted language model chatgpt generative sequential recommendation with gptrec university of glasgow arxiv 2023 19 jun 2023 paper link https arxiv org pdf 2306 11114 pdf https arxiv org pdf 2306 11114 pdf adopted language model gpt 2 1 5 billion towards open world recommendation with knowledge augmentation from large language models shanghai jiao tong university huawei noah s ark lab arxiv 2023 19 jun 2023 paper link https arxiv org pdf 2306 10933 pdf https arxiv org pdf 2306 10933 pdf related paper repo recommender systems and pretrained models https github com archersama awesome recommend system pretraining papers by archersama recommender systems https github com creyesp awesome recsys by creyesp generative recommender systems https github com jihoo kim awesome generative recsys by jihoo kim
ai
llm_mesh
llm mesh a mesh system for adapting multiple large language models
ai
DevOps-Cloud-Infrastructure
devops cloud infrastructure these projects covers the engineering and design of it infrastructure focusing on cloud scale distributed systems and modern devops practices it infrastructure deployment practices are rapidly changing as organizations build infrastructure as code and adopt cloud computing platforms this course examines the theory behind these modern practices and the real world implementation challenges faced by it organizations we will primarily learn by doing students will gain hands on experience with several widely adopted it platforms including github aws and docker objectives understand how it organizations are deploying modern infrastructure and how to build infrastructure as code understand how to architect cloud scale distributed systems and the key design patterns used to enhance scalability and reliability within these systems develop specific skills related to devops practices including source control management package management and configuration management
devops-course cloud-infrastructure aws continuous-integration continuous-delivery infrastructure-as-code cloudformation ansible-playbook yml pipeline jenkins-pipeline containers
cloud
Team-106---AutoHood
team 106 autohood autonomous deployable hood for egr 314 embedded systems design ii asu br project description br the autohood is a low cost mechatronic solution that predicts rainy weather conditions the device includes various sensors to predict rainfall allowing the device to warn the user in the form of a phone notification and deploy a sheltering hood device how to use the main project folder holds all the registers and code used for the pic18f mcu with mplabx ide in order to use the file the user must move the project folder with x at the end of it into their respective mplabx projects directory default dir c users tyler mplabxprojects helpful git bash commands to remember br git clone link br br clones a git repo to your local computer for use link referes to the link you can copy from the green code drop down br br git status br br status of the current local repo br br git add br br add files to push adds all files br br git branch branch name br br generate a new timeline to branch to br br git checkout branch name br br choose what branch to swap to and work with br br git commit m message br br adds a single line message to files being commited to the branch br br git push link br br git push sends the current updates to the github repo link is used to copy the destination for the push br br git pull link br br git pull sends the data from the git repo to your local repository br
os
outserv
outserv blockchain search with graphql apis note dec 13 2022 0xfast com doesn t use outserv directly outserv is not being actively maintained outserv enables you to run production grade graphql search apis over any blockchain data 10x faster than existing mechanisms outserv is the only system which combines the power of a search engine a cache engine and a graphql layer thus replacing three different systems in production with a single executable binary outserv makes it trivial for anyone to bring up a production grade graphql tech stack which i consider to be an important step towards web3 decentralization img src static outserv jpeg width 500 demo an outserv server with first 14 5m polygon blocks is running here https poly 0xfast com graphql you can point any graphql editor to this address to get the full schema curl xpost https poly 0xfast com graphql h content type application graphql d querylog filter address eq 0x5e1ddf2e5a0ecdd923692d4b4429d8603825a8c6 first 10 address topics data blocknumber latest release the latest release of outserv is v22 07 codenamed webb https webb nasa gov latest release is in release v22 07 branch important links follow the documentation at https docs outserv io https docs outserv io docs intro read the announcement blog post here https manishrjain com outserv graphql blockchain search see the product roadmap here https github com outcaste io outserv issues 61 join the outserv discord community https discord gg rmjnnd4xav consult with me to figure if outserv would be a good solution for you via this calendly link https calendly com manishrjain consulting on outserv five reasons why you should choose outserv over cloud api services 1 having control over your own api server and blockchain node is the holy grail of decentralization the big leap that web3 has over web2 1 never run into rate limits ever again 1 not only is outserv high performant having a dedicated system running closer to edge just for your usage would always outperform far away cloud api servers where your queries have to compete for resources against all others 1 data is no one size fits all enrich the blockchain data by adding more context combining data from multiple sources to allow for better searches 1 for medium to heavy workloads outserv would be way cheaper than paying for various cloud api access img src static decentralization jpeg width 500 bugs feature requests to report bugs or request features please use github issues please do answer these following questions 1 what is the problem you are trying to solve for 2 what did you do 3 what did you expect to see 4 what did you see instead license this project as a whole is licensed under the terms of the sustainable license v1 0 a copy of the sustainable license is available in license md license md you can see the reasoning behind sustainable license and the faqs here https manishrjain com tagged license the monetization model mentioned in the sustainable license is explained here billing certain portions of this project are licensed by contributors or others under the terms of open source licenses such as the apache license v2 0
graphql serving system decentralized web3 search search-engine
blockchain
DeepFaceForgeryDetection
deepfaceforgery detection this repository contains code for deep face forgery detection in video frames this is a student project from advanced deep learning for computer vision https dvl in tum de teaching adl4cv ws18 course at tum https www tum de publication available on arxiv https arxiv org abs 2004 11804 faceforensics benchmark using transfer learning we were able to achieve a new state of the art performance on faceforenics benchmark http kaldir vc in tum de faceforensics benchmark state of the art results on public benchmark diagrams ff benchmark png dataset and technologies for detecting video frame forgeries we use faceforensics http www niessnerlab org projects roessler2019faceforensicspp html dataset of pristine and manipulated videos as a preprocessing step we extract faces from all the frames using mtcnn https github com ipazc mtcnn total dataset is 507gb and contains 7 million frames dataset downloading and frame extraction code is located in dataset dataset directory for model training we use the split from faceforensics http www niessnerlab org projects roessler2019faceforensicspp html repository main technologies 1 python as main programming language 2 pytorch as deep learning library 3 pip for dependency management training evaluation all the training and evaluation code together with various models are stored in src src directory all scripts are self documenting just run them with help option they automatically use gpu when available but will also work on cpu but very slowly unfortunately single frame model we got the best single frame classification accuracy using a version of inception resnet v1 model src model py l109 pretrained on vggface2 face recognition dataset inception resnet v1 diagram diagrams incep resnet v1 png window frame models we also evaluated how performance improves when incorporating temporal data the task in this case changes from single frame classification to frame sequence classification we used 2 different models for such an approach 3d convolutional and bi lstm 3d convolutional model 3d convolutional model diagram diagrams 3d conv png temporal feature locality assumption that 3d convolutional model has seems reasonable in this case but it is very slow to train for large window sizes lstm with 2d cnn encoder lstm with 2d cnn encoder diagram diagrams cnn lstm png citation misc dogonadze2020deep title deep face forgery detection author nika dogonadze and jana obernosterer and ji hou year 2020 eprint 2004 11804 archiveprefix arxiv primaryclass cs cv model weights various model weights are available here models https drive google com file d 18 ki7vy0yt4fzq 5a5a5w11kxy9xxqaw view usp sharing
ai
TurboCore
h2 align center a href target blank img height 150 alt turbocore logo src https docs turbocore org logo svg a br br turbocore a blazing fast baas written in rust h2 div align center a href https www rust lang org target blank img alt rust src https img shields io badge built with rust 100000 style for the badge logo rust logocolor ffffff labelcolor 9c1b10 color c72e20 a a href https github com turbo core turbocore actions workflows tests yml taget blank img alt github workflow status with event src https img shields io github actions workflow status turbo core turbocore tests yml event push style for the badge a div documentation documentation is being worked on along with the code you can find the current documentation here https docs turbocore org note that the documentation is not even close to being complete and more will be added as the project progresses feature requests we welcome all feature requests from the community with open arms if you have a feature request or a suggestion you can create an issue on our github repository when creating an issue please provide a clear and concise description of the feature or suggestion this will help us to better understand what you are looking for and to implement it in a more efficient manner why rust rust is a systems programming language that runs insanely fast prevents segfaults and guarantees thread safety it also has a very active community and is used by many large companies such as mozilla dropbox and twitter why baas backend as a service is a new trend in the industry it allows developers to focus on the frontend of their application and not have to worry about the backend it also allows developers to quickly prototype their ideas without having to worry about the backend what is the status of turbocore turbocore is currently in the very early stages of development a roadmap will be created soon how can i contribute you can contribute by reporting bugs suggesting features or even writing code if you want to write code please try to follow the rust style guide https doc rust lang org 1 0 0 style readme html if you have any questions feel free to open an issue
baas backend firebase rust
server
Python-Care
pythoncarebot problem solving in information technology psit
server
ForRapptr
forrapptr coding test for rapptr ios engineering apprenticeship
os
iot-identity-service
this repository contains the code of the azure iot identity service and related services together these services make a basic device runtime for azure iot devices documentation the contents of the docs directory are served at https azure github io iot identity service see docs dev readme md docs dev readme md for developer documentation and contributing contributing md for contribution guidelines license mit
server
text-to-sql-wizardcoder
open in hf spaces https huggingface co datasets huggingface badges raw main open in hf spaces lg dark svg https huggingface co spaces richardr1126 sql skeleton wizardcoder demo finetune datasets spider https huggingface co datasets spider richardr1126 spider context instruct https huggingface co datasets richardr1126 spider context instruct row 0 richardr1126 spider natsql skeleton context instruct https huggingface co datasets richardr1126 spider natsql skeleton context instruct richardr1126 spider skeleton context instruct https huggingface co datasets richardr1126 spider skeleton context instruct validation datasets spider https huggingface co datasets spider richardr1126 spider context validation https huggingface co datasets richardr1126 spider context validation richardr1126 spider natsql context validation https huggingface co datasets richardr1126 spider natsql context validation richardr1126 spider context validation ranked schema https huggingface co datasets richardr1126 spider context validation ranked schema local large language models richardr1126 sql guanaco 13b merged https huggingface co richardr1126 sql guanaco 13b merged richardr1126 spider natsql wizard coder merged https huggingface co richardr1126 spider natsql wizard coder merged richardr1126 spider skeleton wizard coder merged https huggingface co richardr1126 spider skeleton wizard coder merged summer 2023 approaches 1 sql guanaco 13b this was my first attempt at fine tuning an llm i used the guanaco 13b based from llama13b model as the base model and i fine tuned it on a guanaco style spider dataset that was premade i never ran a full evaluation for this model because it was performing so poorly 2 spider wizard coder switch to wizardcoder 15b based from starcoder as the base model for fine tuning for my text to sql model this model was fine tuned on the richardr1126 spider context instruct https huggingface co datasets richardr1126 spider context instruct row 0 dataset which includes the database context in the fine tuning data the results for this model were okay but sub par around 50 3 spider natsql skeleton wizardcoder natsql https arxiv org abs 2109 05153 is an intermediate representation for sql that simplifies the queries and reduces the mismatch between natural language and sql natsql preserves the core functionalities of sql but removes some clauses and keywords that are hard to infer from natural language descriptions natsql also makes schema linking easier by reducing the number of schema items to predict natsql can be easily converted to executable sql queries and can improve the performance of text to sql models this model was fine tuned on the richardr1126 spider natsql skeleton context instruct https huggingface co datasets richardr1126 spider natsql skeleton context instruct dataset it is the same as spider context instruct except it has the natsql output in the response instead of the normsql this dataset also used the skeleton formatting for better outputs skeleton formatting select count from where select count from head where age 56 theoretically if the model doesn t have to write as much it should do better natsql reduces the length of queries and simplifies joining tables a lot 56 5 execution accuracy the results for this model were less than expected most likely because wizardcoder 15b already knows some sql so trying to fine tune it on a different sql language might have confused the model 4 spider skeleton wizard coder stopped using natsql and went back to normsql however i kept the skeleton formatting this model was fine tuned using the richardr1126 spider skeleton context instruct https huggingface co datasets richardr1126 spider skeleton context instruct dataset this is still the best performing dataset i have created for fine tunes skeleton formatting select count from where select count from head where age 56 results 61 execution accuracy beats chatgpt zero shot with a simple system prompt and no examples this was the best model that i fine tuned during the summer 2023 for text to sql tasks the model does very well for a local large language model at text to sql 5 chatgpt to compare my model against something i ran basic prediction for the spider dataset using chatgpt the validation dataset i used was the same one i used for spider skeleton wizard coder chatgpt was evaluated with the default hyperparameters and with the system message you are a sophisticated ai assistant capable of converting text into sql queries you can only output sql don t add any other text results 57 6 this is what the accuracy would be if you were using the chatgpt gui and only asking it to convert natural language to sql with database context chatgpt s capabilities vary so much depending on the input later on in my approaches i use a more complex chatgpt setup to achieve the highest accuracy yet fall 2023 approaches 1 spider skeleton wizard coder chatgpt ranked schema in this approach i use the same model spider skeleton wizard coder for predicting the sql queries chatgpt ranked schema i asked chatgpt to rank the validation dataset s database context by placing tables that are more relevant to the question higher in the database context string i also ask it to remove tables that it doesn t think it will need in the final prediction this created the spider context validation ranked schema https huggingface co datasets richardr1126 spider context validation ranked schema dataset the good thing about this approach was that i only needed to run the rankings for the schema through chatgpt one time then i had the file with the rankings forever i didn t change anything about the model i just reran the predictions using this newly ranked dataset results 63 7 this is now the best performing approach using local large language models this only provided a 2 7 increase in accuracy 2 spider skeleton wizard coder 5 beams chatgpt ranked schema in this approach i decided to make my local model spider skeleton wizard coder use 5 beams in it s generation arguments instead of greedy decoding 5 beams the model will go down 5 different paths when trying to predict the sql i then return 4 of the 5 beams as multiple sql queries with 4 returned sqls from the model for each dataset question i chose the correct query by choosing the first 1 out of the 4 that doesn t have an execution error in the sql results 60 this approach actually brought the execution accuracy down which was not expected probably due to the fact that i was keeping basically correct queries from making it into the final result if they had an execution error and the fact that i took away greedy decoding 3 spider skeleton wizard coder 5 beams chatgpt choose best sql chatgpt ranked schema the only thing that changed with this approach was that i tried to ask chatgpt to choose the best sql for the question out of the 4 return sqls from my local llm chatgpt chooses the best query out of 4 results 58 5 the results went down even further with this approach indicating that asking chatgpt to reason about sql is not going to work it is odd that chatgpt made the accuracy worse because in the next few approaches i use asking chatgpt to do similar things on sql that chatgpt itself predicted works just fine and gives the best results i have ever gotten 4 chatgpt alignment clear context this approach does not use a local large language models at all relies on gpt3 5 turbo 16k from the openai api alignment to align chatgpt to give better responses i use 5 predefined input sequences that load into chatgpt before the sql question i am trying to ask it is not multi shot as i am not giving it example queries i just give it tips and rules to follow and confirmation of those rules by chatgpt see below for the chatgpt alignment prompt format https github com cuplv text to sql wizardcoder chatgpt alignment prompt format clear context i reformatted the database context to be easier for the model to parse with each table on a different line and the columns in parentheses i also asked chatgpt to rank the tables by putting tables that are more relevant higher in the context i did the same for the columns of each table as well singer singer id name country age stadium capacity highest lowest average concert theme year concert id concert name singer in concert concert id singer id concert stadium id stadium stadium id singer in concert singer id singer singer id singer in concert concert id concert concert id results 68 2 5 chatgpt alignment clear context error correction i added was another section to the chatgpt prediction script that looks for errors in the sql and tries to correct them using chatgpt error correction run the predicted sql query from chatgpt on an actual database corresponding to the question and if there is an execution error in the sqlparse library when executing the query ask chatgpt to fix it i am getting an error when executing that on a dummy database please try to fix it the error is results none i didn t do prediction for just the error correction 6 chatgpt alignment clear context error correction example driven correction in this approach i added example driven correction to the prediction script for chatgpt which comes after error correction section example driven correction that is incorrect please try again the resulting table from the query is not what it should be the correct result table is below don t try to match exactly to the result table i give i want these to work for any content in a larger database please try to fix you original query as best you can with the new information results 72 or 75 5 on my simple evaluation tool this is currently the best approach i have taken for text to sql chatgpt alignment prompt format role system content you are now an excellent sql writer first i ll give you some tips and examples and i need you to remember the tips and do not make same mistakes role user content tips 1 question which a has most number of b gold sql select a from b group by a order by count desc limit 1 notice that the gold sql doesn t select count because the question only wants to know the a and the number should be only used in order by clause there are many questions asks in this way and i need you to remember this in the the following questions role assistant content thank you for the tip i ll keep in mind that when the question only asks for a certain field i should not include the count in the select statement but instead use it in the order by clause to sort the results based on the count of that field role user content tips 2 don t use in or left join as it might cause extra results use intersect or except instead and remember to use distinct or limit when necessary for example question who are the a who have been nominated for both b award and c award gold sql should be select a from x where award b intersect select a from x where award c role assistant content thank you for the tip i ll remember to use intersect or except instead of in not in or left join when i want to find records that match or don t match across two tables additionally i ll make sure to use distinct or limit when necessary to avoid repetitive results or limit the number of results returned role user actual prompt content markdown complete sqlite sql query only and with no explanation and do not select extra columns that are not explicitly requested in the query sqlite sql tables with their properties singer singer id name country age stadium capacity highest lowest average concert theme year concert id concert name singer in concert concert id singer id concert stadium id stadium stadium id singer in concert singer id singer singer id singer in concert concert id concert concert id how many singers do we have select citations bibtex misc dong2023c3 title c3 zero shot text to sql with chatgpt author xuemei dong and chao zhang and yuhang ge and yuren mao and yunjun gao and lu chen and jinshu lin and dongfang lou year 2023 eprint 2307 07306 archiveprefix arxiv primaryclass cs cl bibtex misc luo2023wizardcoder title wizardcoder empowering code large language models with evol instruct author ziyang luo and can xu and pu zhao and qingfeng sun and xiubo geng and wenxiang hu and chongyang tao and jing ma and qingwei lin and daxin jiang year 2023 bibtex article yu2018spider title spider a large scale human labeled dataset for complex and cross domain semantic parsing and text to sql task author yu tao and zhang rui and yang kai and yasunaga michihiro and wang dongxu and li zifan and ma james and li irene and yao qingning and roman shanelle and others journal arxiv preprint arxiv 1809 08887 year 2018 bibtex inproceedings gan etal 2021 natural sql title natural sql making sql easier to infer from natural language specifications author gan yujian and chen xinyun and xie jinxia and purver matthew and woodward john r and drake john and zhang qiaofu booktitle findings of the association for computational linguistics emnlp 2021 month nov year 2021 address punta cana dominican republic publisher association for computational linguistics url https aclanthology org 2021 findings emnlp 174 doi 10 18653 v1 2021 findings emnlp 174 pages 2030 2042 bibtex article dettmers2023qlora title qlora efficient finetuning of quantized llms author dettmers tim and pagnoni artidoro and holtzman ari and zettlemoyer luke journal arxiv preprint arxiv 2305 14314 year 2023 bibtex inproceedings li2022resdsql author haoyang li and jing zhang and cuiping li and hong chen title resdsql decoupling schema linking and skeleton parsing for text to sql booktitle aaai year 2023
ai
rusty-blockparser
rusty blockparser rusty blockparser is a bitcoin blockchain parser written in rust language it allows extraction of various data types blocks transactions scripts public keys hashes balances and utxo dumps from bitcoin based blockchains currently supported blockchains bitcoin namecoin litecoin dogecoin myriadcoin unobtanium and noteblockchain imporant it assumes a local unpruned copy of the blockchain with intact block index and blk files downloaded with bitcoin core https github com bitcoin bitcoin 0 15 1 or similar clients if you are not sure whether your local copy is valid you can apply verify to validate the chain and block merkle trees if something doesn t match the parser exits usage usage rusty blockparser options command commands unspentcsvdump dumps the unspent outputs to csv file csvdump dumps the whole blockchain into csv files simplestats shows various blockchain stats balances dumps all addresses with non zero balance to csv file opreturn shows embedded op return data that is representable as utf8 help print this message or the help of the given subcommand s options verify verifies merkle roots and block hashes v increases verbosity level info 0 debug 1 trace 2 default 0 c coin name specify blockchain coin default bitcoin possible values bitcoin testnet3 namecoin litecoin dogecoin myriadcoin unobtanium noteblockchain d blockchain dir blockchain dir sets blockchain directory which contains blk dat files default bitcoin blocks s start height specify starting block for parsing inclusive e end height specify last block for parsing inclusive default all known blocks h help print help v version print version example to make a unspentcsvdump of the bitcoin blockchain your command would look like this blockparser unspentcsvdump path to dump 6 02 53 info main starting rusty blockparser v0 7 0 6 02 53 info index reading index from bitcoin blocks index 6 02 54 info index got longest chain with 639626 blocks 6 02 54 info blkfile reading files from bitcoin blocks 6 02 54 info parser parsing bitcoin blockchain range 0 6 02 54 info callback using unspentcsvdump with dump folder path to dump 6 03 04 info parser status 130885 blocks processed left 508741 avg 13088 blocks sec 10 28 47 info parser status 639163 blocks processed left 463 avg 40 blocks sec 10 28 57 info parser status 639311 blocks processed left 315 avg 40 blocks sec 10 29 07 info parser status 639452 blocks processed left 174 avg 40 blocks sec 10 29 17 info parser status 639596 blocks processed left 30 avg 40 blocks sec 10 29 19 info parser done processed 639626 blocks in 266 43 minutes avg 40 blocks sec 10 32 01 info callback done dumped all 639626 blocks transactions 549390991 inputs 1347165535 outputs 1359449320 10 32 01 info main fin installing this tool should run on windows os x and linux all you need is rust and cargo latest release you can download the latest release from crates io bash cargo install rusty blockparser build from source bash git clone https github com gcarq rusty blockparser git cd rusty blockparser cargo build release cargo test release target release rusty blockparser help it is important to build with release otherwise you will get a horrible performance tested on gentoo linux with rust stable 1 44 1 supported transaction types bitcoin and bitcoin testnet transactions are parsed using rust bitcoin https github com rust bitcoin rust bitcoin this includes transactions of type p2sh p2pkh p2pk p2wsh p2wpkh p2tr op return and segwit bitcoin forks e g dogecoin litecoin are evaluated via a custom script implementation which includes p2pk p2pkh https en bitcoin it wiki transaction pay to pubkeyhash p2sh https github com bitcoin bips blob master bip 0016 mediawiki and some non standard transactions memory usage the required memory usage depends on the used callback simplestats 100mb csvdump 100m unspentcsvdump 18gb balances 18gb note those values are taken from parsing to block height 639631 17 07 2020 callbacks callbacks are built on top of the core parser they can be implemented to extract specific types of information balances dumps all addresses with a non zero balance the csv file is in the following format balances csv address balance unspentcsvdump dumps all utxos along with the address balance the csv file is in the following format unspent csv txid indexout height value address note the total size of the csv dump is at least 8 gib height 635000 opreturn shows transactions with embedded op return data that is representable as utf8 csvdump dumps all parsed data as csv files into the specified folder see usage usage for an example i chose csv dumps instead of an active db connection because load data infile is the most performant way for bulk inserts the files are in the following format blocks csv block hash height version blocksize hashprev hashmerkleroot ntime nbits nnonce transactions csv txid hashblock version locktime tx in csv txid hashprevout indexprevout scriptsig sequence tx out csv txid indexout height value scriptpubkey address if unclear what some of these fields are see the block https en bitcoin it wiki protocol documentation block and transaction https en bitcoin it wiki protocol documentation tx specifications if you want to insert the files into mysql see sql schema sql sql schema sql it contains all table structures and sql statements for bulk inserting also see sql views sql sql views sql for some query examples note the total size of the csv dump is at least to 731 gib height 635000 simplestats prints some blockchain statistics like block count transaction count avg transactions per block largest transaction transaction types etc you can also define custom callbacks a callback gets called at startup on each block and at the end see src callbacks mod rs src callbacks mod rs for more information contributing use the issue tracker to report problems suggestions and questions you may also contribute by submitting pull requests if you find this project helpful please consider making a donation 1lfidbteg5joaqjw35ksebinkvm8azfm1k customizing the tool for your coin the tool can easily be customized to your coin this section outlines the changes that need to be made and is for a beginner user both with rust and blockchain this guide is made possible by reviewing the commits made by merlinmagic2018 during this example the coin name used is nocoinium the main change is src blockchain parser types rs add a new entry pub struct nocoinium above the line pub struct dash the case you use here is to be carried in all subsequent references except when noted you will then need to add a impl coin for nocoinium you could easily copy a previous block e g bitcoin the changes you need to do are highlighted below as comments rust the name here should be the same case as defined in the pub struct line impl coin for nocoinium fn name self string this is primarily for display use same case as before string from nocoinium fn magic self u32 magic bytes are a string of hex characters that prefix messages in the chain to find this value look for the fields pchmessagestart 0 3 in the file chainparams cpp under cmainparams the value to be used here is 0x pchmessagestart 3 pchmessagestart 2 pchmessagestart 1 pchmessagestart 0 i e string the values in reverse 0xd9b4bef9 fn version id self u8 version id is used to identify the address prefix for base58 encoding of the public address found this using the stackoverflow comment https bitcoin stackexchange com questions 62781 litecoin constants and prefixes again with chainparams cpp and cmainparams look for base58prefixes pubkey address convert the decimal value to hex and add it here 0x00 fn genesis self sha256d hash this is the genesis block hash get the value from consensus hashgenesisblock again found in chainparams cpp sha256d hash from str 000000000019d6689c085ae165831e934ff763ae46a2a6c172b3f1b60a8ce26f unwrap fn default folder self pathbuf this is the folder from the user s home folder to where the blocks files are found note the case here it is not camelcase as most coin directories are lower case however use the actual folder name from your coin implementation path new nocoinium join blocks finally tie these changes within impl fromstr for cointype under match coin the first part will be the case passed as argument to the program see bullet point below and the name within from will be the name used above rust nocoinium ok cointype from nocoinium the next change is in src main rs under the fn parse args add your coin to the array of coins the case you use here will be the same value as you pass in the arguments when executing the blockchain using the c argument finally add your coin name in the readme md file so others know your coin is supported todo implement pay2multisig script evaluation
blockchain bitcoin litecoin parser rust
blockchain
txt2onto
systematic tissue annotations of genomic samples by modeling unstructured metadata this repo is the home of txt2onto a python utility for classifying unstructured text to terms in a tissue ontology using nlp ml a combination of natural language processing nlp and machine learning ml also in this repo are our fully trained nlp ml models to perform the tissue classification on unstructured text we have included sample inputs and outline the use of nlp ml with a demo script the nlp ml method is described in this preprint https doi org 10 1101 2021 05 10 443525 biorxiv doi 10 1101 2021 05 10 443525 more info there are currently 1 3 million human omics samples that are publicly available however this valuable resource remains acutely underused because discovering samples say from a particular tissue of interest from this ever growing data collection is still a significant challenge the major impediment is that sample attributes such as tissue cell type of origin are routinely described using non standard varied terminologies written in unstructured natural language here we provide a natural language processing based machine learning approach nlp ml to infer tissue and cell type annotations for genomic samples based only on their free text metadata nlp ml works by creating numerical representations of sample text descriptions and using these representations as features in a supervised learning classifier that predicts tissue cell type terms in a structured ontology our approach significantly outperforms representative methods of existing state of the art approaches to addressing the sample annotation problem we have also demonstrated the biological interpretability of tissue nlp ml models using an analysis of their similarity to each other and an evaluation of their ability to classify tissue and disease associated biological processes based on their text descriptions alone previous studies have shown that the molecular profiles associated with genomic samples are highly predictive of a variety of sample attributes using transcriptome data we have shown that nlp ml models can be nearly as accurate as expression based models in predicting sample tissue annotations however the latter models based on genomic profiles need to be trained anew for each genomic experiment type on the other hand once trained using any text based gold standard approaches such as nlp ml can be used to classify sample descriptions irrespective of sample type we demonstrated this versatility by using nlp ml models trained on microarray sample descriptions to classify rna seq chip seq and methylation samples without retraining here we provide the fully trained models and a simple utility script for users to leverage the predictive power of nlp ml to annotate their text corpora of interest for 346 tissues and cell types from uberon https www ebi ac uk ols ontologies uberon these nlp ml models are trained using our full gold standard as a note in our manuscript we discussed the results of models who had sufficient training data in the gold standard for at least 3 fold cv the remaining models were not discussed or examined in detail in our work due to lack of sufficient labeled samples we have included trained models for all available tissues and cell types so as to provide users with the maximum amount of predictive capability however it should be noted that some models included in this repository have very little training data i e small number of positively labeled examples and thus may provide inaccurate annotations the full list of cross validated models can be found here https github com krishnanlab txt2onto blob main gold standard crossvalidatedmodels txt and the full list of models presented in our paper can be found here https github com krishnanlab txt2onto blob main gold standard manuscriptmodels txt installation requirements can be installed by running pip install r requirements txt these requirements have been verified for python version 3 7 7 an update in library dependencies requires scikit learn to be version 0 23 2 as listed in requirements txt the full models were trained using scikit learn version 0 23 1 when loading the models the following warning message will be shown userwarning trying to unpickle estimator logisticregression from version 0 23 1 when using version your current version this might lead to breaking code or invalid results use at your own risk warnings warn in our testing with newer versions of scikit learn we have encountered no problems if a problem does arise please post a git issue and we will work to resolve it docker installation since getting the correct versions and dependencies can sometimes be challenging we make a dockerfile available for building locally or for extending as needed docker build the resulting docker image will contain the checked out version of the repo after docker run continue through the usage sections as below usage use case 1 making predictions on unstructured text using nlp ml input the input should be a plain text file with one description per line an example is provided here https github com krishnanlab txt2onto blob main data example input txt with a small excerpt below na colon homo sapiens colonoscopy male adenocarcinoma extract total rna le biotin norm specified colonoscopy male adenocarcinoma specified homo sapiens adult allele multiple sclerosis clinically isolated syndrome none peripheral blood mononuclear human pbmc non treated sampling wholeorganism homo sapiens prostate prostate patient id sample type ttumor biopsy ctrlautopsy sample percentage tumor prostate patient normal adult patient normal adult patient age gender male skeletal muscle homo sapiens unknown extract total rna le biotin norm unknown medium lp stimulation blood lp homo sapiens myeloid monocytic cell medium lp stimulation extract total rna le biotin norm medium lp stimulation the input text will be preprocessed during the execution of src txt2onto py for more information on the preprocessing pipeline see the preprocess function in src utils py output the prediction task can then be performed by running the following python txt2onto py file path to text file txt out path to write embeddings to txt predict this will read in the input text from path to text file txt create a word embedding for each line of text and write it to path to write embeddings to txt and then make a prediction for each line of text for each of our models and write it to path to write embeddings predictions to txt the output path for predicted probabilities is automatically generated when the flag is passed the i j entry of the output dataframe is the predicted probability assigned by model i for text snippet j from the input file if you only want embeddings for your input text omit the predict flag alternatively a single text snippet can be read from the command line python txt2onto py text some sample description or piece of text out path to write embeddings to txt predict which will write a single word embedding to path to write embeddings to txt and write the predictions to path to write embeddings predictions to txt if the user only wants word embeddings the predict flag can be omitted word embeddings are always generated and written to file whether predictions are made or not demo for an example run sh demo sh in the src directory the script will execute the full predictive pipeline of nlp ml each line of input text will be turned into a numerical representation by taking a weighted average of the individual word embeddings from the text and run each embedding through each of our trained nlp ml models bash cd src sh demo sh this will read in the example input file from data example input txt write embeddings to out example output txt and write predictions to out predictions example output txt use case 2 training new nlp ml models given labeled text snippets new nlp ml models can be trained for additional tissues cell types or potentially other binary classification problems in order to train a new model an input file of the following is required id1 1 text id2 1 text id3 1 text id4 1 text the input file is a three column tab separated file column 1 is the id corresponding to the sample input text etc column 2 is the true label either 1 or 1 the final column is the text associated with the sample an example training input can be found at training inputs uberon 0002113 nlp ml input subset txt which is a subset of our gold standard for uberon 0002113 kidney a full training input for any tissue or cell type in our gold standard can be created by running the following from the src directory python input py ont uberon 0002113 out training inputs when trying to make a training input from our gold standard if an ontology term is specified that we do not have true labels for a valueerror will be raised once an input of the specified format is created a model can be trained from the given input using the following python train py input training input labeled input txt out user trained models this will train a model from the embeddings created from the text in the input file and save the model as a pickled object to the specified output directory the output filename is automatically generated from the training input for example an input of training inputs uberon 0002113 nlp ml input subset txt used to train a model with a specified output of user trained models will be saved as user trained models model uberon 0002113 nlp ml input subset p to use newly trained models to make predictions as opposed to our trained nlp ml models a new model directory can be specified when running txt2onto py for example user trained models in the user trained models directory can be used instead of the bin directory by specifying with the models flag in txt2onto python txt2onto py file path to text file txt out path to write embeddings to txt predict models user trained models this will allow you to make predicitions on new unstructured text using your own trained models overview of repository here we list the files we have included as part of this repository bin the fully trained logistic regression models stored as pickle p files data example input file and files needed for making embeddings and output predictions data uberoncl txt a text file that maps the model ontology identifiers to plain text data pubmed weights txt gz idf weights for every unique word across pubmed used to make a weighted average embedding for each piece of text gold standard raw datafiles from our manuscript gold standard anatomicalsystemspermodel json mapping of every term in uberon to a high level anatomical system gold standard crossvalidatedmodels txt a list of models we had sufficient positively labeled training data to perform cross validation on gold standard goldstandardlabelmatrix plaintext csv our manually annotated gold standard in plain text gold standard goldstandardlabelmatrix csv our manually annotated gold standard with ontology identifiers gold standard goldstandard propagated txt our manually annotated gold standard with a list of annotations for each sample not in matrix form gold standard goldstandard sample descriptions txt sample descriptions for the samples in our gold standard gold standard goldstandard sample ids txt sample and experiment labels corresponding to gold standard goldstandard sample descriptions txt gold standard goldstandard unpropagated txt the original gold standard manual annotations 1 gold standard manuscriptmodels txt a list of the models we evaluated and showed results for in our manuscript a subset of gold standard crossvalidatedmodels txt gold standard modelsperanatomicalsystem json mapping that lists the tissues and cell types that belong to each high level anatomical system src main source directory src demo sh runs an example of the pipeline src txt2onto py primary file for making predictions on input text src utils py utility file containing tools for making predictions on input text out example directory to send outputs to paper results files with raw values used to create figures from our publication training inputs directory containing example input file for training new nlp ml models user trained models directory where user trained models can be stored using included demo additional information support for support please contact nat hawkins http www nathawkins info at hawki235 msu edu inquiry all general inquiries should be directed to arjun krishnan www thekrishnanlab org at arjun msu edu license this repository and all its contents are released under the creative commons license attribution noncommercial sharealike 4 0 international https creativecommons org licenses by nc sa 4 0 legalcode see license md https github com krishnanlab txt2onto blob main license citation if you use this work please cite systematic tissue annotations of genomic samples by modeling unstructured metadata nathaniel t hawkins marc maldaver anna yannakopoulos lindsay a guare arjun krishnan biorxiv 2021 05 10 443525 doi https doi org 10 1101 2021 05 10 443525 funding this work was primarily supported by us national institutes of health nih grants r35 gm128765 to ak and in part by msu start up funds to ak and msu rasmussen doctoral recruitment award and engineering distinguished fellowship to nth acknowledgements the authors would like to thank kayla johnson https sites google com view kaylajohnson home for their support and feedback on the manuscript and all members of the krishnan lab www thekrishnanlab org for valuable discussions and feedback on the project references 1 ontology aware classification of tissue and cell type signals in gene expression profiles across platforms and technologies lee y krishnan a zhu q troyanskaya og bioinformatics 2013 29 3036 3044
nlp omics-samples annotations
ai
nlp2015
natural language processing 2015 for our natural language processing course we implemented a lda and mg lda algorithm using gibbs sampling the programs also come with a preprocessing part that can read and process product reviews from the multi domain sentiment dataset 1 retrieving the dataset the preprocessing part for the lda and mg lda models was written to read and process product reviews from the multi domain sentiment dataset 1 known issue due to a bug in the preprocessing part not the entire eletronics dataset from the multi domain sentiment dataset can be loaded all other datasets are as far as we know processed correctly running the lda mg lda program the lda mg lda program can be run with the following commands python lda py preprocessing boolean path to dataset directory python mglda py preprocessing boolean path to dataset directory for example python lda py true data electronics python mglda py true data electronics in the lda mg lda program the parameters such as the topics gibbs iterations and model parameters can be tuned by configuration global variables in the python script itself references 1 multi domain sentiment dataset https www cs jhu edu mdredze datasets sentiment
ai
SNN_CV_Applications_Resources
snn cv applications resources paper list for snn or event camera based computer vision tasks if you are interested in this topic especifically on object detection visual tracking based event cameras please contact me via wangxiaocvpr foxmail com or my wechat wangxiao5791509 rgbt car10 https github com wangxiao5791509 snn cv applications resources blob master screenshot 20from 202020 08 25 2019 47 07 png related resources event based vision event cameras event camera slam eth page http rpg ifi uzh ch research dvs html the event camera dataset and simulator event based data for pose estimation visual odometry and slam eth page http rpg ifi uzh ch davis data html event based vision resources github https github com uzh rpg event based vision resources dvs benchmark datasets for object tracking action recognition and object recognition project https dgyblog com projects term dvs dataset html paper https www frontiersin org articles 10 3389 fnins 2016 00405 full survey reviews paper https drive google com file d 1d7igubirewxmui7xq75p6h i4h7ui3fa view usp sharing spiking neural networks and online learning an overview and perspectives neural networks 121 2020 88 100 jesus l lobo javier del ser albert bifet nikola kasabov paper https arxiv org pdf 1908 08019 pdf supervised learning in spiking neural networks a review of algorithms and evaluations neural networks 2020 wang xiangwen xianghong lin and xiaochao dang paper https sci hub st https www sciencedirect com science article pii s0893608020300563 datasets ced color event camera dataset paper https openaccess thecvf com content cvprw 2019 papers eventvision scheerlinck ced color event camera dataset cvprw 2019 paper pdf github https github com uzh rpg rpg esim dataset http rpg ifi uzh ch ced html deep feature learning for event camera gehrig daniel et al end to end learning of representations for asynchronous event based data proceedings of the ieee international conference on computer vision 2019 paper https openaccess thecvf com content iccv 2019 papers gehrig end to end learning of representations for asynchronous event based data iccv 2019 paper pdf code https github com uzh rpg rpg event representation learning tools packages spikingjelly an open source deep learning framework for spiking neural network snn based on pytorch document https spikingjelly readthedocs io zh cn latest index en github https github com fangwei123456 spikingjelly snn toolbox document https snntoolbox readthedocs io en latest github https github com neuromorphicprocessorproject snn toolbox norse document https norse github io norse about html github https github com norse home https norse ai v2e simulator from video frames to realistic dvs event camera streams home https sites google com view video2events home github https github com sensorsini v2e paper https arxiv org pdf 2006 07722 pdf esim an open event camera simulator github https github com uzh rpg rpg esim slayer pytorch documents https bamsumit github io slayerpytorch build html index html bindsnet https github com bindsnet bindsnet also builds on pytorch and is explicitly targeted at machine learning tasks it implements a network abstraction with the typical node and connection notions common in spiking neural network simulators like nest cusnn https github com tudelft cusnn is a c gpu accelerated simulator for large scale networks the library focuses on cuda and includes spike time dependent plasicity stdp learning rules decolle https github com nmi lab decolle public implements an online learning algorithm described in the paper synaptic plasticity dynamics for deep continuous local learning decolle https arxiv org abs 1811 10766 by j kaiser m mostafa and e neftci long short term memory spiking neural networks lsnn https github com igitugraz lsnn official is a tool from the university of graaz for modelling lsnn cells in tensorflow https www tensorflow org the library focuses on a single neuron and gradient model nengo https www nengo ai nengo dl introduction html is a neuron simulator and nengo dl is a deep learning network simulator that optimised spike based neural networks based on an approximation method suggested by hunsberger and eliasmith 2016 https arxiv org abs 1611 05141 this approach maps to but does not build on the deep learning framework tensorflow which is fundamentally different from incorporating the spiking constructs into the framework itself in turn this requires manual translations into each individual backend which influences portability neuron simulation toolkit nest https nest simulator org constructs and evaluates highly detailed simulations of spiking neural networks this is useful in a medical biological sense but maps poorly to large datasets and deep learning pynn http neuralensemble org docs pynn is a python interface that allows you to define and simulate spiking neural network models on different backends both software simulators and neuromorphic hardware it does not currently provide mechanisms for optimisation or arbitrary synaptic plasticity pysnn https github com basbuller pysnn is a pytorch extension similar to norse its approach to model building is slightly different than norse in that the neurons are stateful rockpool https gitlab com aictx rockpool is a python package developed by synsense for training simulating and deploying spiking neural networks it offers both jax and pytorch primitives slayerpytorch https github com bamsumit slayerpytorch is a s pike lay er e rror r eassignment library that focuses on solutions for the temporal credit problem of spiking neurons and a probabilistic approach to backpropagation errors it includes support for the loihi chip https en wikichip org wiki intel loihi snn toolbox https snntoolbox readthedocs io en latest guide intro html q automates the conversion of pre trained analog to spiking neural networks q the tool is solely for already trained networks and omits the possibly platform specific training spytorch https github com fzenke spytorch presents a set of tutorials for training snns with the surrogate gradient approach superspike by f zenke and s ganguli 2017 https arxiv org abs 1705 11146 norse implements superspike https github com norse norse blob master norse torch functional superspike py but allows for other surrogate gradients and training approaches s2net https github com romainzimmer s2net is based on the implementation presented in spytorch https github com fzenke spytorch but implements convolutional layers as well it also contains a demonstration how to use those primitives to train a model on the google speech commands dataset https arxiv org abs 1804 03209 hardware neuromorphic processors such as the ibm truenorth paper http paulmerolla com merolla main som pdf and intel loihi paper https sci hub st https ieeexplore ieee org abstract document 8259423 snn papers year 2023 esl snns an evolutionary structure learning strategy for spiking neural networks jiangrong shen qi xu jian k liu yueming wang gang pan huajin tang paper https arxiv org pdf 2306 03693 pdf temporal contrastive learning for spiking neural networks haonan qiu zeyin song yanqi chen munan ning wei fang tao sun zhengyu ma li yuan yonghong tian paper https arxiv org pdf 2305 13909 pdf auto spikformer spikformer architecture search kaiwei che zhaokun zhou zhengyu ma wei fang yanqi chen shuaijie shen li yuan yonghong tian paper https arxiv org pdf 2306 00807 pdf a graph is worth 1 bit spikes when graph contrastive learning meets spiking neural networks jintang li huizhe zhang ruofan wu zulun zhu liang chen zibin zheng baokun wang and changhua meng paper https arxiv org pdf 2305 19306 pdf code https github com edisonleeeee spikegcl sharing leaky integrate and fire neurons for memory efficient spiking neural networks youngeun kim yuhang li abhishek moitra ruokai yin and priyadarshini panda paper https arxiv org pdf 2305 18360 pdf fast snn fast spiking neural network by converting quantized ann ieee tpami yangfan hu qian zheng xudong jiang and gang pan paper https arxiv org pdf 2305 19868 pdf code https github com yangfan hu fast snn joint a snn joint training of artificial and spiking neural networks via self distillation and weight factorization yufei guo weihang peng yuanpei chen liwen zhang xiaode liu xuhui huang zhe ma pattern recognition paper https arxiv org pdf 2305 02099 pdf training full spike neural networks via auxiliary accumulation pathway guangyao chen peixi peng guoqi li yonghong tian paper https arxiv org abs 2301 11929 code https github com icgy96 aap before 2023 glif a unified gated leaky integrate and fire neuron for spiking neural networks paper https arxiv org pdf 2210 13768 pdf code https github com ikarosy gated lif optimized potential initialization for low latency spiking neural networks tong bu jianhao ding zhaofei yu tiejun huang aaai 2022 paper https arxiv org pdf 2202 01440 pdf b chel j zendrikov d solinas s et al supervised training of spiking neural networks for robust deployment on mixed signal neuromorphic processors j arxiv preprint arxiv 2102 06408 2021 paper https www nature com articles s41598 021 02779 x weixin https mp weixin qq com s xi59paqmqjd5hm3bh6yssq training spiking neural networks using lessons from deep learning paper https arxiv org pdf 2109 12894 pdf training feedback spiking neural networks by implicit differentiation on the equilibrium state nips 2021 mingqing xiao qingyan meng zongpeng zhang yisen wang zhouchen lin paper https arxiv org pdf 2109 14247 pdf code https github com pkuxmq ide fsnn stereospike depth learning with a spiking neural network ulysse ran on javier cuadrado anibarro benoit r cottereau timoth e masquelier paper https arxiv org abs 2109 13751 code https github com urancon stereospike siamevent event based object tracking via edge aware similarity learning with siamese networks paper https arxiv org abs 2109 13456 github https github com yujeong star siamevent deep spiking neural network energy efficiency through time based coding bing han and kaushik roy eccv 2020 https www ecva net papers eccv 2020 papers eccv papers 123550392 pdf inherent adversarial robustness of deep spiking neural networks effects of discrete input encoding and non linear activations saima sharmin1 nitin rathi1 priyadarshini panda2 and kaushik roy1 eccv2020 https www ecva net papers eccv 2020 papers eccv papers 123740392 pdf code https github com ssharmin spikingnn adversarial attack spike flownet event based optical flow estimation with energy efficient hybrid neural networks chankyu lee adarsh kumar kosta alex zihao zhu kenneth chaney kostas daniilidis and kaushik roy eccv2020 https www ecva net papers eccv 2020 papers eccv papers 123740358 pdf code https github com chan8972 spike flownet surrogate gradient learning in spiking neural networks neftci emre o hesham mostafa and friedemann zenke ieee signal processing magazine 36 2019 61 63 paper https sci hub st https ieeexplore ieee org abstract document 8891809 long short term memory and learning to learn in networks of spiking neurons bellec guillaume et al advances in neural information processing systems 2018 paper https papers nips cc paper 7359 long short term memory and learning to learn in networks of spiking neurons pdf code https github com surrogate gradient learning slayer spike layer error reassignment in time shrestha sumit bam and garrick orchard advances in neural information processing systems 2018 paper http papers nips cc paper 7415 slayer spike layer error reassignment in time pdf offical code https bitbucket org bamsumit slayer src master pytorch version https github com bamsumit slayerpytorch video https www youtube com watch v jgdatqqci5o rmp snn residual membrane potential neuron for enabling deeper high accuracy and low latency spiking neural network cvpr 2020 https openaccess thecvf com content cvpr 2020 papers han rmp snn residual membrane potential neuron for enabling deeper high accuracy and cvpr 2020 paper pdf retina like visual image reconstruction via spiking neural model lin zhu siwei dong jianing li tiejun huang yonghong tian cvpr 2020 https openaccess thecvf com content cvpr 2020 papers zhu retina like visual image reconstruction via spiking neural model cvpr 2020 paper pdf biologically inspired alternatives to backpropagation through time for learning in recurrent neural nets bellec g scherr f hajek e salaj d legenstein r maass w 2019 arxiv preprint arxiv 1901 09049 paper https arxiv org pdf 1901 09049 pdf unsupervised learning of a hierarchical spiking neural network for optical flow estimation from events to global motion perception t pami paredes vall s federico kirk yannick willehm scheper and guido cornelis henricus eugene de croon paper https arxiv org pdf 1807 10936 pdf deep neural networks with weighted spikes kim jaehyun et al neurocomputing 311 2018 373 386 paper https sci hub st https www sciencedirect com science article pii s0925231218306726 spiking deep residual network hu yangfan et al arxiv preprint arxiv 1805 01352 2018 paper https arxiv org pdf 1805 01352 pdf towards artificial general intelligence with hybrid tianjic chip architecture nature 572 7767 106 111 pei j deng l song s zhao m zhang y wu s chen f 2019 paper http cacs usc edu education cs653 pei artificialgeneralintelligencechip nature19 pdf training spiking deep networks for neuromorphic hardware paper https arxiv org pdf 1611 05141 pdf direct training for spiking neural networks faster larger better wu yujie et al aaai 2019 paper https www aaai org ojs index php aaai article view 3929 3807 optical flow estimation and motion segmentation spike flownet event based optical flow estimation with energy efficient hybrid neural networks lee chankyu and kosta adarsh and zhu alex zihao and chaney kenneth and daniilidis kostas and roy kaushik eccv 2020 https www ecva net papers eccv 2020 papers eccv papers 123740358 pdf code https github com chan8972 spike flownet ev flownet self supervised optical flow estimation for event based cameras zhu alex zihao et al arxiv preprint arxiv 1802 06898 2018 paper https arxiv org pdf 1802 06898 pdf code https github com daniilidis group ev flownet stoffregen timo et al event based motion segmentation by motion compensation proceedings of the ieee international conference on computer vision 2019 paper https openaccess thecvf com content iccv 2019 papers stoffregen event based motion segmentation by motion compensation iccv 2019 paper pdf bisulco a ojeda f c isler v et al fast motion understanding with spatiotemporal neural networks and dynamic vision sensors j arxiv preprint arxiv 2011 09427 2020 paper https arxiv org pdf 2011 09427 pdf event based motion segmentation with spatio temporal graph cuts paper https arxiv org pdf 2012 08730 pdf code https github com hkust aerial robotics emsgc object recognition tactilesgnet a spiking graph neural network for event based tactile object recognition fuqiang gu weicong sng tasbolat taunyazov and harold soh paper https arxiv org pdf 2008 08046 pdf code https github com clear nus tactilesgnet object detection spiking yolo spiking neural network for real time object detection kim seijoon et al aaai 2020 paper https arxiv org pdf 1903 06530 pdf code https github com cwq159 pytorch spiking yolov3 a large scale event based detection dataset for automotive de tournemire pierre et al arxiv 2020 arxiv 2001 paper https arxiv org pdf 2001 08499 pdf dataset https www prophesee ai 2020 01 24 prophesee gen1 automotive detection dataset event based asynchronous sparse convolutional networks messikommer nico et al arxiv preprint arxiv 2003 09148 2020 paper http rpg ifi uzh ch docs eccv20 messikommer pdf youtube https www youtube com watch v vd7beh 7eu code https github com uzh rpg rpg asynet structure aware network for lane marker extraction with dynamic vision sensor wensheng cheng hao luo wen yang senior member ieee lei yu member ieee and wei li cvpr workshop paper https arxiv org pdf 2008 06204 pdf dataset https spritea github io det object tracking multi domain collaborative feature representation for robust visual object tracking cgi 2021 jiqing zhang kai zhao bo dong yingkai fu yuxin wang xin yang baocai yin paper https arxiv org pdf 2108 04521 pdf deng lei et al fast object tracking on a many core neural network chip frontiers in neuroscience 12 2018 841 paper https www frontiersin org articles 10 3389 fnins 2018 00841 full jiang rui et al object tracking on event cameras with offline online learning caai transactions on intelligence technology 2020 paper https www researchgate net profile rui jiang31 publication 341045469 object tracking on event cameras with offline online learning links 5ebfeadea6fdcc90d67a4af3 object tracking on event cameras with offline online learning pdf retinal slip estimation and object tracking with an active event camera aicas 2020 https sci hub st https ieeexplore ieee org abstract document 9073922 zhang y 2019 real time object tracking for event cameras master s thesis nanyang technological university singapore thesis https dr ntu edu sg bitstream 10356 137297 2 thesis zhangyexin pdf yang zheyu et al dashnet a hybrid artificial and spiking neural network for high speed object tracking arxiv preprint arxiv 1909 12942 2019 paper https arxiv org pdf 1909 12942 pdf end to end learning of object motion estimation from retinal events for event based object tracking aaai 2020 paper https arxiv org pdf 2002 05911 pdf haste multi hypothesis asynchronous speeded up tracking of events bmvc2020 paper https www bmvc2020 conference com assets papers 0744 pdf high speed event camera tracking bmvc2020 paper https www bmvc2020 conference com assets papers 0366 pdf a hybrid neuromorphic object tracking and classification framework for real time systems paper https arxiv org pdf 2007 11404 pdf code https github com nusneuromorphic ceot demo https drive google com file d 1grb1ec2rdm0zmfhpzq2mfyq aulbjxzj preview long term object tracking with a moving event camera ramesh bharath et al bmvc 2018 paper http bmvc2018 org contents papers 0814 pdf e tld event based framework for dynamic object tracking paper https arxiv org pdf 2009 00855 pdf dataset https github com nusneuromorphic object annotations spiking neural network based target tracking control for autonomous mobile robots cao zhiqiang et al neural computing and applications 26 8 2015 1839 1847 paper https sci hub st https link springer com article 10 1007 s00521 015 1848 5 asynchronous tracking by detection on adaptive time surfaces for event based object tracking chen haosheng et al proceedings of the 27th acm international conference on multimedia 2019 paper https arxiv org pdf 2002 05583 pdf high speed object tracking with dynamic vision sensor wu j zhang k zhang y xie x shi g 2018 october in china high resolution earth observation conference pp 164 174 springer singapore paper https sci hub st https link springer com chapter 10 1007 978 981 13 6553 9 18 high speed object tracking with its application in golf playing lyu c liu y jiang x li p chen h 2017 international journal of social robotics 9 3 449 461 paper https sci hub tw 10 1007 s12369 017 0404 0 a spiking neural network architecture for object tracking luo yihao et al international conference on image and graphics springer cham 2019 paper https sci hub st 10 1007 978 3 030 34120 6 siamsnn spike based siamese network for energy efficient and real time object tracking yihao luo min xu caihong yuan xiang cao liangqi zhang yan xu tianjiang wang and qi feng paper https arxiv org pdf 2003 07584 pdf event guided structured output tracking of fast moving objects using a celex sensor huang jing et al ieee transactions on circuits and systems for video technology 28 9 2018 2413 2417 paper https sci hub st https ieeexplore ieee org abstract document 8368143 eklt asynchronous photometric feature tracking using events and frames gehrig daniel et al international journal of computer vision 128 3 2020 601 618 paper https sci hub st https link springer com article 10 1007 s11263 019 01209 w code https github com uzh rpg rpg eklt video https www youtube com watch v zyd1ypw1h4u feature youtu be spatiotemporal multiple persons tracking using dynamic vision sensor pi tkowska ewa et al ieee computer society conference on computer vision and pattern recognition workshops ieee 2012 paper https publik tuwien ac at files pubdat 209369 pdf event driven stereo visual tracking algorithm to solve object occlusion ieee tnnls paper https sci hub st https ieeexplore ieee org abstract document 8088365 ni zhenjiang et al asynchronous event based high speed vision for microparticle tracking journal of microscopy 245 3 2012 236 244 paper https d1wqtxts1xzle7 cloudfront net 43547699 asynchronous event based high speed visi20160309 14281 1284m08 pdf 1457537197 response content disposition inline 3b filename 3dasynchronous event based high speed visi pdf expires 1599041043 signature ngcfjbkclbyvdznldtndtkxucimann9ntoqpb ufkbxfoppzh 59jjjgvp5a2iysfztf1tvqhvgexsp0ubw8tq3wmesvufem l1ub6cxhdavsxugkkrkndahaxnyh lapq3lky3qnlt0kjqzedgivtdyawccjdzb65vrtbwsz6buny2 ghvlifgjlbhxlmsrlnktlivii7eznbkzn11yk4cesysvggfclw7ljhaaerh o3yobxdqf0a vohh9rxrj0c aimw5rtztxhtmcaqdwspopfmpxbo 4 k5 oe jg0hffe cdxpjrstju7tixqs9mj8ikjo4vxec7kt3i4kw key pair id apkajlohf5ggslrbv4za high quality image reconvery event enhanced high quality image recovery bishan wang jingwei he lei yu gui song xia and wen yang eccv2020 https www ecva net papers eccv 2020 papers eccv papers 123580154 pdf code https github com shinywang33 esl net binocular vision u2eyes a binocular dataset for eye tracking and gaze estimation iccv 2019 workshop paper https openaccess thecvf com content iccvw 2019 papers openeds porta u2eyes a binocular dataset for eye tracking and gaze estimation iccvw 2019 paper pdf robust object tracking via multi cue fusion hu mengjie et al signal processing 139 2017 86 95 paper https sci hub st https www sciencedirect com science article abs pii s0165168417301366
ai
ReAInvent
p align center img src https media giphy com media oepoujo00ampeto6ye giphy gif width 600 height 350 alt animated p reainvent this was our submission for cruzhacks 2023 and is soon to become a full fledged project maybe we created a question answer chatbot for youtube videos using semantic search large language models and rest api the name is pronounced the same way as reinvent the project is hosted live at https reainvent com what is it for we created this project to increase the efficiency of learning allowing for longer form videos to be parsed down to text in the matter of seconds being able to quickly re reference sections of lectures where professors go over specific types of problems or trying to find where the professor talks about a specific quiz grading policy can be quite the pain for videos that last over an hour being able to quickly find this information and being ensured it is accurate is an incredibly convenient and useful product for any student outside of just university students this program can be used with any youtube video so long as it has an accurate transcription nearly all youtube videos have this due to auto transcription in any case where you find yourself scrubbing through a video to quickly find pieces of information reainvent is there to do it faster for you how does it work we use pytube and youtube transcript api to scrape a given youtube url for its transcription from there we run a semantic search with the query being the question asked by the client and the document being the youtube transcription to perform the semantic search we use openai s embedding models and sort the transcription by cosine similarity in respect to the query in order to find the most relevant parts of the video for the question we then feed the transcriptions as context to gpt davinci openai s largest llm and the original question to achieve the accurate and digestable responses you see on the site we also use prompt engineering to prevent hallucinations misinformation by gpt and include relevant timestamps so you can quickly watch back the sections you are looking for running in development mode create a virtual environment 1 start by navigating to the project directory 2 create the virtual environment console python3 m venv venv 3 activate the virtual environment console venv scripts activate bat install dependencies console pip install r requirements txt start the api console python3 backend server py note you need to provide an openai api key in a env file within the backend directory openai apis are not free to use and will require you to provide payment info to generate tokens past the free trial the env file should have the line openai key api key start the website when running in development mode make sure to set this line at the top of app js so the site can communicate with the api by default it is set to api console const api endpoint now start the website console cd frontend npm start navigate to localhost 3000 to view the webserver note these instructions are for running the project in development mode you can build the api and frontend for production in a variety of ways currently reainvent is hosted on a google cloud vm using gunicorn for hosting the backend and nginx for the frontend
ai
all-classifiers-2019
peter moss acute myeloid lymphoblastic leukemia ai research project acute lymphoblastic leukemia classifiers 2019 peter moss acute myeloid lymphoblastic leukemia ai research project media images peter moss acute myeloid lymphoblastic leukemia research project png current release https img shields io badge current 20release 0 1 0 blue svg https github com amlresearchproject all classifiers 2019 tree 0 1 0 upcoming release https img shields io badge upcoming 20release 0 2 0 blue svg https github com amlresearchproject all classifiers 2019 tree 0 2 0 issues welcome https img shields io badge contributions welcome lightgrey svg contributing md issues https img shields io badge issues welcome lightgrey svg issues license https img shields io badge license mit blue svg license nbsp table of contents introduction introduction disclaimer disclaimer projects projects data augmentation data augmentation contributing contributing contributors contributors student contributors student contributors versioning versioning license license bugs issues bugs issues nbsp introduction the peter moss acute lymphoblastic leukemia classifiers are a collection of projects that use computer vision to classify acute lymphoblastic leukemia all in unseen images this repository includes classifier projects made with tensorflow caffe keras fastai intel movidius ncs nbsp disclaimer these projects should be used for research purposes only the purpose of the projects is to show the potential of artificial intelligence for medical support systems such as diagnosis systems although the classifiers are accurate and show good results both on paper and in real world testing they are not meant to be an alternative to professional medical diagnosis developers that have contributed to this repository have experience in using artificial intelligence for detecting certain types of cancer they are not a doctors medical or cancer experts salvatore raieli is a bioinformatician researcher and phd in immunology but does not work in medical diagnosis dr amita kapoor is associate professor at srcasw university of delhi and teaches neural networks artificial intelligence operating system embedded system computer communication and networking please use these systems responsibly nbsp projects this repository hosts a collection of classifiers that have been developed by the team using the python programming language these classifiers include caffe fastai movidius ncs1 and keras classifiers each project may have multiple classifiers projects description status author data augmentation projects augmentation data augmentation applies filters to datasets and increases the amount of training test data complete adam milton barker https www leukemiaresearchassociation ai team adam milton barker adam milton barker allcnn caffe classifier projects caffe allcnn allcnn caffe classifier acute lymphoblastic leukemia classifier created using the caffe framework ongoing adam milton barker https www leukemiaresearchassociation ai team adam milton barker adam milton barker movidius ncs classifier projects ncs1 movidius ncs classifier acute lymphoblastic leukemia classifier created using the intel movidius ncs complete adam milton barker https www leukemiaresearchassociation ai team adam milton barker adam milton barker fastai resnet50 classifier projects fastai resnet50 all fastai resnet 50 ipynb fastai resnet50 classifier acute lymphoblastic leukemia classifier created using fastai resnet50 complete salvatore raieli https github com salvatorera salvatore raieli fastai resnet50 a classifier projects fastai resnet50 all fastai resnet 50 a ipynb fastai resnet50 a classifier acute lymphoblastic leukemia classifier created using fastai resnet50 complete adam milton barker https www leukemiaresearchassociation ai team adam milton barker adam milton barker fastai resnet34 classifier projects fastai resnet34 all fastai resnet 34 ipynb fastai resnet34 classifier acute lymphoblastic leukemia classifier created using fastai resnet34 complete salvatore raieli https github com salvatorera salvatore raieli fastai resnet18 classifier projects fastai resnet18 all fastai resnet 18 ipynb fastai resnet18 classifier acute lymphoblastic leukemia classifier created using fastai resnet18 complete salvatore raieli https github com salvatorera salvatore raieli quantisedcode projects keras quantisedcode quantisedcode ipynb quantisedcode acute lymphoblastic leukemia classifier created using keras with tensorflow backend paper 1 https airccj org cscp vol7 csit77505 pdf paper 1 and the original dataset 2 https homes di unimi it scotti all datasets dataset 2 complete dr amita kapoor https www petermossamlallresearch com team amita kapoor profile dr amita kapoor taru jain https www petermossamlallresearch com students student taru jain profile taru jain allcnn projects keras allcnn paper 1 all idb1 non augmented allcnn ipynb allcnn acute lymphoblastic leukemia classifier created using keras with tensorflow backend paper 1 https airccj org cscp vol7 csit77505 pdf paper 1 and the original dataset 1 https homes di unimi it scotti all datasets dataset 1 complete adam milton barker https www leukemiaresearchassociation ai team adam milton barker adam milton barker allcnn projects keras allcnn paper 1 all idb2 non augmented allcnn ipynb allcnn acute lymphoblastic leukemia classifier created using keras with tensorflow backend paper 1 https airccj org cscp vol7 csit77505 pdf paper 1 and the original dataset 2 https homes di unimi it scotti all datasets dataset 2 ongoing adam milton barker https www leukemiaresearchassociation ai team adam milton barker adam milton barker nbsp team publications a series of articles tutorials by adam milton barker https www leukemiaresearchassociation ai team adam milton barker adam milton barker that take you through attempting to replicate the work carried out in the acute myeloid leukemia classification using convolution neural network in clinical decision support system https airccj org cscp vol7 csit77505 pdf acute myeloid leukemia classification using convolution neural network in clinical decision support system paper acute lymphoblastic leukemia data augmentation intel ai developer program https software intel com en us articles acute myeloidlymphoblastic leukemia data augmentation acute lymphoblastic leukemia data augmentation intel ai developer program adam milton barker https www leukemiaresearchassociation ai team adam milton barker adam milton barker inception v3 deep convolutional architecture for classifying acute myeloid lymphoblastic leukemia intel ai developer program https software intel com en us articles inception v3 deep convolutional architecture for classifying acute myeloidlymphoblastic inception v3 deep convolutional architecture for classifying acute myeloid lymphoblastic leukemia intel ai developer program adam milton barker https www leukemiaresearchassociation ai team adam milton barker adam milton barker introduction to convolutional neural networks in caffe https software intel com content www us en develop articles detecting acute lymphoblastic leukemia using caffe openvino neural compute stick 2 part 1 html introduction to convolutional neural networks in caffe adam milton barker https www leukemiaresearchassociation ai team adam milton barker adam milton barker preparing the acute lymphoblastic leukemia dataset https software intel com content www us en develop articles detecting acute lymphoblastic leukemia using caffe openvino neural compute stick 2 part 2 html preparing the acute lymphoblastic leukemia dataset adam milton barker https www leukemiaresearchassociation ai team adam milton barker adam milton barker nbsp contributing the peter moss acute myeloid lymphoblastic leukemia ai research project encourages and welcomes code contributions bug fixes and enhancements from the github please read the contributing contributing md contributing document for a full guide to forking our repositories and submitting your pull requests you will also find information about our code of conduct on this page contributors adam milton barker https www leukemiaresearchassociation ai team adam milton barker adam milton barker asociacion de investigation en inteligencia artificial para la leucemia peter moss https www leukemiaresearchassociation ai asociacion de investigation en inteligencia artificial para la leucemia peter moss president lead developer sabadell spain salvatore raieli https www leukemiaresearchassociation ai team salvatore raieli salvatore raieli asociacion de investigation en inteligencia artificial para la leucemia peter moss https www leukemiaresearchassociation ai asociacion de investigation en inteligencia artificial para la leucemia peter moss bioinformatics immunology ai r d bologna italy dr amita kapoor https www leukemiaresearchassociation ai team amita kapoor dr amita kapoor asociacion de investigation en inteligencia artificial para la leucemia peter moss https www leukemiaresearchassociation ai asociacion de investigation en inteligencia artificial para la leucemia peter moss student program team r d delhi india students contributors taru jain https www leukemiaresearchassociation ai student program student taru jain taru jain asociacion de investigation en inteligencia artificial para la leucemia peter moss https www leukemiaresearchassociation ai asociacion de investigation en inteligencia artificial para la leucemia peter moss student program delhi india nbsp versioning we use semver for versioning for the versions available nbsp license this project is licensed under the mit license see the license license license file for details nbsp bugs issues we use the repo issues issues repo issues to track bugs and general requests related to using this project
artificial-intelligence artificial-neural-networks artificial-intelligence-algorithms computer-vision convolutional-neural-networks deep-neural-networks lymphoblastic-leukemia-classifiers intel-movidius caffe fastai data-augmentation python-classifiers openvino
ai
defi-compound-api
introduction this is a backend blockchain development app which use rest api to interact defi project compound lending protocol api architecture interfacing with compound by building api that runs on node js server client side make a http request to this api then api will forward this request to compound s smart contract on blockchain by using web3 then smart contract of compund will send a response to this api and the api will send request to client client can be a browser a server or mobile app setup development enviornment 1 install node js 2 mkdir directory name 3 inside the directory intinal project npm init y 4 create http server for api can create directly with node js but easier to use framework like koa 5 install koa framework lightweight version of express npm install koa koa router 5 install web3 npm install web3 connect to ethereum blockchain main net use infura eth nodes as a service service which provide some nodes connect to mainnet and able to access to these nodes 1 open an account 2 create a new project 3 copy the api key from endpoints choose mainnet 4 copy the api key in env file 5 install dotenv create web3 contract instance to point to the smart contract of compound find and copy the token smart contract address and abi under documentation of compound s network section in config json create endpoint api 1 token balance api for lending token send the token to contract s name start with c when returning your token balance the balance will be included interest you earned from etherscan search for example of cdai dai contract 0xd7a8843025d55405637a5c952f625bb2fbc258a2 try out the api using this contract curl http localhost 3000 tokenbalance cdai 0xd7a8843025d55405637a5c952f625bb2fbc258a2 2 ctoken balance api for lending token to compound protocol the fund will be locked in ctoken contract in exchange you will get some ctoken which represent the token you sent to compound this can be used to redeem the token that you initially invested plus the earned interest try curl http localhost 3000 ctokenbalance cdai 0xd7a8843025d55405637a5c952f625bb2fbc258a2 3 mint new ctoken balance api for lending token to compound protocol need a private key to sign in and need an address to generate an address go to vanity eth store it in env add private key to web3 4 redeem the ctoken api return the amont of ctoken that we want to redeem e g redeem 10 ctoken doesn t mean you get back 10 token it depends on the exchange rate between token and ctoken also the interest you ve earned deploy to heroku 1 create a heroku account 2 install heroku cli 3 install git 4 create repository in git git init git add a git commit m init 5 in command line heroku login 6 after login create heroku project heroku create 7 since we do not upload the env file set the env variable in heroku heroku config set infura url the link also the private key your private key 8 upload the project to heroku git push heroku master endpoint api testing try cdai dai curl https localhost 3000 tokenbalance cdai 0xc2b58e6b037b19cfba17b1290b1fbbebc00bd967 curl https localhost 3000 ctokenbalance cdai 0xc2b58e6b037b19cfba17b1290b1fbbebc00bd967 author cheryl kwong email cherylkwong gmail com project developed by node js rest api koa infura web3 heroku
server
Belajar-Computer-Vision
belajar computer vision pertemuan 1 pengenalan konsep tool library pemrograman video video pertemuan 1 https www youtube com watch v phjhe1oyqk ppt ppt pertemuan 1 01 20pengenalan 20konsep 20tool 20library 20 20pemrograman 01 20pengenalan 20konsep 20tool 20library 20 20pemrograman pptx pertemuan 2 basic python part 1 variable data type operators video video pertemuan 2 https www youtube com watch v qvdrelzr9oo notebook notebook pertemuan 2 02 20basic 20python 20 20part 201 basic 20python 20 20part 201 ipynb pertemuan 3 basic python part 2 python data structure video video pertemuan 3 https www youtube com watch v 1yjbogzeexw notebook notebook pertemuan 3 03 20basic 20python 20 20part 202 03 20basic 20python 20 20part 202 ipynb pertemuan 4 basic python part 3 if for while function video video pertemuan 4 https www youtube com watch v t6y1td48mm notebook notebook pertemuan 4 04 20basic 20python 20 20part 203 04 20basic 20python 20 20part 203 ipynb pertemuan 5 basic python part 5 class object inheritance scope module video video pertemuan 5 https www youtube com watch v n l76lu2jqq notebook notebook pertemuan 5 05 20basic 20python 20 20part 204 05 20basic 20python 20 20part 204 ipynb pertemuan 6 python numpy fundamentals video video pertemuan 6 https www youtube com watch v xrqombuimkk notebook notebook pertemuan 6 06 20python 20numpy 20fundamentals 06 20python 20numpy 20fundamentals ipynb pertemuan 7 opencv part 1 load display save image video video video pertemuan 7 https youtu be quwad0uvdds notebook notebook pertemuan 7 07 20opencv 20 20part 201 opencv 20 20part 201 ipynb pertemuan 8 opencv part 2 crop resize blend convert color image video video pertemuan 8 https youtu be rfhmgrsolqs notebook notebook pertemuan 8 08 20opencv 20 20part 202 08 20opencv 20 20part 202 ipynb pertemuan 9 opencv part 3 image smoothing image thresholding edge detection video video pertemuan 9 https youtu be tlvthcsrmwg notebook notebook pertemuan 9 09 20opencv 20 20part 203 opencv 20 20part 203 ipynb pertemuan 10 opencv part 4 drawing tool finding contour draw contour contour feature video video pertemuan 10 https youtu be 4bqgcq3tt00 notebook notebook pertemuan 10 10 20opencv 20 20part 204 10 20opencv 20 20part 204 ipynb pertemuan 11 opencv part 5 region mask range thresholding hough transform video video pertemuan 11 https youtu be 7fm5 o5zs2u notebook notebook pertemuan 11 11 20opencv 20 20part 205 11 20opencv 20 20part 205 ipynb pertemuan 12 opencv part 6 morphological transform operation video video pertemuan 12 https youtu be j2kuok4jx7s notebook notebook pertemuan 12 12 20opencv 20 20part 206 12 20opencv 20 20part 205 ipynb pertemuan 13 opencv part 7 image pyramid geometric transform video video pertemuan 13 https youtu be 1dycsakajr8 notebook notebook pertemuan 13 13 20opencv 20 20part 207 13 20opencv 20 20part 207 ipynb pertemuan 14 opencv part 8 cascade classifier cascade classifier training video video pertemuan 14 https www youtube com watch v fchp2qbfsdq notebook notebook pertemuan 14 14 20opencv 20 20part 208 14 20opencv 20 20part 208 ipynb pertemuan 15 opencv dnn part 1 opencv dnn darknet yolo training inferencing video video pertemuan 15 https youtu be vagooe4x we notebook notebook pertemuan 15 15 20opencv 20dnn 20 20part 201 15 20opencv 20dnn 20 20part 201 ipynb pertemuan 16 opencv dnn part 2 opencv dnn tensorflow east model tesseract ocr for alpr system video video pertemuan 16 https www youtube com watch v tu9zsyv9o1q notebook notebook pertemuan 16 16 20opencv 20dnn 20 20part 202 16 20opencv 20dnn 20 20part 202 ipynb pertemuan 17 opencv ml ocr using svm on opencv ml video video pertemuan 17 https youtu be nq8 ne8btq0 notebook notebook pertemuan 17 17 20opencv 20ml 17 20opencv 20ml ipynb
ai
CloudVandana-Assignment
cloudvandana assignment welcome to the cloudvandana assignments repository here you will find a collection of programming assignments completed with diligence and creativity these assignments were provided by cloudvandana challenging us to explore the realms of java javascript html and software engineering each task has been meticulously crafted and thoroughly tested showcasing a blend of technical expertise and problem solving prowess
html-css-javascript java-8 javascript
cloud
JP-Website
jp website a website designed for a client with an online ordering system using modern html5 css3 design principles jquery and embedded apis
os
cst816s-nuttx
touch panel calibration for pine64 pinedio stack bl604 risc v board https lupyuen github io images touch title jpg hynitron cst816s touch controller driver for apache nuttx rtos used by pinedio stack bl604 https lupyuen github io articles pinedio2 read the article nuttx touch panel driver for pinedio stack bl604 https lupyuen github io articles touch watch the demo pinedio stack demo on youtube https www youtube com shorts 2nzjrlp5lce install driver if you re using nuttx on pinedio stack there s no need to install the driver lupyuen incubator nuttx pinedio branch https github com lupyuen incubator nuttx tree pinedio lupyuen incubator nuttx apps pinedio branch https github com lupyuen incubator nuttx apps tree pinedio otherwise to add this repo to your nuttx project bash pushd nuttx nuttx drivers input git submodule add https github com lupyuen cst816s nuttx cst816s ln s cst816s cst816s c popd pushd nuttx nuttx include nuttx input ln s drivers input cst816s cst816s h popd next update the makefile and kconfig see the modified makefile and kconfig https github com lupyuen incubator nuttx commit 5dbf67df8f36cdba2eb0034dac0ff8ed0f8e73e1 then update the nuttx build config bash todo change this to the path of our incubator nuttx folder cd nuttx nuttx preserve the build config cp config config erase the build config and kconfig files make distclean for bl602 configure the build for bl602 tools configure sh bl602evb nsh for pinedio stack bl604 configure the build for bl604 tools configure sh bl602evb pinedio for esp32 configure the build for esp32 todo change esp32 devkitc to our esp32 board tools configure sh esp32 devkitc nsh restore the build config cp config config edit the build config make menuconfig in menuconfig enable the hynitron cst816s driver under device drivers input device support enable i2c warnings because of the i2c workaround for cst816s https github com lupyuen cst816s nuttx i2c logging click build setup debug options check the boxes for the following enable warnings output i2c warnings output optional to enable logging for the cst816s driver check the boxes for enable error output enable informational debug output enable debug assertions input device error output input device warnings output input device informational output note that enable informational debug output must be unchecked for the lorawan test app lorawan test to work because the lorawan timers are time sensitive edit the function bl602 i2c transfer https github com lupyuen incubator nuttx blob touch arch risc v src bl602 bl602 i2c c l671 l773 and apply this workaround patch i2c logging https github com lupyuen cst816s nuttx i2c logging edit the function bl602 bringup or esp32 bringup in this file text for bl602 nuttx boards risc v bl602 bl602evb src bl602 bringup c for esp32 change esp32 devkitc to our esp32 board nuttx boards xtensa esp32 esp32 devkitc src esp32 bringup c and call cst816s register to load our driver https github com lupyuen incubator nuttx blob touch boards risc v bl602 bl602evb src bl602 bringup c l826 l843 c ifdef config input cst816s i2c address of cst816s touch controller define cst816s device address 0x15 include nuttx input cst816s h endif config input cst816s ifdef config input cst816s int bl602 bringup void init i2c bus for cst816s struct i2c master s cst816s i2c bus bl602 i2cbus initialize 0 if cst816s i2c bus err error failed to get i2c d interface n 0 register the cst816s driver ret cst816s register dev input0 cst816s i2c bus cst816s device address if ret 0 err error failed to register cst816s n endif config input cst816s here s how we created the cst816s driver for nuttx on pinedio stack bl604 cypress mbr3108 nuttx driver for cypress mbr3108 touch controller looks structurally similar to pinedio stack s cst816s so we copy n paste into our cst816s driver nuttx driver for cypress mbr3108 https github com lupyuen incubator nuttx blob master drivers input cypress mbr3108 c i2c scan pinedio stack s touch panel is a peculiar i2c device it won t respond to i2c scan unless we tap the screen and wake it up building a rust driver for pinetime s touch controller https lupyuen github io articles building a rust driver for pinetimes touch controller gpio interrupt pinedio stack s touch panel triggers a gpio interrupt when we tap the screen here s how we handle the gpio interrupt c int cst816s register far const char devpath far struct i2c master s i2c dev uint8 t i2c devaddr prepare interrupt line and handler ret bl602 irq attach board touch int cst816s isr handler priv if ret 0 kmm free priv ierr attach interrupt failed n return ret ret bl602 irq enable false if ret 0 kmm free priv ierr disable interrupt failed n return ret source https github com lupyuen cst816s nuttx blob main cst816s c l593 l661 bl602 irq attach is defined below c attach interrupt handler to gpio interrupt for touch controller based on https github com lupyuen incubator nuttx blob touch boards risc v bl602 bl602evb src bl602 gpio c l477 l505 static int bl602 irq attach gpio pinset t pinset far isr handler callback far void arg int ret 0 uint8 t gpio pin pinset gpio pin mask gpio pin shift far struct bl602 gpint dev s dev null todo debugassert callback null configure the pin that will be used as interrupt input warning check glb gpio int trig neg pulse todo bl602 expander set intmod gpio pin 1 glb gpio int trig neg pulse ret bl602 configgpio pinset if ret 0 gpioerr failed to configure gpio pin d n gpio pin return ret make sure the interrupt is disabled bl602 expander pinset pinset bl602 expander callback callback bl602 expander arg arg bl602 expander intmask gpio pin 1 irq attach bl602 irq gpio int0 bl602 expander interrupt dev bl602 expander intmask gpio pin 0 gpioinfo attach p n callback return 0 source https github com lupyuen cst816s nuttx blob main cst816s c l686 l727 note that we re calling bl602 expander to handle interrupts there doesn t seem to be a way to do this with the current bl602 gpio driver bl602evb bl602 gpio c we are building bl602 expander here lupyuen bl602 expander https github com lupyuen bl602 expander to test interrupts we uncomment define test cst816s interrupt c int cst816s register far const char devpath far struct i2c master s i2c dev uint8 t i2c devaddr uncomment this to test interrupts tap the screen define test cst816s interrupt ifdef test cst816s interrupt warning testing cst816s interrupt bl602 irq enable true endif test cst816s interrupt source https github com lupyuen cst816s nuttx blob main cst816s c l593 l661 there s bug with bl602 gpio interrupts that we have fixed for our driver https github com apache incubator nuttx issues 5810 issuecomment 1098633538 test gpio interrupt tapping the screen on pinedio stack correctly triggers a gpio interrupt text gpio pin register registering dev gpio0 gpio pin register registering dev gpio1 gpint enable disable the interrupt gpio pin register registering dev gpio2 bl602 gpio set intmod gpio pin 115 int ctlmod 1 int trgmod 0 spi test driver register devpath dev spitest0 spidev 0 cst816s register bl602 expander set intmod gpio pin 9 int ctlmod 1 int trgmod 0 bl602 irq attach attach 0x2305e9de bl602 irq enable disable interrupt cst816s register driver registered bl602 irq enable enable interrupt nuttshell nsh nuttx 10 2 0 rc0 nsh bl602 expander interrupt interrupt callback 0x2305e9de arg 0x42020a60 bl602 expander interrupt call callback 0x2305e9de arg 0x42020a60 cst816s poll notify bl602 expander interrupt interrupt callback 0x2305e9de arg 0x42020a60 bl602 expander interrupt call callback 0x2305e9de arg 0x42020a60 cst816s poll notify bl602 expander interrupt interrupt callback 0x2305e9de arg 0x42020a60 bl602 expander interrupt call callback 0x2305e9de arg 0x42020a60 cst816s poll notify bl602 expander interrupt interrupt callback 0x2305e9de arg 0x42020a60 bl602 expander interrupt call callback 0x2305e9de arg 0x42020a60 cst816s poll notify lvgl test app lvgltest fails to open dev input0 but that s ok because we haven t implemented the i2c part text nsh ls dev dev console gpio0 gpio1 gpio2 i2c0 input0 lcd0 null spi0 spitest0 timer0 urandom zero nsh lvgltest tp init opening dev input0 bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success cst816s probe device family id 0x34 device id 0x00aa device rev 35 cst816s probe device probe failed dev id mismatch cst816s probe device expected family id 0x9a device id 0x0a03 device rev 1 tp init open dev input0 failed 6 terminating bl602 expander interrupt interrupt callback 0x2305e9e8 arg 0 bl602 expander interrupt call callback 0x2305e9e8 arg 0 bl602 expander interrupt interrupt callback 0x2305e9e8 arg 0 bl602 expander interrupt call callback 0x2305e9e8 arg 0 bl602 expander interrupt interrupt callback 0x2305e9e8 arg 0 bl602 expander interrupt call callback 0x2305e9e8 arg 0 touch data apache nuttx rtos has a standard data format for touch panels let s implement this for pinedio stack c this structure contains information about a single touch point positional units are device specific struct touch point s uint8 t id unique identifies contact same in all reports for the contact uint8 t flags see touch definitions above int16 t x x coordinate of the touch point uncalibrated int16 t y y coordinate of the touch point uncalibrated int16 t h height of touch point uncalibrated int16 t w width of touch point uncalibrated uint16 t gesture gesture of touchscreen contact uint16 t pressure touch pressure uint64 t timestamp touch event time stamp in microseconds the typical touchscreen driver is a read only input character device driver the driver write method is not supported and any attempt to open the driver in any mode other than read only will fail data read from the touchscreen device consists only of touch events and touch sample data this is reflected by struct touch sample s this structure is returned by either the driver read method on some devices multiple touchpoints may be supported so this top level data structure is a struct touch sample s that contains a set of touch points each touch point is managed individually using an id that identifies a touch from first contact until the end of the contact struct touch sample s int npoints the number of touch points in point struct touch point s point 1 actual dimension is npoints source https github com lupyuen incubator nuttx blob touch include nuttx input touchscreen h l113 l148 read touch data here s how we read the touched coordinates in our driver c static int cst816s get touch data far struct cst816s dev s dev far void buf iinfo n struct touch sample s data uint8 t readbuf 7 int ret read the raw touch data ret cst816s i2c read dev cst816s reg touchdata readbuf sizeof readbuf if ret 0 iinfo read touch data failed n return ret interpret the raw touch data uint8 t id readbuf 5 4 uint8 t touchpoints readbuf 2 0x0f uint8 t xhigh readbuf 3 0x0f uint8 t xlow readbuf 4 uint8 t yhigh readbuf 5 0x0f uint8 t ylow readbuf 6 uint8 t event readbuf 3 6 0 touch down 1 touch up 2 contact uint16 t x xhigh 8 xlow uint16 t y yhigh 8 ylow if touch coordinates are invalid return the last valid coordinates bool valid true if x 240 y 240 iwarn invalid touch data id d touch d x d y d n id touchpoints x y if last event 0xff quit if we have no last valid coordinates ierr can t return touch data id d touch d x d y d n id touchpoints x y return einval valid false id last id x last x y last y remember the last valid touch data last event event last id id last x x last y y set the touch data fields memset data 0 sizeof data data npoints 1 data point 0 id id data point 0 x x data point 0 y y set the touch flags if event 0 touch down iinfo down id d touch d x d y d n id touchpoints x y if valid touch coordinates were valid data point 0 flags touch down touch id valid touch pos valid else touch coordinates were invalid data point 0 flags touch down touch id valid else if event 1 touch up iinfo up id d touch d x d y d n id touchpoints x y if valid touch coordinates were valid data point 0 flags touch up touch id valid touch pos valid else touch coordinates were invalid data point 0 flags touch up touch id valid else reject contact iinfo contact id d touch d x d y d n id touchpoints x y return einval return the touch data memcpy buf data sizeof data iinfo id d n data point 0 id iinfo flags 02x n data point 0 flags iinfo x d n data point 0 x iinfo y d n data point 0 y return sizeof data source https github com lupyuen cst816s nuttx blob main cst816s c l213 l302 note that our nuttx driver for pinedio stack s touch panel returns 4 possible states touch down vs touch up valid vs invalid we got this code thanks to jf s cst816s driver for the self test firmware pinedio stack selftest drivers cst816s c https codeberg org jf002 pinedio stack selftest src branch master drivers cst816s c and from our previous work on pinetime which also uses cst816s building a rust driver for pinetime s touch controller https lupyuen github io articles building a rust driver for pinetimes touch controller cst816s driver in rust https github com lupyuen stm32bluepill mynewt sensor blob pinetime rust app src touch sensor rs hynitron reference driver https github com lupyuen hynitron i2c cst0xxse blob master cst0xx core c l407 l466 test touch data nuttx driver for pinedio stack touch panel responds correctly to touch pinedio stack touch screen feels laggy on apache nuttx rtos right now 2 things we can fix 1 increase spi frequency 2 switch to spi dma eventually watch the demo on youtube https www youtube com shorts 2nzjrlp5lce update we have bumped up the spi frequency to max 40 mhz still feels laggy https github com lupyuen incubator nuttx blob touch boards risc v bl602 bl602evb configs pinedio defconfig l580 here s the detailed log text gpio pin register registering dev gpio0 gpio pin register registering dev gpio1 gpint enable disable the interrupt gpio pin register registering dev gpio2 bl602 gpio set intmod gpio pin 115 int ctlmod 1 int trgmod 0 spi test driver register devpath dev spitest0 spidev 0 cst816s register path dev input0 addr 21 bl602 expander set intmod gpio pin 9 int ctlmod 1 int trgmod 0 bl602 irq attach attach 0x2305e596 bl602 irq enable disable interrupt cst816s register driver registered bl602 irq enable enable interrupt nuttshell nsh nuttx 10 2 0 rc0 nsh lvgltest tp init opening dev input0 cst816s open bl602 expander interrupt interrupt callback 0x2305e596 arg 0x42020a70 bl602 expander interrupt call callback 0x2305e596 arg 0x42020a70 cst816s poll notify cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c tbl602 i2c recvdata count 7 temp 0x500 bl602 i2c recvdata count 3 temp 0x1700de ransfer success cst816s get touch data down id 0 touch 0 x 222 y 23 cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 222 cst816s get touch data y 23 cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c tbl602 i2c recvdata count 7 temp 0x500 bl602 i2c recvdata count 3 temp 0x1700de ransfer success cst816s get touch data down id 0 ouch 0 x 222 y 23 cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 222 cst816s get touch data y 23 cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c tbl602 i2c recvdata count 7 temp 0x500 bl602 i2c recvdata count 3 temp 0x1700de ransfer success cst816s get touch data down id 0 touch 0 x 222 y 23 cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 222 cst816s get touch data y 23 cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success cst816s get touch data invalid touch data id 9 touch 2 x 639 y 1688 cst816s get touch data up id 0 touch 2 x 222 y 23 cst816s get touch data id 0 cst816s get touch data flags 0c cst816s get touch data x 222 cst816s get touch data y 23 bl602 expander interrupt interrupt callback 0x2305e596 arg 0x42020a70 bl602 expander interrupt call callback 0x2305e596 arg 0x42020a70 cst816s poll notify cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c tbl602 i2c recvdata count 7 temp 0x500 bl602 i2c recvdata count 3 temp 0xd900db ransfer success cst816s get touch data down id 0 touch 0 x 219 y 217 cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 219 cst816s get touch data y 217 cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c tbl602 i2c recvdata count 7 temp 0x500 bl602 i2c recvdata count 3 temp 0xd900db ransfer success cst816s get touch data down id 0 touch 0 x 219 y 217 cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 219 cst816s get touch data y 217 cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success cst816s get touch data invalid touch data id 4 touch 2 x 636 y 3330 cst816s get touch data up id 0 touch 2 x 219 y 217 cst816s get touch data id 0 cst816s get touch data flags 0c cst816s get touch data x 219 cst816s get touch data y 217 bl602 expander interrupt interrupt callback 0x2305e596 arg 0x42020a70 bl602 expander interrupt call callback 0x2305e596 arg 0x42020a70 cst816s poll notify cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c tbl602 i2c recvdata count 7 temp 0x500 bl602 i2c recvdata count 3 temp 0xdb0022 ransfer success cst816s get touch data down id 0 touch 0 x 34 y 219 cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 34 cst816s get touch data y 219 cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c tbl602 i2c recvdata count 7 temp 0x500 bl602 i2c recvdata count 3 temp 0xdb0022 ransfer success cst816s get touch data down id 0 touch 0 x 34 y 219 cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 34 cst816s get touch data y 219 cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success cst816s get touch data invalid touch data id 4 touch 2 x 636 y 3330 cst816s get touch data up id 0 touch 2 x 34 y 219 cst816s get touch data id 0 cst816s get touch data flags 0c cst816s get touch data x 34 cst816s get touch data y 219 bl602 expander interrupt interrupt callback 0x2305e596 arg 0x42020a70 bl602 expander interrupt call callback 0x2305e596 arg 0x42020a70 cst816s poll notify cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c tbl602 i2c recvdata count 7 temp 0x500 bl602 i2c recvdata count 3 temp 0x180018 ransfer success cst816s get touch data down id 0 touch 0 x 24 y 24 cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 24 cst816s get touch data y 24 cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c tbl602 i2c recvdata count 7 temp 0x500 bl602 i2c recvdata count 3 temp 0x180018 ransfer success cst816s get touch data down id 0 touch 0 x 24 y 24 cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 24 cst816s get touch data y 24 cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success cst816s get touch data invalid touch data id 4 touch 2 x 636 y 3330 cst816s get touch data up id 0 touch 2 x 24 y 24 cst816s get touch data id 0 cst816s get touch data flags 0c cst816s get touch data x 24 cst816s get touch data y 24 bl602 expander interrupt interrupt callback 0x2305e596 arg 0x42020a70 bl602 expander interrupt call callback 0x2305e596 arg 0x42020a70 cst816s poll notify cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c tbl602 i2c recvdata count 7 temp 0x500 bl602 i2c recvdata count 3 temp 0x8d0076 ransfer success cst816s get touch data down id 0 touch 0 x 118 y 141 cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 118 cst816s get touch data y 141 cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c tbl602 i2c recvdata count 7 temp 0x500 bl602 i2c recvdata count 3 temp 0x8d0076 ransfer success cst816s get touch data down id 0 touch 0 x 118 y 141 cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 118 cst816s get touch data y 141 cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success cst816s get touch data invalid touch data id 4 touch 2 x 636 y 3330 cst816s get touch data up id 0 touch 2 x 118 y 141 cst816s get touch data id 0 cst816s get touch data flags 0c cst816s get touch data x 118 cst816s get touch data y 141 tp cal result offset x 23 y 24 range x 194 y 198 invert x y 1 x 0 y 1 let s break down the log enable gpio interrupt at nuttx startup we register the cst816s driver as dev input0 and enable the gpio interrupt text gpio pin register registering dev gpio0 gpio pin register registering dev gpio1 gpint enable disable the interrupt gpio pin register registering dev gpio2 bl602 gpio set intmod gpio pin 115 int ctlmod 1 int trgmod 0 spi test driver register devpath dev spitest0 spidev 0 cst816s register path dev input0 addr 21 bl602 expander set intmod gpio pin 9 int ctlmod 1 int trgmod 0 bl602 irq attach attach 0x2305e596 bl602 irq enable disable interrupt cst816s register driver registered bl602 irq enable enable interrupt nuttshell nsh nuttx 10 2 0 rc0 nsh start lvgl app we run the lvgl test app lvgltest text nsh lvgltest tp init opening dev input0 cst816s open which calls cst816s open https github com lupyuen cst816s nuttx blob main cst816s c l384 l420 to open our cst816s driver the app begins the touchscreen calibration process read touch data the lvgl test app calls cst816s read https github com lupyuen cst816s nuttx blob main cst816s c l328 l382 repeatedly on the cst816s driver to get touch data c bool tp read struct lv indev drv t indev drv lv indev data t data read one sample nbytes read fd sample sizeof struct touch sample s source https github com lupyuen lvgltest nuttx blob main tp c l115 l132 since the screen hasn t been touched and we have no touch data yet our driver returns an error einval c static ssize t cst816s read far struct file filep far char buffer size t buflen int ret einval read the touch data only if screen has been touched or if we re waiting for touch up if priv int pending last event 0 buflen outlen ret cst816s get touch data priv buffer source https github com lupyuen cst816s nuttx blob main cst816s c l336 l370 int pending becomes true when a gpio interrupt gets triggered later last event becomes 0 when we get a touch down event later why do we check int pending to reduce contention on the i2c bus we only read the touch data over i2c when the screen has been touched we ll see this in a while but the lvgl test app really shouldn t call read repeatedly it ought to call poll and block until touch data is available why do we we check last event the touch controller triggers a gpio interrupt only upon touch down not on touch up so after touch down we allow cst816s read https github com lupyuen cst816s nuttx blob main cst816s c l328 l382 to call cst816s get touch data to fetch the touch data repeatedly until we see the touch up event we ll see this in a while trigger gpio interrupt we touch the screen and trigger a gpio interrupt text bl602 expander interrupt interrupt callback 0x2305e596 arg 0x42020a70 bl602 expander interrupt call callback 0x2305e596 arg 0x42020a70 cst816s poll notify the interrupt handler in our driver sets int pending to true c static int cst816s isr handler int irq far void context far void arg far struct cst816s dev s priv far struct cst816s dev s arg irqstate t flags debugassert priv null flags enter critical section priv int pending true leave critical section flags cst816s poll notify priv return 0 source https github com lupyuen cst816s nuttx blob main cst816s c l598 l611 and calls cst816s poll notify https github com lupyuen cst816s nuttx blob main cst816s c l472 l498 to unblock all poll callers and notify them that touch data is available but lvgl test app doesn t poll our driver so this doesn t effect anything touch down event remember that the lvgl test app keeps calling cst816s read https github com lupyuen cst816s nuttx blob main cst816s c l328 l382 repeatedly to get touch data now that int pending is true our driver proceeds to call cst816s get touch data https github com lupyuen cst816s nuttx blob main cst816s c l222 l326 and fetch the touch data over i2c text cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c tbl602 i2c recvdata count 7 temp 0x500 bl602 i2c recvdata count 3 temp 0x1700de ransfer success cst816s get touch data down id 0 touch 0 x 222 y 23 cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 222 cst816s get touch data y 23 the touch data that was read from cst816s over i2c text cst816s get touch data down id 0 touch 0 x 222 y 23 gets returned directly to the lvgl test app as a touch down event text cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 222 cst816s get touch data y 23 cst816s get touch data https github com lupyuen cst816s nuttx blob main cst816s c l222 l326 sets last event to 0 because it s a touch down event cst816s read https github com lupyuen cst816s nuttx blob main cst816s c l372 l382 sets int pending to false touch down event again lvgl test app is still calling cst816s read https github com lupyuen cst816s nuttx blob main cst816s c l328 l382 repeatedly to get touch data now that last event is 0 touch down our driver proceeds to call cst816s get touch data https github com lupyuen cst816s nuttx blob main cst816s c l222 l326 and fetch the touch data over i2c text cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c tbl602 i2c recvdata count 7 temp 0x500 bl602 i2c recvdata count 3 temp 0x1700de ransfer success cst816s get touch data down id 0 ouch 0 x 222 y 23 cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 222 cst816s get touch data y 23 cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c tbl602 i2c recvdata count 7 temp 0x500 bl602 i2c recvdata count 3 temp 0x1700de ransfer success cst816s get touch data down id 0 touch 0 x 222 y 23 cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 222 cst816s get touch data y 23 this happens twice because we haven t received a touch up event touch up event when our finger is no longer touching the screen cst816s get touch data https github com lupyuen cst816s nuttx blob main cst816s c l222 l326 receives a touch up event text cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success cst816s get touch data invalid touch data id 9 touch 2 x 639 y 1688 cst816s get touch data up id 0 touch 2 x 222 y 23 cst816s get touch data id 0 cst816s get touch data flags 0c cst816s get touch data x 222 cst816s get touch data y 23 for touch up events the touch coordinates are invalid text cst816s get touch data invalid touch data id 9 touch 2 x 639 y 1688 the driver patches the touch coordinates with the data from the last touch down event text cst816s get touch data up id 0 touch 2 x 222 y 23 cst816s get touch data id 0 cst816s get touch data flags 0c cst816s get touch data x 222 cst816s get touch data y 23 and returns the valid coordinates to the lvgl test app the patching is done here c static int cst816s get touch data far struct cst816s dev s dev far void buf if touch coordinates are invalid return the last valid coordinates bool valid true if x 240 y 240 iwarn invalid touch data id d touch d x d y d n id touchpoints x y if last event 0xff quit if we have no last valid coordinates ierr can t return touch data id d touch d x d y d n id touchpoints x y return einval valid false id last id x last x y last y remember the last valid touch data last event event last id id last x x last y y set the touch data fields memset data 0 sizeof data data npoints 1 data point 0 id id data point 0 x x data point 0 y y source https github com lupyuen cst816s nuttx blob main cst816s c l258 l282 last event is now set to 1 touch up cst816s read https github com lupyuen cst816s nuttx blob main cst816s c l328 l382 will no longer call cst816s get touch data https github com lupyuen cst816s nuttx blob main cst816s c l222 l326 to fetch the touch data until the screen is touched again screen calibration result when we have touched the 4 screen corners the lvgl test app displays the screen calibration result text tp cal result offset x 23 y 24 range x 194 y 198 invert x y 1 x 0 y 1 which will be used to tweak the touch coordinates in the apps screen is sideways according to the touch data from the lvgl test app our screen is rotated sideways top left x 181 y 12 top right x 230 y 212 bottom left x 9 y 10 bottom right x 19 y 202 so be careful when mapping the touch coordinates we can rotate the display in the st7789 driver but first we need to agree which way is up https twitter com mistertechblog status 1514438646568415232 i2c logging cst816s get touch data https github com lupyuen cst816s nuttx blob main cst816s c l222 l326 won t return any valid touch data unless we enable i2c logging could be an i2c timing issue or race condition with i2c logging enabled we get the touch down event with valid touch data text nsh lvgltest tp init opening dev input0 cst816s open bl602 expander interrupt interrupt callback 0x2305e596 arg 0x42020a70 bl602 expander interrupt call callback 0x2305e596 arg 0x42020a70 cst816s poll notify cst816s get touch data cst816s i2c read bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c transfer success bl602 i2c transfer subflag 0 subaddr 0x0 sublen 0 bl602 i2c transfer i2c tbl602 i2c recvdata count 7 temp 0x500 bl602 i2c recvdata count 3 temp 0x1700de transfer success cst816s get touch data down id 0 touch 0 x 222 y 23 cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 222 cst816s get touch data y 23 with i2c logging disabled we only get the touch up event with invalid touch data text nsh lvgltest tp init opening dev input0 cst816s open bl602 expander interrupt interrupt callback 0x2305e55e arg 0x42020a70 bl602 expander interrupt call callback 0x2305e55e arg 0x42020a70 cst816s poll notify cst816s get touch data cst816s i2c read cst816s get touch data invalid touch data id 9 touch 2 x 639 y 1688 cst816s get touch data can t return touch data id 9 touch 2 x 639 y 1688 bl602 expander interrupt interrupt callback 0x2305e55e arg 0x42020a70 bl602 expander interrupt call callback 0x2305e55e arg 0x42020a70 cst816s poll notify cst816s get touch data cst816s i2c read cst816s get touch data invalid touch data id 9 touch 2 x 639 y 1688 cst816s get touch data can t return touch data id 9 touch 2 x 639 y 1688 this happens before and after we have reduced the number of i2c transfers by checking gpio interrupts via int pending the workaround is to call i2cwarn in the bl602 i2c driver https github com lupyuen incubator nuttx blob touch arch risc v src bl602 bl602 i2c c to force this specific log to be printed c static int bl602 i2c transfer struct i2c master s dev struct i2c msg s msgs int count if priv i2cstate ev i2c end int i2cinfo i2c transfer success n ifdef config input cst816s workaround for cst816s see https github com lupyuen cst816s nuttx i2c logging i2cwarn i2c transfer success n endif config input cst816s source https github com lupyuen incubator nuttx blob touch arch risc v src bl602 bl602 i2c c l753 l761 after patching the workaround we get the touch down event with valid touch data text nsh lvgltest tp init opening dev input0 cst816s open bl602 expander interrupt interrupt callback 0x2305e55e arg 0x42020a70 bl602 expander interrupt call callback 0x2305e55e arg 0x42020a70 cst816s poll notify cst816s get touch data cst816s i2c read bl602 i2c transfer i2c transfer success bl602 i2c transfer i2c transfer success cst816s get touch data down id 0 touch 0 x 200 y 26 cst816s get touch data id 0 cst816s get touch data flags 19 cst816s get touch data x 200 cst816s get touch data y 26 lorawan test app lorawan test also tested ok with the patch todo investigate the internals of the bl602 i2c driver https github com lupyuen incubator nuttx blob touch arch risc v src bl602 bl602 i2c c look for i2c timing issues or race conditions todo probe the i2c bus with a logic analyser watch for i2c hardware issues todo why must we disable logging eventually we must disable config debug info informational debug output because the lorawan test app lorawan test fails when config debug info is enabled due to lorawan timers todo lorawan test app lorawan library sx1262 library nimble porting layer spi test driver should have their own flags for logging todo move cst816s interrupt handler to bl602 gpio expander https github com lupyuen bl602 expander todo implement spi dma on nuttx so that the touchscreen feels less laggy todo add a button https docs lvgl io 7 11 get started quick overview html button with label and a message box to the lvgl test app lvgltest https github com lupyuen lvgltest nuttx blob main lvgltest c l110 l198 to demo the touchscreen
bl602 bl604 cst816s i2c nuttx pinecone pinedio riscv32
os
learning-gcp
learning gcp learning how to get better with google cloud platform technologies
cloud
libgp
getting started libgp is a c library for gaussian process regression a gaussian process defines a distribution over functions and inference takes place directly in function space it is fully specified by a mean function and a positive definite covariance function this library uses two types of covariance functions simple and composite composite functions can be composed of other composite functions allowing flexible structures building the code follow the standard cmake method of building mkdir build cd cmake make testing the build once everything is built you can check that all works fine with the following tests cd tests gptest and running an example cd examples gpdense which should return a mse building the documentation there are doxygen comments in the header files to compile make sure you have doxygen installed mkdir doc cd doxygen doxygen doxyfile open doc html index html with your favorite browser for the documentation if you want a pdf go into latex and run pdflatex refman tex implemented covariance functions simple covariance functions linear covariance function linear covariance function with automatic relevance detection matern covariance function with nu 1 5 and isotropic distance measure matern covariance function with nu 2 5 and isotropic distance measure independent covariance function white noise radial basis covariance function with compact support isotropic rational quadratic covariance function squared exponential covariance function with automatic relevance detection squared exponential covariance function with isotropic distance measure composite covariance functions sums of covariance functions mean function the mean function is fixed to zero training a model fig1 jpg fig2 jpg fig3 jpg fig4 jpg initialize the model by specifying the input dimensionality and the covariance function gaussianprocess gp 2 covsum covseiso covnoise set log hyperparameter of the covariance function see the doxygen documentation parameters should be given in order as listed gp covf set loghyper params add data to the training set input vectors x must be provided as double and targets y as double gp add pattern x y predict value or variance of an input vector x f gp f x v gp var x read and write use write function to save a gaussian process model and the complete training set to a file void write const char filename a new instance of the gaussian process can be instantiated from this file using the following constructor gaussianprocess const char filename advanced topics hyper parameter optimization custom covariance functions the libgp file format hyper parameter optimization this library contains two methods for hyper parameter optimization the conjugate gradient method and rprop resilient backpropagation we recommend using rprop for an example of how to call the optimizers see test optimizer cc reasons for using rprop can be found in blum riedmiller 2013 optimization of gaussian process hyperparameters using rprop european symposium on artificial neural networks computational intelligence and learning requirements cmake http www cmake org cross platform open source build system eigen3 http eigen tuxfamily org template library for linear algebra googletest http code google com p googletest optional release notes 2012 10 11 version 0 1 4 log likelihood function and gradient computation hyper parameter optimization using rprop online updates of the cholesky decomposition 2011 09 28 version 0 1 3 improved organization of training data improved interfaces 2011 06 03 version 0 1 2 added matern5 covariance function added isotropic rational quadratic covariance function added function to draw random data according to covariance function 2011 05 27 version 0 1 1 google tests added added matern3 covariance function various bugfixes 2011 05 26 version 0 1 0 basic functionality for standard gp regression most important covariance functions implemented capability to read and write models to disk
ai
notes
contenido algoritmos algoritmos de ordenamiento https github com bautistaj notes blob master algoritms sort note md arquitectura de software fundamentos de arquitectura de software https github com bautistaj notes blob master architecture basic note md crecimiento profesional gesti n de tiempo https github com bautistaj notes blob master crecimiento gestiontiempo note md estrategias de estudio https github com bautistaj notes blob master crecimiento estrategias note md base de datos fundamentos de bases de datos https github com bautistaj notes blob master bd bd apuntes md sql y mysql https github com bautistaj notes blob master bd bdbasic apuntes md mongodb https github com bautistaj notes blob master bd mongo note md backend fundamentos de apirest https github com bautistaj notes blob master back end api rest note md ecmascript https github com bautistaj notes blob master back end javascript ecmas6 md node https github com bautistaj notes blob master back end node note md expresiones regulares https github com bautistaj notes blob master back end regular expressions note md frontend web components https github com bautistaj web components pwa con react https github com bautistaj notes blob master front end react pwa note md preprocesadores https github com bautistaj notes blob master front end preprocesadores note md servidores administraci n de servidores linux https github com bautistaj notes blob master server linux server note md aws https github com bautistaj notes blob master server aws note md digitalocean https github com bautistaj notes blob master server digitalocean note md ingl s curso de ingl s pr ctico fundamentos https github com bautistaj notes blob master english note md curso de ingl s pr ctico gram tica https github com bautistaj notes blob master english curso 20de 20ingl c3 a9s 20pr c3 a1ctico 20gram c3 a1tica note md
database javascript react
front_end
Embedded_System_Design_S19
esds19 labs spring 2019 this repository is fully owned by rushi james macwan all content on this repository is solely the work of rushi james macwan however all external support and guidance taken in completing the work available on this repository has been clearly cited and credited as per the course guidelines this repository contains all the embedded system design ecen 5613 lab submissions of rushi james macwan the content on this repository should not be used for any purpose without acquiring prior permission from the owner rushi james macwan is not liable for any misuse of the content available on this repository thank you repository overview the embedded system design lab projects involve integrating a wide spectrum of hardware software elements with massive integration of mcus memories plds etc while equal emphasis is laid on the software elements that drive the system e g interrupts schedulers timers peripheral interfaces etc the course capstone project will use over the air data transmission mechanism to send refined and reliable data to a cloud foundry for providing cloud services the project stretches from a simple pcb design to a completely working prototype of a system delivering quality service detailed course topics board bring up involving c8051 oscillators decoupling caps reset circuitry latches splds memory interfaces e g nvram sram lcd interface and device driver isp programming rs 232 interfacing atmel flip utility intra board communications i e uart i2c spi wire wrapping etc detailed project topics integrated sensor network cloud services exhaustive bare metal firmware development using ti msp432 cc3120 boost and sensor fusion coupled with ibm cloud services
os
Research_Serverless-FastAPI-Deployment
research serverless fastapi deployment deployment of fastapi with serverless computing and operational efficiency in cloud native environments korea conference on software engineering 2023
cloud
dusk-blockchain
dusk network node official golang reference implementation of the dusk network protocol actions status https github com dusk network dusk blockchain workflows continuous 20integration badge svg https github com dusk network dusk blockchain actions codecov https codecov io gh dusk network dusk blockchain branch master graph badge svg https codecov io gh dusk network dusk blockchain go report card https goreportcard com badge github com dusk network dusk blockchain style flat square https goreportcard com report github com dusk network dusk blockchain pkggodev https pkg go dev badge github com dusk network dusk blockchain https pkg go dev github com dusk network dusk blockchain specification requirements the following requirements are defined for running an active dusk node depending on the role your node plays and how much functionality it exposes the utilization of the node might vary significantly minimum specifications cpu ram storage network connection 4 cores 2 ghz 4 gb 100 gb 10 mbps recommended specifications cpu ram storage network connection 8 cores 2 ghz 8 gb 250 gb 25 mbps installation guide this guide is for building the node from source if you would like to just download the compiled program head over to the releases https github com dusk network dusk blockchain releases page which should include a pre built dusk node and a pre built wallet executable note this guide assumes you are building and running from a unix like operating system the node is not tested on windows requirements go https golang org 1 17 or newer installation download the codebase and navigate into the folder bash git clone git github com dusk network dusk blockchain git cd dusk blockchain get the project dependencies by running bash go get github com dusk network dusk blockchain to build the binary simply run bash make build finally to start your node type bash bin dusk config dusk toml wallet the wallet is hosted in a separate repository found here https github com dusk network wallet cli how to use the wallet for more information on how to install configure and run the cli wallet see the documentation here https github com dusk network wallet cli tree main src bin rusk rusk is an important separate service that should be ran next to the node rusk is a powerful wrapper around the vm execution engine that provides the genesis contracts and gives the vm access to host functions rusk is hosted in a separate repository found here https github com dusk network rusk how to use rusk for more information on how to install configure and run the rusk see the documentation here https github com dusk network rusk readme license the dusk network blockchain client is licensed under the mit license see the license file license for details contributing please see the contribution guidelines contributing md for details
blockchain cryptocurrency golang
blockchain
backyard-design-system
backyard https www lowes com middot github license https img shields io badge license mit blue svg https github com bryantjderosier templates blob main license prs welcome https img shields io badge prs welcome brightgreen svg https www lowes com a target blank href https github com lowes backyard design system img alt backyard logo src imgs backyard logo png a br backyard is lowe s open source design system for digital products and experiences with the backyard design tokens as its foundation the system consists of working code design tools and resources human interface guidelines and a vibrant community of contributors br guiding principles backyard is open the design system is a distributed effort guided by the principles of the open source movement backyard s users are also its makers and everyone is encouraged to contribute backyard is inclusive it s designed and built to be accessible to all regardless of ability or situation backyard is modular and flexible the system s modularity ensures maximum flexibility in execution its components are designed to work seamlessly with each other in whichever combination suits the needs of the user backyard builds consistency based on the backyard design tokens every element and component of backyard was designed from the ground up to work elegantly together to ensure consistent cohesive user experiences br ecosystem project status description tokens tokens status tokens pkg the styling engine for backyard icons icons status icons pkg icon pack as react components react react status react pkg react ui component library tokens https github com lowes backyard design system icons https github com lowes backyard design system react https github com lowes backyard design system tokens status https img shields io npm v lowes tech bds tokens svg icons status https img shields io npm v lowes tech bds icons svg react status https img shields io npm v lowes tech bds react svg tokens pkg https www npmjs com package lowes tech bds tokens icons pkg https www npmjs com package lowes tech bds icons react pkg https www npmjs com package lowes tech bds react br installation npm bash npm install save dev lowes tech bds tokens lowes tech bds icons lowes tech bds react yarn bash yarn add d lowes tech bds tokens lowes tech bds icons lowes tech bds react pnpm bash pnpm add d lowes tech bds tokens lowes tech bds icons lowes tech bds react br setup create the theme provider jsx import react from react import themeprovider from lowes tech bds react const app props return themeprovider theme light props children themeprovider add global styles js globals ts import createglobalstyle from styled components import fonts from lowes tech bds react import themevariables from lowes tech bds tokens const globalstyles createglobalstyle themevariables fonts other css global code eg resets export globalstyles export default globalstyles jsx app tsx import react from react import themeprovider from lowes tech bds react import globalstyles from globals const app props return themeprovider theme light font roboto globalstyles props children themeprovider export app export default app br contributing the main purpose of this repository is to continue evolving backyard making it faster and easier to use development of backyard happens in the open on github and we are grateful to the community for contributing bugfixes and improvements read below to learn how you can take part in improving react code of conduct code of conduct md backyard has adopted a code of conduct that we expect project participants to adhere to please read the full text code of conduct md so that you can understand what actions will and will not be tolerated contributing guide contributing md read our contributing guide contributing md to learn about our development process how to propose bugfixes and improvements and how to build and test your changes to backyard good first issues https github com bryantjderosier templates labels good 20first 20issue to help you get your feet wet and get you familiar with our contribution process we have a list of good first issues https github com bryantjderosier templates labels good 20first 20issue that contain bugs that have a relatively limited scope this is a great place to get started license backyard is mit licensed license
os
composer
hyperledger composer warning warning warning as of the 29th august 2019 the hyperledger composer project is in deprecated status none of the maintainers are actively developing new features none of the maintainers are actively providing support via github issues however if you wish to submit code changes via pull requests these will be merged it is highly recommended that you use hyperledger fabric v1 4 instead which features significant improvements to the developer experience including a new programming model more information available here what s new in hyperledger fabric v1 4 https hyperledger fabric readthedocs io en release 1 4 whatsnew html improved programming model for developing applications warning warning warning hyperledger composer is an application development framework which simplifies and expedites the creation of hyperledger fabric https hyperledger fabric readthedocs io en latest blockchain applications if you re new to blockchain hyperledger fabric or hyperledger composer we recommend that you start at the hyperledger composer website stable release website https hyperledger github io composer next release website https hyperledger github io composer next this site will help you get up and running by developing a sample blockchain application to buy and sell houses and apartments in a digital property business network build status https travis ci org hyperledger composer svg branch master https travis ci org hyperledger composer cii best practices https bestpractices coreinfrastructure org projects 1071 badge https bestpractices coreinfrastructure org projects 1071 for additional help with hyperledger composer the following are good places ask a question on stack overflow http stackoverflow com questions tagged hyperledger composer chat on the rocket chat discussion channels https chat hyperledger org channel composer contributing to this repository we welcome contributions to the code base if you are interested in becoming a contributor please read the contributing guide contributing md that covers the following getting started contrib notes getting started md coding guidelines contrib notes coding guidelines md raising an issue contrib notes raising issues md submitting a pull request contrib notes submitting pull request md there is a specific channel https chat hyperledger org channel composer dev on rocketchat for contributors getting started with building an application try the online playground https composer playground mybluemix net to get going quickly suggested reading list is introduction https hyperledger github io composer latest introduction introduction html introduction video https www youtube com watch v fdfusrsv5iw t 23s quick start https hyperledger github io composer latest installing installing index html tutorials https hyperledger github io composer latest tutorials tutorials html getting in touch if you have a question on using hyperledger composer rocket chat discussion channels https chat hyperledger org channel composer stack overflow http stackoverflow com questions tagged hyperledger composer where the question should be tagged with hyperledger composer if you have found a defect or want to raise a feature requests all tracked on github please read how to raise contrib notes raising issues md if you want to contribute to the develop of hyperledger composer come introduce yourself on the contributors rocketchat channel https chat hyperledger org channel composer dev please read the contributing guide contributing md license a name license a hyperledger project source code files are made available under the apache license version 2 0 apache 2 0 located in the license file hyperledger project documentation files are made available under the creative commons attribution 4 0 international license cc by 4 0 available at http creativecommons org licenses by 4 0
hyperledger composer blockchain distributed-ledger
blockchain
react-starter
react starter create react app https create react app dev eslint https eslint org babel https babeljs io typescript jest postcss npm prettier https prettier io eslint husky https github com typicode husky lintstaged https www npmjs com package lint staged eslint commitlint https github com conventional changelog commitlint husky git commit message gh pages https www npmjs com package gh pages github pages https pages github com css in js styled components https www styled components com react router https reacttraining com favicon favorite icons https realfavicongenerator net public splash splash https appsco pe developer splash screens public splash components containers pages yarn vscode eslint prettier vscode commitizen markdownlint react starter https colors ichuantong cn github https stars yangerxiao com repo https github com zerosoul chinese colors https works yangerxiao com honeyed words generator https works yangerxiao com breathe relaxer https works yangerxiao com strong password generator http works yangerxiao com http works yangerxiao com
boilerplate starter-kit react
front_end
Machine-Learning-Flappy-Bird
machine learning for flappy bird using neural network and genetic algorithm here is the source code for a html5 project that implements a machine learning algorithm in the flappy bird video game using neural networks and a genetic algorithm the program teaches a little bird how to flap optimally in order to fly safely through barriers as long as possible the complete tutorial with much more details and demo you can find here http www askforgametask com tutorial machine learning algorithm flappy bird http www askforgametask com tutorial machine learning algorithm flappy bird here you can also watch a short video with a simple presentation of the algorithm https www youtube com watch v aewmdojejf0 https www youtube com watch v aewmdojejf0 all code is written in html5 using phaser framework http phaser io and synaptic neural network library https synaptic juancazala com for neural network implementation flappy bird screenshot https raw githubusercontent com ssusnic machine learning flappy bird master screenshots flappy 10 png flappy bird screenshot neural network architecture to play the game each unit bird has its own neural network consisted of the next 3 layers 1 an input layer with 2 neurons presenting what a bird sees 1 horizontal distance between the bird and the closest gap 2 height difference between the bird and the closest gap 2 a hidden layer with 6 neurons 3 an output layer with 1 neuron used to provide an action as follows if output 0 5 then flap else do nothing flappy bird neural network https raw githubusercontent com ssusnic machine learning flappy bird master screenshots flappy 06 png flappy bird neural network there is used synaptic neural network library https synaptic juancazala com to implement entire artificial neural network instead of making a new one from the scratch the main concept of machine learning the main concept of machine learning implemented in this program is based on the neuro evolution form it uses evolutionary algorithms such as a genetic algorithm to train artificial neural networks here are the main steps 1 create a new population of 10 units birds with a random neural network 2 let all units play the game simultaneously by using their own neural networks 3 for each unit calculate its fitness function to measure its quality as fitness total travelled distance distance to the closest gap flappy bird fitness https raw githubusercontent com ssusnic machine learning flappy bird master screenshots flappy 08 png flappy bird fitness 4 when all units are killed evaluate the current population to the next one using genetic algorithm operators selection crossover and mutation as follows 1 sort the units of the current population in decreasing order by their fitness ranking 2 select the top 4 units and mark them as the winners of the current population 3 the 4 winners are directly passed on to the next population 4 to fill the rest of the next population create 6 offsprings as follows 1 offspring is made by a crossover of two best winners 3 offsprings are made by a crossover of two random winners 2 offsprings are direct copy of two random winners 5 to add some variations apply random mutations on each offspring 5 go back to the step 2 implementation requirements since the program is written in html5 using phaser framework http phaser io and synaptic neural network library https synaptic juancazala com you need these files phaser min js synaptic min js gameplay js the entire game logic is implemented in gameplay js file it consists of the following classes app main the main routine with the following essential functions preload to preload all assets create to create all objects and initialize a new genetic algorithm object update to run the main loop in which the flappy bird game is played by using ai neural networks and the population is evolved by using genetic algorithm drawstatus to display information of all units treegroup class extended phaser group class to represent a moving barrier this group contains a top and a bottom tree sprite tree class extended phaser sprite class to represent a tree sprite bird class extended phaser sprite class to represent a bird sprite text class extended phaser bitmaptext class used for drawing text genetic js the genetic algorithm is implemented in genetic js file which consists of the following class geneticalgorithm class the main class to handle all genetic algorithm operations it needs two parameters max units to set a total number of units in population and top units to set a number of top units winners used for evolving population here are its essential functions reset to reset genetic algorithm parameters createpopulation to create a new population activatebrain to activate the ai neural network of an unit and get its output action according to the inputs evolvepopulation to evolve the population by using genetic operators selection crossover and mutations selection to select the best units from the current population crossover to perform a single point crossover between two parents mutation to perform random mutations on an offspring
machine-learning flappy-bird genetic-algorithm machine-learning-algorithm artificial-intelligence neuroevolution ai-tutorial ai artificial-evolution game-programming html5 javascript phaser phaser-tutorial machinelearning machine-intelligence neural-networks neural-network genetic-algorithms flappybird
ai
CPS353
cps353 software engineering cloud based business website
cloud
IIT
iit introduction to information technology
server
embedded
embedded lab code from embedded system design class
os
DL-ML-project
dl ml project deep learning machine learning project content convolutional extractor https github com roguesir dl project tree master convolutional extractor expression recognition https github com roguesir dl project tree master expression recognition face scoring https github com roguesir dl project tree master face scoring web recruitment data analysis https github com roguesir dl project tree master web recruitment data analysis sentiment analysis for comment data https github com roguesir dl ml project tree master sentiment analysis for comment data note tensorflow
deep-learning
ai
Papers4NLPwithLLM
downstream tasks with llm dataset 1 is gpt 3 a good data annotator paper https arxiv org abs 2212 10450 2 chatgpt outperforms crowd workers for text annotation tasks paper https arxiv org abs 2303 15056 3 exploiting asymmetry for synthetic training data generation synthie and the case of information extraction paper https arxiv org pdf 2303 04132 pdf ie 1 zcode4struct code generation for few shot structured prediction from natural language paper https arxiv org pdf 2210 12810 pdf 2 exploiting asymmetry for synthetic training data generation synthie and the case of information extraction paper https arxiv org pdf 2303 04132 pdf 3 zero shot information extraction via chatting with chatgpt paper https arxiv org abs 2302 10205 code https github com cocacola lab chatie 4 large language model is not a good few shot information extractor but a good reranker for hard samples paper https arxiv org abs 2303 08559 5 thinking about gpt 3 in context learning for biomedical ie paper https aclanthology org 2022 findings emnlp 329 code https github com dki lab few shot bioie 6 lauge language models are fewshot clinical information extractors paper https arxiv org abs 2205 12689 8 zero shot clinical entity recognition using chatgpt paper https arxiv org pdf 2303 16416v1 pdf 9 yes but can chatgpt identify entities in historical documents paper https arxiv org abs 2303 17322 10 evaluation of chatgpt on information extraction paper https github com ridonghan evaluation of chatgpt on information extraction 11 improving chatbot responses with information extraction paper https ai fools com knowledge information extraction 12 https huggingface co spaces shad0ws information extraction with chatgpt https huggingface co spaces shad0ws information extraction with chatgpt 13 gpt4ie code https github com cocacola lab gpt4ie summarization 1 chatgpt as a factual inconsistency evaluator for abstractive text summarization paper https arxiv org abs 2303 15621 2 large language models are diverse role players for summarization evaluation paper https arxiv org abs 2303 1507 machine translation 1 towards making the most of chatgpt for machine translation paper https arxiv org pdf 2303 13780 pdf 2 error analysis prompting enables human like translation evaluation in large language models a case study on chatgpt paper https arxiv org pdf 2303 13809 pdf error correction 1 chatgpt or grammarly evaluating chatgpt on grammatical error correction benchmark paper https arxiv org pdf 2303 13648 pdf 2 an analysis of gpt 3 s performance in grammatical error correction paper https arxiv org abs 2303 14342 others 1 is chatgpt a general purpose natural language processing task solver 2 comprehensive capability analysis of gpt 3 and gpt 3 5 seriesmodels paper https arxiv org ftp arxiv papers 2303 2303 10420 pdf 3 chatdoctor a medical chat model fine tuned on llama model using medical domain paper https arxiv org pdf 2303 14070v1 pdf
ai
auvsi-cv-all
auvsi cv all all files for the online computer vision training for auvsi foundation teams this entry contains all of the material for the online computer vision training for auvsi foundation teams
ai
tutorials
h2020 waziup wazihub projects waziup open innovation platform for iot big data in sub saharan africa fev2016 jan2019 wazihub accelerating open iot and big data innovation in africa may2018 apr2021 waziup is an h2020 european collaborative research project using cutting edge technological research applications on iot and related big data management and advanced analytic issues the project brings liaison with the whole iot european research cluster ierc and leading research and development organizations in africa the project is driven by a consortium of 5 eu partners and of 7 partners from 4 sub saharan african countries it has support from multiple african stakeholders and public bodies with the aim of defining new innovation space to advance the african rural economy the potential of iot in sub saharan africa can only be realized if the cost is resolvable as most of the rural population in the africa is at the poverty level waziup will take this challenge as the main one to be addressed wazihub in swahili for open hub is an innovation project for africa aiming to create an open hub of iot and big data cutting edge and african grade solutions co designed by african people the vision of wazihub is to exploit iot potential and share iot technologies best practices through the involvement of innovation communities and stakeholder e g young entrepreneur including woman startup developer innovation hub from local district regional national and african wide the project aims to enable the creation of open hubs throughout africa where iot technology solutions can then be adapted to match local service needs the project goal is to iterate and extract value from spining off value added iot innovative services e g monitoring controlling data analytic based on the technologies developed in waziup online tutorial new online arduino sensor lora tutorial http diy waziup io for training hackathons bootcamps entrepreneur s days here is the direct link to the lora part http diy waziup io sensors lora sensor lora sensor html slide based tutorials this set of step by step tutorials is part of wp2 on open iot sensing and communication platform describing in images how to build low cost iot devices and gateways using the lora radio technology iot lpwan iot4all intro lr is a vulgarisation presentation showing how the iot revolution can be made possible for everybody it reviews the main iot technologies then presents the waziup s approach for low cost iot the presentation also shows various real world deployment campaigns done in the context of waziup for latest version see f iot 2a and f iot 2b courses on http diy waziup io http diy waziup io index html waziup iot courses lpwan review pdf is a review of lpwan technologies focusing mainly on lora for latest version see a iot 1 course on http diy waziup io http diy waziup io index html waziup iot courses smyle deploying low cost iot pdf is a summary of lpwan iot and waziup s objectives when deploying low cost and long range internet of things in developing countries this presentation has been made at smyle event en september 2016 devices demo slides pdf is a set of slides that we use for demonstrating our low cost lora iot framework faq pdf is our low cost lora framework frequently asked questions it is advised to read it first low cost iot hardware parts pdf lists all the parts you need to build both the low cost device and the gateway low cost lora iot step by step pdf shows how to build a simple lora iot sensing device with the simple temperature example and an arduino pro mini running on 4 aa battery for several months see also d iot 1 course on http diy waziup io http diy waziup io index html waziup iot courses low cost lora iot outdoor step by step pdf shows how you can improve the design for out door usage low cost lora device leaflet pdf is a leaflet summarizing the iot device side low cost lora iot supported sensors pdf explains in a didactic manner how physical sensors can be connected and how they can be integrated into our generic framework low cost lora collar pdf shows a cattle rustling use case where lora collar is used for preventing cattle rustling there are 2 versions a simple beacon version using only the lora radio module and a gps version with a gps module ublox 6 7 8 where gps coordinates of the collar can be received the arduino code for the collar is available here simple beacon https github com congducpham lowcostloragw tree master arduino arduino lora simple beaconcollar and gps https github com congducpham lowcostloragw tree master arduino arduino lora gps low cost lora imageiot step by step pdf shows how you can build a long range image sensor based on the teensy32 and a 4d system ucamii camera it also explains how the gateway receives and provides display features of the transmitted images gateway low cost lora gw step by step pdf show how you can build and configure the lora gateway with a raspberry pi to start pushing data to the cloud this tutorial explains in more details the internals and architecture of the low cost gateway see also see d gw 1 and d gw 2 courses on http diy waziup io http diy waziup io index html waziup iot courses low cost lora gw web admin pdf explains the web admin interface extension to easily configure and update your gateway for latest version see d gw 4 course on http diy waziup io http diy waziup io index html waziup iot courses low cost lora gw leaflet pdf is a leaflet summarizing the gateway side low cost lora iot antennacable pdf is a tutorial on how to assemble an antenna cable with sma and or n connectors to match your antenna and radio module connectors this is mainly required when you want to use a higher gain antenna or when you want to place the antenna outdoor and have your gateway indoor to simplify deployment for latest version see d gw 3 course on http diy waziup io http diy waziup io index html waziup iot courses waziup deployment guidelines pdf describes some deployment issues and best practices when deploying iot and gateways for latest version see d iot 2 course on http diy waziup io http diy waziup io index html waziup iot courses talks ressacs16 low cost lora iot step by step pdf shows how you can use the gateway program to have a simple interactive sender node and how you can push data to cloud this tutorial is kind of complementory to the low cost lora iot step by step pdf tutorial low cost lora ghana ispace public event pdf is an other presentation on how to build low cost iot with our framework low cost lora iot using demo kit pdf explains how the waziup long range demo kit can be used for demonstration purposes it will show how to use the out of the box gateway distribution raspberrypi jessie waziup demo iso zip http cpham perso univ pau fr lora waziup raspberrypi jessie waziup demo iso zip and the arduino lora simple temp sketch https github com congducpham lowcostloragw tree master arduino arduino lora simple temp example video tutorials there are also 3 tutorial videos on youtube build your low cost long range iot device with waziup https www youtube com watch v yskbjeeav m extreme low cost low power lora iot for real world deployment https www youtube com watch v 2 vqpccwdd8 build your low cost lora gateway with waziup https www youtube com watch v mj8itka14py that show in images all the steps to build the whole framework from scratch enjoy c pham university of pau uppa
server
18ECO108J-Arduino
embedded system design using arduino sri ramaswami memorial srm ist all lab practicals of subject 18eco108j embedded system design using arduino software used proteus ide used arduino index 1 blink led 2 seven segment display 3 lcd display interface 4 temperature sensor interface with led indicator 5 servo motor interface 6 dc motor interface 7 stepper motor interface 8 serial communication watch implementation videos of all above mentioned projects https youtube com playlist list pl n i vft7fvzojkxnwd eledgb2zthb2
os
CalcHub
h1 align center calchub h1 h3 align center this open source repository is all about making new calculators on different aspects of mathematics h3 div align center a href https github com vasu 1 calchub img src https sloc xyz github vasu 1 calchub alt total lines a a href https github com vasu 1 calchub img src https img shields io github stars vasu 1 calchub alt stars a a href https github com vasu 1 calchub network members img src https img shields io github forks vasu 1 calchub alt forks a a href https github com vasu 1 calchub graphs contributors img alt github contributors src https img shields io github contributors vasu 1 calchub color 2b9348 a a href https github com vasu 1 calchub img src https badges frapsoft com os v2 open source svg alt open source a a href https open vscode dev organization repository img src https open vscode dev badges open in vscode svg alt open in visual studio code a div tech stack html https img shields io badge html5 20 23e34f26 svg style for the badge logo html5 logocolor white css https img shields io badge css3 20 231572b6 svg style for the badge logo css3 logocolor white js https img shields io badge javascript 20 23323330 svg style for the badge logo javascript logocolor 23f7df1e img alt bootstrap src https img shields io badge bootstrap 23563d7c svg style for the badge logo bootstrap logocolor white getting started fork this repository clone the repository git clone url of this repo raise an issue wait for the issue to be assigned to you create a branch git checkout b your new branch name put your code make a new folder in calculators folder put your code files e g index html style css app js into your newely created folder add all neccessary information like functionalities screenshots working video if required in the readme md file you will have to create it in your newely created folder add your folder s link in the main readme md file of the repo push changes to github git push origin add your branch name submit your changes for review and boom you re done i will review and merge your changes into the master branch of this project you will be automatically notified via e mail once the changes have been merged calculators list no name of calculator 1 affine cipher calculator calculators affine cipher calculator 2 age calculator calculators age calculator 3 arithmatic equation solver calculators arithmatic 20equation 20solver 4 basex calculator calculators basex calculator 5 basic physics calculator calculators basic physics 20calculator 6 binary calculator calculators binary calculator 7 binomial coefficient calculator calculators binomial coeff calc 8 bitwise calculator calculators bitwise calculator 9 bmi calculator calculators bmi calculator 10 calculators electric vs petrol economy price calculator calculators electric 20vs 20petrol 20economy 20price 20calculator 11 catalan number calculator calculators catalan number calculator 12 color code converter calculators color code converter 13 complex number calculator calculators complex number calculator 14 compound interest calculator calculators compoundinterest calculator 15 covid calculator calculators covid 20calculator 16 cubic equation calculator calculators cubic equation calc 17 currency calculator calculators currency calculator 18 daily water intake calculator calculators daily water intake calculator 19 day calculator calculators daycalculator 20 digital storage calculator calculators digital storage calculator 21 divisors calculator calculators divisors 20calculator 22 emi calculator calculators emi calculator 23 employee salary calculator calculators employee 20salary 20calculator 24 euclidean algorithm calculator lcm gcd calculators euclidean algorithm calculator 25 fraction calculator calculators fraction 20calculator 26 gpa calculator calculators gpa calculator index html 27 grade calculator calculators grade calculator 28 graph calculator calculators graph 20calculator 29 kinetic theory of gases calculator calculators kinetic theory of gases calculator 30 log antilog calculator calculators log calculator 31 matrix adjoint and inverse calculator calculators matrix 20adjoint 20and 20inverse 20calculator 32 modulo inverse modulo calculator calculators modulo inverse modulo calculator 33 multiplication table calculators multiplication table 34 number system conversation calculators numbersystemcoversion calculator 35 permutation and combination calculator calculators permutation 20and 20combination 20calculator 36 prime factor calculator calculators primefactors calculator 37 quadratic equation calculator calculators quadratic equation calculator 38 resonant frequency calculator calculators resonant frequency calculator 39 roman to integer calculator calculators roman to integer calculator 40 scientific calculator calculators scientific calculator 41 simple arithmetic calculators simple arithmetic 42 simple interest calculator calculators simple interest 43 simple matrix calculator calculators matrix operations calculator 44 sleep calculator calculators sleep calculator 45 statistics calculator calculators statistics 20calculator 46 stock profit and loss calculator calculators stock profit 20 and loss calculator 47 tax calculator calculators incometaxcalculator 48 temperature calculator calculators temperature calculator 49 tip calculator calculators tip calculator 50 transpose 3x3 matrix calculator calculators transpose 3x3 matrix calculator 51 twin paradox calculator calculators twin paradox 20calculator 52 unit converter calculators unit 20converter 53 vat calculator calculators vat calculator 54 vigenere cipher calculator calculators vigenerecipher calculator 55 visual type scale calculator calculators visual 20type 20scale 20calculator 56 volume calculator calculators volume 20calculator 57 wind chill calculator calculators wind chill calculator 58 pythagoras calculator calculators pythagorus 20calculator 59 numerical methods calculator calculators numerical methods calculator 60 time remaining calculator calculators time 20remaining contributing guidelines read our contributing guidelines github contributingguidelines md to learn about our development process how to propose bugfixes and improvements and how to build to calchub code of conduct this project and everyone participating in it is governed by the code of conduct code of conduct md by participating you are expected to uphold this code credits this project exists thanks to all the people who contribute a href graphs contributors img src https contrib rocks image repo vasu 1 calchub a contribution is fun forthebadge https forthebadge com images badges built with love svg https forthebadge com
calculators open-source html css javascript canva collaborate ghdesktop github github-campus-experts github-pages icons8 namecheap tech bootstrap hacktoberfest hacktoberfest-accepted
front_end
BasicDSPLibrary
basicdsplibrary this is a library of basic dsp functions intended to be used to create various filters primarily intended to be used with embedded systems as an arduino library this is in active use reducing noise on joystick and potentiometer inputs filter builders this implements some simple libraries that let users build fir and iir filters by just inputting coefficients
os
app
kickback dapp dev version https rinkeby kickback events live version https kickback events dev guide currently only frontend is open sourced if you are working on the integration and need to connect to our backend please contact kickback team contribution style guide at kickback we adhere to some basic front end coding style guides so if you are looking to contribute please follow these rules styles styles are added via the library emotion which is a styled component style library you can add a new styled component using the emotion styled library js import styled from emotion styled const calltoaction styled div background green color red styles can be kept in the same file as the component you re adding them into unless they are being reused somewhere else in that case they can be abstracted out of the file and imported like a normal component components components need to be inside the components folder we are currently migrating to functional components and hooks so all new components must be functional components and use hooks if they need state or access to external apis do not use arrays to group sibling jsx instead use the short fragment syntax js function component return button this is a button button othercomponent importing and exporting for imports we use the es6 import syntax over commonjs we first import all npm packages first such as react emotion or apollo then we group imports based on what they are and separate them with a space js import react from react import usequery from react apollo import get profile from graphql queries import set profile from graphql mutations import button from components button for exports we generally have one default export for components if it makes sense to have multiple exports in a component we generally separate than component into a new file the only exception is a group of reusable styled components that have no specific hierarchy such as buttons or inputs js normal component export default function somecomponent component internals js reusable components export const button1 styled button export const button2 styled button variable declaration generally we use const for everything especially styled components any variable that needs to be reassigned will use let var is not used at all throughout the code base all variables are camel cased unless they are a constant in which they capitalised and snake cased if the constant is used throughout the project abstract the constant to an appropriate file level that all components or files can import it without going up and down the file try should only need to go up the file tree to find the constant js const component styled div function sidebar const constant value 42 let reassignablevar null const data usequery query most destructured values will use const unless otherwise needed rest of the component file naming all components are pascal cased all javascript files that aren t components are camel cased anything that isn t a component are also camel cased connecting to external data sources blockchain and backend currently kickback injects data into it s react front end using the apollo graphql library to connect to the blockchain we use a client side resolver that connects to web3 this way our app does not need to know how it is connecting to the blockchain that is all handled by our data library apollo in the same way we also connect to our back end via apollo which does not require anything apart from the graphql queries and mutations setup clone the repo and install dependencies git clone https github com wearekickback app git cd app yarn generate src config env json yarn setup rinkeby start the development server yarn start test creating a event locally go to http localhost 3000 create fill in event detail leave password as blank fill in event detail and press submit automated e2e tests with cypress not available to public make sure the locally environment is up and running with local contract ganache and server when run ganache run it with 500 accounts ganache cli accounts 500 b 3 run the seeding script to deploy some seed parties bash yarn run seedparty creating admin profile deployed new party at address 0x073c8e6c4653a150178d4cdf501e455e55c26ba4 new rsvp 0x5656d12b67cca4cf8d300a6d7f541bab0965e443 at party super duper at address 0x073c8e6c4653a150178d4cdf501e455e55c26ba4 new rsvp 0xb37e1d697753ee89d572d6cf56defcdfb55236d5 at party super duper at address 0x073c8e6c4653a150178d4cdf501e455e55c26ba4 admin account adm1547825705218 already exists deployed new party at address 0xf4d12f3e5ca4d66c2196942135fda78c8f3a90d1 new rsvp 0xb37e1d697753ee89d572d6cf56defcdfb55236d5 at party super duper 2 at address 0xf4d12f3e5ca4d66c2196942135fda78c8f3a90d1 new rsvp 0x5656d12b67cca4cf8d300a6d7f541bab0965e443 at party super duper 2 at address 0xf4d12f3e5ca4d66c2196942135fda78c8f3a90d1 seeding parties complete ready to run cypress tests done in 14 11s note if the output does not show contract addresses there is a possibility that the web3 version you are using may have a bug https github com ethereum web3 js issues 1916 if the bug is not fixed try to manually change the web3 js code as described here https ethereum stackexchange com questions 61073 uncaught error returned values arent valid did it run out of gas bash yarn run cypress open seed script must be re run to test again deploying to xdai env manually vercel login vercel switch wearekickback vercel f local config deploy now xdai json public prod deploying to polygon env manually vercel login vercel switch wearekickback 1 make sure vercel project json points to polygon project yarn build release polygon vercel f local config deploy now polygon json public prod
front_end
lively4-core
an explorative self supporting web based development environment build status github https github com livelykernel lively4 core actions workflows ci yml badge svg https github com livelykernel lively4 core actions query workflow 3aci getting started lively import src https lively kernel org lively4 lively4 petrinet doc navigation html alt lively import directly start developing in your chrome by visiting lively4 https lively kernel org lively4 lively4 core start html and follow the getting started guide doc tutorial index md through checking out and serving git repositories with the lively4 server doc lively4 server md we can directly edit and use the environment in a self supporting way a project or branch of a project on github can be checked out in multiple directories at once working on the same url allows to collaborate in a very tight way working in a separate directory provides more distance lively4 core https lively kernel org lively4 lively4 core start html lively4 stable https lively kernel org lively4 lively4 stable start html lively4 jens https lively kernel org lively4 lively4 jens start html the lively4 server and github sync tools can check out arbitrary projects such as the code of lively4 server https lively kernel org lively4 lively4 server itself or the source of a paper hosted by overleaf authors contributors sofware architecture group https www hpi uni potsdam de hirschfeld hasso plattner institute https www hpi de 2015 2021 mit license license jens lincke stefan ramson tim felgentreff fabio niephaus robert hirschfeld marcel taeumel seminars live programming live 2021 https lively kernel org lively4 lively4 seminars live2021 software design swd 2015 https lively kernel org lively4 lively4 seminars swd2015 index md swd 2016 https lively kernel org lively4 lively4 seminars swd2015 index md swd 2021 https lively kernel org lively4 lively4 seminars swd2021 web based development webdev 2016 https lively kernel org lively4 lively4 seminars webdev2016 index md webdev 2017 18 https lively kernel org lively4 lively4 seminars webdev2017 index md programming experience px 2018 https lively kernel org lively4 lively4 seminars px2018 index md reactive programming rp 2018 https lively kernel org lively4 lively4 seminars rp2018 index md imprint imprint md
lively4 lively-kernel live-programming development-environment
front_end
badbankfinal
badbank mern description this full stack banking application was created as a capstone project for mit xpro professional certificate in coding full stack development with mern bootcamp it was initially built via create react app and connected to backend with express and mongodb cloud solution installation guidelines the app can be run via local server which requires the following steps clone the repository to your local machine install node js run npm install to create the node modules in the project folder run npm install express run npm install cors run npm install mongodb run node index js to start the server in the project folder browse to http localhost 3000 technology used mern mongodb express react node js heroku features create user login various bank simulated functions withdraw deposit balance all data etc license mit screenshot 1 https user images githubusercontent com 90288071 176186071 fa768958 717c 4415 8fbe 4ef2c29c1711 png deployed in heroku app as well https badbankfinal herokuapp com screenshot 2 https user images githubusercontent com 90288071 176171113 dadd0abc 0672 4e67 95d4 150fae511417 png
express mongodb nodejs react
server
pwa-ecommerce-demo
convert and e commerce app to a pwa e commerce demo app for the progressive web applications instructor led training this is not an official google product in maintenance mode this is based on web starter kit 0 6 4 which is no longer being maintained https github com google web starter kit issues 940 the code still runs but note deprecation warnings from npm this repository is in maintenance mode security issues are getting fixed but new development would require taking it off the web starter kit the paymentrequest code fails on safari https github com google developer training pwa ecommerce demo issues 18 the linked issue supplements the lab notes and contains instructions for fixing it getting started to get started check out the instructions on developers google com https developers google com web ilt pwa challenge convert an e commerce site to a pwa
front_end
embedded-system-learning
embedded system learning group of embedded systems and relataed diagrams schematics documenting my journey of re learning embedded systems design i haven t done much with embedded systems since college i have been wanting to get back into it for some robotics projects i plan to design and create but before i can jump into a big project like that i need to grasp the basics once again which is exactly what this directory is for traffic lights simple starting project using 3 leds the lights are lit up in the pattern green yellow red before all turning back off each step is 1 second long img src traffic light image jpeg alt image of traffic light project width 400 br binary counter a 0 7 counter using three leds to represent the value in binary and a button to increase the value by one each press img src binary counter image jpeg alt image of binary counter project width 400
os
Duke-MLSS-2018
duke machine learning summer school 2018 welcome to the duke tsinghua machine learning summer school 2018 https www fuqua duke edu machine learning summer school this repository will contain the lecture materials and assignments for the hands on tensorflow sessions while there is no hard requirement to attend these sessions or complete the exercises we do strongly recommend them many of the machine learning concepts being covered thoughout the week are best learned and reinforced by implementing the ideas in code yourself also if you are a duke student hoping for a certification of completion of the course on your transcript the assignments are how we determine if you pass please come ready to code before you arrive required please have python 3 and tensorflow installed we will be doing a lot of our development in ipython notebooks so you ll likely want to have jupyter installed as well or have access to colab https research google com colaboratory if you don t already have the aforementioned software installed please go through the notebook labeled 00a tensorflow installation ipynb https github com duke mlss duke mlss 2018 blob master 00a tensorflow installation ipynb installing these tools should take about 5 10 minutes optional given the pace of the course we ll be assuming some background knowledge for scientific computing in python if you are unfamiliar with ipython notebooks or python coding environments a brief introduction can be found in 00b coding environments ipynb https github com duke mlss duke mlss 2018 blob master 00b coding environments ipynb if you haven t used python before or want a refresher we recommend python like you mean it https www pythonlikeyoumeanit com intro html by ryan soklaski this free e book consists of five short modules introducing python for scientific computing and data analysis modules 1 and 2 on installing python and python essentials will be especially useful module 3 which concerns the manipulation of matrices and vectors in python is very relevant but optional reading as we will also be covering those topics in our sessions additionally we ll be releasing new lecture materials to this github repository each day of the course if you re familiar with git the most seamless way to keep your files up to date is by cloning forking this repository and pulling if you need a primer on git there s one available in 00c git basics ipynb https github com duke mlss duke mlss 2018 blob master 00c git basics ipynb but learning how to use git isn t required we ll distribute the materials in other ways as well
ai
Twitter-Sentiment-Analysis-Classical-Approach-VS-Deep-Learning
twitter sentiment analysis classical approach vs deep learning img src images love scrable jpg style width 1000px margin bottom 15px span photo by a href https unsplash com gaellemarcel utm source unsplash amp utm medium referral amp utm content creditcopytext gaelle marcel a on a href https unsplash com s photos computer text utm source unsplash amp utm medium referral amp utm content creditcopytext unsplash a span overview this project s aim is to explore the world of natural language processing nlp by building what is known as a sentiment analysis model a sentiment analysis model is a model that analyses a given piece of text and predicts whether this piece of text expresses positive or negative sentiment center img src images sentiment classification png style width 800px margin bottom 15px center to this end we will be using the sentiment140 dataset containing data collected from twitter an impressive feature of this dataset is that it is perfectly balanced i e the number of examples in each class is equal citing the creators http help sentiment140 com for students of this dataset our approach was unique because our training data was automatically created as opposed to having humans manual annotate tweets in our approach we assume that any tweet with positive emoticons like were positive and tweets with negative emoticons like were negative we used the twitter search api to collect these tweets by using keyword search after a series of cleaning and data processing and after visualizing our data in a word cloud we will be building a naive bayezian model this model s goal would be to properly classify positive and negative tweets in terms of sentiment next we will propose a much more advanced solution using a deep learning model lstm this process will require a different kind of data cleaning and processing also we will discover word embeddings dropout and many other machine learning related concepts throughout this notebook we will take advantage of every result visualization and failure in order to try and further understand the data extract insights and information from it and learn how to improve our model from the type of words used in positive negative sentiment tweets to the vocabulary diversity in each case and the day of the week in which these tweets occur to the overfitting concept and grasping the huge importance of the data while building a given model i really hope that you ll enjoy going through this notebook and gain not only technical skills but also analytical skills from it this notebook is written by joseph assaker feel free to reach out for any feedback on this notebook via email mailto lb josephassaker gmail com or linkedin https www linkedin com in joseph assaker now let s start with the fun table of content 1 importing and discovering the dataset 2 cleaning and processing the data 2 1 tokenization 2 2 lemmatization 2 3 cleaning the data 3 visualizing the data 4 naive bayesian model 4 1 splitting the data 4 2 training the model 4 3 testing the model 4 4 asserting the model 5 deep learning model lstm 5 1 data pre processing nbsp nbsp nbsp nbsp 5 1 1 word embeddings nbsp nbsp nbsp nbsp 5 1 2 global vectors for word representation glove nbsp nbsp nbsp nbsp 5 1 3 data padding 5 2 data transformation 5 3 building the model 5 4 training the model 5 5 investigating possibilties to improve the model nbsp nbsp nbsp nbsp 5 5 1 regularization dropout nbsp nbsp nbsp nbsp 5 5 2 inspecting the data unknown words 5 6 predicting on custom data 5 7 inspecting wrongly predicted data 6 bonus section 7 extra tip pickling 8 further work continue reading the whole notebook here https github com josephassaker twitter sentiment analysis classical approach vs deep learning blob master twitter 20sentiment 20analysis 20 20classical 20approach 20vs 20deep 20learning ipynb you can also find this notebook and give it an upvote on kaggle https www kaggle com josephassaker twitter sentiment analysis classical vs lstm
nlp python sentiment-analysis sentiment-classification natural-language-processing lstm deep-learning
ai
sql-challenge
sql challenge readme md sql homework employee database a mystery in two parts it is a beautiful spring day and it is two weeks since you have been hired as a new data engineer at pewlett hackard your first major task is a research project on employees of the corporation from the 1980s and 1990s background all that remain of the database of employees from that period are six csv files in this assignment you will design the tables to hold data in the csvs import the csvs into a sql database and answer questions about the data in other words you will perform data modeling data engineering data analysis before you begin create a new repository for this project called sql challenge do not add this homework to an existing repository clone the new repository to your computer inside your local git repository create a directory for the sql challenge use a folder name to correspond to the challenge employeesql add your files to this folder push the above changes to github instructions data modeling inspect the csvs and sketch out an erd of the tables feel free to use a tool like http www quickdatabasediagrams com data engineering use the information you have to create a table schema for each of the six csv files remember to specify data types primary keys foreign keys and other constraints import each csv file into the corresponding sql table data analysis once you have a complete database do the following list the following details of each employee employee number last name first name gender and salary list employees who were hired in 1986 list the manager of each department with the following information department number department name the manager s employee number last name first name and start and end employment dates list the department of each employee with the following information employee number last name first name and department name list all employees whose first name is hercules and last names begin with b list all employees in the sales department including their employee number last name first name and department name list all employees in the sales and development departments including their employee number last name first name and department name in descending order list the frequency count of employee last names i e how many employees share each last name
server
cuda-uvm-gpt2
cuda uvm gpt2 this repo evaluates the performance of pytorch uvm https github com kooyunmo pytorch uvm tree 53e458826f1895ab92c7b31a1c66fa60c29f84cd with extremely large scale language models e g gpt 2 gpt 3 pytorch uvm adopts cuda unified virtual memory a k a uvm to serve memory intensive models with preventing the program execution from oom by up to cpu memory capacity uvm makes both cpu and gpu share the same virtual address space therefore even though the gpu memory is physically oversubscribed vanilla pytorch occurs oom in this case victim memory block is implicitly migrated to cpu physical memory space without any explicit data off loading command the evicted data can be migrated to gpu memory again when it is on demand how to build pytorch uvm prerequisites ubuntu 18 04 anaconda3 cuda 11 0 cudnn 8 0 4 for cuda 11 0 correct environment variables bash git clone recursive https github com kooyunmo cuda uvm gpt2 cd cuda uvm gpt2 pytorch uvm git checkout uvm create new conda environment conda create n uvm pytorch python 3 8 y conda activate uvm pytorch environment variables we need this setting for every installation and experiment export cuda home your cuda 11 0 path export cuda nvcc executable cuda home bin nvcc export cudnn lib dir cuda home lib64 export cudnn include dir cuda home include export cudnn library cuda home lib64 export path cuda home bin path export ld library path cuda home lib64 ld library path export cmake prefix path conda prefix dirname which conda install dependencies ensure prerequisites for pytorch build conda install numpy ninja pyyaml mkl mkl include setuptools cmake cffi typing y conda install c pytorch magma cuda110 y install onnx conda install c conda forge onnx y downgrade protobuf why https github com onnx onnx issues 2434 conda install c conda forge protobuf 3 9 y ensure prerequisites for caffe2 build pip install future run setup py build test 0 use distributed 0 use nccl 0 use numa 0 use mpi 0 python setup py install evaluate bash install requirements pip install r requirements txt run inference cuda visible devices gpu id python run gpt2 py model model name enable prefetch enable cudnn benchmark num streams num streams warmups num warmup step
ai
threadx
azure rtos threadx this advanced real time operating system rtos is designed specifically for deeply embedded applications among the multiple benefits it provides are advanced scheduling facilities message passing interrupt management and messaging services azure rtos threadx has many advanced features including picokernel architecture preemption threshold event chaining and a rich set of system services here are the key features and modules of threadx threadx key features docs threadx features png getting started azure rtos has been integrated to the semiconductor s sdks and development environment you can develop using the tools of choice from stmicroelectronics https www st com content st com en campaigns x cube azrtos azure rtos stm32 html nxp https www nxp com design software embedded software azure rtos for nxp microcontrollers azure rtos renesas https github com renesas azure rtos and microchip https mu microchip com get started simplifying your iot design with azure rtos we also provide getting started guide https github com azure rtos getting started and samples https github com azure rtos samples using hero development boards from semiconductors you can build and test with see overview of azure rtos threadx https learn microsoft com en us azure rtos threadx overview threadx for the high level overview and all documentation and apis can be found in azure rtos threadx documentation https learn microsoft com en us azure rtos threadx also there is dedicated learning path of azure rtos threadx https learn microsoft com training paths azure rtos threadx for learning systematically repository structure and usage directory layout cmake cmakelist files for building the project common core threadx files common modules core threadx module files common smp core threadx smp files docs documentation supplements ports architecture and compiler specific files see below for directory breakdown cortex m7 iar example iar compiler sample project example build iar workspace and sample project files inc tx port h for this architecture src source files for this architecture ac6 example ac6 keil sample project gnu example gnu sample project ports modules architecture and compiler specific files for threadx modules ports smp architecture and compiler specific files for threadx smp samples demo threadx c utility test cases and utilities branches releases the master branch has the most recent code with all new features and bug fixes it does not represent the latest general availability ga release of the library each official release preview or ga will be tagged to mark the commit and push it into the github releases tab e g v6 2 rel when you see xx xx xxxx 6 x or x x in function header this means the file is not officially released yet they will be updated in the next release see example below function release tx initialize low level cortex m23 gnu 6 x author scott larson microsoft corporation description this function is responsible for any low level processor initialization including setting up interrupt vectors setting up a periodic timer interrupt source saving the system stack pointer for use in isr processing later and finding the first available ram memory address for tx application define input none output none calls none called by tx initialize kernel enter threadx entry function release history date name description 09 30 2020 scott larson initial version 6 1 xx xx xxxx scott larson include tx user h resulting in version 6 x supported architecture ports threadx arc em cortex a12 cortex m0 cortex r4 arc hs cortex a15 cortex m23 cortex r5 arm11 cortex a17 cortex m3 cortex r7 arm9 cortex a34 cortex m33 c667x cortex a35 cortex m4 linux cortex a5 cortex m55 risc v32 cortex a53 cortex m7 rxv1 cortex a55 cortex m85 rxv2 cortex a57 rxv3 cortex a5x win32 cortex a65 xtensa cortex a65ae cortex a7 cortex a72 cortex a73 cortex a75 cortex a76 cortex a76ae cortex a77 cortex a8 cortex a9 threadx modules azure rtos threadx modules https learn microsoft com azure rtos threadx modules chapter1 component provides an infrastructure for applications to dynamically load modules that are built separately from the resident portion of the application cortex a35 cortex a35 smp cortex a7 cortex m0 cortex m23 cortex m3 cortex m33 cortex m4 cortex m7 cortex r4 rxv2 threadx smp azure rtos threadx smp https learn microsoft com azure rtos threadx threadx smp chapter1 is a high performance real time smp kernel designed specifically for embedded applications arc hs smp cortex a34 smp cortex a35 smp cortex a53 smp cortex a55 smp cortex a57 smp cortex a5x smp cortex a5 smp cortex a65ae smp cortex a65 smp cortex a72 smp cortex a73 smp cortex a75 smp cortex a76ae smp cortex a76 smp cortex a77 smp cortex a78 smp cortex a7 smp cortex a9 smp linux adaptation layer for threadx azure rtos threadx is an advanced real time operating system rtos designed specifically for deeply embedded applications to help ease application migration to azure rtos threadx provides adaption layers https github com azure rtos threadx tree master utility rtos compatibility layers for various legacy rtos apis freertos posix osek etc component dependencies the main components of azure rtos are each provided in their own repository but there are dependencies between them as shown in the following graph this is important to understand when setting up your builds dependency graph docs deps png you will have to take the dependency graph above into account when building anything other than threadx itself building and using the library instruction for building the threadx as static library using arm gnu toolchain and cmake if you are using toolchain and ide from semiconductor you might follow its own instructions to use azure rtos components as explained in the getting started getting started section 1 install the following tools cmake https cmake org download version 3 0 or later arm gnu toolchain for arm none eabi https developer arm com downloads arm gnu toolchain downloads ninja https ninja build org 1 cloning the repo bash git clone https github com azure rtos threadx git 1 define the features and addons you need in tx user h and build together with the component source code you can refer to tx user sample h https github com azure rtos threadx blob master common inc tx user sample h as an example 1 building as a static library each component of azure rtos comes with a composable cmake based build system that supports many different mcus and host systems integrating any of these components into your device app code is as simple as adding a git submodule and then including it in your build using the cmake add subdirectory while the typical usage pattern is to include threadx into your device code source tree to be built linked with your code you can compile this project as a standalone static library to confirm your build is set up correctly an example of building the library for cortex m4 bash cmake bbuild gninja dcmake toolchain file cmake cortex m4 cmake cmake build build professional support professional support plans https azure microsoft com support options are available from microsoft for community support and others see the resources resources section below licensing license terms for using azure rtos are defined in the license txt file of this repo please refer to this file for all definitive licensing information no additional license fees are required for deploying azure rtos on hardware defined in the licensed hardware txt file if you are using hardware not defined in the licensed hardware txt file or have licensing questions in general please contact microsoft directly at https aka ms azrtos license resources the following are references to additional azure rtos resources product introduction and white papers https azure com rtos general technical questions https aka ms qna azure rtos product issues and bugs or feature requests https github com azure rtos threadx issues licensing and sales questions https aka ms azrtos license product roadmap and support policy https aka ms azrtos lts blogs and videos http msiotblog com and https aka ms iotshow azure rtos tracex installer https aka ms azrtos tracex installer you can also check previous questions https stackoverflow com questions tagged azure rtos threadx or ask new ones on stackoverflow using the azure rtos and threadx tags security azure rtos provides oems with components to secure communication and to create code and data isolation using underlying mcu mpu hardware protection mechanisms it is ultimately the responsibility of the device builder to ensure the device fully meets the evolving security requirements associated with its specific use case contribution please follow the instructions provided in the contributing md contributing md for the corresponding repository
embedded iot mcu microcontroller azure-rtos rtos real-time
os
llm-alignment-survey
llm alignment survey a curated reading list for large language model llm alignment take a look at our new survey large language model alignment a survey https arxiv org abs 2309 15025 on arxiv for more details feel free to open an issue pr or e mail thshen tju edu cn and dyxiong tju edu cn if you find any missing areas papers or datasets we will keep updating this list and survey if you find our survey useful please kindly cite our paper bibtex article shen2023alignment title large language model alignment a survey author shen tianhao and jin renren and huang yufei and liu chuang and dong weilong and guo zishan and wu xinwei and liu yan and xiong deyi journal arxiv preprint arxiv 2309 15025 year 2023 table of contents llm alignment survey llm alignment survey related surveys related surveys why llm alignment why llm alignment llm generated content llm generated content undesirable content undesirable content unfaithful content unfaithful content malicious uses malicious uses negative impacts on society negative impacts on society potential risks associated with advanced llms potential risks associated with advanced llms what is llm alignment what is llm alignment outer alignment outer alignment non recursive oversight non recursive oversight rl based methods rl based methods sl based methods sl based methods scalable oversight scalable oversight inner alignment inner alignment mechanistic interpretability mechanistic interpretability attacks on aligned language models attacks on aligned language models privacy attacks privacy attacks backdoor attacks backdoor attacks adversarial attacks adversarial attacks alignment evaluation alignment evaluation factuality evaluation factuality evaluation ethics evaluation ethics evaluation toxicity evaluation toxicity evaluation task specific evaluation task specific evaluation llm centered evaluation llm centered evaluation stereotype and bias evaluation stereotype and bias evaluation task specific evaluation task specific evaluation 1 llm centered evaluation llm centered evaluation 1 hate speech detection hate speech detection general evaluation general evaluation related surveys 1 aligning large language models with human a survey yufei wang et al arxiv 2023 paper https arxiv org abs 2307 12966 2 trustworthy llms a survey and guideline for evaluating large language models alignment yang liu et al arxiv 2023 paper https arxiv org abs 2308 05374 3 bridging the gap a survey on integrating human feedback for natural language generation patrick fernandes et al arxiv 2023 paper https arxiv org abs 2305 00955 4 augmented language models a survey gr goire mialon et al arxiv 2023 paper https arxiv org abs 2302 07842 5 an overview of catastrophic ai risks dan hendrycks et al arxiv 2023 paper https arxiv org abs 2306 12001 6 a survey of large language models wayne xin zhao et al arxiv 2023 paper https arxiv org abs 2303 18223 7 a survey on universal adversarial attack chaoning zhang et al ijcai 2021 paper https arxiv org abs 2103 01498 8 survey of hallucination in natural language generation ziwei ji et al acm computing surveys 2022 paper https arxiv org abs 2202 03629 9 automatically correcting large language models surveying the landscape of diverse self correction strategies liangming pan et al arxiv 2023 paper https arxiv org abs 2308 03188 10 automatic detection of machine generated text a critical survey ganesh jawahar et al coling 2020 paper https arxiv org abs 2011 01314 why llm alignment 1 synchromesh reliable code generation from pre trained language models gabriel poesia et al iclr 2022 paper https openreview net forum id kmtvd97j43e 2 llm planner few shot grounded planning for embodied agents with large language models chan hee song et al iccv 2023 paper https arxiv org abs 2212 04088 3 language models as zero shot planners extracting actionable knowledge for embodied agents wenlong huang et al pmlr 2022 paper https proceedings mlr press v162 huang22a html 4 tool learning with foundation models yujia qin et al arxiv 2023 paper https arxiv org abs 2304 08354 5 ethical and social risks of harm from language models laura weidinger et al arxiv 2021 paper https arxiv org abs 2112 04359 llm generated content undesirable content 1 predictive biases in natural language processing models a conceptual framework and overview deven shah et al arxiv 2019 paper https arxiv org abs 1912 11078 2 realtoxicityprompts evaluating neural toxic degeneration in language models samuel gehman et al arxiv 2023 paper https arxiv org abs 2009 11462 3 extracting training data from large language models nicholas carlini et al arxiv 2012 paper https arxiv org abs 2012 07805 4 stereoset measuring stereotypical bias in pretrained language models moin nadeem et al arxiv 2020 paper https arxiv org abs 2004 09456 5 crows pairs a challenge dataset for measuring social biases in masked language models nikita nangia et al emnlp 2020 paper https arxiv org abs 2010 00133 6 honest measuring hurtful sentence completion in language models debora nozza et al naacl 2021 paper https aclanthology org 2021 naacl main 191 7 language models are few shot learners tom brown et al neurips 2020 paper https proceedings neurips cc paper 2020 hash 1457c0d6bfcb4967418bfb8ac142f64a abstract html 8 persistent anti muslim bias in large language models abubakar abid et al aies 2021 paper https dl acm org doi 10 1145 3461702 3462624 9 gender and representation bias in gpt 3 generated stories li lucy et al wnu 2021 paper https aclanthology org 2021 nuse 1 5 unfaithful content 1 measuring and improving consistency in pretrained language models yanai elazar et al tacl 2021 paper https aclanthology org 2021 tacl 1 60 pdf 2 gpt 3 creative fiction gwern 2023 blog https gwern net gpt 3 3 gpt 3 what s it good for robert dale natural language engineering 2020 paper https www cambridge org core journals natural language engineering article gpt3 whats it good for 0e05cfe68a7ac8bf794c8ecbe28aa990 4 scaling language models methods analysis insights from training gopher jack w rae et al arxiv 2021 paper https arxiv org abs 2112 11446 5 truthfulqa measuring how models mimic human falsehoods stephanie lin et al acl 2022 paper https arxiv org abs 2109 07958 6 towards tracing knowledge in language models back to the training data ekin akyurek et al emnlp 2020 paper https aclanthology org 2022 findings emnlp 180 7 sparks of artificial general intelligence early experiments with gpt 4 s bastien bubeck et al arxiv 2023 paper https arxiv org abs 2303 12712 8 navigating the grey area expressions of overconfidence and uncertainty in language models kaitlyn zhou et al arxiv 2023 paper https arxiv org abs 2302 13439 9 patient and consumer safety risks when using conversational assistants for medical information an observational study of siri alexa and google assistant reza asadi et al 2018 paper https www academia edu 87073068 patient and consumer safety risks when using conversational assistants for medical information an observational study of siri alexa and google assistant 10 will chatgpt replace lawyers kate rattray 2023 blog https www clio com blog chat gpt lawyers 11 constitutional ai harmlessness from ai feedback yuntao bai et al arxiv 2022 paper https arxiv org abs 2212 08073 malicious uses 1 truth lies and automation how language models could change disinformation ben buchanan et al center for security and emerging technology 2021 paper https cset georgetown edu publication truth lies and automation 2 understanding the capabilities limitations and societal impact of large language models alex tamkin et al arxiv 2021 paper https arxiv org abs 2102 02503 3 deal or no deal end to end learning for negotiation dialogues mike lewis et al arxiv 2017 paper https arxiv org abs 1706 05125 4 evaluating large language models trained on code anne laure ligozat et al arxiv 2021 paper https arxiv org abs 2107 03374 5 artificial intelligence and biological misuse differentiating risks of language models and biological design tools jonas b sandbrink arxiv 2023 paper https arxiv org abs 2306 13952 negative impacts on society 1 sustainable ai ai for sustainability and the sustainability of ai aimee van wynsberghe ai and ethics 2021 paper https link springer com article 10 1007 s43681 021 00043 6 2 unraveling the hidden environmental impacts of ai solutions for environment anne laure ligozat et al arxiv 2021 paper https arxiv org abs 2110 11822 3 gpts are gpts an early look at the labor market impact potential of large language models tyna eloundou et al arxiv 2023 paper https arxiv org abs 2303 10130 potential risks associated with advanced llms 1 formalizing convergent instrumental goals tsvi benson tilsen et al aaai aies workshop 2016 paper https www semanticscholar org paper formalizing convergent instrumental goals benson tilsen soares d7b321d8d88381a2a84e9d6e8f8f34ee2ed65df2 2 model evaluation for extreme risks toby shevlane et al arxiv 2023 paper https arxiv org abs 2305 15324 3 aligning ai optimization to community well being stray j international journal of community well being 2020 paper https europepmc org article med 34723107 4 what are you optimizing for aligning recommender systems with human values jonathan stray et al icml 2020 paper https arxiv org abs 2107 10939 5 model evaluation for extreme risks toby shevlane et al arxiv 2023 paper https arxiv org abs 2305 15324 6 human level play in the game of diplomacy by combining language models with strategic reasoning meta fundamental ai research diplomacy team fair et al science 2022 paper https vlgiitr github io papers we read summaries cicero html 7 characterizing manipulation from ai systems micah carroll et al arxiv 2023 paper https arxiv org abs 2303 09387 8 deceptive alignment monitoring andres carranza et al icml advml workshop 2023 paper https arxiv org abs 2307 10569 9 the superintelligent will motivation and instrumental rationality in advanced artificial agents nick bostrom minds and machines 2012 paper https link springer com article 10 1007 s11023 012 9281 3 10 is power seeking ai an existential risk joseph carlsmith arxiv 2023 paper https arxiv org abs 2206 13353 11 optimal policies tend to seek power alexander matt turner et al neurips 2021 paper https proceedings neurips cc paper files paper 2021 file c26820b8a4c1b3c2aa868d6d57e14a79 paper pdf 12 parametrically retargetable decision makers tend to seek power alexander matt turner et al neurips 2022 paper https proceedings neurips cc paper files paper 2022 file cb3658b9983f677670a246c46ece553d paper conference pdf 13 power seeking can be probable and predictive for trained agents victoria krakovna et al arxiv 2023 paper https arxiv org abs 2304 06528 14 discovering language model behaviors with model written evaluations ethan perez et al arxiv 2022 paper https arxiv org abs 2212 09251 what is llm alignment 1 some moral and technical consequences of automation as machines learn they may develop unforeseen strategies at rates that baffle their programmers norbert wiener science 1960 paper https doi org 10 1126 science 131 3410 1355 2 coherent extrapolated volition eliezer yudkowsky singularity institute for artificial intelligence 2004 paper https intelligence org files cev pdf 3 the basic ai drives stephen m omohundro agi 2008 paper https dl acm org doi 10 5555 1566174 1566226 4 the superintelligent will motivation and instrumental rationality in advanced artificial agents nick bostrom minds and machines 2012 paper https doi org 10 1007 s11023 012 9281 3 5 general purpose intelligence arguing the orthogonality thesis stuart armstrong analysis and metaphysics 2013 paper https www fhi ox ac uk wp content uploads orthogonality analysis and metaethics 1 pdf 6 aligning superintelligence with human interests an annotated bibliography nate soares intelligence 2015 paper http intelligence org files annotatedbibliography pdf 7 concrete problems in ai safety dario amodei et al arxiv 2016 paper https arxiv org abs 1606 06565 8 the mythos of model interpretability zachary c lipton arxiv 2017 paper https arxiv org abs 1606 03490 9 ai safety gridworlds jan leike et al arxiv 2017 paper https arxiv org abs 1711 09883 10 overview of current ai alignment approaches micah carroll 2018 paper https micahcarroll github io assets valuealignment pdf 11 risks from learned optimization in advanced machine learning systems evan hubinger et al arxiv 2019 paper https arxiv org abs 1906 01820 12 an overview of 11 proposals for building safe advanced ai evan hubinger arxiv 2020 paper https arxiv org abs 2012 07532 13 unsolved problems in ml safety dan hendrycks et al arxiv 2021 paper https arxiv org abs 2109 13916 14 a mathematical framework for transformer circuits nelson elhage et al transformer circuits thread 2021 paper https transformer circuits pub 2021 framework index html 15 alignment of language agents zachary kenton et al arxiv 2021 paper https arxiv org abs 2103 14659 16 a general language assistant as a laboratory for alignment amanda askell et al arxiv 2021 paper https arxiv org abs 2112 00861 17 a transparency and interpretability tech tree evan hubinger 2022 blog https www lesswrong com posts nbq2bwlcymsgup9af a transparency and interpretability tech tree 18 understanding ai alignment research a systematic analysis j kirchner et al arxiv 2022 paper https arxiv org abs 2206 02841 19 softmax linear units nelson elhage et al transformer circuits thread 2022 paper https transformer circuits pub 2022 solu index html 20 the alignment problem from a deep learning perspective richard ngo arxiv 2022 paper https arxiv org abs 2209 00626 21 paradigms of ai alignment components and enablers victoria krakovna 2022 blog https www lesswrong com posts jc7ajzjt2wvxxffgz paradigms of ai alignment components and enablers 22 progress measures for grokking via mechanistic interpretability neel nanda et al arxiv 2023 paper https arxiv org abs 2301 05217 23 agentized llms will change the alignment landscape seth herd 2023 blog https www lesswrong com posts dcoxvehafycov2la6 agentized llms will change the alignment landscape 24 language models can explain neurons in language models steven bills et al 2023 paper https openaipublic blob core windows net neuron explainer paper index html 25 core views on ai safety when why what and how anthropic 2023 blog https www anthropic com index core views on ai safety outer alignment non recursive oversight rl based methods 1 proximal policy optimization algorithms john schulman et al arxiv 2017 paper https arxiv org abs 1707 06347 2 fine tuning language models from human preferences daniel m ziegler et al arxiv 2019 paper https arxiv org abs 1909 08593 3 learning to summarize with human feedback nisan stiennon et al neurips 2020 paper https proceedings neurips cc paper 2020 file 1f89885d556929e98d3ef9b86448f951 paper pdf 4 training language models to follow instructions with human feedback long ouyang et al neurips 2022 paper https arxiv org abs 2203 02155 5 training a helpful and harmless assistant with reinforcement learning from human feedback yuntao bai et al arxiv 2022 paper https arxiv org abs 2204 05862 6 rl4f generating natural language feedback with reinforcement learning for repairing model outputs afra feyza aky rek et al arxiv 2023 paper https arxiv org abs 2305 08844 7 improving language models with advantage based offline policy gradients ashutosh baheti et al arxiv 2023 paper https arxiv org abs 2305 14718 8 scaling laws for reward model overoptimization leo gao et al icml 2023 paper https proceedings mlr press v202 gao23h gao23h pdf 9 improving alignment of dialogue agents via targeted human judgements amelia glaese et al arxiv 2022 paper https arxiv org abs 2209 14375 10 aligning language models with preferences through f divergence minimization dongyoung go et al arxiv 2023 paper https arxiv org abs 2302 08215 11 aligning large language models through synthetic feedback sungdong kim et al arxiv 2023 paper https arxiv org abs 2305 13735 12 rlhf ansh radhakrishnan lesswrong 2022 paper https www lesswrong com posts rqh4grmpmjyjtmptn rlhf 13 guiding large language models via directional stimulus prompting zekun li et al arxiv 2023 paper https arxiv org abs 2302 11520 14 aligning generative language models with human values ruibo liu et al naacl 2022 findings paper https aclanthology org 2022 findings naacl 18 15 second thoughts are best learning to re align with human values from text edits ruibo liu et al neurips 2022 paper https openreview net pdf id u6ofmagiya1 16 secrets of rlhf in large language models part i ppo rui zheng et al arxiv 2023 paper https arxiv org abs 2307 04964 17 principled reinforcement learning with human feedback from pairwise or k wise comparisons banghua zhu et al arxiv 2023 paper https arxiv org abs 2301 11270 18 open problems and fundamental limitations of reinforcement learning from human feedback stephen casper et al arxiv 2023 paper https arxiv org abs 2307 15217 sl based methods 1 self diagnosis and self debiasing a proposal for reducing corpus based bias in nlp timo schick et al tacl 2021 paper https aclanthology org 2021 tacl 1 84 2 the cringe loss learning what language not to model leonard adolphs et al arxiv 2022 paper https arxiv org abs 2211 05826 3 leashing the inner demons self detoxification for language models canwen xu et al aaai 2022 paper https ojs aaai org index php aaai article view 21406 21155 4 calibrating sequence likelihood improves conditional language generation yao zhao et al arxiv 2022 paper https arxiv org abs 2210 00045 5 raft reward ranked finetuning for generative foundation model alignment hanze dong et al arxiv 2023 paper https arxiv org abs 2304 06767 6 chain of hindsight aligns language models with feedback hao liu et al arxiv 2023 paper https arxiv org abs 2302 02676 7 training socially aligned language models in simulated human society ruibo liu et al arxiv 2023 paper https arxiv org abs 2305 16960 8 direct preference optimization your language model is secretly a reward model rafael rafailov et al arxiv 2023 paper https arxiv org abs 2305 18290 9 training language models with language feedback at scale j r my scheurer et al arxiv 2023 paper https arxiv org abs 2303 16755 10 preference ranking optimization for human alignment feifan song et al arxiv 2023 paper https arxiv org abs 2306 17492 11 rrhf rank responses to align language models with human feedback without tears zheng yuan et al arxiv 2023 paper https arxiv org abs 2304 05302 12 slic hf sequence likelihood calibration with human feedback yao zhao et al arxiv 2023 paper https arxiv org abs 2305 10425 13 lima less is more for alignment chunting zhou et al arxiv 2023 paper https arxiv org abs 2305 11206 scalable oversight 1 supervising strong learners by amplifying weak experts paul christiano et al arxiv 2018 paper https arxiv org abs 1810 08575 2 scalable agent alignment via reward modeling a research direction jan leike et al arxiv 2018 paper https arxiv org abs 1811 07871 3 ai safety needs social scientists geoffrey irving and amanda askell distill 2019 paper https distill pub 2019 ai safety needs social scientists 4 learning to summarize with human feedback nisan stiennon et al neurips 2020 paper https proceedings neurips cc paper 2020 hash 52b6c8e5a34e5e7e11e466a3d508d6a5 abstract html 5 task decomposition for scalable oversight agisf distillation charbel rapha l segerie 2023 blog https www lesswrong com posts ffz6h35gy6barhxkc task decomposition for scalable oversight agisf distillation 6 measuring progress on scalable oversight for large language models samuel r bowman et al arxiv 2022 paper https arxiv org abs 2211 03540 7 constitutional ai harmlessness from ai feedback yuntao bai et al corr 2022 paper https arxiv org abs 2212 08073 8 improving factuality and reasoning in language models through multiagent debate yilun du et al arxiv 2023 paper https arxiv org abs 2305 14325 9 evaluating superhuman models with consistency checks lukas fluri et al arxiv 2023 paper https arxiv org abs 2306 09983 10 ai safety via debate geoffrey irving et al arxiv 2018 paper https arxiv org abs 1805 00899 11 ai safety via market making evan hubinger 2020 blog https www lesswrong com posts ywwzccgbchmjmpt45 ai safety via market making 12 encouraging divergent thinking in large language models through multi agent debate tian liang et al arxiv 2023 paper https arxiv org abs 2305 19118 13 let s verify step by step hunter lightman et al arxiv 2023 paper https arxiv org abs 2305 20050 14 introducing superalignment openai 2023 blog https openai com blog introducing superalignment 15 principle driven self alignment of language models from scratch with minimal human supervision zhiqing sun et al arxiv 2023 paper https arxiv org abs 2305 03047 inner alignment 1 risks from learned optimization in advanced machine learning systems evan hubinger et al arxiv 2021 paper https arxiv org abs 1906 01820 2 goal misgeneralization in deep reinforcement learning lauro langosco et al icml 2022 paper https arxiv org abs 2105 14111 3 goal misgeneralization why correct specifications aren t enough for correct goals rohin shah et al arxiv 2022 paper https arxiv org abs 2210 01790 4 defining capability and alignment in gradient descent edouard harris lesswrong 2020 blog https www lesswrong com posts xg2yycefcnlyrccjy defining capability and alignment in gradient descent 5 categorizing failures as outer or inner misalignment is often confused rohin shah lesswrong 2023 blog https www lesswrong com posts jkwrdwsarisxtv9ur categorizing failures as outer or inner misalignment is 6 inner alignment failures which are actually outer alignment failures john wentworth lesswrong 2020 blog https www lesswrong com posts hyerofgze6j9tuigi inner alignment failures which are actually outer alignment 7 relaxed adversarial training for inner alignment evan hubinger lesswrong 2019 blog https www lesswrong com posts 9dy5yraocxh9zujqa relaxed adversarial training for inner alignment 8 the inner alignment problem evan hubinger et al lesswrong 2019 blog https www lesswrong com posts pl56xponilvtmdq4j the inner alignment problem 9 three scenarios of pseudo alignment eleni angelou lesswrong 2022 blog https www lesswrong com posts w5nnfgwkcpxdvjmpe three scenarios of pseudo alignment 10 deceptive alignment evan hubinger et al lesswrong 2019 blog https www lesswrong com s r9tykb2a8fp4dn8yb p zthdpajh9w6ytbeks 11 what failure looks like paul christiano ai alignment forum 2019 blog https www alignmentforum org posts hbxe6wdjxk239zajf more realistic tales of doom 12 concrete experiments in inner alignment evan hubinger lesswrong 2019 blog https www lesswrong com posts usdpa9nrsgmxctdkn concrete experiments in inner alignment 13 a central ai alignment problem capabilities generalization and the sharp left turn nate soares lesswrong 2022 blog https www lesswrong com posts gnhmpawcfbcasy8e6 a central ai alignment problem capabilities generalization 14 clarifying the confusion around inner alignment rauno arike ai alignment forum 2022 blog https www alignmentforum org posts xdtnd8xcdzpgfngme clarifying the confusion around inner alignment 15 2 d robustness vladimir mikulik ai alignment forum 2019 blog https www alignmentforum org posts 2mhfmgtajfjesasyr 2 d robustness 16 monitoring for deceptive alignment evan hubinger lesswrong 2022 blog https www lesswrong com posts km9shjhtsbdbgwkyi monitoring for deceptive alignment mechanistic interpretability 1 notions of explainability and evaluation approaches for explainable artificial intelligence giulia vilone et al arxiv 2020 paper https www sciencedirect com science article pii s1566253521001093 2 a comprehensive mechanistic interpretability explainer glossary neel nanda 2022 paper https www lesswrong com posts vnoclyewxcaxtddnp a comprehensive mechanistic interpretability explainer and 3 the mythos of model interpretability zachary c lipton arxiv 2017 paper https arxiv org abs 1606 03490 4 ai research considerations for human existential safety arches andrew critch et al arxiv 2020 paper https arxiv org abs 2006 04948 5 concrete problems for autonomous vehicle safety advantages of bayesian deep learning rt mcallister et al ijcai 2017 paper https dl acm org doi 10 5555 3171837 3171951 6 in context learning and induction heads catherine olsson et al transformer circuits thread 2022 paper https arxiv org abs 2209 11895 7 transformer feed forward layers are key value memories mor geva et al emnlp 2021 paper https arxiv org abs 2012 14913 8 transformer feed forward layers build predictions by promoting concepts in the vocabulary space mor geva et al emnlp 2022 paper https arxiv org abs 2203 14680 9 softmax linear units nelson elhage et al transformer circuits thread 2022 paper https transformer circuits pub 2022 solu index html 10 toy models of superposition nelson elhage et al transformer circuits thread 2022 paper https arxiv org abs 2209 10652 11 mechanistic interpretability variables and the importance of interpretable bases chris olah 2022 paper https transformer circuits pub 2022 mech interp essay index html 12 knowledge neurons in pretrained transformers dai damai et al acl 2021 paper https aclanthology org 2022 acl long 581 13 locating and editing factual associations in gpt kevin meng et al neurips 2022 paper https arxiv org abs 2202 05262 14 eliciting truthful answers from a language model kenneth li et al arxiv 2023 paper https arxiv org abs 2306 03341 15 leace perfect linear concept erasure in closed form nora belrose et al arxiv 2023 paper https arxiv org abs 2306 03819 attacks on aligned language models privacy attacks 1 jailbreaker automated jailbreak across multiple large language model chatbots gelei deng et al arxiv 2023 paper https arxiv org abs 2307 08715 2 multi step jailbreaking privacy attacks on chatgpt haoran li et al arxiv 2023 paper https arxiv org abs 2304 05197 backdoor attacks 1 prompt injection attack against llm integrated applications yi liu et al arxiv 2023 paper https arxiv org abs 2306 05499 2 prompt as triggers for backdoor attack examining the vulnerability in language models shuai zhao et al arxiv 2023 paper https arxiv org abs 2305 01219 3 more than you ve asked for a comprehensive analysis of novel prompt injection threats to application integrated large language models kai greshake et al arxiv 2023 paper https arxiv org abs 2302 12173 4 backdoor attacks for in context learning with language models nikhil kandpal et al arxiv 2023 paper https arxiv org abs 2307 14692 5 badgpt exploring security vulnerabilities of chatgpt via backdoor attacks to instructgpt jiawen shi et al arxiv 2023 paper https arxiv org pdf 2304 12298 pdf adversarial attacks 1 universal and transferable adversarial attacks on aligned language models andy zou et al arxiv 2023 paper https arxiv org abs 2307 15043 2 are aligned neural networks adversarially aligned nicholas carlini et al arxiv 2023 paper https arxiv org abs 2306 15447 3 visual adversarial examples jailbreak large language models xiangyu qi et al arxiv 2023 paper https arxiv org pdf 2306 13213 pdf alignment evaluation factuality evaluation 1 factscore fine grained atomic evaluation of factual precision in long form text generation sewon min et al arxiv 2023 paper https arxiv org abs 2305 14251 2 factuality enhanced language models for open ended text generation nayeon lee et al neurips 2022 paper https arxiv org abs 2206 04624 3 truthfulqa measuring how models mimic human falsehoods stephanie lin et al arxiv 2021 paper https aclanthology org 2022 acl long 229 4 summac re visiting nli based models for inconsistency detection in summarization philippe laban et al tacl 2022 paper https aclanthology org 2022 tacl 1 10 5 qafacteval improved qa based factual consistency evaluation for summarization alexander r fabbri et al arxiv 2021 paper https arxiv org abs 2112 08542 6 true re evaluating factual consistency evaluation or honovich et al arxiv 2022 paper https arxiv org abs 2204 04991 7 alignscore evaluating factual consistency with a unified alignment function yuheng zha et al arxiv 2023 paper https arxiv org abs 2305 16739 ethics evaluation 1 social chemistry 101 learning to reason about social and moral norms maxwell forbes et al arxiv 2020 paper https aclanthology org 2020 emnlp main 48 2 aligning ai with shared human values dan hendrycks et al arxiv 2020 paper https arxiv org abs 2008 02275 3 would you rather a new benchmark for learning machine alignment with cultural values and social preferences yi tay et al acl 2020 paper https aclanthology org 2020 acl main 477 4 scruples a corpus of community ethical judgments on 32 000 real life anecdotes nicholas lourie et al aaai 2021 paper https arxiv org abs 2008 09094 toxicity evaluation task specific evaluation 1 detecting offensive language in social media to protect adolescent online safety ying chen et al passat socialcom 2012 paper http www cse psu edu sxz16 papers socialcom2012 pdf 2 offensive language detection using multi level classification amir h razavi et al canadian ai 2010 paper https www cs csustan edu mmartin lds razavi pdf 3 hateful symbols or hateful people predictive features for hate speech detection on twitter zeerak waseem and dirk hovy naacl student research workshop 2016 paper https aclanthology org n16 2013 pdf 4 measuring the reliability of hate speech annotations the case of the european refugee crisis bjorn ross et al nlp4cmc 2016 paper https linguistics rub de forschung arbeitsberichte 17 pdf page 12 5 ex machina personal attacks seen at scale ellery wulczyn et al www 2017 paper https arxiv org pdf 1610 08914 pdf 6 predicting the type and target of offensive posts in social media marcos zampieri et al naacl hlt 2019 paper https arxiv org pdf 1902 09666 llm centered evaluation 1 recipes for safety in open domain chatbots jing xu et al arxiv 2020 paper https arxiv org pdf 2010 07079 2 realtoxicityprompts evaluating neural toxic degeneration in language models samuel gehman et al emnlp 2020 findings paper https arxiv org pdf 2009 11462 pdf 3 cold a benchmark for chinese offensive language detection jiawen deng et al emnlp 2022 paper https arxiv org pdf 2201 06025 stereotype and bias evaluation task specific evaluation 1 gender bias in coreference resolution rachel rudinger et al naacl 2018 paper https arxiv org pdf 1804 09301 2 gender bias in coreference resolution evaluation and debiasing methods jieyu zhao et al naacl 2018 paper https arxiv org pdf 1804 06876 3 the winograd schema challenge hector levesque et al kr 2012 paper https cdn aaai org ocs 4492 4492 21843 1 pb pdf 4 toward gender inclusive coreference resolution an analysis of gender and bias throughout the machine learning lifecycle yang trista cao and hal daum iii computational linguistics 2021 paper https aclanthology org 2021 cl 3 19 pdf 5 evaluating gender bias in machine translation gabriel stanovsky et al acl 2019 paper https arxiv org pdf 1906 00591 6 investigating failures of automatic translation in the case of unambiguous gender adithya renduchintala and adina williams acl 2022 paper https arxiv org pdf 2104 07838 7 towards understanding gender bias in relation extraction andrew gaut et al acl 2020 paper https arxiv org pdf 1911 03642 8 addressing age related bias in sentiment analysis mark d az et al chi 2018 paper https dl acm org doi pdf 10 1145 3173574 3173986 9 examining gender and race bias in two hundred sentiment analysis systems svetlana kiritchenko and saif m mohammad naacl hlt 2018 paper https arxiv org pdf 1805 04508 10 on measuring and mitigating biased inferences of word embeddings sunipa dev et al aaai 2020 paper https ojs aaai org index php aaai article view 6267 6123 11 social bias frames reasoning about social and power implications of language maarten sap et al acl 2020 paper https arxiv org pdf 1911 03891 12 towards identifying social bias in dialog systems framework dataset and benchmark jingyan zhou et al emnlp 2022 findings paper https arxiv org pdf 2202 08011 13 corgi pm a chinese corpus for gender bias probing and mitigation ge zhang et al arxiv 2023 paper https arxiv org pdf 2301 00395 llm centered evaluation 1 stereoset measuring stereotypical bias in pretrained language models moin nadeem et al acl 2021 paper https arxiv org pdf 2004 09456 2 crows pairs a challenge dataset for measuring social biases in masked language models nikita nangia et al emnlp 2020 paper https arxiv org pdf 2010 00133 3 bold dataset and metrics for measuring biases in open ended language generation jwala dhamala et al facct 2021 paper https arxiv org abs 2101 11718 4 i m sorry to hear that finding new biases in language models with a holistic descriptor dataset eric michael smith et al emnlp 2022 paper https aclanthology org 2022 emnlp main 625 pdf 5 multilingual holistic bias extending descriptors and patterns to unveil demographic biases in languages at scale marta r costa juss et al arxiv 2023 paper https arxiv org pdf 2305 13198 6 unqovering stereotyping biases via underspecified questions tao li et al emnlp 2020 findings paper https arxiv org pdf 2010 02428 pdf 7 bbq a hand built bias benchmark for question answering alicia parrish et al acl 2022 findings paper https arxiv org pdf 2110 08193 8 cbbq a chinese bias benchmark dataset curated with human ai collaboration for large language models yufei huang and deyi xiong arxiv 2023 paper https arxiv org pdf 2306 16244 hate speech detection 1 automated hate speech detection and the problem of offensive language thomas davidson et al aaai 2017 paper https www researchgate net profile ingmar weber publication 314942659 automated hate speech detection and the problem of offensive language links 58e76b6a4585152528de68f2 automated hate speech detection and the problem of offensive language pdf 2 deep learning for hate speech detection in tweets pinkesh badjatiya et al www 2017 paper https arxiv org pdf 1706 00188 3 detecting hate speech on the world wide web william warner and julia hirschberg naacl hlt 2012 paper https aclanthology org w12 2103 pdf 4 a survey on hate speech detection using natural language processing anna schmidt and michael wiegand socialnlp 2017 paper https aclanthology org w17 1101 pdf 5 hate speech detection with comment embeddings nemanja djuric et al www 2015 paper https djurikom github io pdfs djuric2015wwwb pdf 6 are you a racist or am i seeing things annotator influence on hate speech detection on twitter zeerak waseem nlp css emnlp 2016 paper https aclanthology org w16 5618 pdf 7 tweetblm a hate speech dataset and analysis of black lives matter related microblogs on twitter sumit kumar and raj ratn pranesh arxiv 2021 paper https arxiv org pdf 2108 12521 8 hate speech dataset from a white supremacy forum ona de gibert et al alw2 2018 paper https arxiv org pdf 1809 04444 9 the gab hate corpus a collection of 27k posts annotated for hate speech brendan kennedy et al lre 2022 paper https www researchgate net profile brendan kennedy 4 publication 346608617 the gab hate corpus a collection of 27k posts annotated for hate speech links 5fc932eba6fdcc697bdb7175 the gab hate corpus a collection of 27k posts annotated for hate speech pdf 10 finding microaggressions in the wild a case for locating elusive phenomena in social media posts luke breitfeller et al emnlp 2019 paper https aclanthology org d19 1176 pdf 11 learning from the worst dynamically generated datasets to improve online hate detection bertie vidgen et al acl 2021 paper https arxiv org pdf 2012 15761 12 hate speech detection challenges and solutions sean macavaney et al plos one 2019 paper https journals plos org plosone article id 10 1371 journal pone 0221152 13 racial microaggressions in everyday life implications for clinical practice derald wing sue et al american psychologist 2007 paper https www law stanford edu wp content uploads sites default files event 263076 media slspublic sue radicalmicroagressionsineverydaylife pdf 14 the impact of racial microaggressions on mental health counseling implications for clients of color kevin l nadal et al journal of counseling development 2014 paper https www researchgate net profile kevin nadal publication 262412771 the impact of racial microaggressions on mental health counseling implications for clients of color links 606fb0d94585150fe993b16b the impact of racial microaggressions on mental health counseling implications for clients of color pdf 15 a preliminary report on the relationship between microaggressions against black people and racism among white college students jonathan w kanter et al race and social problems 2017 paper https link springer com article 10 1007 s12552 017 9214 0 16 microaggressions and traumatic stress theory research and clinical treatment kevin l nadal american psychological association 2018 paper https psycnet apa org record 2017 58590 000 doi 1 17 arabs as terrorists effects of stereotypes within violent contexts on attitudes perceptions and affect muniba saleem and craig a anderson psychology of violence 2013 paper http www craiganderson org wp content uploads caa abstracts 2010 2014 13sa pdf 18 mean girls the influence of gender portrayals in teen movies on emerging adults gender based attitudes and beliefs elizabeth behm morawitz and dana e mastro journalism and mass communication quarterly 2008 paper https www researchgate net profile elizabeth behm morawitz publication 237797930 mean girls the influence of gender portrayals in teen movies on emerging adults gender based attitudes and beliefs links 57bdfc6c08ae6f1737689537 mean girls the influence of gender portrayals in teen movies on emerging adults gender based attitudes and beliefs pdf 19 exposure to hate speech increases prejudice through desensitization wiktor soral micha bilewicz and miko aj winiewski aggressive behavior 2018 paper https www academia edu download 55409445 4 ab pdf 20 latent hatred a benchmark for understanding implicit hate speech mai elsherief et al emnlp 2021 paper https arxiv org pdf 2109 05322 21 toxigen a large scale machine generated dataset for adversarial and implicit hate speech detection thomas hartvigsen et al acl 2022 paper https arxiv org pdf 2203 09509 22 an empirical study of metrics to measure representational harms in pre trained language models saghar hosseini hamid palangi and ahmed hassan awadallah arxiv 2023 paper https arxiv org pdf 2301 09211 general evaluation 1 trustgpt a benchmark for trustworthy and responsible large language models yue huang et al arxiv 2023 paper https arxiv org pdf 2306 11507 pdf 2 safety assessment of chinese large language models hao sun et al arxiv 2023 paper https arxiv org pdf 2304 10436 pdf 3 flask fine grained language model evaluation based on alignment skill sets seonghyeon ye et al arxiv 2023 paper https arxiv org pdf 2307 10928 pdf 4 judging llm as a judge with mt bench and chatbot arena lianmin zheng et al arxiv 2023 paper https arxiv org pdf 2306 05685 pdf 5 beyond the imitation game quantifying and extrapolating the capabilities of language models aarohi srivastava et al arxiv 2023 paper https arxiv org pdf 2206 04615 pdf 6 a critical evaluation of evaluations for long form question answering fangyuan xu et al arxiv 2023 paper https arxiv org pdf 2305 18201 pdf 7 alpacaeval an automatic evaluator of instruction following models xuechen li et al github 2023 github https github com tatsu lab alpaca eval 8 alpacafarm a simulation framework for methods that learn from human feedback yann dubois et al github 2023 paper https arxiv org pdf 2305 14387 pdf 9 pandalm an automatic evaluation benchmark for llm instruction tuning optimization yidong wang et al arxiv 2023 paper https arxiv org pdf 2306 05087 pdf 10 large language models are not fair evaluators peiyi wang et al arxiv 2023 paper https arxiv org pdf 2305 17926 pdf 11 g eval nlg evaluation using gpt 4 with better human alignment yang liu et al arxiv 2023 paper https arxiv org pdf 2303 16634 pdf 12 benchmarking foundation models with language model as an examiner yushi bai et al arxiv 2023 paper https arxiv org pdf 2306 04181 pdf 13 prd peer rank and discussion improve large language model based evaluations ruosen li et al arxiv 2023 paper https arxiv org pdf 2307 02762 pdf 14 self instruct aligning language models with self generated instructions yizhong wang et al arxiv 2023 paper https arxiv org pdf 2212 10560 pdf
ai
Blockchain-for-maintaining-Digital-Assets
build status https travis ci org ibm blockchain for maintaining digital assets svg branch master https travis ci org ibm blockchain for maintaining digital assets blockchain for maintaining digital assets note this developer pattern creates a blockchain network on ibm blockchain platform version 2 5 using the hyperledger fabric version 1 4 in this code pattern we will be building a digital asset management application by creating and deploying a smart contract on a hyperledger fabric network created on ibm blockchain platform we will then interact with this application via a user interface created using vuejs digital asset management systems ensure that operations are only performed on a digital asset by individuals or organizations that have the right access rights and permissions for the asset the digital asset is defined as the content an image a music file a document a video file etc and its metadata the metadata could be as simple as the name of the asset the name of the owner of the asset and the date of creation of the asset or it could be something more complex such as extracted speech from a video subtitles in any digital asset management system there can be any number of users and these users can have the ability to perform various actions on the asset in the system based on the permissions they have examples of such actions that are being covered in this developer pattern are 1 user registration and user login 2 viewing all existing assets in the system 3 viewing assets owned by the user that is currently logged in 4 uploading a new asset 5 deleting an existing asset 6 suggesting edits to an existing asset 7 viewing suggested edits for an asset that is owned by the user that is currently logged in 8 approving or denying suggeested edits for an asset that is owned by the user that is currently logged in 9 allowing other users the permission to update an asset owned by the user that is currently logged in 10 assigning another user as the owner of an asset that is owned by the user that is currently logged in 11 downloading assets the large number of users participants in this use case as well as the different kinds of actions transactions that can be executed indicate that this is a good use case for blockchain blockchain will also allow for the history of the transactions to be maintained in the ledger thereby ensuring that there is always a chain of record for any changes that have been made to any asset we will start by packaging the node js smart contract using the ibm blockchain platform extension for vs code next we will create a hyperledger fabric network on ibm blockchain platform where we will install and instantiate the smart contract we will also set up an ibm cloud object storage instance where we can retain the digital assets uploaded to the digital asset management application and a fake smtp testing server using mailtrap io to test the email notifications sent by the application finally the vuejs web application which makes use of the hyperledger fabric sdk can be used to interact with the network when you have completed this code pattern you will understand how to package a blockchain smart contract using the ibm blockchain platform extension for vs code set up a hyperledger fabric network on ibm blockchain platform install and instantiate a smart contract package through ibm blockchain platform set up an instance of the ibm cloud object storage service and connect it with the node js application test the blockchain network by executing a node js application with the hyperledger fabric sdk to interact with the deployed network by issuing transactions architecture flow p align center img src https user images githubusercontent com 8854447 72009715 7288c980 3224 11ea 9a85 30a4aac1f5eb png p 1 the blockchain operator sets up the ibm blockchain platform service 2 the ibm blockchain platform service creates a hyperledger fabric network on an ibm cloud kubernetes service and the blockchain operator installs and instantiates the smart contract on the network 3 the node js application server uses the fabric sdk to interact with the deployed network on ibm blockchain platform ibm cloud object storage instance and the mailtrap server fake smtp testing server and creates apis for a web client 4 the vue js client uses the node js application api to interact with the network 5 the user interacts with the vue js web interface to interact with the digital asset management application included components ibm blockchain platform https www ibm com cloud blockchain platform gives you total control of your blockchain network with a user interface that can simplify and accelerate your journey to deploy and manage blockchain components on the ibm cloud kubernetes service ibm cloud kubernetes service https www ibm com cloud container service creates a cluster of compute hosts and deploys highly available containers a kubernetes cluster lets you securely manage the resources that you need to quickly deploy update and scale applications ibm blockchain platform extension for vs code https marketplace visualstudio com items itemname ibmblockchain ibm blockchain platform is designed to assist users in developing testing and deploying smart contracts including connecting to hyperledger fabric environments ibm cloud object storage https www ibm com cloud object storage is a highly scalable cloud storage service designed for high durability resiliency and security mailtrap io https mailtrap io is a test mail server solution that allows testing email notifications without sending them to the real users of your application featured technologies hyperledger fabric v1 4 https hyperledger fabric readthedocs io en release 1 4 is a platform for distributed ledger solutions underpinned by a modular architecture that delivers high degrees of confidentiality resiliency flexibility and scalability node js https nodejs org en is an open source cross platform javascript run time environment that executes server side javascript code vue js 2 6 10 https vuejs org is an open source javascript framework for building user interfaces and single page applications prerequisites ibm cloud account https cloud ibm com registration target 2fdashboard 2fapps node v10 x and npm v6 x or greater https nodejs org en download vscode version 1 38 0 or greater https code visualstudio com ibm blockchain platform extension for vscode https marketplace visualstudio com items itemname ibmblockchain ibm blockchain platform watch the video introduction and demo note click on the image below to view the video on youtube for google chrome press the ctrl key the left mouse button and say open link https user images githubusercontent com 8854447 72086129 6eb48000 32d4 11ea 8869 d6362dd7556a png https youtu be tfnrfdfwhuc running the application follow these steps to set up and run this code pattern the steps are described in detail below steps 1 clone the repo 1 clone the repo 2 package the smart contract 2 package the smart contract 3 create the mailtrap server 3 create the mailtrap server 4 create ibm cloud services 4 create ibm cloud services 5 build a network 5 build a network 6 deploy blockchain for maintaining digital assets smart contract on the network 6 deploy blockchain for maintaining digital assets smart contract on the network 7 connect application to the network 7 connect application to the network 8 run the application 8 run the application 1 clone the repo clone this repository in a folder your choice git clone https github com ibm blockchain for maintaining digital assets git 2 package the smart contract we will use the ibm blockchain platform extension on vs code to package the smart contract open visual studio code and open the contract folder from blockchain for maintaining digital assets repository that was cloned earlier it is important that you are opening the contract folder and not the entire blockchain for maintaining digital assets directory otherwise you will see an error that states that it doesn t understand what programming language you are using press the f1 key to see the different vs code options choose ibm blockchain platform package open project p align center img src https user images githubusercontent com 8854447 71910509 05036d00 3140 11ea 8b15 7c8aeb403974 png p click the ibm blockchain platform extension button on the left this will show the packaged contracts on top and the blockchain connections on the bottom p align center img height 500 src https user images githubusercontent com 8854447 85961325 ae9d1b80 b977 11ea 89d9 4c9b2e627c59 png p next right click on the packaged contract in this case select blockchain for maintaining digital assets 0 0 1 to export it and choose export package choose a location on your machine and save the cds file we will use this packaged smart contract later to deploy on the ibm blockchain platform service now we will start setting up the different services required for configuring our hyperledger fabric network on the ibm cloud and for running our application using this network 3 create the mailtrap server create the mailtrap server https mailtrap io you can sign up using your google or github account or using your email address once the account has been created and you have logged in create a new inbox by typing in an inbox name and clicking on create inbox p align center img height 500 src https user images githubusercontent com 8854447 71910507 046ad680 3140 11ea 9317 aa9219ae1383 gif p 4 create ibm cloud services create the ibm cloud kubernetes service https cloud ibm com kubernetes catalog cluster you can find the service in the catalog for this code pattern we can use the free cluster and give it a name note that the ibm cloud allows one instance of a free cluster which expires after 30 days note it could take 20 minutes for the ibm cloud kubernetes service setup to complete br p align center img src https user images githubusercontent com 8854447 71910506 046ad680 3140 11ea 9f4b 8bcb4d2a651b gif p br create the ibm cloud object storage https cloud ibm com catalog services cloud object storage service on the ibm cloud you can find the service in the catalog and give it a name br p align center img src https user images githubusercontent com 8854447 71918961 80b9e580 3151 11ea 8efc 8d4a08b55380 gif p br create the ibm blockchain platform https cloud ibm com catalog services blockchain platform service on the ibm cloud you can find the service in the catalog and give it a name br p align center img src https user images githubusercontent com 8854447 71910502 046ad680 3140 11ea 9853 3598b9363d91 gif p br after your kubernetes cluster is up and running you can deploy your ibm blockchain platform on the cluster again wait for the ibm cloud kubernetes service to indicate it was deployed the ibm blockchain platform service walks through few steps and finds your cluster on the ibm cloud to deploy the service on br p align center img src https user images githubusercontent com 8854447 71910501 046ad680 3140 11ea 8440 9d2fef0be426 gif p br once the blockchain platform is deployed on the kubernetes cluster you can launch the console to start configuring your blockchain network 5 build a network we will build a network as provided by the ibm blockchain platform documentation https cloud ibm com docs services blockchain howto topic blockchain ibp console build network ibp console build network this will include creating a channel with a single peer organization with its own msp and ca certificate authority and an orderer organization with its own msp and ca we will create the respective identities to deploy peers and operate nodes create your peer organization ca navigate to the b nodes b tab in the left navigation and click b add certificate authority b click b create a certificate authority b and click b next b give it a b ca display name b of org1 ca a b ca administrator enroll id b of admin and a b ca administrator enroll secret b of adminpw then click b next b review the summary and click b add certificate authority b br p align center img src https user images githubusercontent com 8854447 85798060 cf146e00 b70a 11ea 856b ef3264428fbc gif p br associate the peer organization ca admin identity in the nodes tab select the b org1 ca b once it is running indicated by the green box in the tile click b associate identity b on the ca overview panel on the side panel select the b enroll id b tab provide an b enroll id b of admin and an b enroll secret b of adminpw use the default value of org1 ca admin for the b identity display name b click b associate identity b to associate the admin identity with the b org1 ca b br p align center img src https user images githubusercontent com 8854447 85799219 dfc5e380 b70c 11ea 80a6 afccb0e526fc gif p br use peer organization ca to register the peer and org1 admin identities select the b org1 ca b certificate authority and ensure the admin identity that was created for the ca is visible in the table the next step is to register an admin for the organization org1 click on the b register user b button give an b enroll id b of org1admin and an b enroll secret b of org1adminpw set the b type b for this identity as admin specify to b use root affiliation b leave the b maximum enrollments b field blank click b next b skip the section to add attributes to this user and click b register user b repeat the process to create an identity of the peer click on the b register user b button give an b enroll id b of peer1 and an b enroll secret b of peer1pw set the b type b for this identity as peer specify to b use root affiliation b leave the b maximum enrollments b field blank click b next b skip the section to add attributes to this user and click b register user b br p align center img src https user images githubusercontent com 8854447 85800394 e35a6a00 b70e 11ea 967a f37334a685a3 gif p br create the peer organization msp definition navigate to the b organizations b tab in the left navigation and click b create msp definition b enter the b msp display name b as org1msp and the b msp id b as org1msp click b next b specify org1 ca as the b root certificate authority b click b next b select the b new identity b tab give the b enroll id b and b enroll secret b for your organization admin i e org1admin and org1adminpw respectively then give the b identity name b as org1 admin click the b generate b button to enroll this identity as the admin of your organization and add the identity to the wallet click b export b to export the admin certificates to your file system click b next b review all the information and click b create msp definition b br p align center img src https user images githubusercontent com 8854447 85800904 cbcfb100 b70f 11ea 9b95 376d9ef72caa gif p br create a peer navigate to the b nodes b tab in the left navigation and click b add peer b click b create a peer b and then click b next b give the b peer display name b as peer org1 and click b next b on the next screen select org1 ca as the b certificate authority b then give the b peer enroll id b and b peer enroll secret b as peer1 and peer1pw respectively select the b organization msp b as org1msp leave the b tls csr hostname b blank and select 1 4 7 0 in the drop down for b fabric version b click b next b provide org1 admin as the b peer administrator identity b and click b next b review the summary and click b add peer b br p align center img src https user images githubusercontent com 8854447 86380230 e0ed9800 bc59 11ea 9440 50e0ca1921cd gif p br create your orderer organization ca navigate to the b nodes b tab in the left navigation and click b add certificate authority b click b create a certificate authority b and click b next b give it a b ca display name b of orderer ca a b ca administrator enroll id b of admin and a b ca administrator enroll secret b of adminpw then click b next b review the summary and click b add certificate authority b br p align center img src https user images githubusercontent com 8854447 85802348 c4f66d80 b712 11ea 801b 9f2fbbb66593 gif p br associate the orderer organization ca admin identity in the nodes tab select the b orderer ca b once it is running indicated by the green box in the tile click b associate identity b on the ca overview panel on the side panel select the b enroll id b tab provide an b enroll id b of admin and an b enroll secret b of adminpw use the default value of orderer ca admin for the b identity display name b click b associate identity b to associate the admin identity with the b orderer ca b br p align center img src https user images githubusercontent com 8854447 85802898 e2780700 b713 11ea 82c7 9ffc09686f0b gif p br use orderer organization ca to register orderer and orderer admin identities select the b orderer ca b certificate authority and ensure the admin identity that was created for the ca is visible in the table the next step is to register an admin for the organization orderer click on the b register user b button give an b enroll id b of ordereradmin and an b enroll secret b of ordereradminpw set the b type b for this identity as admin specify to b use root affiliation b leave the b maximum enrollments b field blank click b next b skip the section to add attributes to this user and click b register user b repeat the process to create an identity of the orderer click on the b register user b button give an b enroll id b of orderer and an b enroll secret b of ordererpw set the b type b for this identity as orderer specify to b use root affiliation b leave the b maximum enrollments b field blank click b next b skip the section to add attributes to this user and click b register user b br p align center img src https user images githubusercontent com 8854447 85803027 4995bb80 b714 11ea 81c7 b4b4cb2ec49d gif p br create the orderer organization msp definition navigate to the b organizations b tab in the left navigation and click b create msp definition b enter the b msp display name b as orderermsp and the b msp id b as orderermsp click b next b specify orderer ca as the b root certificate authority b click b next b select the b new identity b tab give the b enroll id b and b enroll secret b for your organization admin i e ordereradmin and ordereradminpw respectively then give the b identity name b as orderer admin click the b generate b button to enroll this identity as the admin of your organization and add the identity to the wallet click b export b to export the admin certificates to your file system click b next b review all the information and click b create msp definition b br p align center img src https user images githubusercontent com 8854447 85803287 caed4e00 b714 11ea 94e7 305880e6ba63 gif p br create an orderer navigate to the b nodes b tab in the left navigation and click b add ordering service b click b create an ordering service b and then click b next b give the b ordering service display name b as orderer and click b next b on the next screen select orderer ca as the b certificate authority b then give the b ordering service enroll id b and b ordering service enroll secret b as orderer and ordererpw respectively select the b organization msp b as orderermsp leave the b tls csr hostname b blank and select 1 4 7 0 in the drop down for b fabric version b click b next b provide orderer admin as the b orderer administrator identity b and click b next b review the summary and click b add ordering service b br p align center img src https user images githubusercontent com 8854447 86392493 8c521900 bc69 11ea 9a7f 544bfadfb34e gif p br add organization as consortium member on the orderer to transact navigate to the b nodes b tab and click on the b orderer b that was created under b consortium members b click b add organization b select the b existing msp id b tab from the drop down list select org1msp org1msp as this is the msp that represents the peer s organization org1 click b add organization b br p align center img src https user images githubusercontent com 8854447 85803823 105e4b00 b716 11ea 9ee3 28e0d30ffa95 gif p br create the channel navigate to the b channels b tab in the left navigation and click b create channel b click b next b give the b channel name b as mychannel select orderer from the b ordering service b drop down list click b next b under b organizations b select org1msp org1msp from the drop down list to add the organization org1 as a member of this channel click the b add b button set the permissions for this member as b operator b click b next b leave the b policy b as the default value i e 1 out of 1 click b next b select the b channel creator msp b as org1msp org1msp and the b identity b as org1 admin click b next b review the summary and click b create channel b br p align center img src https user images githubusercontent com 8854447 85804332 5a93fc00 b717 11ea 81e7 a4b6955575ee gif p br join your peer to the channel click on the newly created channel b mychannel b in the side panel that opens under b choose from available peers b select peer org1 once the peer is selected a check mark will be displayed next to it ensure that b make anchor peer s b is marked as yes click b join channel b br p align center img src https user images githubusercontent com 8854447 85804533 e60d8d00 b717 11ea 8066 64d66e4b4d33 gif p br 6 deploy blockchain for maintaining digital assets smart contract on the network install a smart contract navigate to the b smart contracts b tab in the left navigation and click b install smart contract b click on b add file b browse to the location of the blockchain for maintaining digital assets smart contract package file it is probably named blockchain for maintaining digital assets 0 0 1 cds which we packaged earlier using the ibm blockchain platform extension for visual studio code once the contract is uploaded click b install smart contract b br p align center img src https user images githubusercontent com 8854447 85815413 562b0b80 b736 11ea 95c7 dbc2293d7e7d gif p br instantiate smart contract under b installed smart contracts b find the smart contract from the list note ours is called blockchain for maintaining digital assets installed on our peer and click b instantiate b from the overflow menu on the right side of the row on the side panel that opens select the channel mychannel on which to instantiate the smart contract click b next b select org1msp as the organization member to be included in the endorsement policy click b next b skip the b setup private data collection b step and simply click b next b leave the b function name b and b arguments b blank click b instantiate smart contract b br p align center img src https user images githubusercontent com 8854447 85815857 9dfe6280 b737 11ea 9883 02f86dcaa9c1 gif p br 7 connect application to the network connect with sdk through connection profile navigate to the b organizations b tab in the left navigation and click on b org1msp b click on b download connection profile b in the side panel that opens up select yes as the response for b include org1 ca for user registration and enrollment b under b select peers to include b select peer org1 then click b download connection profile b this will download the connection json which we will use to establish a connection between the node js web application and the blockchain network br p align center img src https user images githubusercontent com 8854447 85816453 5973c680 b739 11ea 9ddb 370ae50f9f81 gif p br create an application admin navigate to the b nodes b tab in the left navigation and under b certificate authorities b choose b org1 ca b click on the b register user b button give an b enroll id b of app admin and an b enroll secret b of app adminpw set the b type b for this identity as client specify to b use root affiliation b leave the b maximum enrollments b field blank click b next b click on b add attribute b enter the b attribute name b as hf registrar roles and the b attribute value b as click b register user b br p align center img src https user images githubusercontent com 8854447 85872406 b0ab8280 b79d 11ea 80e8 632d2bd39285 gif p br update application connection profile copy the connection profile you downloaded into the config folder web app server config update the config json web app server config config json file with the connection json file name you downloaded the b enroll id b and b enroll secret b for your app admin which we earlier provided as app admin and app adminpw respectively the orgmsp id which we provided as org1msp the caname which can be found in your connection json file under organizations org1msp certificateauthorities this would be like an ip address and a port the peername which can be found in your connection json file under organizations org1msp peers this would be like an ip address and a port update gateway discovery to enabled true aslocalhost false to connect to ibm blockchain platform go to your inbox on mailtrap io and choose nodemailer in the dropdown under integrations obtain the host port auth user and auth pass values and specify them as smtphost smtpport smtpusername and smtppassword values in the config json file br p align center img src https user images githubusercontent com 8854447 85883435 fc1a5c80 b7ae 11ea 885b 20796d4dba81 gif p br go to your ibm cloud object storage instance and go to buckets in the left hand navigation pane and click on create bucket choose standard under predefined buckets provide a unique bucket name as per the naming rules specified skip the upload files step and click next skip the test bucket out step and click next once the bucket is successfully created obtain the following information from the webpage under bucket details obtain the bucket name and specify it as cos bucketname in the config json under service credentials obtain the apikey and resource instance id values and specify them as the cos apikeyid and cos serviceinstanceid respectively in the config json under endpoints obtain the public endpoint and specify this value as the cos endpoint in the config json file br p align center img src https user images githubusercontent com 8854447 85887092 2838dc00 b7b5 11ea 8dbd e70da8f2c9b3 gif p br once all this is done your config json should look something like this bash channel name mychannel smart contract name blockchain for maintaining digital assets connection file org1msp profile json appadmin app admin appadminsecret app adminpw orgmspid org1msp caname 184 172 229 220 31844 peername 184 172 229 220 30884 gatewaydiscovery enabled true aslocalhost false smtphost smtp mailtrap io smtpport 2525 smtpusername cb49e25f8cbe5f smtppassword 3734c09cfdj05f senderemail no reply digitalassetscodepattern com cos endpoint s3 us south cloud object storage appdomain cloud cos apikeyid qrc2rlbkjems755xr88 78sedgd2ai8diqxvd74g21je cos serviceinstanceid crn v1 bluemix public cloud object storage global a 86ac1b16b6f8b9639124a38d8edbd301 2f8d9627 46ff 46e9 a053 9d3e7121eedf cos bucketname blockchain digital assets bucket 8 run the application in a new terminal navigate to the server web app server directory bash cd blockchain for maintaining digital assets web app server build the node dependencies bash npm install enroll the admin and add identity to the wallet note this creates public and private key files for the app admin in the idwallet folder inside the config folder web app server config if a folder named app admin exists in the idwallet folder then the following command will not enroll the app admin as it already exists in the wallet remove the app admin folder and then run the following command bash node enrolladmin js start the server bash npm start in a separate terminal navigate to the client web app client directory bash cd blockchain for maintaining digital assets web app client build the node dependencies bash npm install start the client bash npm run serve once both the server and client have successfully started the ui can be accessed at http localhost 8080 http localhost 8080 main page of application br p align center img src https user images githubusercontent com 8854447 71941831 28063f00 3189 11ea 9f02 dfe2f78a6cbb png p br you can have a look at the introduction and demo video watch the video introduction and demo for examples of actions that can be taken within the application containerize the application here are instructions for containerizing the application the advantage to containerizing is all of the benefits one gets with kubernetes to include standing up the front end client and backend server on a public ip address so anyone can access here are the steps build tag and push the image to a container registry docker build f dockerfile t commpaper docker tag commpaper us icr io commpaper commpaper docker push us icr io commpaper commpaper ensure you have setup the kubernetes onfigmaps for your server cd blockchain for maintaining digital assets web app server config kubectl delete configmap configuration kubectl create configmap configuration from file config json from file connection profile json ensure you have setup the kubernetes configmaps for your client cd blockchain for maintaining digital assets web app kubectl delete configmap images kubectl delete configmap assets kubectl create configmap assets from file client src assets logo png kubectl create configmap images from file client public images favicon ico deploy your application to kubernetes cd blockchain for maintaining digital assets web app kubectl delete f kubernetes deployment yaml kubectl apply f kubernetes deployment yaml note make sure and edit the kubernetes deployment yaml file with the correct information troubleshooting if you get an error that says error calling register endpoint failed with error error self signed certificate you can get past this by adding httpoptions verify false to the certificateauthorities section of the connection profile that was downloaded from ibm blockchain platform br p align center img src https user images githubusercontent com 8854447 85960220 fb7cf400 b96f 11ea 9821 b9e21d6382e5 png p br br p align center img src https user images githubusercontent com 8854447 85960318 b2796f80 b970 11ea 9fcc b8af15bf4b38 png p br extending the code pattern this application can be extended by adding additional metadata for the digital assets adding enhanced features for registering and logging in users adding encryption to the ibm cloud object storage bucket links hyperledger fabric docs http hyperledger fabric readthedocs io en latest ibm code patterns for blockchain https developer ibm com patterns category blockchain license this code pattern is licensed under the apache software license version 2 separate third party code objects invoked within this code pattern are licensed by their respective providers pursuant to their own separate licenses contributions are subject to the developer certificate of origin version 1 1 dco https developercertificate org and the apache software license version 2 https www apache org licenses license 2 0 txt apache software license asl faq https www apache org foundation license faq html whatdoesitmean
blockchain
twitter-scraper-selenium
h1 twitter scraper selenium h1 p python s package to scrape twitter s front end easily with selenium p pypi license https img shields io pypi l ansicolortags svg https opensource org licenses mit python 3 6 9 https img shields io badge python 3 6 blue svg https www python org downloads release python 360 maintenance https img shields io badge maintained yes green svg https github com shaikhsajid1111 facebook page scraper graphs commit activity table of contents h2 table of contents h2 details open open summary table of contents summary ol li a href getting started getting started a ul li a href prerequisites prerequisites a li li a href installation installation a ul li a href sourceinstallation installing from source a li li a href pypiinstallation installing with pypi a li ul li ul li li a href usage usage a ul li a href availablefunction available functions in this package summary a li ul ul li a href profiledetail scraping profile s details a ul li a href profiledetailexample in json format example a li li a href profiledetailargument function argument a li li a href profiledetailkeys keys of the output a li ul li ul ul li a href profile scraping profile s tweets a ul li a href profilejson in json format example a li li a href profilecsv in csv format example a li li a href profileargument function arguments a li li a href profileoutput keys of the output data a li ul li a href to scrape user tweets with api scraping user s tweet using api a li ul li a href to scrape user tweets with api in json format example a li li a href users api parameter function arguments a li li a href scrape user with api args keys keys of the output a li ul li a href proxy using scraper with proxy a ul li a href unauthenticatedproxy unauthenticated proxy a li li a href authenticatedproxy authenticated proxy a li ul li li ul li li a href privacy privacy a li li a href license license a li ol details table of contents br hr h2 id prerequisites prerequisites h2 li internet connection li li python 3 6 li li chrome or firefox browser installed on your machine li hr h2 id installation installation h2 h3 id sourceinstallation installing from the source h3 p download the source code or clone it with p git clone https github com shaikhsajid1111 twitter scraper selenium p open terminal inside the downloaded folder p br python3 setup py install h3 id pypiinstallation installing with a href https pypi org pypi a h3 pip3 install twitter scraper selenium hr h2 id usage usage h2 h3 id availablefunction available function in this package summary h3 div table thead tr td function name td td function description td td scraping method td td scraping speed td tr thead tr td code scrape profile code td td scrape s twitter user s profile tweets td td browser automation td td slow td tr tr td code get profile details code td td scrape s twitter user details td td http request td td fast td tr tr td code scrape profile with api code td td scrape s twitter tweets by twitter profile username it expects the username of the profile td td browser automation http request td td fast td tr table p note http request method sends the request to twitter s api directly for scraping data and browser automation visits that page scroll while collecting the data p div br hr h3 id profiledetail to scrape twitter profile details h3 div id profiledetailexample python from twitter scraper selenium import get profile details twitter username twitterapi filename twitter api data browser firefox headless true get profile details twitter username twitter username filename filename browser browser headless headless output js id 6253282 id str 6253282 name twitter api screen name twitterapi location san francisco ca profile location null description the real twitter api tweets about api changes service issues and our developer platform don t get an answer it s on my website url https t co 8ikczcdr19 entities url urls url https t co 8ikczcdr19 expanded url https developer twitter com display url developer twitter com indices 0 23 description urls protected false followers count 6133636 friends count 12 listed count 12936 created at wed may 23 06 01 13 0000 2007 favourites count 31 utc offset null time zone null geo enabled null verified true statuses count 3656 lang null contributors enabled null is translator null is translation enabled null profile background color null profile background image url null profile background image url https null profile background tile null profile image url null profile image url https https pbs twimg com profile images 942858479592554497 bbazlo9l normal jpg profile banner url null profile link color null profile sidebar border color null profile sidebar fill color null profile text color null profile use background image null has extended profile null default profile false default profile image false following null follow request sent null notifications null translator type null div br div id profiledetailargument p code get profile details code arguments p table thead tr td argument td td argument type td td description td tr thead tbody tr td twitter username td td string td td twitter username td tr tr td output filename td td string td td what should be the filename where output is stored td tr tr td output dir td td string td td what directory output file should be saved td tr tr td proxy td td string td td optional parameter if user wants to use proxy for scraping if the proxy is authenticated proxy then the proxy format is username password host port td tr tbody table div hr br div h4 id profiledetailkeys keys of the output p detail of each key can be found a href https developer twitter com en docs twitter api v1 data dictionary object model user here a h4 div br hr h3 id profile to scrape profile s tweets h3 p id profilejson in json format p python from twitter scraper selenium import scrape profile microsoft scrape profile twitter username microsoft output format json browser firefox tweets count 10 print microsoft output javascript 1430938749840629773 tweet id 1430938749840629773 username microsoft name microsoft profile picture https twitter com microsoft photo replies 29 retweets 58 likes 453 is retweet false retweet link posted time 2021 08 26t17 02 38 00 00 content easy to use and efficient for all u2013 windows 11 is committed to an accessible future n nhere s how it empowers everyone to create connect and achieve more https msft it 6009x6tbw hashtags mentions images videos tweet url https twitter com microsoft status 1430938749840629773 link https blogs windows com windowsexperience 2021 07 01 whats coming in windows 11 accessibility ocid fy22 soc omc br tw windows ac hr p id profilecsv in csv format p python from twitter scraper selenium import scrape profile scrape profile twitter username microsoft output format csv browser firefox tweets count 10 filename microsoft directory home user downloads output br table class table table bordered table hover table condensed style line height 14px overflow hidden white space nowrap thead tr th title field 1 tweet id th th title field 2 username th th title field 3 name th th title field 4 profile picture th th title field 5 replies th th title field 6 retweets th th title field 7 likes th th title field 8 is retweet th th title field 9 retweet link th th title field 10 posted time th th title field 11 content th th title field 12 hashtags th th title field 13 mentions th th title field 14 images th th title field 15 videos th th title field 16 post url th th title field 17 link th tr thead tbody tr td 1430938749840629773 td td microsoft td td microsoft td td https twitter com microsoft photo td td align right 64 td td align right 75 td td align right 521 td td false td td td td 2021 08 26t17 02 38 00 00 td td easy to use and efficient for all windows 11 is committed to an accessible future br br here 39 s how it empowers everyone to create connect and achieve more https msft it 6009x6tbw td td td td td td td td td td https twitter com microsoft status 1430938749840629773 td td https blogs windows com windowsexperience 2021 07 01 whats coming in windows 11 accessibility ocid fy22 soc omc br tw windows ac td tr tbody table p p br hr div id profileargument p code scrape profile code arguments p table thead tr td argument td td argument type td td description td tr thead tbody tr td twitter username td td string td td twitter username of the account td tr tr td browser td td string td td which browser to use for scraping only 2 are supported chrome and firefox default is set to firefox td tr tr td proxy td td string td td optional parameter if user wants to use proxy for scraping if the proxy is authenticated proxy then the proxy format is username password host port td tr tr td tweets count td td integer td td number of posts to scrape default is 10 td tr tr td output format td td string td td the output format whether json or csv default is json td tr tr td filename td td string td td if output parameter is set to csv then it is necessary for filename parameter to passed if not passed then the filename will be same as username passed td tr tr td directory td td string td td if output format parameter is set to csv then it is valid for directory parameter to be passed if not passed then csv file will be saved in current working directory td tr tr td headless td td boolean td td whether to run crawler headlessly default is code true code td tr tbody table div hr br div id profileoutput p keys of the output p table thead tr td key td td type td td description td tr thead tbody tr td tweet id td td string td td post identifier integer casted inside string td tr tr td username td td string td td username of the profile td tr tr td name td td string td td name of the profile td tr tr td profile picture td td string td td profile picture link td tr tr td replies td td integer td td number of replies of tweet td tr tr td retweets td td integer td td number of retweets of tweet td tr tr td likes td td integer td td number of likes of tweet td tr tr td is retweet td td boolean td td is the tweet a retweet td tr tr td retweet link td td string td td if it is retweet then the retweet link else it ll be empty string td tr tr td posted time td td string td td time when tweet was posted in iso 8601 format td tr tr td content td td string td td content of tweet as text td tr tr td hashtags td td array td td hashtags presents in tweet if they re present in tweet td tr tr td mentions td td array td td mentions presents in tweet if they re present in tweet td tr tr td images td td array td td images links if they re present in tweet td tr tr td videos td td array td td videos links if they re present in tweet td tr tr td tweet url td td string td td url of the tweet td tr tr td link td td string td td if any link is present inside tweet for some external website td tr tbody table div br hr div id to scrape user tweets with api p to scrap profile s tweets with api p python from twitter scraper selenium import scrape profile with api scrape profile with api elonmusk output filename musk tweets count 100 div br div id users api parameter p code scrape profile with api code arguments p table thead tr td argument td td argument type td td description td tr thead tbody tr td username td td string td td twitter s profile username td tr tr td tweets count td td integer td td number of tweets to scrape td tr tr td output filename td td string td td what should be the filename where output is stored td tr tr td output dir td td string td td what directory output file should be saved td tr tr td proxy td td string td td optional parameter if user wants to use proxy for scraping if the proxy is authenticated proxy then the proxy format is username password host port td tr tr td browser td td string td td which browser to use for extracting out graphql key default is firefox td tr tr td headless td td string td td whether to run browser in headless mode td tr tbody table div br div id scrape user with api args keys p output p js 1608939190548598784 tweet url https twitter com elonmusk status 1608939190548598784 tweet details user details div br hr div h3 id proxy using scraper with proxy http proxy h3 div id unauthenticatedproxy p just pass code proxy code argument to function p python from twitter scraper selenium import scrape profile scrape profile elonmusk headless false proxy 66 115 38 247 5678 output format csv filename musk in ip port format div br div id authenticatedproxy p proxy that requires authentication p python from twitter scraper selenium import scrape profile microsoft data scrape profile twitter username microsoft browser chrome tweets count 10 output json proxy sajid pass123 66 115 38 247 5678 username password ip port print microsoft data div br hr div id privacy h2 privacy h2 p this scraper only scrapes public data available to unauthenticated user and does not holds the capability to scrape anything private p div br hr div id license h2 license h2 mit div
python python3 selenium twitter twitter-scraper twitter-bot twitter-profile twitter-hashtag twitter-profiles automation twitter-api pypi json csv web-scraping social-media tweets contribution-welcome open-source hacktoberfest
front_end
FreeRTOSMinGW
freertosmingw basic freertos port for x86 mingw winmm library building toolchain used to build the project was mingw project is using cmake as a build mechanism proposed ide is clion
os
stack_Project_2023_09_20
stack project 2023 09 20 cloud engineering stack project
cloud
developer.bitcoin.com
developer bitcoin com developer tooling cloud and market bitbox bitcoin com s developer plaform is based on the popular bitbox javascript framework offering utility methods for mnemonics hdnodes ecpairs crypto address conversion transactions and much more badger integrate bitcoin cash bch into your next app with ease helpers tools and sdk s to help connect your website to the badger wallet rest the bch json rpc over http including a fully documented and interactive gui which developers can use to test their ideas and confirm their code is making proper api calls cloud blockchain as a service infrastructure to deploy and scale your apps an ecosystem of add ons for data monitoring logging metrics testing and more all built w bitbox market paid downloads streaming media in app purchases tokens and more ways for you to monetize install develop sh yarn install yarn start build deploy sh yarn run build deploy public folder somewhere primary technologies gatsby v2 react markdown graphql styled components love and care content contributing file structure the site follows gatsby v2 structure please refer to the gatsby docs https next gatsbyjs org for an overview data documentation tutorial insight and mastering bitcoin cash chapters are all stored as markdown files as such these pages can all be created removed and edited by just editing the associated md file these are found in src data custom markdown tags we define some custom markdown takes to extend it s functionality all content nested in these custom components needs to be written as html tip a bordered box used to render a tip info warning box usage standard tip tip use a tip to display tips tip standard tip with nested i tip use a tip to display tips i in italic i tip warning tip nature warning use warnings to display warnings tip note tip nature note notes for extra info tip image caption caption below an image usage standard tip image text images wow png image caption use a tip to display tips image caption warning tip nature warning use warnings to display warnings tip note tip nature note notes for extra info tip anchor anchor to link to if the auto generated header anchors aren t working usage basic invisible anchor anchor name link to me anchor spacer empty space until we can figure out a better solution usage some extra spce spacer spacer
front_end
HTML4Vision
html4vision a simple html visualization tool for computer vision research https github com mtli html4vision https github com mtli html4vision p align center img alt demo src examples sort png width 500px p easy table description and generation table description and generation for algorithm comparison and pipeline visualization handy formatting controls formatting to make pretty figures web publishing web publishing for remote browsering interactive tables interactive tables able to sort and toggle display states tile images tile images with optional captions and hyperlinks 3d models 3d models with camera controls installation pip install html4vision pypi version https badge fury io py html4vision svg https badge fury io py html4vision table description and generation you can use glob patterns e g results method 2 png to specify the content and use imagetable to generate the table automatically python from html4vision import col imagetable table description cols col id1 id make a column of 1 based indices col img label map images road label png specify image content for column 2 col img road object map images road roadobj png specify image content for column 3 col img amodel road mask images road amodelroad png specify image content for column 4 html table generation imagetable cols p align center img alt basic example src examples basic png p example examples basic py descriptor syntax the table is described by a list of col objects col type name content subset style href type text for text img for images overlay for image overlay image overlay id0 for zero based indices id1 for one based indices and model for 3d models name the name of the column content for images both img and overlay and 3d models it is a glob pattern https docs python org 3 library glob html or a list of the file paths for text it is a list of strings it is none for all other types indexing is automatic subset subset selection of the content provided if subset is a single integer it is interpreted as length n and the first n items are selected if subset is a tuple it is interpreted in the form of start stop or start step stop if subset is a list it is interpreted as a list of indices to be selected style a string of css rules for the entire column see styling through css styling through css for more href either a glob pattern https docs python org 3 library glob html or a list of urls works in the same way as content for img columns generation syntax basic syntax python imagetable contents cols out file index html title thumbs dir none additional contents summary row none copyright true modifiers pathrep none sortcol none precompute thumbs false thumb quality 95 style imsize none imscale 1 preserve aspect false summary color none sticky header false sort style none zebra false style none interaction overlay toggle false sortable false 3d model viewer auto rotate false camera controls true the only required argument is cols which is a sequence of col objects specifying the content of the table out file optionally names the output file while title sets the title of the generated html page the meaning and format for other arguments can be found in respective sections generating the html file to another folder search path and publish path pathrep size control size control imsize imscale preserve aspect precomputed thumbnails precomputed thumbnails thumbs dir precompute thumbs thumb quality sorting sorting sortcol sortable sticky header sort style zebra summary row and summary color display toggle display toggle overlay toggle styling through css styling through css style 3d models 3d models auto rotate and camera controls web publishing web publishing is designed for sharing the visualized results typically it can be used for sharing the results with others or viewing the results on a remote compute node without downloading them in advance of course it can also be used as a general purpose http server the script provided here is functionally similar to the simplehttpserver in python 2 and serves the files in the current directory it supports both python 2 3 and uses multiple threads for a much better web experience for security reasons directory browsing is disabled and accessing files outside of the current directory is not allowed so you need to type in the exact html file name e g http 127 0 0 1 6096 index html to access change the current directory to the directory you want to serve and run python m html4vision server the default port is 6096 to specify a port e g 23333 run python m html4vision server 23333 search path and publishing path the search path the path used to look for the images in the python script might be different from the publishing path the path that is encoded in the generated html it is not uncommon to generate the html file to a path other than the current folder the pathrep argument is designed for this case it can be a string or a tuple of two strings the former specifies the part of path to be removed while the latter specifies the one replacing it example examples pathrep py formatting size control the image size is controlled by imsize imscale and preserve aspect arguments note that all scaling is done through javascript which takes place after the content of the webpage is loaded imsize can be either a single index or a tuple or a list of width and height if it is a single index it means scaling the images in all other columns to match the corresponding image in the column specified by the index the index is zero based and refers to the items in the list of col objects understanding the indexing is important when you also use image overlay where two objects describe a single column for example you can scale the intermediate feature maps of a convolutional neural network cnn to match the size of the input image if imsize is a tuple or a list of width and height then all images will be scaled to that size imscale is a factor to scale the image when used in combination with imsize the imscale is applied after the effects of imsize if preserve aspect is true then the image aspect ratio will be preserved even if the size for imsize differs from the original image aspect ratio in that case the image will be resized so that the maximum image size matches imsize precomputed thumbnails the default thumbnail previews are based on using javascript to resize down the image inside the browser however if the amount of image data is too much then this mode can become prohibitively slow or even crash in the browser due to the system memory resources being exhausted if precompute thumbs is set to true then thumbnail images are precomputed and stored alongside the html file this allows the memory requirements in the browser to be greatly reduced by default a thumbnail directory name is automatically generated by appending thumbs to the name provided for out file however thumbs dir can be used to specify the thumbnail directory explicitly thumbnail generation is also cached using the modification times returned by the system s os stat function so the slow thumbnail generation need not be re run if thumbnails have already been computed in a previous run of the script precomputed thumbnails will be generated with the same extension as images already have in case the image extension is jpg thumb quality will control the jpeg compression quality example examples auto thumbs py styling through css the benefit of an html table is the freedom to customize styles with css below is an example of setting the image border and box shadows and highlighting a particular column p align center img alt formatting example src examples formatting png width 700px p example examples formatting py image overlay p align center img alt overlay example src examples overlay png p two consecutive col objects form a single image column the first col object describes the bottom image while the second describes the top image their types need to be img and overlay respectively if the top image by itself is not transparent you can specify its opacity by adding opacity 0 5 value range from 0 to 1 to the style field example examples overlay py interactive tables sorting the sorting feature comes in handy when you have statistics for each data point in the dataset a one time sorting can be done at html generation you can specify sortcol to an index of the column based which you want to sort furthermore you can enable post generation interactive sorting by setting sortable to true once interactive sorting is enabled a style template can be specified with sort style for a list of templates check files contains theme here https cdnjs com libraries jquery tablesorter 2 30 7 zebra stripes can be added to the table if zebra is set to true when you have too many columns it might be useful to enable set sticky header to true to keep track of the columns as you scroll down the page note that some features may seem irrelevant to sorting yet they only work when sortable is true in addition you will find how to specify a summary row with a particular color in the following example example examples sort py display toggle when image overlay is used and overlay toggle is set to true you can click the overlaid images to toggle image overlay example examples overlay py hyperlinks the href field is designed to create a clickable link for table items all column types support href as long as the column content is non empty it works in the same way as the content field of img columns mdash it will synergize with subset selection and the pathrep argument if specified see here search path and publish path for overlay columns href should be attached to the preceding img column and it cannot be used together with overlay toggle true since they both bind to the mouse click example examples href py tile images in addition to the main function imagetable we provide another function to generate a grid display of a list of images imagetile the layout is specified through n col which means each row has items no more than n col the number of rows is calculated automatically optionally you can add captions and urls to the images basic syntax python imagetile contents content n col 3 out file index html title additional contents caption none href none subset none copyright true modifiers pathrep none style imsize none imscale 1 caption bottom true style none most arguments bear the same meaning as in imagetable example examples tile py integrated example for an integrated example of how html4vision is used in practice you can check out the sap repo https github com mtli sap blob master doc tasks md web display 3d models not only does this repo tiles images but also 3d models the rendering and control are supported through google s model viewer https github com google model viewer the 3d models used in the example below are also from model viewer p align center img alt 3d model example src examples model png width 500px p note you need to serve the generated html with a server to view the content see web publishing web publishing directly opening the html file locally will yield cross origin error cros example examples model py contributions any contribution is welcome many thanks to these noteworthy community contributions connellybarnes https github com connellybarnes thumbnail generation bertjiazheng http github com bertjiazheng 3d model support
visualization computer-vision html
ai
CSSE3010
csse3010 firmware written for embedded systems design and interfacing csse3010 in semester 1 2018 at the university of queensland this consisted of weekly in class exercises weekly take home demonstrations two projects weekly demonstrations projects were developed focussing on various embedded systems concepts concepts week 1 pushbuttons and leds week 2 pan tilt servo pwm week 3 joystick adc infrared tx rx timer input capture week 4 interfacing with radio transceiver fsms week 5 real time operating systems cli week 6 additional rtos features projects project 1 duplex radio ir communications this project implemented duplex communications between two stm32f4 dev boards over radio with an acknowledgement scheme over infrared the radio packets used a 7 4 hamming encoding with a parity bit for error detection and correction the acknowledgement scheme implemented ack err and timeouts and employed manchester encoding project 2 remote plotter controller this project wirelessly controls a computer numerical controlled cnc plotter and focusses on using rtos in an embedded design the design receives commands over a cli or ir and transmits packets to the remote plotter for drawing
os
gin-user-center
gin user center gin user start https github com pengfeidai gin user start x x go git clone git github com pengfeidai gin user center git go version 1 13 global environment configure go export go111module on export goproxy https goproxy io go env w goproxy https goproxy cn direct go cd gin user center go run main go listening and serving http on port 8000 pid 15932 http server curl x get http 127 0 0 1 8000 health check name world yaml server port 8000 mode release limitnum 20 mongo usemongo false redis useredis false redis redis addr 127 0 0 1 6379 password db 0 mysql mysql user password path 127 0 0 1 3306 database user center config charset utf8 parsetime true loc local driver mysql maxidleconns 10 maxopenconns 100 log false mongo database url session key size 10 7 86400 7 maxage 604800 path domain httponly true log debug true maxage 7 filename server log dirname opt data gin user center logs file dirname opt data gin user center file dirname users zl workspace go gin user center public file urlprefix http 127 0 0 1 8000 api v1 gin user center file url prefix api v1 gin user center oss endpoint accesskeyid accesskeysecret bucket shell docker build make docker docker run name gin user center network host v opt conf gin user center opt conf v data gin user center data gin user center d gin user center
server
bootplus
bootplus v1 0 5 https github io aozora bootplus please pay attention that this project is no longer maintained bootplus is a front end framework for faster and easier web development inspired by the lates google look feel created and maintained by aozora http twitter com aozoralabs bootplus is based on twitter bootstrap http twitter github io bootstrap to get started check out http aozora github com bootplus http aozora github io bootplus quick start three quick start options are available download the latest release http aozora github io bootplus zipball master clone the repo git clone git github com aozora bootplus git read the getting started page http aozora github io bootplus getting started for information on the framework contents templates and examples and more versioning for transparency and insight into our release cycle and for striving to maintain backward compatibility bootplus will be maintained under the semantic versioning guidelines as much as possible releases will be numbered with the following format major minor patch and constructed with the following guidelines breaking backward compatibility bumps the major and resets the minor and patch new additions without breaking backward compatibility bumps the minor and resets the patch bug fixes and misc changes bumps the patch for more information on semver please visit http semver org http semver org bug tracker have a bug or a feature request please open a new issue https github com twitter bootplus issues before opening any issue please search for existing issues and read the issue guidelines https github com necolas issue guidelines written by nicolas gallagher https github com necolas community keep track of development and community news follow aozoralabs on twitter http twitter com aozoralabs compiling css and javascript bootplus includes a makefile makefile with convenient methods for working with the framework before getting started be sure to install the necessary local dependencies package json npm install when completed you ll be able to run the various make commands provided build make runs the recess compiler to rebuild the less files and compiles the docs requires recess and uglify js test make test runs jshint and qunit tests headlessly in phantomjs http code google com p phantomjs used for ci depends on having phantomjs installed watch make watch this is a convenience method for watching just less files and automatically building them whenever you save requires the watchr gem should you encounter problems with installing dependencies or running the makefile commands be sure to first uninstall any previous versions global and local you may have installed and then rerun npm install authors marcello palmitessa http twitter com aozoralabs http twitter com aozoralabs https github com aozora https github com aozora copyright and license bootplus is dual licensed gpl 2 and apache 2 see the license file
bootstrap css
front_end
Paryatana
welcome to paryatana https arun9739 github io paryatana forks https badgen net github forks arun9739 paryatana icon github scale 1 2 color red cache 300 nbsp stars https badgen net github stars arun9739 paryatana icon https upload wikimedia org wikipedia commons a a3 orange star svg scale 1 2 color orange cache 300 nbsp license https badgen net badge license mit purple scale 1 2 nbsp br click here https arun9739 github io paryatana to have a look at our website p align center paryatana is a front end website for a travel agency the website contains details about tourist destinations available hotels and car booking services we focus on displaying various attractive places within the country and tourist packages are updated as they are announced by the travel agency details about the founder and manager are included together with links to the paryatana website social media pages p br br p align center img src https forthebadge com images badges built by developers svg img src https forthebadge com images badges uses brains svg br img src https forthebadge com images badges made with javascript svg img src https forthebadge com images badges powered by responsibility svg img src https forthebadge com images badges built with love svg p why contribute contributions are a good way to offer help to project maintainers and improve the functionality and experience of a software it also helps you level up your skillset by working on real world tasks how to contribute if you encounter a bug on the website raise an issue with the proposed fix if you can modify the user interface to improve user experience on the website raise an issue with the proposed changes if you would like to add more functionality to the site raise an issue with the proposed changes contributing guidelines take a look at contributing guidelines contributing md if you re interested in contributing refer github flow https guides github com introduction flow the heart of this project are our contributors a href https github com arun9739 paryatana graphs contributors img src https contrib rocks image repo arun9739 paryatana a mit licensed https github com arun9739 paryatana blob main license
html-css-javascript frontend hacktoberfest hacktoberfest-accepted hacktoberfest2022 javascript
front_end
esp8266-rtos-sdk
introduction this project builds a docker image placed at nsfilho esp8266 rtos sdk 3 3 the goal of this docker image is provide a complete isolated environment for build firmwares making easy the entire process example you can see a sample of use in github actions on e12aio3 firmware https github com nsfilho e12aio3 yml name esp8266 rtos sdk build on push branches master jobs build runs on ubuntu latest steps uses actions checkout v2 uses nsfilho esp8266 rtos sdk v2 thank you and enjoy
os
polars
h1 align center img src https raw githubusercontent com pola rs polars static master logos polars github logo rect dark name svg br h1 div align center a href https crates io crates polars img src https img shields io crates v polars svg a a href https pypi org project polars img src https img shields io pypi v polars svg alt pypi latest release a a href https www npmjs com package nodejs polars img src https img shields io npm v nodejs polars svg alt npm latest release a a href https rpolars r universe dev img src https rpolars r universe dev badges polars alt r universe latest release a a href https doi org 10 5281 zenodo 7697217 img src https zenodo org badge doi 10 5281 zenodo 7697217 svg alt doi latest release a div p align center b documentation b a href https pola rs github io polars py polars html reference index html python a a href https docs rs polars latest polars rust a a href https pola rs github io nodejs polars index html node js a a href https rpolars github io index html r a b stackoverflow b a href https stackoverflow com questions tagged python polars python a a href https stackoverflow com questions tagged rust polars rust a a href https stackoverflow com questions tagged nodejs polars node js a a href https stackoverflow com questions tagged r polars r a a href https pola rs github io polars user guide a a href https discord gg 4ufp5cfbe7 discord a p polars blazingly fast dataframes in rust python node js r and sql polars is a dataframe interface on top of an olap query engine implemented in rust using apache arrow columnar format https arrow apache org docs format columnar html as the memory model lazy eager execution multi threaded simd query optimization powerful expression api hybrid streaming larger than ram datasets rust python nodejs r to learn more read the user guide https pola rs github io polars python python import polars as pl df pl dataframe a 1 2 3 4 5 fruits banana banana apple apple banana b 5 4 3 2 1 cars beetle audi beetle beetle beetle embarrassingly parallel execution very expressive query language df sort fruits select fruits cars pl lit fruits alias literal string fruits pl col b filter pl col cars beetle sum pl col a filter pl col b 2 sum over cars alias sum a by cars pl col a sum over fruits alias sum a by fruits pl col a reverse over fruits alias rev a by fruits pl col a sort by b over fruits alias sort a by b by fruits shape 5 8 fruits cars literal stri b sum a by ca sum a by fr rev a by fr sort a by b ng fruits rs uits uits by fruits str str i64 str i64 i64 i64 i64 apple beetle fruits 11 4 7 4 4 apple beetle fruits 11 4 7 3 3 banana beetle fruits 11 4 8 5 5 banana audi fruits 11 2 8 2 2 banana beetle fruits 11 4 8 1 1 sql python create a sql context context pl sqlcontext register a table table pl scan ipc file arrow context register my table table the query we want to run query select sum v1 as sum v1 min v2 as min v2 from my table where id1 id016 limit 10 option 1 run query to materialization context query query shape 1 2 sum v1 min v2 i64 i64 298268 1 option 2 don t materialize the query but return as lazyframe and continue in python lf context execute query lf join other table group by foo agg pl col sum v1 count collect sql commands can also be ran directly from your terminal using the polars cli bash run an inline sql query polars c select sum v1 as sum v1 min v2 as min v2 from read ipc file arrow where id1 id016 limit 10 run interactively polars polars cli v0 3 0 type help for help select sum v1 as sum v1 min v2 as min v2 from read ipc file arrow where id1 id016 limit 10 refer to the polars cli repository https github com pola rs polars cli for more information performance blazingly fast polars is very fast in fact it is one of the best performing solutions available see the results in duckdb s db benchmark https duckdblabs github io db benchmark in the tpch benchmarks https www pola rs benchmarks html polars is orders of magnitudes faster than pandas dask modin and vaex on full queries including io lightweight polars is also very lightweight it comes with zero required dependencies and this shows in the import times polars 70ms numpy 104ms pandas 520ms handles larger than ram data if you have data that does not fit into memory polars lazy is able to process your query or parts of your query in a streaming fashion this drastically reduces memory requirements so you might be able to process your 250gb dataset on your laptop collect with collect streaming true to run the query streaming this might be a little slower but it is still very fast setup python install the latest polars version with sh pip install polars we also have a conda package conda install c conda forge polars however pip is the preferred way to install polars install polars with all optional dependencies sh pip install polars all pip install polars numpy pandas pyarrow install a subset of all optional dependencies you can also install the dependencies directly tag description all install all optional dependencies all of the following pandas install with pandas for converting data to and from pandas dataframes series numpy install with numpy for converting data to and from numpy arrays pyarrow reading data formats using pyarrow fsspec support for reading from remote file systems connectorx support for reading from sql databases xlsx2csv support for reading from excel files openpyxl support for reading from excel files with native types deltalake support for reading from delta lake tables pyiceberg support for reading from apache iceberg tables timezone timezone support only needed if are on python 3 9 or you are on windows releases happen quite often weekly every few days at the moment so updating polars regularly to get the latest bugfixes features might not be a bad idea rust you can take latest release from crates io or if you want to use the latest features performance improvements point to the main branch of this repo toml polars git https github com pola rs polars rev optional git tag required rust version 1 71 contributing want to contribute read our contribution guideline contributing md python compile polars from source if you want a bleeding edge release or maximal performance you should compile polars from source this can be done by going through the following steps in sequence 1 install the latest rust compiler https www rust lang org tools install 2 install maturin https maturin rs pip install maturin 3 cd py polars and choose one of the following make build release fastest binary very long compile times make build opt fast binary with debug symbols long compile times make build debug opt medium speed binary with debug assertions and symbols medium compile times make build slow binary with debug assertions and symbols fast compile times append native e g make build release native to enable further optimizations specific to your cpu this produces a non portable binary wheel however note that the rust crate implementing the python bindings is called py polars to distinguish from the wrapped rust crate polars itself however both the python package and the python module are named polars so you can pip install polars and import polars use custom rust function in python extending polars with udfs compiled in rust is easy we expose pyo3 extensions for dataframe and series data structures see more in https github com pola rs pyo3 polars going big do you expect more than 2 32 4 2 billion rows compile polars with the bigidx feature flag or for python users install pip install polars u64 idx don t use this unless you hit the row boundary as the default polars is faster and consumes less memory legacy do you want polars to run on an old cpu e g dating from before 2011 or on an x86 64 build of python on apple silicon under rosetta install pip install polars lts cpu this version of polars is compiled without avx https en wikipedia org wiki advanced vector extensions target features sponsors img src https www jetbrains com company brand img jetbrains logo png height 50 https www jetbrains com
dataframe-library dataframe dataframes rust arrow python out-of-core polars
front_end
ReForm-Eval
div align center h1 style display inline block font size 48px reform eval h1 div p align center img src https avatars githubusercontent com u 100903507 s 200 v 4 alt fudan disc logo style display inline block vertical align middle height 48px p p align center img src https img shields io badge version v1 0 green img src https img shields io badge licence apache 2 0 green a href https github com fudandisc img src https img shields io badge disc repositories blue a img src https img shields io github stars fudandisc reform eval label stars a href https hits seeyoufarm com img src https hits seeyoufarm com api count incr badge svg url https 3a 2f 2fgithub com 2ffudandisc 2freform eval count bg 23d8659b title bg 23555555 icon icon color 23e7e7e7 title visitors edge flat false a p p align center a href https arxiv org pdf 2310 02569 pdf img src https img shields io badge paper pdf red a a href https arxiv org abs 2310 02569 img src https img shields io badge paper arxiv red a a href https huggingface co datasets aweminus reform eval data tree main img src https img shields io badge hugging face dataset orange a a href https drive google com file d 1gjwvm0f6fkj7vfyskyefb2n kyzxcdyi view img src https img shields io badge google drive dataset orange logo googledrive a p div align center h2 reform eval evaluating large vision language models via unified re formulation of task oriented benchmarks h2 div p align center strong zejun li sup 1 sup sup sup ye wang sup 1 sup sup sup mengfei du sup 1 sup sup sup qingwen liu sup 1 sup sup sup binhao wu sup 1 sup sup sup jiwen zhang sup 1 sup sup sup chengxing zhou sup 2 sup zhihao fan sup 3 sup jie fu sup 4 sup jingjing chen sup 1 sup xuanjing huang sup 1 sup zhongyu wei sup 1 sup sup sup strong p p align center sup 1 sup fudan university sup 2 sup northeastern university sup 3 sup alibaba group sup 4 sup hong kong university of science and technology p p align center sup sup equal contribution sup sup corresponding author p p align center a href https arxiv org abs 2310 02569v1 reform eval paper a a href https huggingface co datasets aweminus reform eval tree main reform eval data a a href https drive google com file d 1gjwvm0f6fkj7vfyskyefb2n kyzxcdyi view google drive a p recent years have witnessed remarkable progress in the development of large vision language models lvlms benefiting from the strong language backbones and efficient cross modal alignment strategies lvlms exhibit surprising capabilities to perceive visual signals and perform visually grounded reasoning however the capabilities of lvlms have not been comprehensively and quantitatively evaluated most existing multi modal benchmarks require task oriented input output formats posing great challenges to automatically assess the freeform text output of lvlms to effectively leverage the annotations available in existing benchmarks and reduce the manual effort required for constructing new benchmarks we propose to re formulate existing benchmarks into unified lvlm compatible formats through systematic data collection and reformulation we present the reform eval benchmark offering substantial data for evaluating various capabilities of lvlms based on reform eval we conduct extensive experiments thoroughly analyze the strengths and weaknesses of existing lvlms and identify the underlying factors our benchmark and evaluation framework will be open sourced as a cornerstone for advancing the development of lvlms we explore ways of re formulating existing benchmarks into unified formats that are compatible with lvlms p align center img src short png p span style font size larger existing lvlms evaluation span no quantification the capabilities of existing lvlms are mainly demonstrated only by qualitative examples task oriented most existing multi modal benchmarks cannot be directly utilized to evaluate lvlms since they are designed for specific tasks and rely on structured input output formats for evaluation even need to be fine tuned or learn task specific parameters limited samples limited manual annotation such as around 100 samples per dimension in mme and mmbench could potentially introduce evaluation bias into the results span style font size larger based on the re formulation framework we present our unified multi modal benchmark reform eval span larger data scale reform eval provides a dataset scale almost 100 times larger than existing benchmarks allowing models to be comprehensively evaluated across various dimensions without manual annotation reform eval leverages publicly open resources reducing annotation costs while providing a larger scale dataset universal evaluation unlike lvlm ehub which requires designing complex and dataset specific evaluation strategies reform eval offers greater scalability and a more universally applicable and efficient evaluation approach comprehensive evaluation we re formulate 61 benchmark datasets based on existing data resources the evaluation dimensions range from basic visual perception to high level visual reasoning and dialog unified re formulation multi modal benchmark datasets are re formulated as multiple choice problems or specialized text generation problems additionally generation based black box and likelihood based white box approaches are implemented for evaluation the unified formulation enables universal and comprehensive evaluation for each formulation we design a consistent and reliable evaluation method as mentioned in fu et al 2023 https arxiv org abs 2306 13394 current lvlms may struggle to follow multiple choice instructions we propose both black box and white box approaches to assist 1 guiding lvlms to output in desired formats through in context learning 2 directly calculating the generation probability for options and selecting the one with the highest value considering the sensitivity of lvlms to the input prompts zeng et al 2023 https arxiv org abs 2307 02469 we additionally design an instability aware evaluation strategy and introduce a metric to characterize such instability reform eval serves as a reliable tool for quantitative analysis of lvlms aiding in the research and development of lvlms we welcome a diverse range of large vision and language models to participate in reform eval benchmark evaluation update if you have any questions please send us an email or leave a github issue email yewang22 m fudan edu cn 2023 10 we released the initial version of the reform eval https arxiv org abs 2310 02569 containing interfaces of 16 models and 61 converted reformulated datasets reform eval data https huggingface co datasets aweminus reform eval data tree main contents model performance model performance getting start getting start install install pipeline pipeline load data load data create your own model interface create your own model interface evaluation evaluation demo demo parameters parameters model usage model usage data usage data usage output result output result citation citation acknowledgements acknowledgements related projects related projects model performance we list the average ranking and the score of the model under generation evaluation and likelihood evaluation in the table below if you get results on our benchmark using the new lvlm interface please contact us to add your model to this table email yewang22 m fudan edu cn model gen avg rank gen avg score like avg rank like avg blip 2 2 3 62 94 4 3 62 92 instructblip f 2 0 60 77 4 0 63 48 instructblip v 4 4 52 20 3 0 64 37 llava v 11 1 34 24 8 7 55 49 llava l2 5 9 45 78 11 2 52 97 minigpt4 7 3 43 12 7 8 56 15 mplug owl 10 6 37 95 10 3 53 69 pandagpt 13 9 26 84 15 8 41 80 ib llm 13 0 30 24 14 5 47 58 la v2 12 5 32 60 12 2 50 00 mmgpt 14 4 29 38 12 8 50 92 shikra 11 0 36 14 7 0 58 40 lynx 5 0 50 00 2 8 63 93 cheetor v 6 8 44 74 8 2 56 73 cheetor l2 7 9 41 75 10 7 52 43 bliva 7 9 42 40 2 7 64 92 gen avg rank and like avg rank represents the average rank of generation and likelihood evaluation gen avg score and like avg score are the average score of generation and likelihood evaluation respectively getting start install 1 git clone our repository via the following command bash git clone https github com fudandisc reform eval git cd reform eval pip install r requirements txt if you want to test all existing 16 models you need to run the following command bash git clone https github com fudandisc reform eval git recursive cd reform eval pip install r requirements txt 2 build from source bash git clone https github com fudandisc reform eval git cd reform eval pip install the advantage of building from source is that you can directly replace the command of python run eval py and python run loader eval py with the run eval or run loader eval by modifying the config file and can be executed in any path including the dataloader function load reform dataset open your shell configuration file bash vim bashrc add the following line at the end of the file bash export pythonpath path to reform eval pythonpath note once you use run eval or run loader eval on other paths the parameters related to the file dir should be set to absolute paths pipeline our benchmark provides accuracy and instability as metrics for each task to quantify the model performance we provide two methods a create the interface in our framework and run it directly b use the data loader we provide and output the inference results then provide a new script to evaluate with our benchmark taking the problem formulation and the output json file as input method a step 1 use an existing model interface or create a new model interface based on reform eval framework refer to create your own model interface create your own model interface step 2 create the conda env corresponding to the model and install the necessary packages step 3 switch to the corresponding conda env run run eval py in the root path of this repository and add necessary parameters bash cuda visible devices 0 1 2 3 4 5 6 7 torchrun nproc per node 8 run eval py model lynx model name models interfaces lynx configs lynx yaml dataset name visdial output dir output lynx visdial test generation per gpu eval batch size 4 formulation singlechoice infer method generation do eval half evaluation dataset duplication 1 in context sample option mark upper dataset config build configs visdial val v1 2 yaml step 4 check the inference progress and results in the terminal the accuracy the format hit rate or instability can also be viewed in output dir log txt method b step 1 build a dataset using our data loader and process them into a string with the desired format of the corresponding model python from build import load reform dataset example for loading vqa v2 dataset load reform dataset dataset config please check data usage for available arguments dataset name vqa formulation singlechoice dataset config path to reform eval build configs vqa vqa v2 val yaml inference method generation inference method generation likeligood in context sample true whether to include in context sample random instruct true whether to use different instructions for the same sample data duplication 5 number of multiple tests for the same sample shuffle options true whether to shuffle the options for the same sample load from hf true optional whether to load from huggingface offline from hf false optional whether to load the huggingface data from the local path step 2 the model outputs a json file such as path to tdiuc singlechoice likelihood imagebindllm imagebindllm json based on the dataset built by step 1 step 3 run our new script run loader eval py taking the problem formulation and the output json file as main parameters of input bash python run loader eval py formulation singlechoice infer method likelihood eval stability prediction file test output singlechoice tdiuc singlechoice likelihood imagebindllm imagebindllm json or python from run loader eval import loader eval dataset loader eval formulation singlechoice infer method likelihood multi round eval false eval stability true prediction file path to tdiuc singlechoice likelihood imagebindllm imagebindllm json note there are four types of formulation singlechoice generation ocropenended and kieopenended respectively it can only be set eval stability and multi round eval when formulation singlechoice which means that only singlechoice can measure the instability and be used for the multi round evaluation notice that each sample in the output json are supposed to be specific format python dataset information sample id vqa 0 answer 1 answer options yes no maybe prediction a yes the prediction note during generation based evaluation for multiple choice questions we only consider the format like a a 1 if a prediction does not hit the format it will be considered wrong the requirement for likelihood prediction is int and for generation prediction is str step 4 the accuracy the format hit rate or instability can be viewed in output dir log txt load data there are two ways to load data using our framework directly or using data loader the most recommendation is using hugging face data which we call it reform eval data we introduce how to load reform eval data from hugging face hub or the local path if this still does not work we also provide other loading methods please refer to prepare dataset build prepare dataset md prepare dataset for more details here is the google drive link of reform eval data and you can directly download it to load from the local path download url https drive google com file d 1gjwvm0f6fkj7vfyskyefb2n kyzxcdyi view https drive google com file d 1gjwvm0f6fkj7vfyskyefb2n kyzxcdyi view wget wget https drive google com uc export download id 1gjwvm0f6fkj7vfyskyefb2n kyzxcdyi using reform eval framework if you load data from reform eval framework when running run eval py and run loader eval py you should set the data related parameters including dataset name formulation dataset config dataset duplication in context sample and capitalize please set hf or offline hf if you would like to load reform eval data hf is loading from hugging face hub and offline hf is loading reform eval data from the local path if set at the same time data will be loaded from hugging face hub using data loader reform eval provides the direct data loader if you would like to perform evaluation without our framework here is an example python from build import load reform dataset example for loading vqa v2 dataset load reform dataset dataset config please check data usage for available arguments dataset name vqa formulation singlechoice dataset config path to reform eval build configs vqa vqa v2 val yaml inference method generation inference method generation likeligood in context sample true whether to include in context sample random instruct true whether to use different instructions for the same sample data duplication 5 number of multiple tests for the same sample shuffle options true whether to shuffle the options for the same sample load from hf true optional whether to load from huggingface option mark upper optional the option mark to use number upper lower random offline from hf false optional whether to load the huggingface data from the local path notice that each sample of the loaded dataset will be a dict containing all information like sample id vqa 000 image pil jpegimageplugin jpegimagefile image mode rgb size 640x484 question is there a cat in the image answer 2 answer options yes no maybe instruct based on the image answer the question with the provided options question with option is there a cat in the image options a yes b no c maybe you may need to process them into a string with the desired format you may be intersted in the preprocessors models prepare models md preprocessors we used in reform eval to gather the information into a dialogue like string as the input for you model all valid datasets and corresponding arguments are in the data usage data usage please set load from hf true or offline from hf true if you would like to load reform eval data load from hf true is loading from hugging face hub and offline from hf true is loading reform eval data from the local path if true is set at the same time data will be loaded from hugging face hub create your own model interface to add new models you need to create the corresponding model interface for the unified evaluation for a general new model interface please refer to the interface template in path to reform eval models interfaces base interface py here we provide a step by step guide for the convenience of your implementation taking lynx as an example step 1 configure the code path add the lynx project as a submodule to path to reform eval models interfaces bash cd models interfaces git submodule add https github com bytedance lynx llm git step 2 model loading refer to the code for loading the model in the original lynx project python def main args config print evaluating flush true device torch device args device seed args seed utils get rank torch manual seed seed np random seed seed random seed seed cudnn benchmark true print config json dumps config flush true print output path args output path flush true print creating model flush true from models lynx import lynxbase model lynxbase config config freeze vit config freeze vit freeze llm config freeze llm load bridge false so we can implement the init function for model loading in our interface python class lynx interface nn module def init self model config none device none half false inference method generation none super lynx interface self init setup the model device if device is none self device torch device cuda if torch cuda is available else cpu else self device torch device device loading the model self config yaml load open model config r loader yaml loader self model lynxbase config self config freeze vit self config freeze vit freeze llm self config freeze llm load bridge false locate the model to half precision and target device if needed self prec half half if self prec half self model self model half self model self model to self device setup the inference method self inference method inference method step 3 implement the inference function generation based black box evaluation we provide the black box generation based inference method black box generation based inference method args image list pil image the batch of input images each element is loaded as pil image prompt list str the batch of input textual prompts prompts should be formulated as a dialoge by the model preprocessor see utils preprocessors py temperature float optional a generation related parameter the temperature parameter in the generation process of language models max new tokens int optional a generation related parameter the maximal number of tokens a model can generate returns outputs list str the generated output response in text an example is provided below python an example of vqa for llava from models interfaces llava interface import llava interface from pil import image image image open path to image convert rgb model llava interface path to llava device cuda 0 prompt a chat between a curious human and an artificial intelligence assistant the assistant gives helpful detailed and polite answers to the human s questions human image n can you see the image options a yes b no assistant the answer is a yes human what color is the truck options a blue b orange assistant the answer is generation based inference outputs model raw batch generate image prompt outputs b orange then find the generation related code in the original lynx project python torch no grad def evaluation model data loader device config test model eval result for n idx vision input input ids input atts in enumerate data loader vision input vision input to device non blocking true input ids input ids to device input atts input atts to device text outputs model generate vision input vision input input ids input ids input atts input atts use nucleus sampling config get use nucleus sampling false apply lemmatizer config apply lemmatizer num beams config num beams min length config min length length penalty config get length penalty 1 0 no repeat ngram size config get no repeat ngram size 1 top p config get top p 0 9 top k config get top k 3 max new tokens config get max new tokens 64 for i output in zip idx text outputs result append index i text output output strip return result therefore in lynx interface py we can implement the generation inference function as python torch no grad def raw generate self image prompt temperature 1 max new tokens 30 vision input self load vision inp image unsqueeze 0 if self prec half vision input vision input to torch float16 input ids input atts self process text prompt answer self model generate vision input vision input input ids input ids input atts input atts use nucleus sampling self config get use nucleus sampling false apply lemmatizer self config apply lemmatizer num beams 3 self config num beams min length self config min length length penalty self config get length penalty 1 0 no repeat ngram size self config get no repeat ngram size 1 top p self config get top p 0 9 top k self config get top k 3 max new tokens max new tokens temperature temperature return answer 0 in this function you have to use the internal vision processor to get the vision input open and get the image and the internal tokenizer to get the input ids and input atts all of these codes can be directly found and implemented from the original project python def load vision inp self vision inp if vision inp is none return none elif isinstance vision inp list or isinstance vision inp np ndarray return self get frames vision inp elif isinstance vision inp str if os path exists vision inp image image open vision inp convert rgb else base64 encoding try image image open io bytesio b64decode vision inp convert rgb except exception as e raise valueerror f check whether it is a rpath and not exist vision inp e else image vision inp image self img transform image return image to self device def process text self text text text strip if self lower text text text lower input ids self tokenizer bos token self tokenizer tokenize text print input ids input ids self tokenizer convert tokens to ids input ids input atts torch longtensor 1 len input ids input ids torch longtensor input ids return input ids to self device input atts to self device likelihood based white box evaluation we provide the white box likelihood based inference method white box likelihood based inference method args image list pil image the batch of input images each element is loaded as pil image prompt list str the batch of input textual prompts prompts should be formulated as a dialoge by the model preprocessor see utils preprocessors py candidates list list str the list of candidate lists each element candidates i is the candidate list of the corresponding question returns outputs list int the generated output prediction index each element outputs i is the selected index of the corresponding candidates the prediction is therefore candidates i outputs i here is an example python an example of vqa for llava from models interfaces llava interface import llava interface from pil import image image image open path to image convert rgb model llava interface path to llava device cuda 0 prompt a chat between a curious human and an artificial intelligence assistant the assistant gives helpful detailed and polite answers to the human s questions human what color is the truck assistant candidates orange blue likelihood based inference outputs model raw batch predict image prompt candidates outputs 1 to support the likelihood evaluation we add the following function in our model file path to reform eval models interfaces lynx models lynx py to calculate the loss neg log likelihood for each sequence python def forward likelihood self vision input input ids input atts labels likelihood reduction sum text embeds self embed tokens input ids if vision input is not none vision embeds vision atts self get vision embeds vision input v2t feats v2t atts self bridge vision embeds vision embeds vision atts vision atts inputs embeds torch cat v2t feats text embeds dim 1 attention mask torch cat v2t atts input atts dim 1 else inputs embeds text embeds attention mask input atts outputs self llm inputs embeds inputs embeds attention mask attention mask labels labels return dict true reduction none loss outputs loss reshape inputs embeds shape 0 1 if likelihood reduction sum loss loss sum 1 elif likelihood reduction mean valid num targets loss 0 sum 1 loss loss sum 1 valid num targets elif likelihood reduction none loss loss else raise valueerror return loss hence in lynx interface py we can use self model forward likelihood at the raw predict function python def raw predict self image prompt candidates likelihood reduction sum loading the image text pair vision input self load vision inp image unsqueeze 0 if self prec half vision input vision input to torch float16 input ids attention mask self process text prompt get the embedding from the input num cand len candidates input seq len input ids shape 1 tokenize the candidates current padding side self tokenizer padding side current truncation side self tokenizer truncation side self tokenizer padding side right self tokenizer truncation side right if self lower text candidates cand lower for cand in candidates candidates tokens self tokenizer candidates return tensors pt padding longest to self device self tokenizer padding side current padding side self tokenizer truncation side current truncation side construct the inputs ids and lm targets candidates ids candidates tokens input ids 1 remove the s token candidates att candidates tokens attention mask 1 remove the s token mask the lm targets with pad cand targets candidates ids clone cand targets cand targets masked fill cand targets self tokenizer pad token id 100 mask the targets for inputs part targets torch cat 100 torch ones num cand input seq len self config num bridge tokens dtype torch long device self device cand targets dim 1 concatenate the inputs for the model attention mask torch cat attention mask repeat interleave num cand dim 0 candidates att dim 1 full input ids torch cat input ids repeat interleave num cand dim 0 candidates ids dim 1 calculate the loss neg log likelihood for each candidate with torch inference mode outputs self model forward likelihood vision input vision input repeat interleave num cand dim 0 input ids full input ids input atts attention mask labels targets likelihood reduction likelihood reduction neg likelihood outputs select the one with the highest likelihood lowest loss output class ranks torch argsort neg likelihood dim 1 0 item return output class ranks step 4 implement the preprocessor preprocessors are used to formulate the structural information in order to get the correct form of dialogue our preprocessor is in path to reform eval utils preprocessors py python class convsinglechoiceprocessor object def init self sep sep2 none roles question answer system msg none first query fn none init conv none sep style two alphabet choice none infer method generation response prefix none preprocessors to convert input information into a dialogue string args sep str the text separator 1 sep2 str the text separator 2 roles list str role names of the dialogue roles 0 is the role of users while roles 1 is the name of assistants system msg str optional the system message that appears at the beginning first query fn function optional the function to process the first query mainly for adding img marks init conv list list str the initial conversation each element is a list str str where the first is the role name and the second is the message sep style str the dialogue style alphabet choice str optional the option mark used for multiple choice questions defaults to random infer method str optional the inference method generation or likelihood response prefix str optional the prefix text for the response of lvlm assistants we use the answer is to help with multiple choice questions returns output str the constructed dialogue text here is an example of the n separated preprocessor python proc convsinglechoiceprocessor n roles user bot first query fn lambda x image x sep style one infer method model args inference method response prefix the answer is system message a chat between a curious human and an artificial intelligence assistant the assistant gives helpful detailed and polite answers to the human s questions the input sample is a json style dict inputs sample id 287626 3 round id 3 image image path jpg question is there a cat in the image answer 2 answer options yes no maybe history from human value can you see the image options a yes b no from assistant value the answer is a yes therefore the final content will be a chat between a curious human and an artificial intelligence assistant the assistant gives helpful detailed and polite answers to the human s questions user image can you see the image options a yes b no n bot the answer is a yes n user is there a cat in the image options a yes b no c maybe n bot the answer is for other supported sep style please refer to path to reform eval utils preprocessors py init conv can also be used to add image marks if it is init conv user image this means that a new conversation will be started user image user bot step 5 add model loader implement the model loading function in path to reform eval models interfaces lynx interface py python def get lynx model config none model args map the general input arguments to the model specific arguments if model config is not none valid args model name device half inference method target args model config device half inference method for i arg in enumerate valid args if arg in model config model args target args i model config arg configure the dialogue preprocessor proc convsinglechoiceprocessor n roles user bot sep style one infer method model args inference method response prefix the answer is return lynx interface model args proc additionally you should add the following codes in path to reform eval models init py python elif model name lynx from interfaces lynx interface import get lynx return get lynx model config done finally you can use the following model arguments in the main entrance to evaluate your model bash model lynx model name models interfaces lynx configs lynx yaml if you have trouble incorporating new models into our framework please let us know through github issues or emails for more details about models and preprocessors please refer to prepare models models prepare models md prepare models evaluation our benchmark supports multi gpu evaluation if the half evaluation is set the evaluation can be run on a single machine within cuda memory of 24g on a single card for 7b models under limited equipment conditions demo we provide one example of running the benchmark test using lynx model for visdial evaluation bash cuda visible devices 0 1 2 3 4 5 6 7 torchrun nproc per node 8 run eval py model lynx model name models interfaces lynx configs lynx yaml dataset name visdial output dir output lynx visdial test generation per gpu eval batch size 4 formulation singlechoice infer method generation do eval half evaluation dataset duplication 1 in context sample option mark upper dataset config build configs visdial val v1 2 yaml the num of nproc per node must be equal to the num of cuda visible devices output dir is the path of output result formulation must be generation singlechoice ocropenended or kieopenended infer method must be generation or likelihood if you infer in generation mode you should use in context sample to assist models to generate option marks for most questions dataset config is the path of the dataset config file parameters all parameters used are listed below and you can modify any parameter to customize your evaluation settings python def main parser argparse argumentparser model related parameters parser add argument model type str default none help the model family name parser add argument model name type str default none help the model name to load parser add argument model type type str default none help the model type to set dataset related parameters parser add argument dataset name type str default none help the dataset name to evaluate on parser add argument formulation type str default none help the problem formulation to perform must be in generation singlechoice parser add argument dataset config type str default none help the config file path using the default path without explicit parser add argument dataset duplication type int default 1 help duplicate the sample for evaluating the stability parser add argument in context sample action store true help whether to provide in context learning samples parser add argument capitalize action store true help whether to capitalize the qa 0805 add parser add argument yesno instruct action store true help whether add please answer yes or no to the full instruct parser add argument answer space instruct action store true help whether add answer space to the full instruct running parameters parser add argument per gpu eval batch size type int default 1 help the batch size per gpu parser add argument num workers type int default 4 help workers in dataloader parser add argument half evaluation action store true help whether to use half precision for evluation general evaluation setup parser add argument do eval action store true help whether to evluate the output parser add argument eval stability action store true help whether to evaluate the stability parameters for model generation parser add argument temperature type float default none help the temperature for generation parser add argument max new tokens type int default none help max new tokens to generate parameters for likelihood measurement parser add argument likelihood reduction type str default none help the reduction method for likelihood measurement parameters for singlechoice problem parser add argument infer method type str default generation help the inference method to use must be in generation likelihood parser add argument option mark type str default none help the index mark for options in single shoice questions number for 1 2 3 4 lower for a b c d while upper for a b c d parameters for randomness control parser add argument random instruct action store true help whether to use random instructions parser add argument shuffle options action store true help whether to shuffle options parameters for multi round problem parser add argument options in history action store true help whether to put options in history parser add argument online multi round action store true help make online update to the history during dialog parser add argument multi round eval action store true help whether to evaluate multi round performance output setup parser add argument output dir type str default output help the path to save the output debug mode parser add argument dataset debug action store true help debug on the dataset setup parser add argument dataset subsample type int default none help only n sub samples of the dataset core parser add argument core eval action store true help only eval on the core datasets hugging face parser add argument hf action store true help whether to load the dataset directly from hugging face parser add argument offline hf action store true help whether to load the hugging face data from the local path args parser parse args model usage when running the evaluation these model related parameters must be applied for specific models some models require additional forward likelihood function please refer to likelihood based white box evaluation in create your own model interface create your own model interface we only list a few examples of blip 2 and instructblip here for the remaining models please refer to the complete model usage models complete model usage md complete model usage blip 2 instructblip bash blip 2 flant5 model blip2 model name blip2 t5 model type pretrain flant5xl instructblip flan t5 model blip2 model name blip2 t5 instruct model type flant5xl instructblip vicuna model blip2 model name blip2 vicuna instruct model type vicuna7b you also have to put bert base uncased and google flan t5 xl folders on the root directory of our repository reform eval bert base uncased google flan t5 xl build commands metrics models if you load blip2 t5 you need to add the predict class function in blip2 t5 py python def predict class self samples candidates n segments 1 if candidates is a list of lists each sample has its candidates then we need to iterate one by one if type candidates 0 list results for i in range samples image size 0 add support for different prompts for different samples this sample image samples image i unsqueeze 0 prompt samples prompt i if type samples prompt list else samples prompt if text input in samples keys this sample text input samples text input i if context in samples keys this sample context samples context i if history in samples keys this sample history samples history i if caption in samples keys this sample caption samples caption i this result self predict class this sample candidates i n segments results append this result try results torch cat results dim 0 except results res tolist 0 for res in results return results return self predict class samples candidates n segments def predict class self samples candidates n segments 1 args samples dict a dictionary containing the following keys image torch tensor a tensor of shape batch size 3 h w prompt the instruction candidates list a list of candidate class names n segments int split the candidates into n segments and predict one by one this is useful when the number of candidates is too large returns output class predicted class index image samples image prompt samples prompt bs image size 0 if isinstance prompt str prompt prompt bs else assert len prompt bs the number of prompts must be equal to the batch size if text input in samples keys if type samples text input 0 list prompt prompt i format samples text input i for i in range len prompt else prompt prompt i format samples text input i for i in range len prompt scienceqa if context in samples keys and samples context prompt f context samples context i prompt i for i in range len prompt visual dialog if history in samples keys and samples history 0 prompt f dialog history samples history i n prompt i for i in range len prompt if caption in samples keys and samples caption 0 prompt f this image has the caption samples caption i prompt i for i in range len prompt query tokens self query tokens expand bs 1 1 if image dim 5 inputs t5 atts t5 for j in range image size 2 this frame image j with self maybe autocast frame embeds self ln vision self visual encoder this frame frame atts torch ones frame embeds size 1 dtype torch long to image device frame query output self qformer bert query embeds query tokens encoder hidden states frame embeds encoder attention mask frame atts return dict true frame inputs t5 self t5 proj frame query output last hidden state query tokens size 1 frame atts t5 torch ones frame inputs t5 size 1 dtype torch long to image device inputs t5 append frame inputs t5 atts t5 append frame atts t5 inputs t5 torch cat inputs t5 dim 1 atts t5 torch cat atts t5 dim 1 else with self maybe autocast image embeds self ln vision self visual encoder image image atts torch ones image embeds size 1 dtype torch long to image device query output self qformer bert query embeds query tokens encoder hidden states image embeds encoder attention mask image atts return dict true inputs t5 self t5 proj query output last hidden state query tokens size 1 atts t5 torch ones inputs t5 size 1 dtype torch long to image device input tokens self t5 tokenizer prompt padding longest return tensors pt to image device output tokens self t5 tokenizer candidates padding longest return tensors pt to image device encoder atts torch cat atts t5 input tokens attention mask dim 1 n cands len candidates with self maybe autocast dtype torch bfloat16 inputs embeds self t5 model encoder embed tokens input tokens input ids inputs embeds torch cat inputs t5 inputs embeds dim 1 encoder outputs self t5 model encoder inputs embeds inputs embeds attention mask encoder atts all losses for n in range n segments seg len n cands n segments if n n segments 1 seg len n cands seg len n segments 1 this encoder outputs copy deepcopy encoder outputs this encoder outputs basemodeloutput last hidden state encoder outputs 0 clone this encoder outputs last hidden state this encoder outputs 0 repeat interleave seg len dim 0 this encoder atts encoder atts repeat interleave seg len dim 0 start i n n cands n segments end i start i seg len this output tokens ids output tokens input ids start i end i repeat bs 1 this output tokens atts output tokens attention mask start i end i repeat bs 1 this targets this output tokens ids masked fill this output tokens ids self t5 tokenizer pad token id 100 outputs self t5 model encoder outputs this encoder outputs attention mask this encoder atts decoder attention mask this output tokens atts return dict true labels this targets reduction none loss outputs loss loss loss reshape bs seg len output class ranks torch argsort loss dim 1 all losses append loss all losses torch cat all losses dim 1 output class ranks torch argsort all losses dim 1 return output class ranks then you should run the following command to implement the modification cd models lavis pip install e data usage for data related parameters we list required parameters of different tasks for comprehensive evaluation coarse grained perception coarse grained perception cg is the ability to recognize the overall layout and main objects at the image level flowers102 bash dataset name flowers102 formulation singlechoice dataset config build configs imageclassification flowers102 val yaml cifar10 bash dataset name cifar10 formulation singlechoice dataset config build configs imageclassification cifar10 val yaml imagenet 1k bash dataset name imagenet 1k formulation singlechoice dataset config build configs imageclassification imagenet1k val yaml pets37 bash dataset name pets37 formulation singlechoice dataset config build configs imageclassification pets37 val yaml vizwiz yesno bash dataset name vizwiz formulation singlechoice dataset config build configs imagequality vizwiz yesno val yaml vizwiz singlechoice bash dataset name vizwiz formulation singlechoice dataset config build configs imagequality vizwiz singlechoice val yaml tdiuc sport bash dataset name vizwiz formulation singlechoice dataset config build configs imagequality vizwiz singlechoice val yaml tdiuc scene bash dataset name tdiuc formulation singlechoice dataset config build configs tdiuc scene yaml medic bash dataset name medic formulation singlechoice dataset config build configs disastertype val yaml fine grained perception fine grained perception fg requires detailed sensing at the object level mscoco mci bash dataset name mscoco formulation singlechoice dataset config build configs multiclassidentification val yaml mscoco goi bash dataset name mscoco formulation singlechoice dataset config build configs groundedobjidentification val yaml mscoco mos bash dataset name mscoco formulation singlechoice dataset config build configs missingobjectselection val yaml tdiuc color bash dataset name tdiuc formulation singlechoice dataset config build configs tdiuc color yaml tdiuc utility bash dataset name tdiuc formulation singlechoice dataset config build configs tdiuc utility yaml tdiuc position bash dataset name tdiuc formulation singlechoice dataset config build configs tdiuc position yaml tdiuc detection bash dataset name tdiuc formulation singlechoice dataset config build configs tdiuc detection yaml tdiuc counting bash dataset name tdiuc formulation singlechoice dataset config build configs tdiuc counting yaml refcoco bash dataset name refcoco formulation singlechoice dataset config build configs referringexpression val yaml mscoco oc bash dataset name mscoco formulation singlechoice dataset config build configs objectcounting mscoco val yaml visually grounded reasoning a reliable lvlm is supposed to perform reasoning based on multi modal contextual information in order to assess such capability we adopt the commonly applied visual question answering vqa task and its variant knowledge based visual question answer k vqa which further requires models to utilize internally stored knowledge vqa v2 bash dataset name vqa formulation singlechoice dataset config build configs vqa vqa v2 val yaml gqa bash dataset name vqa formulation singlechoice dataset config build configs vqa gqa val v2 0 yaml whoops bash dataset name vqa formulation singlechoice dataset config build configs vqa whoops val yaml ok vqa bash dataset name vqa formulation singlechoice dataset config build configs vqa okvqa val yaml scienceqa bash dataset name vqa formulation singlechoice dataset config build configs vqa scienceqa val v2 0 yaml vizwiz bash dataset name vqa formulation singlechoice dataset config build configs vqa vizwiz val v2 0 yaml viquae bash dataset name vqa formulation singlechoice dataset config build configs vqa viquae val yaml k viquae bash dataset name kvqa formulation singlechoice dataset config build configs kvqa viquae val yaml a okvqa bash dataset name vqa formulation singlechoice dataset config build configs vqa aokvqa val yaml a okvqra bash dataset name vqra formulation singlechoice dataset config build configs vqra aokvqa val yaml a okvqar bash dataset name vqar formulation singlechoice dataset config build configs vqar aokvqa val yaml imagenetvc bash dataset name vqa formulation singlechoice dataset config build configs vqa imagenetvc val yaml spatial understanding spatial understanding is the key to the real life application of lvlms on robots this task requires a comprehensive understanding of both the object object and object observer relationship so as to make reasonable behaviors clevr bash dataset name clevr formulation singlechoice dataset config build configs spatial clevr val yaml vsr bash dataset name vsr formulation singlechoice dataset config build configs spatial vsr val yaml mp3d bash dataset name mp3d formulation singlechoice dataset config build configs spatial mp3d val yaml multi turn dialogue reform eval evaluates the performance of lvlms in multi turn dialogues vqa mt bash dataset name visdial formulation singlechoice dataset config build configs vqa vqa multiround val yaml online multi round num workers 0 visdial bash dataset name visdial formulation singlechoice dataset config build configs visdial val v1 2 yaml online multi round num workers 0 please refer to online multi round dialogue build prepare dataset md online multi round dialogue for the details of the setup of online multi round dialogues cross modal inference we consider two tasks image text matching itm requires models to measure the cross modal similarities and visual entailment ve demands models to check whether the information is entailed across modalities mscoco itm bash dataset name mscoco formulation singlechoice dataset config build configs imagetextmatching val yaml mscoco its bash dataset name mscoco formulation singlechoice dataset config build configs imagetextselection val yaml wikihow bash dataset name wikihow formulation singlechoice dataset config build configs temporalordering val yaml winoground bash dataset name captionselection formulation singlechoice dataset config build configs captionselection winoground val yaml snli ve bash dataset name snli ve formulation singlechoice dataset config build configs visualentailment val yaml mocheg bash dataset name mcv formulation singlechoice dataset config build configs mcv mocheg val yaml scene text perception scene text perception enables lvlms to identify understand and perform inference based on text in images grounded ic15 bash dataset name ic15 formulation ocropenended dataset config build configs groundocr ic15 val yaml ic15 bash dataset name ic15 formulation ocropenended dataset config build configs ocr ic15 val yaml grounded coco text bash dataset name coco text formulation ocropenended dataset config build configs groundocr cocotext val yaml coco text bash dataset name coco text formulation ocropenended dataset config build configs ocr cocotext val yaml grounded textocr bash dataset name textocr formulation ocropenended dataset config build configs groundocr textocr val yaml textocr bash dataset name textocr formulation ocropenended dataset config build configs ocr textocr val yaml cute80 bash dataset name cute80 formulation ocropenended dataset config build configs ocr cute80 val yaml iiit5k bash dataset name iiit5k formulation ocropenended dataset config build configs ocr iiit5k val yaml wordart bash dataset name wordart formulation ocropenended dataset config build configs ocr wordart val yaml funsd bash dataset name funsd formulation kieopenended dataset config build configs kie funsd val yaml poie bash dataset name poie formulation ocropenended dataset config build configs kie poie val yaml sroie bash dataset name sroie formulation ocropenended dataset config build configs kie sroie val yaml textvqa bash dataset name ocr formulation ocropenended dataset config build configs ocr textvqa val yaml docvqa bash dataset name ocr formulation ocropenended dataset config build configs ocr docvqa val yaml ocr vqa bash dataset name ocr formulation ocropenended dataset config build configs ocr ocrvqa val yaml visual description visual description is an inherent capability of lvlms as generative models mscoco bash dataset name mscoco formulation generation dataset config build configs caption mscoco val yaml textcaps bash dataset name textcaps formulation generation dataset config build configs caption textcaps val yaml nocaps bash dataset name nocaps formulation generation dataset config build configs caption nocaps val yaml flickr30k bash dataset name flickr30k formulation generation dataset config build configs caption flickr30k val yaml output result the output json file is generated in your output dir path and you can dircetly look up the corresponding json file for the final result you can also run command by ipython in the terminal python import json res json load open path to your prediction file json load the output json file res 0 res n n can be any number within the generated results citation if reform eval has been beneficial to your research and work please cite our work using the following format latex misc li2023reformeval title reform eval evaluating large vision language models via unified re formulation of task oriented benchmarks author zejun li and ye wang and mengfei du and qingwen liu and binhao wu and jiwen zhang and chengxing zhou and zhihao fan and jie fu and jingjing chen and xuanjing huang and zhongyu wei year 2023 eprint 2310 02569 archiveprefix arxiv primaryclass cs cv acknowledgements we thank mme https github com bradyfu awesome multimodal large language models tree evaluation mmbench https github com open compass mmbench lvlm ehub http lvlm ehub opengvlab com index html m3it https huggingface co datasets mminstruction m3it and other repositories that have made great contributions to multi modal large model evaluation in addition we are also very grateful that many lvlms can be open sourced and participate in our evaluation enriching results of our benchmarks related projects mme a comprehensive evaluation benchmark for multimodal large language models https github com bradyfu awesome multimodal large language models tree evaluation mmbench is your multi modal model an all around player https github com open compass mmbench lvlm ehub a comprehensive evaluation benchmark for large vision language models http lvlm ehub opengvlab com index html m3it a large scale dataset towards multi modal multilingual instruction tuning https huggingface co datasets mminstruction m3it
gpt4 instruction-tuning large-language-models llm multimodal pre-training large-vision-language-models benchmark embodied-ai in-context-learning instruction-following multimodal-chain-of-thought visual-chain-of-thought reformulation
ai
Face_Emotion_Recognition
div align center face emotion recognition div dataset b dataset link b https www kaggle com deadskull7 fer2013 br the dataset contains 35887 images of people showing 7 unique emotions anger happy disgust sad surprise neutral and fear br steps for running in local computer step 1 install all the dependencies mentioned in requirements txt file you can run the following command in your command prompt to install all the dependencies python pip install r requirements txt step 2 after installing all the dependencies open command prompt from the direction where app py file is present and run the following command python python app py that s it you can see the web application running at you localhost br live video of face emotion recognition alt text https github com venugopalkadamba face emotion recognition blob master live video gif emotions detected in a image alt text https github com venugopalkadamba face emotion recognition blob master all emotions detection jpg div align center model architecture img src https github com venugopalkadamba face emotion recognition blob master model png div div align center b please do this repo if you liked my work b div
face-emotion-recognition cnn deep-learning flask keras
ai
Ros-Robot
rosrobot ros
front_end
Py-NLP-applied-to-Event-Driven-Investing-Profit-Warning-Prediction
nlp applied to event driven investing profit warning prediction applying nlp natural language processing techniques to predict profit warnings using conference call transcripts data scrappy and selenium used to scrappe conference call transcripts data the two key websites used to obtain the data were i rtt news www rttnews com data about us profit warnings and seeking alpha seekingalpha com publicly available conference call transcripts 93 conference call transcripts scrapped containing 42 transcripts from future profit warning stocks and 51 transcripts from healthy stocks coverage considered only us companies within the industrial sector mainly capital goods companies for the period 4q16 and 1q17 feature engineering more than 90 features were created out from the whole conference call transcripts and its md and q a parts python nlp related libraries used in this analysis were textstat nltk vader pysentiment spacy and gensim several nlp dimensions were measured in order to generate reliable and significant predictor categories related to text physical properties size number of words number of syllables etc text complexity readability indices like smog index padding lexicon complexity number of difficult words brown dictionary semantic and syntactic sentiment indices notes read the powerpoint attached or the nycdsa blog post in order to get familiar with the data and the different sources of information https nycdatascience com blog student works predicting profit warnings nlp applied conference call transcripts analysis different warned me as of 2018 about problems to scrapp seeking alpha s website using my scrappy crawler program seeking alpa has apparently changed its website since i last crawled the data in 2017 if you need to obtain more conference call data from sa s wesbite i recommend you to tweak my crawler or also check some good tips in https stackoverflow com questions 48756326 web scraping results in 403 forbidden error best advice is to try to obtain if possible the conference call transcripts from a subscription based data provider factset bloomberg etc on a user friendly format in order to save the web crapping pain pending future work in this project will be focused on inclusion of other sectors and industries extension of the time span to at least five years tagging and standalone analysis of management team different members ceo cfo coo etc and analysts addition of more complex modeling methods such as neural networks
ai
sharc-reusable-components
sam reusable modules this repository contains a collection of code packaged into reusable components designed to facilitate easy deployment into demo or production systems unless otherwise specified all of these components work in both freertos and main loop applications some components are derived from open source oss projects and have bsd mit or similar commercial friendly licenses 3rd party this directory contains 3rd party proprietary packages which in most cases have been modified to ease integration adi a2b this directory contains the a2b stack the stack is a cross platform scalable a2b network discovery and life cycle management environment designed to run on a wide array of system processors and architectures adi drivers components in this directory are derived from existing adi device drivers or system services oss services components in this directory are derived from a variety of oss projects and adapted where appropriate to work here simple apps this directory contains a collection of make based starter applications simple drivers components in this directory are part of the simple driver collection of device drivers and designed to provide easy integration reasonable flexibility and high performance simple services this directory contains middle ware components not derived from open source projects license and notice files for detailed component license information please refer to the license and notice files located throughout this repository
os
design-system-sketch
priceline one sketch a sketch library of components and styles designed to be the single source of truth for user interface standards check out our react components https github com pricelinelabs design system to start building and to learn more about the priceline one design system install priceline one download the latest release https github com pricelinelabs design system sketch releases in sketch open preferences navigate to libraries import the pricelineone sketch and pricelineoneicons sketch files optionally import the pricelineoneemail sketch file for access to email specific components recommended plugins relabel button https github com kenmoore sketch relabel button shared text styles https github com nilshoenson shared text styles sketch palettes https github com andrewfiorillo sketch palettes
design-system sketch
os
World-War-II-Weather-Dashboard
h1 p align center strong visualizing weapons weather and aircraft data from world war ii strong p br p align center h1 p align center https wwii herokuapp com p p align center every day memories of world war ii its sights and sounds its terrors and triumphs disappear p p align center www nationalww2museum org p br background 1 intro images 1 png welcome to the world war ii visualizations dashboard where you will be able to dive deeper into the historical events of this war occurring between 1942 1945 the goal of our visualizations is to provide the user with a comprehensive overview of major bombing events where and when they occurred and how weather conditions may have affected these missions 2 aircraft images 2 png visualizations the first visual is an interactive map which allows the user to explore data we found on wwii aerial bombing events that occurred between 1942 and 1945 within all theaters this tells us a visual story of the major bombing events that took place during wwii and how these may have influenced the overall trajectory of the war our second visual is an interactive table chart that shows the detailed stats of the data we found including aircraft used to transport the bombs weapons locations etc and gives the user the opportunity to see the war unfold in a chronological order with the opportunity to select and interact with specific dates in history datasets used 3 bombing images 3 png https www kaggle com smid80 weatherww2 22 20 5ct 20 22 blank https data world datamil world war ii thor data i andrew zamora vanessa simpson ibet hernandez i br i utsa trilogy data analytics bootcamp 2020 i
server
azure-iot-preview
cortex m4 https github com azure rtos azure iot preview workflows cortex m4 badge svg cortex m7 https github com azure rtos azure iot preview workflows cortex m7 badge svg azure rtos sdk for azure iot is now officially released azure rtos sdk is now part of netx duo https github com azure rtos netxduo version 6 1 or above azure iot module is in netx duo addons under azure iot https github com azure rtos netxduo tree master addons azure iot and the nx cloud module is in addon cloud https github com azure rtos netxduo tree master addons cloud this repositiory is now archived and is no longer maintained or monitored azure rtos sdk for azure iot this repository contains sdk for azure iot services sdk uses threadx https github com azure rtos threadx and netxduo https github com azure rtos netxduo to connect to azure iot documentation documentation for this library can be found here link docs azure rtos iot sdk api md key features heavy check mark feature available heavy check mark feature partially available see description for details heavy multiplication x feature planned but not supported feature azure rtos sdk for azure iot services description send device to cloud message https docs microsoft com azure iot hub iot hub devguide messages d2c heavy check mark send device to cloud messages to iot hub with the option to add custom message properties receive cloud to device messages https docs microsoft com azure iot hub iot hub devguide messages c2d heavy check mark receive cloud to device messages and associated properties from iot hub device twins https docs microsoft com azure iot hub iot hub devguide device twins heavy check mark iot hub persists a device twin for each device that you connect to iot hub the device can perform operations like get twin document subscribe to desired property updates direct methods https docs microsoft com azure iot hub iot hub devguide direct methods heavy check mark iot hub gives you the ability to invoke direct methods on devices from the cloud dps device provisioning service https docs microsoft com azure iot dps heavy check mark this sdk supports connecting your device to the device provisioning service via for example individual enrollment https docs microsoft com azure iot dps concepts service enrollment using an x 509 leaf certificate https docs microsoft com azure iot dps concepts security leaf certificate protocol mqtt the azure rtos sdk for azure iot services supports only mqtt iot plug and play https docs microsoft com azure iot pnp overview iot plug and play heavy check mark iot plug and play preview enables solution developers to integrate devices with their solutions without writing any embedded code asc for iot https docs microsoft com azure asc for iot heavy check mark the azure security center for iot security module provides a comprehensive security solution for azure rtos devices building and using the library prerequisites install the following tools cmake https cmake org download version 3 13 0 or later gcc compilers for arm none eabi https developer arm com tools and software open source software developer tools gnu toolchain gnu rm downloads ninja https ninja build org cloning the repo bash git clone https github com azure rtos azure iot preview git git submodule update init building sample bash cd samples cmake gninja bbuild dcmake toolchain file cmake cortex m4 cmake cmake build build repository structure and usage branches releases the master branch has the most recent code with all new features and bug fixes it does not represent the latest general availability ga release of the library releases each official release preview or ga will be tagged to mark the commit and published to the github releases tab e g v6 0 beta1 directory layout azure iot docs nx cloud samples cmake lib netxduo threadx ports cortex m4 gnu ports cortex m7 gnu sample azure iot embedded sdk sample projects sample projects zip files can be downloaded from the release https github com azure rtos azure iot preview releases associated with this repository note these zip files are completely self contained and include appropriate code from the other azure rtos repositories please refer to the license txt file in each zip file for licensing requirements security azure rtos provides oems with components to secure communication and to create code and data isolation using underlying mcu mpu hardware protection mechanisms it is ultimately the responsibility of the device builder to ensure the device fully meets the evolving security requirements associated with its specific use case in the meanwhile with built in support of azure security center for iot https docs microsoft com azure asc for iot iot security azure rtos you can detect malicious network activities and create baseline with the predefined customer alert rules licensing license terms for using azure rtos are defined in the license txt file of this repo please refer to this file for all definitive licensing information no additional license fees are required for deploying azure rtos on hardware defined in the licensed hardware txt file if you are using hardware not defined in the licensed hardware txt file or have licensing questions in general please contact microsoft directly at https azure rtos ms iot contact com contribution feedback issues and professional support if you encounter any bugs have suggestions for new features or if you would like to become an active contributor to this project please follow the instructions provided in the contribution guideline for the corresponding repo for basic support click issues in the command bar or post a question to stack overflow using the threadx and azure rtos tags professional support plans https azure microsoft com en us support options are available from microsoft additional resources the following are references to additional azure rtos and azure iot in general azure rtos website https azure microsoft com en us services rtos azure rtos sales questions https azure rtos ms iot contact com microsoft q a for azure iot https docs microsoft com en us answers products azure product iot internet of things show aka ms iotshow iot tech community aka ms iottechcommunity
os
Blog
blog blog getting started these instructions will get you a copy of the project up and running on your local machine for development and testing purposes prerequisites what things you need to install the software and how to install them node v12 x x postgresql 11 x environment variable used in application variable description example server port set this variable to run your app on this port 3002 db connection string this variable is used to connect with database dialect username password host port database name installing a step by step series that will tell you how to get a development env running cd server npm ci installing via using docker compose a step by step series to run application via using docker compose command use the following link to setup docker compose on different operating system click here to check docker compose installation commands for your operating system https docs docker com compose install install compose run the following command to build a docker image for application bash bin bash cd ops local sudo chmod a x build sh build sh use the following commands to check start stop running docker containers images on your local machine list all containers only ids bash bin bash docker ps aq stop all running containers bash bin bash docker stop docker ps aq docker stop container id stop single running container remove all containers bash bin bash docker rm docker ps aq docker rm container id remove single stopped container remove all images bash bin bash docker rmi docker images q docker rmi image id remove single docker image check running docker compose services bash bin bash docker compose ps check status of running particular docker compose services bash bin bash docker compose ps q service name kill running docker compose services bash bin bash docker compose kill database migrations create database node modules bin sequelize db create url dialect username password host port database name keyword example description dialect postgres database we are using username root username for the database password postgres password for the database host localhost ip host on which database is running port 5432 port for the database database name sample database database name for the microservice create migrations node modules bin sequelize migration create name migration name run migrations node modules bin sequelize db migrate url dialect username password host port database name keyword example description dialect postgres database we are using username root username for the database password postgres password for the database host localhost ip host on which database is running port 5432 port for the database database name sample database database name from which migrations will happen run the server for development npm run start dev for production npm run start run the test cases npm run test for validating modules npm run test nsp for linting npm run test lint
server
machine-learning-deployment
machine learning deployment tutorials launch machine learning models into production using flask docker etc citing if you find this code useful in your research please consider citing the blog misc sagardeploy author abhinav sagar title how to easily deploy machine learning models using flask year 2019 journal towards data science 1 predict sales check out the corresponding medium blog post https towardsdatascience com how to easily deploy machine learning models using flask b95af8fe34d4 https towardsdatascience com how to easily deploy machine learning models using flask b95af8fe34d4 environment and tools 1 scikit learn 2 pandas 3 numpy 4 flask installation pip install scikit learn pandas numpy flask python model py python app py logo i1 png 2 predict house prices download the dataset from here https www kaggle com shivachandel kc house data environment and tools 1 scikit learn 2 pandas 3 numpy 4 flask 5 docker installation docker compose up build curl x post h content type application json d to predict json json http localhost 8080 predict price where to predict json contains grade 9 0 lat 37 45 long 12 09 sqft living 1470 08 waterfront 0 0 yr built 2008 0 or curl x post h content type application json d grade 9 0 lat 37 45 long 12 09 sqft living 1470 08 waterfront 0 0 yr built 2008 0 http localhost 8080 predict price output json predict cost 1022545 34768284 license mit license copyright c 2019 abhinav sagar permission is hereby granted free of charge to any person obtaining a copy of this software and associated documentation files the software to deal in the software without restriction including without limitation the rights to use copy modify merge publish distribute sublicense and or sell copies of the software and to permit persons to whom the software is furnished to do so subject to the following conditions the above copyright notice and this permission notice shall be included in all copies or substantial portions of the software the software is provided as is without warranty of any kind express or implied including but not limited to the warranties of merchantability fitness for a particular purpose and noninfringement in no event shall the authors or copyright holders be liable for any claim damages or other liability whether in an action of contract tort or otherwise arising from out of or in connection with the software or the use or other dealings in the software
flask machine-learning machine-learning-deploy predictive-modeling predictive-analytics linear-regression docker docker-deployment deployment machine-learning-algorithms machine-learning-models flask-deploy
ai
Resilient-Response
resilient response resilient response is a web application developed to foster connectivity and provide vital information during periods of crisis this application was conceived as part of the 2023 solution challenge which aligns with the overarching mission to address the united nations 17 sustainable development goals by leveraging google technology the resilient response web app offers users the ability to register and access chat interfaces to stay informed about events occurring in their geographic area in addition to real time weather updates the application also displays the locations of available shelters during times of emergency img src public firebase png width 175 img src public tensorflow ar21 png width 175 public banner png nbsp our video presentation https www youtube com watch v a0wbma13rkm emsp live demo https resilient response vercel app our team is excited to announce the development of a cutting edge web application both for mobile and desktop for the gdsc solution challenge 2023 with a mission to empower communities to withstand and recover from natural disasters our app focuses on building resilience through innovative technology we believe that by harnessing the power of technology we can build stronger more resilient communities and help protect our planet for future generations it helps to achieve united nation sustainable development goals 1 11 and 13 target 1 5 build resilience to environmental economic and social disasters https sdg tracker org no poverty target 11 5 reduce the adverse effects of natural disasters https sdg tracker org cities target 13 1 strengthen resilience and adaptive capacity to climate related disasters https sdg tracker org climate change google technologies used tensorflow img src public tensorflow logo png width 15 tensorflowjs tensorflow tfjs tensorflow models handpose google firebase img src public firebase logo png width 18 realtime database authentication cloud messaging cloud functions other libraries leafletjs img src public leaflet logo png width 15 maptiler img src public maptiler logo png width 15 application features provides track of who has signed up and to contact them in case of an emergency sharing of information related to disasters and emergency preparedness enables users to organize and collaborate with each other during an emergency provides a map of nearby emergency shelters that users can go to during a disaster donate disaster gift boxes to those in need during and after a disaster provides a list of emergency contacts that users can call or message during an emergency provides real time weather updates and alerts users can use computer vision technology to detect hand gestures and poses screenshots main page signup page chat login community chat https i ibb co ndm7hbt homepage jpg https i ibb co wbqd9bk signup page jpg https i ibb co 0slmw0d chat login jpg https i ibb co f59twh7 community chat jpg donations page shelter locations real time weather hand detector page https i ibb co k44zgyx donation page jpg https i ibb co wzbv5sd shelter address jpg https i ibb co d60km3z real time weather jpg https i ibb co 5mbz8db hand detector jpg let s start building if you wish to build this project you should make your own firebase project and use your own firebase json the project should be on the blaze plan and have a realtime database cloud functions storage and authentication enabled in the project directory install the dependencies and library npm install legacy peer deps img src public building gif width 300 align right after that run the project npm start runs the app in the development mode open http localhost 3000 to view it in your browser the page will reload when you make changes you may also see any lint errors in the console npm test launches the test runner in the interactive watch mode see the section about running tests for more information npm run build authors contact us n divij img alt alt text width 18px src public github logo png https github com n 45div img alt alt text width 18px src public gmail logo png ndivij2004 gmail com vivek yadav img alt alt text width 18px src public github logo png https github com enpvivek img alt alt text width 18px src public gmail logo png enpvivek gmail com
firebase react open-source machine-learning dashboard hacktoberfest chat tensorflowjs
front_end
EmbeddedSystem_MediaCenter
embeddedmediacenter embedded system design on mcb1700 development board the media center is created using c programming language and kiel uvision ide features of the media center include a main menu a photo gallery a music player original video game and paint application also see ieee report https drive google com file d 1t9bg8mxxofsusiagq q 4v1gezmk38gx view usp sharing
os
COMP9900-LinkTime
comp9900 linktime 20t1 accommodation booking web portal 1 install frontend environment and run frontend if node or npm is not installed on your computer please check this website https nodejs org en for node installation after installing the node environment open your command console execute the following command in the folder where the project is located to start the frontend bash cd frontend yarn install yarn start open your browser and visit http localhost 3000 http localhost 3000 you will see the homepage of this project 2 install backend environment and run backend you can create a virtual env with conda recommended if conda is not installed on your computer please check this website https docs conda io projects conda en latest user guide install for conda installation after installing conda execute the following command to create and activate a virtual environment bash conda create n comp9900 python 3 7 conda activate comp9900 this method creates a space in which the backend can run without clashing with any other python packages and issues on your local machine if you don t care you can run the backend in the global space as such bash cd backend pip3 install r requirements txt python3 run py open your browser and visit http localhost 5000 http localhost 5000 you will see the backend docs of this project if you want to exit the virtual environment execute the following code bash conda deactivate to make sure everything is working correctly we strongly suggest you read the instructions in both backend and frontend and try to start both servers frontend and backend quick start you can login with the following user info bash 1 username link password 123123 2 username lin password 123123 3 source code navigation you can view frontend source code in the editor like sublime or vscode bash src the main code folder common header and footer component pages all frontend pages redux redux data warehouse style default css style utils help functions app js route settings for website admin js combine header page contents footer index js the main entrypoint you can view backend source code in the editor like sublime or vscode bash apis all api code db initialization code and data utils api models and help functions uploads all upload pictures requirement txt relevant python packages run py the entrypoint for backend
front-end react react-redux antd backend flask sqlalchemy
server