names
stringlengths
1
98
readmes
stringlengths
8
608k
topics
stringlengths
0
442
labels
stringclasses
6 values
Quora_question_pairs_NLP_Kaggle
duplicate question detection using word2vec xgboost and autoencoders https i imgur com 2ylty1y jpg in this post i tackle the problem of classifying questions pairs based on whether they are duplicate or not duplicate this is important for companies like quora or stack overflow where multiple questions posted are duplicates of questions already answered if a duplicate question is spotted by an algorithm the user can be directed to it and reach the answer faster an example of two duplicate questions is how do i read and find my youtube comments and how can i see all my youtube comments and non duplicate questions is what s causing someone to be jealous and what can i do to avoid being jealous of someone two approaches are applied to this problem 1 sequence encoder trained by auto encoder approach and dynamic pooling for classification 2 bag of words model with logistic regression and xgboost classifier bag of words model with ngrams 4 and min df 0 achieves an accuracy of 82 with xgboost as compared to 89 5 whicch is the best accuracies reported in literature with bi lstm and attention the encoder approach implemented here achieves 63 8 accuracy which is lower than the other approaches i found it interesting because of the autoencoder implementation and the approach considers similary between phrases as well as the words for variable length sequences perhaphs the efficiency could be improved by changing the dimentions of dynamically pooled matrix a different approach in cleaning the data as well as spelling checks classifier s can be compared based on three different evaluation metrics log loss auc and accuracy log loss or the cross entropy loss is an indicator of how different the probability distribution of the output of the classifier is relative to the true probability distribution of the class labels receiver operating characteristic plots the true positive rate vs the false positive rate and an area under the curve auc of 0 5 corresponds to a random classifier higher the auc better the classifier accuracy is a simple metric which calculates the fraction of correct predicted labels in this post i use accuracy as a metric for comparison as there is specific reason to do otherwise bow model https i imgur com 2wqyptt png https i imgur com pbb57pm png as shown in the figure as min df is changed from 0 to 600 the accuracy decreases from 80 to 72 for ngram 4 min df thresholds the ngrams appearing in the vocabulary according to count any ngram with frequency of appearance below min df in the corpus is ignored ngrams beyond 4 are not used as there is a negligible change in accuracy as ngrams are increased from 3 to 4 tf idf vectorizer instead of count vectorizer is used to speed up computation and it also increases the accuracy by a small amount less than 1 for one data point an accuracy of 82 is obtained by running the same input through xgboost for the bow model parameter sweep vocabulary size ranges from 703912 n grams 4 and min df 0 to 1018 ngrams 1 and min df 600 auto encoder and dynamic pooling cnn classifier https i imgur com xlkpkab png the figure above shows the implemented model which is similar to socher et al word2vec embedding is generated with a vocabulary size of 100000 according to tensorflow word2vec opensource release using the skip gram model in these embeddings words which share similar context have smaller cosine distance the key problem is dealing with questions of different lengths the information content of a sentence is compressed by training an auto encoder the main motivation behind this approach is to find similarity between sentences by comparing the entire sentence as well as the phrases in the sentence the problem of different lengths is circumvented by upsampling and dynamic pooling as described below https i imgur com 59wshfu png sentences are encoded using the approach shown in the left figure the three words and the two encodings are considered as input to generate the similarity matrix the auto encoder is trained as shown in the right figure using tensorflow the right figure descibes the encoder decoder architecture i used a single layer neural network for the encoder and the decoder multiple hidden layers could also be considered multiple batches of words are concatenated and fed into the encoder and in the ideal case the output of the decoder should match the input mean squared error loss of the neural net is minimized with gradient descent optimizer with learning rate of 0 1 l2 regularization coefficient of 1e 4 is used for the encoder and decoder weights the autoencoder here uses any two words for training and can be batch trained it is different from the approach used by socher et al where the author encodes the entire sentence and decodes it by unfolding it into a question unfolding autoencoder is difficult or maybe even impossible to implement in tensorflow dynamic computational graph construction tools like pytorch could potentially be a better fit to implment the full approach the entire sentence with its intermediate encodings can be used as input to the upsampling and dynamic pooling phase in the upsampling phase the smaller vector of the question pair considered is upsampled by repeating the encodings randomly chosen of the vector to match the length of the other question encodings a pairwise similarity matrix is generated for each phrase vector and the variable dimention matrix is pooled into a matrix of npool x npool i used npool 28 this matrix is fed into a cnn classifier to classify as duplicate or not a hyper parameter optimization of npool could also increase tha accuracy the accuracy of this model is 63 8 issues i faced some issued with sklearn s logistic regression the model did output right class labels but wrong probabilities i havent figured out a solution to this problem there was no such problem with xgboost references best question pair matching method wang zhiguo wael hamza and radu florian bilateral multi perspective matching for natural language sentences arxiv preprint arxiv 1702 03814 2017 understanding crossentropy loss and visualizing information http colah github io posts 2015 09 visual information unfolding recursive autoencoder approach socher richard et al dynamic pooling and unfolding recursive autoencoders for paraphrase detection advances in neural information processing systems 2011 word2vec embeddings tensroflow opensource release https github com tensorflow tensorflow blob r1 9 tensorflow examples tutorials word2vec word2vec basic py tensorflow https www tensorflow org
nlp nlp-machine-learning quora-question-pairs tensorflow xgboost scikit-learn
ai
OCSystem2.0
oc onlychain oc onlychain ocsystem lgpl ocsystem ocsystem wabt apache wavm bsd 3 onlychain ocsystem ocsystem onlychain ocsystem 1 oc mpt token 2 oc bft dpos 100 2 1 bft bft bft pbft bft bft https images gitee com uploads images 2020 0627 151311 07728aa7 459087 jpeg bft jpg 2 2 dpos oc oc 21 9 n 21 2 2 1 30 0 30 3 oc 2 3 21 63 126 63 100 100 x y 10 1 x 1000 y y y y y 0 5 y 0 5 y 0 5 y 0 5 y y y y 10 y 10 y 0 5 y 0 5 0 5s 10s 0 5s 10s 4 a 20 14 6 14 a 1 2 https images gitee com uploads images 2020 0627 152226 9f6cd204 459087 jpeg trandata jpg 4 1 1 14 a 2 https github com ibukisaar erasurecoding 5 oc secp256k1 https images gitee com uploads images 2020 0627 152855 4323f653 459087 jpeg key jpg 6 0296bf07ae95af84e0c77259c72de5f609ca772f 0000000000000000000000000000000000000001 0296bf07ae95af84e0c77259c72de5f609ca772e 7 1 2 hash 3 hash 4 hash 5 30 hash 6 30 7 hash 8 9 8 kademlia kademlia b b ddos action nonce hash 0 hash action 9 smart contract 1994 nick szabo oc oc lgpl 3 0 license license license
blockchain
nimble
div align center img src docs nimble logo icon svg width 100px p b ni nimble b p div nimble nimble angular npm version and repo link https img shields io npm v ni nimble angular svg label ni nimble angular https www npmjs com package ni nimble angular nimble blazor nuget version and repo link https img shields io nuget v nimbleblazor svg label nimbleblazor https www nuget org packages nimbleblazor nimble components npm version and repo link https img shields io npm v ni nimble components svg label ni nimble components https www npmjs com package ni nimble components nimble tokens npm version and repo link https img shields io npm v ni nimble tokens svg label ni nimble tokens https www npmjs com package ni nimble tokens the ni nimble design system styled ui components for ni applications storybook page https img shields io badge storybook white svg logo storybook https ni github io nimble storybook example angular app https img shields io badge example 20angular 20app dd0031 svg logo angular https ni github io nimble storybook example client app example blazor app https img shields io badge example 20blazor 20app 572b8a svg logo blazor https ni github io nimble storybook blazor client app wwwroot if you are at ni lucky you reach out to ask questions directly in the design system teams channel https teams microsoft com l team 19 3awo8vmmkmshfltkxxc0bczzo x4jlqsv5vxparjdh13k1 40thread tacv2 conversations groupid 9ee065d7 3898 4245 82f6 76e86084b8b1 tenantid 87ba1f9a 44cd 43a6 b008 6fdb45a5204e getting started this repository contains the source for the following packages ni nimble angular angular workspace projects ni nimble angular styled angular components for use in ni angular applications ni nimble blazor packages nimble blazor styled blazor components for use in ni blazor applications ni nimble components packages nimble components styled web components for use in other applications also used by nimble angular and nimble blazor ni nimble tokens packages nimble tokens design tokens used by the component packages and some additional utility packages ni xliff to json converter packages xliff to json converter a utility to convert translation files from xliff to json for angular localization consult the readme md for each package to learn more including how to use it in an application the above packages follow semantic versioning https semver org consult the changelog md for each package to see the changes in each version including instructions for adapting your application in response to breaking changes community we welcome feedback and contributions the fastest way to ask questions is to join the discussion on teams https teams microsoft com l team 19 3awo8vmmkmshfltkxxc0bczzo x4jlqsv5vxparjdh13k1 40thread tacv2 conversations groupid 9ee065d7 3898 4245 82f6 76e86084b8b1 tenantid 87ba1f9a 44cd 43a6 b008 6fdb45a5204e accessible to ni employees only you can also start a discussion on github by filing an issue using the discussion template https github com ni nimble issues new choose requesting new components and features is nimble missing a component that your team needs search the issues list https github com ni nimble issues to see if it s on our radar if an issue exists already comment with your use cases if no issue exists yet file a new one using the feature request template filing bugs to report a bug with an existing component file an issue using the bug report template learning architecture docs architecture md architecture of the design system packages and monorepo contributing see getting started in contributing md contributing md getting started to get started with building the monorepo component status view status of components that are completed and on the roadmap in the component status https ni github io nimble storybook path docs component status docs page
os
cpp-cv-project-template
title resource project title png it s never been easier than this purpose fire this repo contains a c project template for developing computer vision applications features star2 currently this repo is going under a rigorous change stay tuned for a big update the project comprise a widely used c project structure the project supports installing essential toolchains for c programming and debugging git build essentials cppcheck cmake clang gcc clang tools clang tidy lldb lld libc libomp heavy check mark the project supports the following 3rdparty libraries python3 basic packages numpy pandas matplotlib jupyter notebook voila tqdm nbconvert heavy check mark open3d python only for now heavy check mark eigen heavy check mark opencv heavy check mark non free algorithms currently disabled ceres solver heavy check mark gtsam heavy check mark pangolin heavy check mark pcl heavy check mark visualization disabled opengl heavy check mark the project supports various tools to make good c project practices gtest heavy check mark spdlog heavy check mark fast cpp csv parser heavy multiplication x nlohmann json heavy multiplication x status legend heavy check mark fully supported white check mark partially supported build only heavy multiplication x not supported yet how to use book you need python3 https www python org to use the automation scripts for project setup and build linux bash install python 3 and required modules sudo apt install python3 python3 pip pip3 install pyyaml gitpython edit setup config yaml to configure project gedit thirdparty packages yaml install dependencies build dependencies py edit the packages yaml thirdparty packages yaml file to configure your project dependencies you can also use the optional password your password argument to avoid manually typing your linux password for every internal sudo command usage you can also use the optional d argument to also build debug libraries you can also use the optional system argument to build inside the system windows for now windows native build is not supported instead you may use windows subsystems for linux wsl https docs microsoft com en gb windows wsl install win10 to use the build scripts macos for now macos native build is not supported license bank this repo is licensed under mit license click here license https github com changh95 cpp cv project template blob main license to view the license contributors sunglasses thanks goes to these wonderful people all contributors list start do not remove or modify this section prettier ignore start markdownlint disable table tr td align center a href https github com changh95 img src https avatars githubusercontent com u 39010111 v 4 width 100px alt br sub b changh95 b sub a br a href https github com changh95 cpp cv project template commits author changh95 title commits a td td align center a href https github com pacientes img src https avatars githubusercontent com u 22834091 v 4 width 100px alt br sub b pacientes b sub a br a href https github com changh95 cpp cv project template commits author pacientes title commits a td td align center a href https github com gst img src https avatars githubusercontent com u 38055 v 4 width 100px alt br sub b gst b sub a br a href https github com changh95 cpp cv project template commits author gst title commits a td tr table markdownlint restore prettier ignore end all contributors list end
ai
dashninja-fe
dash ninja front end dashninja fe by alexandre aka elbereth devilliers check the running live website at https www dashninja pl this is part of what makes the dash ninja monitoring application it contains public rest api using php and phalcon framework public web pages using static html5 css javascript feel free to donate to xkfkhqmnhvqovo7kxqjvnnifnqhrnzycsz special thanks to jetbrains https www jetbrains com requirement you will need a running website official at https dashninja pl uses nginx for the rest api php v5 6 with mysqli works tested with php 7 1 phalcon v2 0 x should work with any version up to v3 2 x mysql database with dash ninja database check dashninja db repository optional dash ninja control script installed and running to have a populated database install go to the root of your website for dash monitoring ex cd home dashninja2 www get latest code from github shell git clone https github com elbereth dashninja fe git configure php to answer only to calls to api index php rewriting to end point api add api cron php script to your crontab activate for main and or test net both for blocks24h and for masternode full list configuration todo
php phalcon dashpay monitoring blockchain
front_end
iot-nodejs
ibm watson iot platform javascript sdk build status https travis ci org ibm watson iot iot nodejs svg branch master https travis ci org ibm watson iot iot nodejs coverage status https coveralls io repos github ibm watson iot iot nodejs badge svg branch master https coveralls io github ibm watson iot iot nodejs branch master github issues https img shields io github issues ibm watson iot iot nodejs svg https github com ibm watson iot iot nodejs issues github https img shields io github license ibm watson iot iot nodejs svg https github com ibm watson iot iot nodejs blob master license installation npm install wiotp sdk save usage application javascript import applicationclient applicationconfig from wiotp sdk let appconfig applicationconfig parseenvvars let appclient new applicationclient appconfig appclient connect do stuff appclient disconnect device javascript import deviceclient deviceconfig from wiotp sdk let deviceconfig deviceconfig parseenvvars let deviceclient new deviceclient deviceconfig deviceclient connect do stuff deviceclient disconnect gateway javascript import gatewayclient gatewayconfig from wiotp sdk let gwconfig gatewayconfig parseenvvars let gwclient new gatewayclient gwconfig gwclient connect do stuff gwclient disconnect development build npm i npm run build publish npm login npm publish
iot nodejs sdk
server
trainer-backend
readme this readme would normally document whatever steps are necessary to get the application up and running things you may want to cover ruby version system dependencies configuration database creation database initialization how to run the test suite services job queues cache servers search engines etc deployment instructions
server
CRM-CustomerRelationshipManager
crm customer relationship manager a basic crud app using spring framework for backend development a simple project for spring framework beginners where you can create read update or delete from your database in addition as a plus feature you can search for a customer index png getting started prerequisites 1 must install maven or if you re using eclipse jee ide make sure that m2e plugin is installed 2 check sql scripts to create the database built with spring 5 https spring io projects spring framework including spring mvc spring core spring aop maven https maven apache org dependency management hibernate orm http hibernate org orm object relational mapping eclipse https www eclipse org java ide mysql https www mysql com relational database management system
spring-framework hibernate
server
spring-boot-hello-world
spring boot hello world development of hello world web application with spring boot 1 go to eclipse help market place and search spring tools to install 2 after spring boot tool installation lets create a spring boot starter project here project name is helloworld 3 let s check the pom xml and set java version as 8 if it s different 4 spring boot will create a helloworldapplication java class in the com example hello package this class will be annotated with springbootapplication this is the entry point of the spring boot application to start package com example hello import org springframework boot springapplication import org springframework boot autoconfigure springbootapplication springbootapplication public class helloworldapplication public static void main string args springapplication run helloworldapplication class args 5 now run the created project as spring boot app this will start the inbuilt servlet container and run the application by default web application will run at port 8080 let s check the application in browser http localhost 8080 it will show error message as we haven t created any request handler 6 let s create a controller which will accept request the controller class will have various annotation like controller getmapping to handle the get request a controller to inform framework that this class is controller b getmapping to inform framework about get request mapping for a specific path here path is defined as hello model is used to transfer data from controller to view 7 in the above example the controller return hello this will force framework to redirect transfer control to hello html file here make sure hello html file is available inside resource templates directory 8 we have used themeleaf template spring devtools helps to update container on runtime i e changes will be automatically pushed to container let s check the web application http localhost 8080 hello hope this quick tutorial will help in basic understating and encourage you to develop more complex applications
front_end
MelbusRtos
author vincent gijsen work in progress repo to hold work in progress melbus port do note only for hu 850 i wasn t able to sniff other radios hw stm32f103 bluepill bm20 bluetooth module aka is2020 from microchip software freertos libopencm3 heavily inspired copy paste of initial is2020 library from https github com tomaskovacik is2020 for structure ide stm32workbench working bluetooth comms with is2020 extract meta from songs running event driven updates due to song end user actions on phone control of music playback console via stm32f103 usb virtual hu interface to send hu keys and get single line display back tested with screen terminal todo implement melbus logic for initialisation and generating of text sending reqto hu850 improve statemachine for settings menu voice assistent
os
trueblocks-core
markdownlint disable md033 md036 md041 b if you have an existing installation complete applicable migrations https github com trueblocks trueblocks core blob develop migrations md before proceeding b hr h1 trueblocks unchained index h1 github repo size https img shields io github repo size trueblocks trueblocks core github contributors https img shields io github contributors trueblocks trueblocks core https github com trueblocks trueblocks core contributors github stars https img shields io github stars trueblocks trueblocks core style 3dsocial https github com trueblocks trueblocks core stargazers github forks https img shields io github forks trueblocks trueblocks core style social https github com trueblocks trueblocks core network members twitter follow https img shields io twitter follow trueblocks style social https twitter com trueblocks table of contents introduction introduction installing installing command line command line troubleshooting troubleshooting the unchained index the unchained index docker version docker version documentation documentation linting linting contributing contributing contact contact list of contributors list of contributors introduction trueblocks improves access to blockchain data for any evm compatible chain particularly ethereum mainnet while remaining entirely local features include chifra init and chifra scrape which builds the unchained index an index of address appearances that provides lightning fast access to transactional histories an optional binary cache which speeds up queries to the rpc by orders of magnitude enhanced command line options enabling much better access to chain data for data scientists and analysts for example easily extract all logs produced by a smart contract or view all erc 20 holdings for an account etc advanced tools for producing reconciled bank statements and accounting export for any token including eth an infinite number of other things restricted only by your imagination installing please see the installation instructions https trueblocks io docs install install trueblocks on our website searching account histories while optional you most likely want to use the unchained index to search account histories to do so get the index https trueblocks io docs install get the index account explorer you may use the command line of course to access data but you may also wish to run an api server shell chifra daemon use curl to pull data or use it to drive our pre beta account explorer see installing the explorer https trueblocks io docs install install explorer the api provides the identical tools and options as the command line and it documented here https trueblocks io api generate grpc files developers only to regenerate grpc files you have to install protobuf tools shell brew install protobuf mac go install google golang org protobuf cmd protoc gen go v1 28 go install google golang org grpc cmd protoc gen go grpc v1 2 command line the trueblocks command line tool is called chifra this gives you access to all the other tools shell chifra help get more help on any sub command with chifra cmd help full documentation is available on our website https trueblocks io getting data let s look at the first subcommand called status shell chifra status if you get a bunch of data congratulations your installation is working try this command which shows every 10th block between the first and the 100 000th shell chifra blocks 0 100000 10 you should see a long stream of data kill the display with control c see the entire list of chifra commands with chifra help troubleshooting depending on your setup you may get the following error message when you run some chifra commands shell warning a request to your ethereum node http localhost 8545 resulted in the following error could not connect to server specify a valid rpcprovider by editing rootpath trueblocks toml edit the file as instructed you may find helpful answers on our faq https trueblocks io blog faq see our blog https trueblocks io blog for a lot of useful articles on getting started and using trueblocks if you continue to have trouble join our discord discussion https discord gg kafczh2x7k the unchained index the primary data structure produced by trueblocks is an index of address appearances called the unchained index this index provides very quick access to transaction histories for any address you may either build the entire index from scratch requires an evm compatible tracing archive node or you may download a snapshot of the index build from there this process is described in the article indexing addresses https trueblocks io docs install get the index docker version our official docker version https github com trueblocks trueblocks docker is in a separate repo please see that repo for more information on running with docker documentation the trueblocks documentation repo https github com trueblocks trueblocks docs builds the trueblocks website see our website for the best available documentation https trueblocks io linting our build process requires the code you submit to be linted in order to that you must install the golang linters see this page for more information https golangci lint run usage install to install the primary linter called golangci lint run this command shell curl ssfl https raw githubusercontent com golangci golangci lint master install sh sh s b go env gopath bin v1 53 3 verify the installation with golangci lint version you should see something like this shell golangci lint has version 1 50 1 built from commit on date next run golangci lint linters your system should have at least the default list https golangci lint run usage linters to properly lint your submission shell golangci lint linters you are encouraged to use additional linters if you do and you think they re useful please suggest that we add it to our build process contributing a chart showing the number of stars on our repo over time stargazers over time https starchart cc trueblocks trueblocks core svg https starchart cc trueblocks trueblocks core we love contributors please see information about our workflow https github com trueblocks trueblocks core blob develop docs branching md before proceeding 1 fork this repository into your own repo 2 create a branch git checkout b branch name 3 make changes to your local branch and commit them to your forked repo git commit m commit message 4 push back to the original branch git push origin trueblocks trueblocks core 5 create the pull request contact if you have questions comments or complaints please join the discussion on our discord server which is linked from our website https trueblocks io list of contributors thanks to the following people who have contributed to this project tjayrush https github com tjayrush dszlachta https github com dszlachta wildmolasses https github com wildmolasses mysticryuujin https github com mysticryuujin mattdodsonenglish https github com mattdodsonenglish crodnun https github com crodnun
ethereum blockchain command-line-tools indexing
blockchain
iot-visualization-examples
iot visualization examples sonarcloud status https sonarcloud io api project badges measure project alwxkxk iot visualization examples metric alert status https sonarcloud io dashboard id alwxkxk iot visualization examples sonarcloud bugs https sonarcloud io api project badges measure project alwxkxk iot visualization examples metric bugs https sonarcloud io component measures metric reliability rating list id alwxkxk iot visualization examples sonarcloud vulnerabilities https sonarcloud io api project badges measure project alwxkxk iot visualization examples metric vulnerabilities https sonarcloud io component measures metric security rating list id alwxkxk iot visualization examples bash install package yarn build npm run build run server npm run dev example datacenter https alwxkxk github io iot visualization examples github page 3diot scaugreen cn http 3diot scaugreen cn
server
Tri-Netra
tri netra an augmented reality based information system for trident academy of technology github issues https img shields io github issues agrmayank tri netra label issues style flat square github pull requests https img shields io github issues pr agrmayank tri netra label pull 20requests style flat square github last commit https img shields io github last commit agrmayank tri netra label last 20commit style flat square github commit activity https img shields io github commit activity m agrmayank tri netra label commit 20activity style flat square github all releases https img shields io github downloads agrmayank tri netra total label downloads style flat square github repo size https img shields io github repo size agrmayank tri netra label repo 20size style flat square p align center br img src assets icons tat building png alt tat building br br p download build download the latest release here https github com agrmayank tri netra releases quickstart download unity https unity3d com get unity download archive version 2018 or above download microsoft visual studio community https visualstudio microsoft com platform specific sdk such as android build tools are also required how to use to build the project you need to go to menu build settings your os build and run don t forget to change the bundle id under the menu build settings your os player preferences hr made with by agrmayank https agrmayank github io
unity unity3d app gps navigation android ios ar augmented-reality augmentedreality 3d 3d-models arfoundation arcore college college-project project
server
tf-dev-nlp
tf dev nlp course materials for course tensorflow developer natural language processing with tensorflow 2 9 all content is copyright data trainers llc and is intended to be used for educational purposes alone
ai
bitwave
bitwave tv bitwave tv banner image static images bitwave banner jpg bitwave tv https bitwave tv an open platform live streaming service for creators to freely express themselves dev setup bash install dependencies npm ci serve with hot reload at localhost 3000 npm run dev production setup bash install dependencies npm ci build for production and launch server npm run build npm start jetbrains logo static images jetbrains 128 png https www jetbrains com from bitwave for additional explanation on how things work checkout vue js https vuejs org v2 guide nuxt js https nuxtjs org video js https docs videojs com vuetify https vuetifyjs com
vue vuejs nuxt nuxtjs livestreaming bitwave
front_end
Data-Science-45min-Intros
data science 45 min intros a href https notebooks azure com import gh drskippy data science 45min intros img src https notebooks azure com launch png a every week our data science team gnip https twitter com gnip aka twitterboulder https twitter com twitterboulder gets together for about 50 minutes to learn something while these started as opportunities to collectively raise the tide on common stumbling blocks in data munging and analysis tasks they have since grown to machine learning statistics and general programming topics anything that will help us do our jobs better is fair game for each session someone puts together the lesson walk through and leads the discussion presentation platforms commonly include well written readmes ipython notebooks knitr documents interactive code sessions the more hands on the better feel free to use these for your own or your team s growth and do submit pull requests if you have something to add ok while we try to do it every week sometimes it doesn t happen in that case we try to guilt trip the person who slacked current topics python object oriented programming concepts modules packaging python oop 101 unit testing with unittest python unittest 101 iterators generators iterators generators 201 introduction to pandas pandas 101 introduction to vertica with vertica python vertica 101 introduction to multiprocessing python multiprocessing python decorators python decorators 101 python interfaces python interfaces python logging python logging 201 bash command line tools using jq jq 102 bash data structures bash 201 regular expressions regex 101 statistics maximum likelihood estimation max likelihood doodle count min algorithm count min 101 a b testing ab testing causal inference causal inference 101 error statistics error statistics 101 classical statistics applied to social data classical stats and social data 101 meaningful comparisons of ordered lists comparing collections counting and maximum likelihood estimation counting and mles estimating the number of classes in a population estimating classes long tail distributions i long tail distributions 001 long tail distributions ii long tail distributions 002 maximum likelihood parameter estimation max likelihood doodle probabilty graph models probgraphmodels 101 machine learning intro to scikit learn sklearn 101 introduction to k means clustering k means 101 choosing k in k means clustering choosing k in kmeans logistic regression logistic regression 101 naive bayes classifier naive bayes classifier 101 introduction to knn knn 101 introduction to adaboost adaboost 101 decision trees decision trees 101 basis expansions kernels ml basis expansion 101 model selection model selection 101 introduction to svm support vector machines 101 text mining with sklearn text mining with sklearn 101 bandit algorithms bandit algorithms 101 kernel smoothing kernel smoothing neural networks i neural networks 101 neural networks ii neural networks 201 natural langugage processing intro to topic modeling topic modeling 101 more on topic modeling a practical example topic modeling 201 part of speech tagging pos tagging text processing text 101 word vector spaces vector spaces network structure network statistics igraph network igraph 101 network analysis using null models networks 201 network analysis community structures networks 202 network analysis centrality metrics networks 203 algorithms count min sketch count min 101 engineering refactoring refactoring 101 geographic information systems shapefile utilties reverse geo coding and makefile gis tools 101 web development websockets websockets 101 python flask basics python flask basics visualization d3 and javascript intro d3 101 d3 reusable charts heatmap d3 201 real time data websockets intro websockets 101 introduction to horizon charts horizon charts 101 bokeh bokeh matplotlib graphing for science in python matplotlib 201 databases sql 201 script based data and queries sql 201 vertica vertica 101
ai
CIE_XYZ_NET
cie xyz net unprocessing images for low level computer vision tasks mahmoud afifi https sites google com view mafifi abdelrahman abdelhamed https www eecs yorku ca kamel abdullah abuolaim https sites google com view abdullah abuolaim abhijith punnappurath https abhijithpunnappurath github io and michael s brown http www cse yorku ca mbrown br br york university reference code for the paper cie xyz net unprocessing images for low level computer vision tasks https arxiv org pdf 2006 12709 pdf mahmoud afifi abdelrahman abdelhamed abdullah abuolaim abhijith punnappurath and michael s brown ieee transactions on pattern analysis and machine intelligence tpami 2021 if you use this code or our dataset please cite our paper article ciexyznet title cie xyz net unprocessing images for low level computer vision tasks author afifi mahmoud and abdelhamed abdelrahman and abuolaim abdullah and punnappurath abhijith and brown michael s journal ieee transactions on pattern analysis and machine intelligence tpami pages year 2021 code mit license network design https user images githubusercontent com 37669469 81250194 550b1700 8fee 11ea 8a69 0fde90f1062f jpg pytorch p align left img width 20 src https user images githubusercontent com 37669469 81490764 0c549780 9254 11ea 813c 02de8da42102 png p prerequisite 1 python 3 6 2 opencv python 3 pytorch tested with 1 5 0 4 torchvision tested with 0 6 0 5 cudatoolkit 6 tensorboard optional 7 numpy 8 future 9 tqdm 10 matplotlib the code may work with library versions other than the specified get started demos 1 run demo single image py or demo images py to convert from srgb to xyz and back you can change the task to run only one of the inverse or forward networks 2 run demo single image with operators py or demo images with operators py to apply an operator s to the intermediate layers images the operator code should be located in the pp code directory you should change the code in pp code postprocessing py with your operator code training code run train py to re train our network you will need to adjust the training validation directories accordingly note all experiments in the paper were reported using the matlab version of cie xyz net the pytorch code model is provided to facilitate using our framework with pytorch but there is no guarantee that the torch version gives exactly the same reconstruction rendering results reported in the paper br br matlab p align left img width 25 src https user images githubusercontent com 37669469 81493516 e1c40800 926e 11ea 8685 11f41ade7ed4 png p prerequisite 1 matlab 2019b or higher 2 deep learning toolbox get started run install m demos 1 run demo single image m or demo images m to convert from srgb to xyz and back you can change the task to run only one of the inverse or forward networks 2 run demo single image with operators m or demo images with operators m to apply an operator s to the intermediate layers images the operator code should be located in the pp code directory you should change the code in pp code postprocessing m with your operator code training code run training m to re train our network you will need to adjust the training validation directories accordingly br br srgb2xyz dataset srgb2xyz https user images githubusercontent com 37669469 80854947 4eedf280 8c0a 11ea 8ada e12bea63bdc6 jpg our srgb2xyz dataset contains 1 200 pairs of camera rendered srgb and the corresponding scene referred cie xyz images 971 training 50 validation and 244 testing images training set 11 1 gb part 0 https ln2 sync com dl a2894dbb0 sp365wf7 rtd9tujt kaqqpcpq mnpph44z part 1 https ln2 sync com dl d55a95be0 zg95xg6u n8nf7kc5 pttv6z8f n4yu3yny part 2 https ln2 sync com dl fb406ca40 j5wmbqdx knia8qia cm9yisub mjcmmbjy part 3 https ln2 sync com dl 508d5e380 tyhx4efv ibirjzzu vid3hjdr m4j2yxan part 4 https ln2 sync com dl e0941e650 hsu3z5dp fa5ird2b uiv8tqjy nfq6cje6 part 5 https ln2 sync com dl 258b02190 9jmarz63 ct33xx4e x4ikhfwt guan99b7 validation set 570 mb part 0 https ln2 sync com dl de4bc8380 xiughx76 cf6xcbp4 vzr73pde 3iyf4spk testing set 2 83 gb part 0 https ln2 sync com dl bb19d1b90 nv38zdmq n4b46kgq hv7sj472 dfxbzz2u part 1 https ln2 sync com dl 17f046300 5qcidmk6 rqhqqy57 dwybi55v f8kz9xku dataset license as the dataset was originally rendered using raw images taken from the mit adobe fivek dataset https data csail mit edu graphics fivek our srgb2xyz dataset follows the original license of the mit adobe fivek dataset https data csail mit edu graphics fivek
computational-photography computer-vision deep-learning deep-neural-networks deeplearning image-processing image-restoration low-level-vision dataset datasets cie-xyz color-processing camera-pipeline cie-xyz-net unprocessing-images srgb2xyz-dataset
ai
Data-Engineering-Bootcamp-Google-Cloud-Wizeline-Academy
data engineering bootcamp google cloud wizeline academy in this challenge i ve calculated various kpis using a car based dataset
cloud
react-universal
react universal app with social login starter kit minimal react redux boilerplate mern for desktop mobile and web app with social login feature inspired by creating universal apps like slack skype etc demo web app https react universal web herokuapp com expo mobile https expo io appetize simulator url https expo io by12380 react universal instructions click tap to play open with chrome click always scroll down the web page and click open project using expo electron app download links mac https www dropbox com s 2vnwx9dttz083or react universal 0 2 7 dmg raw 1 windows https www dropbox com s o97syfutahencpg react universal 20setup 200 2 7 exe raw 1 linux https www dropbox com s zrd413nhrmhibqg react universal 0 2 7 x86 64 appimage raw 1 br img src demo gif width 1000px br sample app using react universal todos app demo https react todo universal herokuapp com br img src https raw githubusercontent com by12380 react todo universal master todos demo gif width 1000px br features react universal redux universal electron desktop expo mobile express app server optional mongo db database optional automatic re login session storage sync app across all devices socket io social login auth0 https auth0 com default setup to demonstrate multiple social login platforms google facebook github twitter etc getting started git clone https github com by12380 react universal git cd react universal general setup auth0 auth0 setup for development web app client react electron app client electron expo app client expo app server server optional auth0 setup for development 1 sign in register auth0 account 2 go to application your app name settings 3 in allowed callback urls add http localhost 3000 callback for web and electron app https auth expo io your expo account username react universal for expo app in allowed logout urls add http localhost 3000 4 go to apis create api 5 set identifier ex https api react universal com hit create 6 go to settings toggle allow offline access save
react redux expo react-native electron social-login socket-io
front_end
NLP_Social_Sector
natural language processing in the social sector supplemental materials supplemental materials to the article six ways to implement natural language processing in the social sector this repo contains the code supporting the online demos https datakind github io nlp social sector as well as sample code for reproducing the results the intention behind sharing this material is to provide a starting point for further exploration of the techniques described in the article repository organization docs docs source code for demo site https datakind github io nlp social sector notebooks notebooks jupyter notebooks to reproduce the results shown in the demo site notebooks data notebooks data input data for notebooks nb most data gets fetched from s3 from the notebook notebooks scrapers notebooks scrapers sample code for scraping the input data credit this work was managed by alfred lee https github com justalfred and ben kinsella https github com bjk127 at datakind https www datakind org we are grateful to our volunteers who contributed to this project scrapers and demos were developed by sarah eltinge https github com eltinge matthew harris https github com dividor and john winter https github com johnwinter jared mcdonald https github com jaredmcdonald contributed the web development none of this would have been possible without their energy expertise and enthusiasm
ai
bytewax_index
bytewax index an index to use with large language models llms to answer questions about bytewax documentation and python api usage make a new virtual env then install the requirements sh pip install r requirements txt an index file must exist before you can ask questions you can run the file query bytewax py to create an index or you can download one here https drive google com file d 1ymyx018hp0g56tnb1vcajpjn fxuhuor view usp share link note that the preloaded index file above only contains a subset of the bytewax repository you can edit the python file to not exclude any folders by default rust source files are not included usage requires at least an openai api key in your environment variables if you want to recreate the index a github api key is needed as well with the current configuration creating an index uses free huggingface models but asking queries requires openai quota to get more relevant answers increase the n sources parameter default is 2 python query bytewax py n 3 example enter query how can i emit a state after each change or update for a given key info root query total llm token usage 1453 tokens info root query total embedding token usage 15 tokens to emit a state after each change or update for a given key you can use a stateful map function this function will take in the key and the new state and then emit the updated state for that key for example in the code snippet provided the stateful map function order book is used to update the orderbook object with the new state and then emit the updated state for that key additionally the code will check if the order should be removed and if not it will update the order if the order was removed it will check to make sure the bid and ask prices are modified if required finally the capture operator is used to use the output builder function that was defined earlier and print out the output to the terminal
ai
Full-Stack-Development
full stack development mern stack as web app development has taken itself to the next level several technologies are being used together to produce web apps that have extravagant features unlike native apps can be e g ios apps android apps web apps have far greater reach because web apps can be accessed on both desktop and mobile devices most of the companies are now looking for full stack developers who are having the knowledge of front and technologies like mongodb express amp node js and back end technologies like react and can create a fully functional web app by managing both the frontend and backend of the web application
server
whitepaper-portal
whitepaper portal whitepaper tools build status https travis ci org whitepapertools whitepaper portal svg branch master https travis ci org whitepapertools whitepaper portal whitepaper whitepaper bash npm install whitepaper bem html css npm install whitepaper core cdn html link rel stylesheet href https whitepaper tools whitepaper min css project stub https github com whitepapertools whitepaper stub
project-stub bemjson
os
BlockChain
simple blockchain example a very simple blockchain implementation intended to illustrate the concept original code august 2015 by marty anstey https marty anstey ca latest code at https github com rhondle blockchain file formats all values are stored as little endian and the hash used is sha256 isam index this file is simply a quick way to access any block in the chain it s intended to be used to fetch the offset and length of any block without walking the entire chain which is particularly useful when appending new blocks to the blockchain header type size description uint32 4 record count records type size description uint32 4 offset uint32 4 length blockchain the blockchain is simply a concatenated collection of blocks the hash from the previous block is stored in the current block forming a cryptographically verifiable chain and hardening the preceding blocks against tampering type size description uint32 4 magic uint8 1 block format uint32 4 timestamp uint8 32 32 previous hash uint32 4 data length data arbitrary data example tool output for the test blockchain dumpindex 0 ofs 0 len 195 1 ofs 195 len 332 2 ofs 527 len 97 3 ofs 624 len 451 4 ofs 1075 len 117 walkchain height 1 magic d5e8a97f version 1 timestamp 1440021658 22 00 58 08 19 2015 prevhash 0000000000000000000000000000000000000000000000000000000000000000 blockhash 87988ac16e72dd2b6878c83e03ef99264d7f6e6955df83ac955ac7e7e6f1185e datalen 150 data aug 19 2015 bitcoin price falls 14 following bitfinex flash crash http www coindesk com bitcoin price falls 14 following bitfinex flash crash height 2 magic d5e8a97f version 1 timestamp 1440021687 22 01 27 08 19 2015 prevhash 87988ac16e72dd2b6878c83e03ef99264d7f6e6955df83ac955ac7e7e6f1185e blockhash 5314b241d82e60d35786ee3a876c883cf6c623ec8f81c443d306ed2cbc808d80 datalen 287 data he had come a long way to this blue lawn and his dream must have seemed so close that he could hardly fail to grasp it he did not know that it was already behind him somewhere back in that vast obscurity beyond the city where the dark fields of the republic rolled on under the night height 3 magic d5e8a97f version 1 timestamp 1440024279 22 44 39 08 19 2015 prevhash 5314b241d82e60d35786ee3a876c883cf6c623ec8f81c443d306ed2cbc808d80 blockhash 25eb4659f629659f3074d3f485b736307f24968490eef359cfe2d364d9ab7048 datalen 52 data how a bitcoin transaction works www bit ly 1hwnbc4 height 4 magic d5e8a97f version 1 timestamp 1440024364 22 46 04 08 19 2015 prevhash 25eb4659f629659f3074d3f485b736307f24968490eef359cfe2d364d9ab7048 blockhash ae0bfb7a9ee6ddfc8a8443320e59b22421971391504a70399eb0a1ff8fe9a60f datalen 406 data https en bitcoin it wiki genesis block a genesis block is the first block of a block chain modern versions of bitcoin assign it block number 0 though older versions gave it number 1 the genesis block is almost always hardcoded into the software it is a special case in that it does not reference a previous block and for bitcoin and almost all of its derivatives it produces an unspendable subsidy height 5 magic d5e8a97f version 1 timestamp 1440024695 22 51 35 08 19 2015 prevhash ae0bfb7a9ee6ddfc8a8443320e59b22421971391504a70399eb0a1ff8fe9a60f blockhash 25940c512f3795f460dedd9e359e6c2cd7ac3f8269531d4aa1a7be2fd74fad69 datalen 72 data rvhbtvbmrtogquxjq0ugu0vorfmgqk9cicqyljk1ifvtrcbbvcaymjo0osaxos84lziwmtu https pastebin com raw php i dpwg7xvy
blockchain
vision-explanation-methods
vision explanation methods vision explanation methods is an open source package that implements d rise detector randomized input sampling for explanation https arxiv org abs 2006 03204 towards visual interpretations of object detection models d rise is a black boxed or model agnostic explainability method which can produce saliency maps for any object detection or instance segmentation models provided these models are appropriately wrapped in essence d rise works by randomly masking the input images and isolating the parts that are most pertinent for the detection or segmentation of the object in question drise diagram python vision explanation methods images drisediagram png diagram from petsiuk et al 2020 example outputs example output python vision explanation methods images outputmaps2 png installation to install the vision explanation package run pip install vision explanation methods colab the process of fine tuning an object detection model and visualizing it through d rise is illustrated in this colab notebook https colab research google com drive 1rrjytxf ybld ksoq0k3tphitgs56i5q usp sharing basic usage to generate saliency maps import the package and run res drise runner get drise saliency map imagelocation str model optional object numclasses int savename str nummasks int 25 maskres tuple int int 4 4 maskpadding optional int none devicechoice optional str none wrapperchoice optional object pytorchfasterrcnnwrapper contributing this project welcomes contributions and suggestions most contributions require you to agree to a contributor license agreement cla declaring that you have the right to and actually do grant us the rights to use your contribution for details visit https cla opensource microsoft com when you submit a pull request a cla bot will automatically determine whether you need to provide a cla and decorate the pr appropriately e g status check comment simply follow the instructions provided by the bot you will only need to do this once across all repos using our cla this project has adopted the microsoft open source code of conduct https opensource microsoft com codeofconduct for more information see the code of conduct faq https opensource microsoft com codeofconduct faq or contact opencode microsoft com mailto opencode microsoft com with any additional questions or comments trademarks this project may contain trademarks or logos for projects products or services authorized use of microsoft trademarks or logos is subject to and must follow microsoft s trademark brand guidelines https www microsoft com en us legal intellectualproperty trademarks usage general use of microsoft trademarks or logos in modified versions of this project must not cause confusion or imply microsoft sponsorship any use of third party trademarks or logos are subject to those third party s policies
ai
capstone-project5
capstone project5 capstone project for cloud devops engineering nanodegree capstone project project info my website api endpoint http aeda75adeb4a74f879adfb3123c57767 2021078310 us west 2 elb amazonaws com my github github link https github com charleswilkenson capstone project5 ci cd pipelines circleci https app circleci com pipelines github charleswilkenson capstone project5 branch main project overview with the help of circleci and an aws eks amazon elastic kubernetes service cluster this project operationalizes a python flask example web application for my blog 1 continuous integration with circleci to automate the development workflow we have put up a circleci pipeline to assure code quality we carry out code linting in the pipeline the application s docker image is created for simple access and deployment the docker image is next posted to a public docker registry especially docker hub 2 kubernetes deployment on aws eks we use aws eks to administer our kubernetes cluster we set up and execute the flask application within the eks cluster this enables us to manage scale and orchestrate containers using kubernetes 3 production deployment and rolling updates we use a rolling update method when it s time to push a new version of the application to production we employ a rolling update method when it s time to promote a new version of the application to production this guarantees less downtime and easy switching between versions of the program this update s orchestration is done using kubernetes 4 makefile and shell scripts we have arranged the numerous activities in this project into a makefile to simplify and automate them the execution of duties and project management are made easier by this organized approach we have built a solid workflow for building testing and deploying the hello flask application using circleci aws eks docker hub and kubernetes assuring effective development and production deployment procedures tasks this project follows a ci cd methodology to build a docker image and deploy it to a kubernetes cluster here are the key tasks within the project 1 environment setup create a virtual python environment create setup use the make install command to install all required dependencies 2 code quality assurance lint the project s code to check for errors make lint python code dockerfile and shell scripts 3 docker image creation create a dockerfile to containerize the hello application dockerfile 4 docker image deployment deploy the containerized application to a public docker registry specifically docker hub 5 kubernetes cluster deployment use the command make create cluster to deploy a kubernetes cluster 6 application deployment in kubernetes make the application available to the kubernetes cluster by using the make deployment command 7 rolling updates implement an update strategy for the application within the cluster using a rolling update approach make rolling this ci cd project automates the entire software development lifecycle from code quality checks to deployment in a kubernetes cluster ensuring efficient development and deployment processes ci cd tools and cloud services 1 circle ci a cloud based ci cd service 2 amazon aws a provider of cloud services 3 aws eks amazon elastic kubernetes service a managed kubernetes service 4 aws eksctl the official cli tool for amazon eks 5 aws cli a command line tool for interacting with aws services 6 cloudformation an infrastructure as code service for aws 7 kubectl a command line tool for controlling kubernetes clusters 8 docker hub a container image repository service explanation of the files in the repository directory file description circleci config yml this is the circleci configuration file app py entry point containing codes for rest handling request endpoint making house price prediction dockerfile file containing commands to build image makefile the project build file requirements txt file containing python dependencies librairies main files app app py dockerfile requirements txt makefile readme md scrpits docker kubernetes cleanup sh create cluster sh build docker sh deployment sh install docker sh install eksctl sh run docker sh install kubectl sh upload docker sh rollback sh rolling sh update config eks sh bin this directory seems to contain various scripts and utility files for managing and deploying your application build docker sh a script for building a docker image cleanup sh a script for cleaning up resources or temporary files create cluster sh a script for creating a kubernetes cluster deployment sh a script for managing deployments install docker sh a script for installing docker install eksctl sh a script for installing the eks amazon elastic kubernetes service cli tool install kubectl sh a script for installing kubectl the kubernetes command line tool rollback sh a script for rolling back deployments rolling sh a script for managing rolling updates run docker sh a script for running a docker container update config eks sh a script for updating eks configurations upload docker sh a script for uploading docker images the makefile triggers the execution of the following shell scripts create cluster sh creates the eks cluster install eksctl sh installs the eksctl tool install kubectl sh installs the kubectl tool create cluster sh create cluster deployment sh deploys and exposes a service in the k8s cluster
cloud
blockchain
blockchain python list dict json class https github com opensourcebooks blockchain git cd blockchain pip3 install r requirements txt python3 python3 pip3 python3 blockchain py hash hash blockchain 0 blockchain svg block 1 block 1 hash block 2 previous hash block 1 hash blockchain 0 blockchain err svg python python python python iot 0 0 readme md 1 index 1 readme md 2 2 readme md 3 3 readme md 4 4 readme md 5 5 readme md 6 6 readme md 7 7 readme md 8 8 readme md 8 1 8 kv 9 hash 9 readme md 8 2 8 1 8 9 10 ok github img src images wx jpg width 350 cr4fun img src images xq jpg width 350
blockchain
SDGP-Genius-Tier
sdgp genius tier this repo is for the group project this repository has 5 branches for each feature the front end is developed by flutter the back end and the ml model is built with python
ci-cd classification dart machine-learning mobile-app python
front_end
motiondetector
title images title png motion detector takes input from video sources such as network cameras web cams files etc and makes intelligent decisions based on analyzing frames motion detector uses a plugin based event driven architecture that allows you to easily extend functionality it is deployed as an intelligent security system but can be configured for your particular scenario reasons to use motion detector you have been disappointed with over priced proprietary cameras which require subscriptions to store your data you have been disappointed with other surveillance software being windows only ispy woefully outdated motion or requiring a special os image kerberos io you want to use advanced computer vision and machine learning algorithms you want a secure camera new iot malware targets 100 000 ip cameras via known flaw http www csoonline com article 3195825 security new iot malware targets 100000 ip cameras via known flaw html that does not rely on crappy proprietary firmware unless of course you want your data transmitted to the chinese mothership there is no free paid version lame accounts to sign up for etc if you want extra features you can request them or write them yourself if you are a developer and sbc tinkerer then the possibilities are endless the primary focus of motion detector is efficient video processing fault tolerance and extensibility while most security themed video monitoring is based on motion detection motion detector places a high value on computer vision for intelligent frame analysis such as hog pedestrian and haar cascade multi scale detection using the pre trained histogram of oriented gradients and linear svm method works better when objects are larger green is motion roi and blue is a detected pedestrian hog images hog jpg using the pre trained haar cascade method works better when objects are smaller cascade images cascade jpg it s important to use the right detectors and configuration to achieve the desired results features hog images camera jpg diy compact cameras are easy to build and install this puts you in control of resolution features processing power etc motion detector has been tested on sbcs such as raspberry pi nanopi m1 chip odroid c1 c2 xu4 pine a64 etc to create compact smart cameras threading and subprocess based architecture allows consistent fps while processing frames writing video files moving files to remote location etc all concurrently run multiple copies on a central server for ip based dumb cameras supports several types of video inputs including usb and ip wired wireless cameras video files etc fault tolerant architecture ensures buggy camera firmware or poor network connectivity will not derail video processing high performance frame capture plugins including python socket based mjpeg decoder high performance hardware video encoding and decoding if your sbc supports it threshold based motion detection ignore mask multiple object marking and video recording pedestrian and human feature detection with the ability to train your own detector add your own plugins single configuration file requirements x86 x86 64 armv7 or armv8 version of ubuntu 18 04 or debian 9 will most likely work on other linux based operating systems as well internet connection camera or video file install opencv https github com sgjava install opencv or some other method to install latest opencv download project and test since most video hardware acceleration is exposed through ffmpeg you may need to install a custom version please check your sbc for special kernels or builds of ffmpeg for hardware acceleration sudo apt install git python3 pip ffmpeg sudo h pip3 install ffmpeg python cd git clone depth 1 https github com sgjava motiondetector git cd motiondetector codeferm export pythonpath pythonpath motiondetector python videoloop py you should see the video process and create output in motion motion detection from experience i can tell you that you need to understand the usage scenario simple motion detection will work well with static backgrounds but using it outside you have to deal with cars tree branches blowing sudden light changes etc this is why built in motion detection is mostly useless on security cameras for these types of scenarios you can use ignore bitmaps and roi regions of interest to improve results with dynamic backgrounds for instance i can ignore my palm tree but trigger motion if you walk in my doorway boosting performance i see a lot of posts on the internet about opencv performance on various arm based sbcs being cpu intensive or slow frame capture etc over time i learned the tricks of the trade and kicked it up a few notches from my own research these techniques may not work for all usage scenarios or opencv functions but they do work well for security type applications problem slow or inconsistent fps using usb camera solution use mjpeg compatible usb camera mjpg streamer and my mjpegclient py https github com sgjava motiondetector blob master codeferm mjpegclient py solution use threading and a frame buffer to get consistent fps from camera even with recording video and background events you will get very consistent fps from cameras that allow you to set the fps some cameras have dynamic fps based on contrast and light this can be tricky when dealing with fixed fps video codecs problem opencv functions max out the cpu resulting in low fps solution resize image before any processing check out pedestrian detection opencv http www pyimagesearch com 2015 11 09 pedestrian detection opencv as it covers reduction in detection time and improved detection accuracy the pedestrian hog detector was trained with 64 x 128 images so a 320x240 image is fine for some scenarios as you go up in resolution you get even better performance versus operating on the full sized image this article also touches on non maxima suppression which is basically removing overlapping rectangles from detection type functions solution sample only some frames motion detection using the moving average algorithm works best at around 3 or 4 fps this works to your advantage since that is an ideal time to do other types of detection such as for pedestrians this also works out well as your camera fps goes higher that means 3 fps are processed even at 30 fps you still have to consider video recording overhead since that s still 30 fps solution analyze only motion roi regions of interest by analyzing only roi you can cut down processing time tremendously for instance if only 10 of the frame has motion then the opencv function should run about 900 faster this may not work where there s a large change frame after frame luckily this will not happen for most security type scenarios if a region is too small for the detector it is not processed thus speeding things up even more solution use hardware encoding and decoding when available the odroid xu4 for instance has hardware h 264 acceleration enabled by default you can use the h264 v4l2m2m codec for hardware encoding and decoding run motion detector the default test ini https github com sgjava motiondetector blob master config test ini is configured to detect pedestrians from a local video file in the project try this first and make sure it works properly cd motiondetector codeferm export pythonpath pythonpath motiondetector python videoloop py video will record to motion test using camera name default test date for directory and time for file name this is handy for debugging issues or fine tuning using the same file over and over create a new configuration file for videoloop py https github com sgjava motiondetector blob master codeferm videoloop py to suit your needs cp motiondetector config test ini camera ini cd motiondetector codeferm python videoloop py camera ini the same test video should have processed fine with a copy of the test configuration now we can try the default v4l camera which is 1 nano camera ini and change the following name to camera url to 1 videocaptureproperties to cv2 cap prop frame width 1280 cv2 cap prop frame height 720 or a resolution your camera supports frameplugin to codeferm videocapture detectplugin to empty fps to camera s fps watch output for actual fps and tune if necessary mark to false ignoremask to empty historyimage to true try the new configuration cd motiondetector codeferm python videoloop py camera ini videoloop should write video files to motion camera when motion is detected you can adjust startthreshold and stopthreshold as needed let videoloop run and capture videos once you have enough samples take a look at the history images history images use the name of the video file and add png to the end take a look at the example one i created with the sample video ignore mask resources mask png hog images hog jpg i m ignoring that balloon at the top center of the video white pixels are considered for motion detection and black pixels are ignored this only pertains to the motion detection moving average all movement is considered for detection otherwise you might miss an important region of interest roi if you apply this to your situation you can effectively prevent a lot of false motion detection trees bushes cars and other objects can be ignored if the fall into a particular region of the ignore mask use the history image as the basis of your ignore mask it s important not to move the camera after the mask is created or it will not be aligned properly of course you can leave ignoremask empty if you wish to analyze the entire frame for motion if you wish to use the scp plugin then you should generate ssh keypair so you do not have to pass passwords around or save them in a file it s handy to scp video files to a central server or cloud storage after detection ssh keygen ssh copy id user host ssh host configure supervisor to make motion detector more resilient it s wise to run it with a process control system like supervisor http supervisord org motion detector currently fails fast if it gets a bad frame or socket timeout as long as you use a reasonable socket timeout value in the configuration supervisor will automatically restart videoloop py after failure sudo h pip3 install supervisor sudo su echo supervisord conf etc supervisord conf exit sudo nano etc supervisord conf program mjpg streamer command mjpg streamer i usr local lib mjpg streamer input uvc so n f 5 r 1280x720 o usr local lib mjpg streamer output http so w usr local www directory home username user username startsecs 0 autostart true autorestart true stdout logfile tmp mjpg streamer log stderr logfile tmp mjpg streamer err log environment ld library path opt libjpeg turbo lib32 or lib64 program videoloop command python videoloop py home username camera ini directory home username motiondetector codeferm user username autostart true autorestart true stdout logfile tmp videoloop log stderr logfile tmp videoloop err log environment pythonpath home username motiondetector supervisord check logs in tmp and fix any errors supervisorctl shutdown sudo nano etc systemd system supervisord service unit description supervisor daemon documentation http supervisord org after network target service execstart usr local bin supervisord n c etc supervisord conf execstop usr local bin supervisorctl options shutdown execreload usr local bin supervisorctl options reload killmode process restart on failure restartsec 42s install wantedby multi user target alias supervisord service sudo systemctl start supervisord check logs in tmp and fix any errors sudo systemctl stop supervisord sudo systemctl enable supervisord camera health check if you want to monitor the health of motion detector you just need to look at health txt timestamp for example if i want to use zabbix agent i could add the following to zabbix agentd conf camera health userparameter camhealth if test find home servadmin motion health txt mmin 2 then echo 1 else echo 0 fi freebsd license copyright c steven p goldsmith all rights reserved redistribution and use in source and binary forms with or without modification are permitted provided that the following conditions are met redistributions of source code must retain the above copyright notice this list of conditions and the following disclaimer redistributions in binary form must reproduce the above copyright notice this list of conditions and the following disclaimer in the documentation and or other materials provided with the distribution this software is provided by the copyright holders and contributors as is and any express or implied warranties including but not limited to the implied warranties of merchantability and fitness for a particular purpose are disclaimed in no event shall the copyright holder or contributors be liable for any direct indirect incidental special exemplary or consequential damages including but not limited to procurement of substitute goods or services loss of use data or profits or business interruption however caused and on any theory of liability whether in contract strict liability or tort including negligence or otherwise arising in any way out of the use of this software even if advised of the possibility of such damage
ai
dascoin-blockchain
dascoin blockchain getting started getting started support support using the api using the api accessing restricted api s accessing restricted apis faq faq license license dascoin blockchain is the techsolutions ltd blockchain implementation and command line interface getting started build instructions and additional documentation are available in the wiki https github com techsolutions ltd dascoin blockchain wiki we recommend building on ubuntu 16 04 lts 64 bit build dependencies sudo apt get update sudo apt get install autoconf cmake make automake libtool git libboost all dev libssl dev g libcurl4 openssl dev build script git clone https github com techsolutions ltd dascoin blockchain git cd dascoin blockchain git checkout master may substitute master with current release tag git submodule update init recursive cmake dcmake build type relwithdebinfo make upgrade script prepend to the build script above if you built a prior release git remote set url origin https github com techsolutions ltd dascoin blockchain git git checkout master git remote set head origin auto git pull git submodule update init recursive this command may fail git submodule sync recursive git submodule update init recursive note bitshares requires a boost http www boost org version in the range 1 57 1 65 1 versions earlier than 1 57 or newer than 1 65 1 are not supported if your system s boost version is newer then you will need to manually build an older version of boost and specify it to cmake using dboost root note bitshares requires a 64 bit operating system to build and will not build on a 32 bit os note bitshares now supports ubuntu 18 04 lts note bitshares now supports openssl 1 1 0 after building the witness node can be launched with programs witness node witness node the node will automatically create a data directory including a config file it may take several hours to fully synchronize the blockchain after syncing you can exit the node using ctrl c and setup the command line wallet by editing witness node data dir config ini as follows rpc endpoint 127 0 0 1 8090 important by default the witness node will start in reduced memory ram mode by using some of the commands detailed in memory reduction for nodes https github com techsolutions ltd dascoin blockchain wiki memory reduction for nodes in order to run a full node with all the account history you need to remove partial operations and max ops per account from your config file please note that currently 2018 09 19 a full node will need more than 14gb of ram to operate and required memory is growing fast consider the following table before running a node default full minimal elasticsearch 16g ram 120g ram 4g ram 500g ssd hd 32g ram after starting the witness node again in a separate terminal you can run programs cli wallet cli wallet set your inital password set password password unlock password to import your initial balance import balance account name wif key true if you send private keys over this connection rpc endpoint should be bound to localhost for security use help to see all available wallet commands source definition and listing of all commands is available here https github com techsolutions ltd dascoin blockchain blob master libraries wallet include graphene wallet wallet hpp up to date online doxygen documentation can be found at doxygen https bitshares org doxygen hierarchy html using the api we provide several different api s each api has its own id when running witness node initially two api s are available api 0 provides read only access to the database while api 1 is used to login and gain access to additional restricted api s here is an example using wscat package from npm for websockets npm install g wscat wscat c ws 127 0 0 1 8090 id 1 method call params 0 get accounts 1 2 0 id 1 result id 1 2 0 annotations membership expiration date 1969 12 31t23 59 59 registrar 1 2 0 referrer 1 2 0 lifetime referrer 1 2 0 network fee percentage 2000 lifetime referrer fee percentage 8000 referrer rewards percentage 0 name committee account owner weight threshold 1 account auths key auths address auths active weight threshold 6 account auths 1 2 5 1 1 2 6 1 1 2 7 1 1 2 8 1 1 2 9 1 1 2 10 1 1 2 11 1 1 2 12 1 1 2 13 1 1 2 14 1 key auths address auths options memo key gph1111111111111111111111111111111114t1anm voting account 1 2 0 num witness 0 num committee 0 votes extensions statistics 2 7 0 whitelisting accounts blacklisting accounts we can do the same thing using an http client such as curl for api s which do not require login or other session state curl data jsonrpc 2 0 method call params 0 get accounts 1 2 0 id 1 http 127 0 0 1 8090 rpc id 1 result id 1 2 0 annotations membership expiration date 1969 12 31t23 59 59 registrar 1 2 0 referrer 1 2 0 lifetime referrer 1 2 0 network fee percentage 2000 lifetime referrer fee percentage 8000 referrer rewards percentage 0 name committee account owner weight threshold 1 account auths key auths address auths active weight threshold 6 account auths 1 2 5 1 1 2 6 1 1 2 7 1 1 2 8 1 1 2 9 1 1 2 10 1 1 2 11 1 1 2 12 1 1 2 13 1 1 2 14 1 key auths address auths options memo key gph1111111111111111111111111111111114t1anm voting account 1 2 0 num witness 0 num committee 0 votes extensions statistics 2 7 0 whitelisting accounts blacklisting accounts api 0 is accessible using regular json rpc curl data jsonrpc 2 0 method get accounts params 1 2 0 id 1 http 127 0 0 1 8090 rpc accessing restricted api s you can restrict api s to particular users by specifying an api access file in config ini or by using the api access full path to api access json startup node command here is an example api access file which allows user bytemaster with password supersecret to access four different api s while allowing any other user to access the three public api s necessary to use the wallet permission map bytemaster password hash b64 9e9gf7ooxvb9k4bosfniptelxegoz5drgoymj94elay password salt b64 inddm6ici 8 allowed apis database api network broadcast api history api network node api password hash b64 password salt b64 allowed apis database api network broadcast api history api passwords are stored in base64 as salted sha256 hashes a simple python script saltpass py is avaliable to obtain hash and salt values from a password a single asterisk may be specified as username or password hash to accept any value with the above configuration here is an example of how to call add node from the network node api id 1 method call params 1 login bytemaster supersecret id 2 method call params 1 network node id 3 method call params 2 add node 127 0 0 1 9090 note the call to network node is necessary to obtain the correct api identifier for the network api it is not guaranteed that the network api identifier will always be 2 since the network node api requires login it is only accessible over the websocket rpc our doxygen documentation contains the most up to date information about api s for the witness node https bitshares github io doxygen namespacegraphene 1 1app html and the wallet https bitshares github io doxygen classgraphene 1 1wallet 1 1wallet api html if you want information which is not available from an api it might be available from the database https bitshares github io doxygen classgraphene 1 1chain 1 1database html it is fairly simple to write api methods to expose database methods faq is there a way to generate help with parameter names and method descriptions yes documentation of the code base including apis can be generated using doxygen simply run doxygen in this directory if both doxygen and perl are available in your build environment the cli wallet s help and gethelp commands will display help generated from the doxygen documentation if your cli wallet s help command displays descriptions without parameter names like signed transaction transfer string string string string string bool it means cmake was unable to find doxygen or perl during configuration if found the output should look like this signed transaction transfer string from string to string amount string asset symbol string memo bool broadcast is there a way to allow external program to drive cli wallet via websocket jsonrpc or http yes external programs may connect to the cli wallet and make its calls over a websockets api to do this run the wallet in server mode i e cli wallet s 127 0 0 1 9999 and then have the external program connect to it over the specified port in this example port 9999 is there a way to access methods which require login over http no login is inherently a stateful process logging in changes what the server will do for certain requests that s kind of the point of having it if you need to track state across http rpc calls you must maintain a session across multiple connections this is a famous source of security vulnerabilities for http applications additionally http is not really designed for server push notifications and we would have to figure out a way to queue notifications for a polling client websockets solves all these problems if you need to access graphene s stateful methods you need to use websockets what is the meaning of a b c numbers the first number specifies the space space 1 is for protocol objects 2 is for implementation objects protocol space objects can appear on the wire for example in the binary form of transactions implementation space objects cannot appear on the wire and solely exist for implementation purposes such as optimization or internal bookkeeping the second number specifies the type the type of the object determines what fields it has for a complete list of type id s see enum object type and enum impl object type in types hpp https github com bitshares bitshares 2 blob bitshares libraries chain include graphene chain protocol types hpp the third number specifies the instance the instance of the object is different for each individual object the answer to the previous question was really confusing can you make it clearer all account id s are of the form 1 2 x if you were the 9735th account to be registered your account s id will be 1 2 9735 account 0 is special it s the committee account which is controlled by the committee members and has a few abilities and restrictions other accounts do not all asset id s are of the form 1 3 x if you were the 29th asset to be registered your asset s id will be 1 3 29 asset 0 is special it s bts which is considered the core asset the first and second number together identify the kind of thing you re talking about 1 2 for accounts 1 3 for assets the third number identifies the particular thing how do i get the network add nodes command to work why is it so complicated you need to follow the instructions in the accessing restricted api s section to allow a username password access to the network node api then you need to pass the username password to the cli wallet on the command line or in a config file it s set up this way so that the default configuration is secure even if the rpc port is publicly accessible it s fine if your witness node allows the general public to query the database or broadcast transactions in fact this is how the hosted web ui works it s less fine if your witness node allows the general public to control which p2p nodes it s connecting to therefore the api to add p2p connections needs to be set up with proper access controls license dascoin blockchain is under the mit license see license https github com techsolutions ltd dascoin blockchain blob master license txt for more information
blockchain
ESP01-RTOS
esp 01 this project contains material resulting from my experimentations with the esp 01 board based on the esp8266 my main target with these experimentations is to get acquainted with the espressif rtos sdk current repository is structured as follows code folder source code doc folder all documentation reference design installation etc main entry point for documentation is design md doc design md hw folder files relating to hardware side schematics reference documentation etc wiki information about builds delivery etc the development machine i use for this project is a mac running os x el capitan 10 11 1 please bear in mind that i m not experienced in adapting eclipse to an embedded toolchain consequently configurations i present in this project can be quite far from being optimized my current target is not to set up the perfect development environment but rather to get a working environment that i can use to develop code right now
os
Awesome-DAG-Blockchain
awesome dag blockchain https camo githubusercontent com 13c4e50d88df7178ae1882a203ed57b641674f94 68747470733a2f2f63646e2e7261776769742e636f6d2f73696e647265736f726875732f617765736f6d652f643733303566333864323966656437386661383536353265336136336531353464643865383832392f6d656469612f62616467652e737667 https github com sindresorhus awesome https camo githubusercontent com cb8cb80af654f3dae14a4aa62e44bf62f16953d6 68747470733a2f2f6a617977636a6c6f76652e6769746875622e696f2f73622f6c616e672f6368696e6573652e737667 readme md https camo githubusercontent com 15a53d5ec5d896319068168a27da0203156bbdb9 68747470733a2f2f6a617977636a6c6f76652e6769746875622e696f2f73622f6c616e672f656e676c6973682e737667 readme en md dag star fork https github com guantau awesome dag blockchain issues pr awesome https github com sindresorhus awesome readme md english readme en md dag dag dag dag wikipedia https en wikipedia org wiki directed acyclic graph dag git http ericsink com vcbe html directed acyclic graphs html nxt dag https nxtforum org proof of stake algorithm dag a generalized blockchain 2014 dagcoin https bitslog files wordpress com 2015 09 dagcoin v41 pdf 2015 sergio demian lerner accelerating bitcoin s transaction processing fast money grows on trees not chains https eprint iacr org 2013 881 pdf 2013 bitcoin secure high rate transaction processing in bitcoin http www avivz net pubs 15 btc ghost pdf ghost greedy heaviest observed sub tree ethereum ghost inclusive block chain protocols http www cs huji ac il avivz pubs 15 inclusive btc pdf 2015 dag spectre serialization of proof of work events confirming transactions via recursive elections https eprint iacr org 2016 1159 pdf dag phantom a scalable blockdag protocol https eprint iacr org 2018 104 pdf blockdag by dagfans https github com dagfans transtudy blob master papers phantom 20 20a 20scalable 20blockdag 20protocol md iota https iota org https iota org iota whitepaper pdf https github com iotaledger byteball http byteball org https byteball org byteball pdf https github com byteball byteball 1 https medium com suirelav introduction to byteball part 1 why ab3ff6a7a8f2 byteball byteball 2 https medium com suirelav introduction to byteball part 2 the dag ce84ca4c4e01 dag byteball 3 https medium com suirelav introduction to byteball part 3 smart contracts 81efa010a0b3 byteball 4 https medium com suirelav introduction to byteball part 4 adoption ff37d87615c9 byteball byteball 1 http blog guantau com 2017 12 14 byteball1 dag byteball byteball 2 http blog guantau com 2017 12 19 byteball2 byteball byteball 3 http blog guantau com 2018 01 19 byteball3 byteball byteball 4 http blog guantau com 2018 01 26 byteball4 byteball byteball 5 http blog guantau com 2018 01 30 byteball5 byteball byteball https blog goodaudience com byteball main net under stress test c131ba85b72b 1 15tps dag 2 dos 3 nano raiblocks 2014 beta https nano org https nano org zh whitepaper https github com nanocurrency raiblocks 1 https hackernoon com stress testing the raiblocks network 568be62fdf6d raiblocks 2 https medium com bnp117 stress testing the raiblocks network part ii def83653b21f dagcoin byteball https dagcoin org https dagcoin org whitepaper pdf https github com dagcoin trustnote byteball https trustnote org https trustnote org trustnote whitepaper pdf https github com trustnote nerthus dag http nerthus io http nerthus io static downfile nerthuswhitepagev0 0 2 pdf askcoin dag https blog askcoin org askcoin in one page f284bb3d9b42 iot chain pbft dag https iotchain io https iotchain io whitepaper itcwhitepaper pdf https github com iotchaincode xdag dag https bitcointalk org index php topic 2552368 0 http xdag io https docs google com document d 1runptvghy0xsb8goa 8syg58ghu ibegnwji2h0ttna edit hashgraph dag https hashgraph com https www swirlds com downloads swirlds tr 2016 01 pdf python https github com lapin0t py swirld nodejs https github com thecallsign hashgraph go https github com babbleio babble daglabs https www daglabs com spectre phantom dag get rid of blocks http slides com davidvorick braids iota vs raiblocks https hackernoon com iota vs raiblocks 413679bb4c3e dag coin comparison byteball iota raiblocks etc https web archive org web 20171211100146 https www reddit com r cryptocurrency comments 7iv20r dag coin comparison byteball iota raiblocks etc byteball vs iota battle of two dag cryptocurrencies https steemit com cryptocurrency jimmco byteball vs iota battle of two dag cryptocurrencies sergio demian lerner yonatan sompolinsky yoad lewenberg aviv zohar serguei popov sergey ivancheglo anton churyumov leemon baird
blockchain
front-end-do-zero
front end do zero notebook um guia de estudos acess vel para se tornar front end computer este um guia em constru o baseado na minha experi ncia em aprender desenvolvimento front end sozinha e com o apoio do guia do desenvolvedor web https github com hideraldus13 roadmap do desenvolvedor web come ando do come o smile antes de tudo acredito que importante entender realmente como funciona a internet e para isso este video https www youtube com watch v hbrdmaxkb8q no nexo faz tudo ficar muito claro artigos importantes antes de iniciar os conte dos t cnicos o que front e back end https www programaria org o que e front end e back end programaria assumindo o papel de protagonista em nossa aprendizagem https woliveiras com br posts assumindo o papel de protagonista em nossa aprendizagem william oliveira conselhos que eu gostaria de ter ouvido quando mudei para tecnologia https medium com carolcode conselhos que eu gostaria de ter ouvido quando mudei para tecnologia c75664da2568 por eu mesma o que e pra que serve o protocolo http quando eu uso ele http gabsferreira com pra que serve o protocolo http quando eu uso ele gabs ferreira videos importantes como estudar melhor https www youtube com watch v is6c9ksgcbk list wl index 13 t 363s alura produtividade para estudar https www youtube com watch v uthmksxlhsi list wl index 17 t 0s alura bases de computa o como funcionam navegadores web https www youtube com watch v kdy62zachze list wl index 5 t 18s alura o que esse tal de compilador https www youtube com watch v y1m9yohgkbc list wl index 33 t 0s computa o sem ca html css a verdade que pr dio nenhum se sustenta sem boas bases e por isso para construirmos algo muito bom para web precisamos partir de um bom come o muita gente por ai desvaloriza essas linguagens de marca o voc vai entender a diferen a em breve mas aprendendo e praticando d para construir coisas incriveis como essa imagem aqui https codepen io ivorjetski pen xmjoyo que foi feita apenas com css artigo linguagem de marca o x linguagem de programa o http lpsychomamba blogspot com 2013 03 ppsi 1 post 2 html curso gratuito html 5 https www youtube com watch v epdcjkskmok list plhz arehm4dlanj jjtv29rfxnphduk9o curso em v deo curso gratuito css 3 https www youtube com watch v frhm6smotfg list plwxqlz3fdtvgf7gutioflc 9axo25iizg node studio treinamentos curso pago com valor acess vel frontend b sico do zero a uma p gina com html e css https www udemy com frontend basico do zero a uma pagina com html e css f bernardo curso gratuito html css https www udemy com girls4tech ebanx git github octocat voc j deve ter ouvido sobre reposit rios open source git github pense no github como o seu portif lio de c digo e para voc interagir com ele voc usa o git nos links abaixo voc saber mais sobre video o que github https www youtube com watch v zdo f3zibfa t 263s bruno germano video diferen as de git github e gitlab https www youtube com watch v i9dujcn mu alura artigo plano para estudar git e github enquanto aprende programa o https medium com trainingcenter plano para estudar git e github enquanto aprende programa c3 a7 c3 a3o f5d5f986f459 training center video git e github para iniciantes https www youtube com watch v umhsklxjuq4 t 344s loiane groner curso git e github para iniciantes https www udemy com git e github para iniciantes william justen artigo introdu o ao terminal https woliveiras com br posts introdu c3 a7 c3 a3o ao terminal william oliveira video como usar git e github na pr tica https www youtube com watch v 2alg7mq6 si list wl index 11 t 0s rocketseat javascript artigo 10 dicas para se tornar ninja em javascript https medium com womakerscode 10 dicas para se tornar ninja em javascript 31a963ad17a1 glaucia lemos curso javascript https rocketseat com br starter rocketseat video o que api rest e restful https www youtube com watch v ghtrp1x 1as list wl index 7 t 0s rocketseat
front_end
IoTClient
h1 align center iotclient h1 english readme zh cn md image https img shields io nuget v iotclient svg https www nuget org packages iotclient image https img shields io nuget dt iotclient svg https www nuget org packages iotclient image https img shields io github license alienwow snowleopard svg this is an iot device communication protocol realization client which will include mainstream plc communication reading modbus protocol bacnet protocol and other common industrial communication protocols this component is based on net standard 2 0 and can be used for cross platform development of net such as windows linux and even run on raspberry pi this component is open source and free for life and adopts the most relaxed mit protocol you can also modify and use it for commercial use commercial use please evaluate and test development tools visual studio 2019 qq exchange group 700324594 https jq qq com wv 1027 k tirmmgbt document directory toc instructions for use instructions for use reference component reference component modbustcp read and write operations modbustcp read and write operations modbusrtu read and write operations modbusrtu read and write operations modbusascii read and write operations modbusascii read and write operations modbusrtuovertcp read and write operations modbusrtuovertcp read and write operations siemensclient siemens read and write operations siemensclient siemens read and write operations note about siemens plc address note about siemens plc address siemensclient best practices siemensclient best practices mitsubishiclient mitsubishi read and write operations mitsubishiclient mitsubishi read and write operations omronfinsclient omron read and write operations omronfinsclient omron read and write operations allenbradleyclient read and write operations allenbradleyclient read and write operations some projects based on iotclient library some projects based on iotclient library iotclient tool desktop program tool open source iotclient tool desktop program tool open source energy management system commercial energy management system commercial e8 83 bd e6 ba 90 e7 ae a1 e7 90 86 e7 8e b0 e5 9c ba e5 8d 95 e9 a1 b9 e7 9b ae e8 83 bd e6 ba 90 e7 ae a1 e7 90 86 e4 ba 91 e7 ab af e5 a4 9a e9 a1 b9 e7 9b ae e8 83 bd e6 ba 90 e7 ae a1 e7 90 86 e7 a7 bb e5 8a a8 e7 ab af haidilao terminal control commercial haidilao terminal control commercial web e6 b5 b7 e5 ba 95 e6 8d 9e e6 9c ab e7 ab af e6 8e a7 e5 88 b6 web e6 b5 b7 e5 ba 95 e6 8d 9e e6 9c ab e7 ab af e6 8e a7 e5 88 b6 e7 a7 bb e5 8a a8 e7 ab af toc instructions for use reference component nuget installation https www nuget org packages iotclient install package iotclient or graphical installation image https user images githubusercontent com 5820324 68722366 2fc5bf00 05f0 11ea 8282 f2b0a58a9f9d png modbustcp read and write operations 1 instantiate the client enter the correct ip and port modbustcpclient client new modbustcpclient 127 0 0 1 502 2 write operation parameters are address value station number function code client write 4 short 33 2 16 2 1 note when writing data you need to clarify the data type client write 0 short 33 2 16 write short type value client write 4 ushort 33 2 16 write ushort type value client write 8 int 33 2 16 write int type value client write 12 uint 33 2 16 write uint type value client write 16 long 33 2 16 write long type value client write 20 ulong 33 2 16 write ulong type value client write 24 float 33 2 16 write float type value client write 28 double 33 2 16 write double type value client write 32 true 2 5 write coil type value client write 100 ordercode stationnumber write string 3 read operation the parameters are address station number function code var value client readint16 4 2 3 value 3 1 other types of data reading client readint16 0 stationnumber 3 short type data read client readuint16 4 stationnumber 3 ushort type data read client readint32 8 stationnumber 3 int type data read client readuint32 12 stationnumber 3 uint type data read client readint64 16 stationnumber 3 long type data read client readuint64 20 stationnumber 3 ulong type data read client readfloat 24 stationnumber 3 float type data read client readdouble 28 stationnumber 3 double type data read client readcoil 32 stationnumber 1 coil type data read client readdiscrete 32 stationnumber 2 discrete type data read client readstring 100 stationnumber readlength 10 read string 4 if there is no active open it will automatically open and close the connection every time you read and write operations which will greatly reduce the efficiency of reading and writing so it is recommended to open and close manually client open 5 read and write operations will return the operation result object result var result client readint16 4 2 3 5 1 whether the reading is successful true or false var issucceed result issucceed 5 2 exception information for failed reading var errmsg result err 5 3 read the request message actually sent by the operation var requst result requst 5 4 read the response message from the server var response result response 5 5 read value var value3 result value 6 batch read var list new list modbusinput list add new modbusinput address 2 datatype datatypeenum int16 functioncode 3 stationnumber 1 list add new modbusinput address 2 datatype datatypeenum int16 functioncode 4 stationnumber 1 list add new modbusinput address 199 datatype datatypeenum int16 functioncode 3 stationnumber 1 var result client batchread list 7 other parameters of the constructor ip port timeout time big and small end settings modbustcpclient client new modbustcpclient 127 0 0 1 502 1500 endianformat abcd for more usage of modbustcp please refer to unit test https github com zhaopeiym iotclient blob master iotclient tests modbus tests modbustcpclient tests cs modbusrtu read and write operations instantiate the client com port name baud rate data bits stop bits parity modbusrtuclient client new modbusrtuclient com3 9600 8 stopbits one parity none other read and write operations are the same as modbustcpclient s read and write operations modbusascii read and write operations instantiate the client com port name baud rate data bits stop bits parity modbusasciiclient client new modbusasciiclient com3 9600 8 stopbits one parity none other read and write operations are the same as modbustcpclient s read and write operations modbusrtuovertcp read and write operations serial port transparent transmission i e send rtu format messages in tcp mode instantiate the client ip port timeout big and small end settings modbusrtuovertcpclient client new modbusrtuovertcpclient 127 0 0 1 502 1500 endianformat abcd other read and write operations are the same as modbustcpclient s read and write operations siemensclient siemens read and write operations 1 instantiate the client enter the model ip and port other models siemensversion s7 200 siemensversion s7 300 siemensversion s7 400 siemensversion s7 1200 siemensversion s7 1500 siemensclient client new siemensclient siemensversion s7 200smart 127 0 0 1 102 2 write operation client write q1 3 true client write v2205 short 11 client write v2209 33 3 read operation var value1 client readboolean q1 3 value var value2 client readint16 v2205 value var value3 client readint32 v2209 value 4 if there is no active open it will automatically open and close the connection every time you read and write operations which will greatly reduce the efficiency of reading and writing so it is recommended to open and close manually client open 5 read and write operations will return the operation result object result var result client readint16 v2205 5 1 whether the reading is successful true or false var issucceed result issucceed 5 2 exception information for failed reading var errmsg result err 5 3 read the request message actually sent by the operation var requst result requst 5 4 read the response message from the server var response result response 5 5 read value var value4 result value note about siemens plc address vb263 vw263 vd263 b w d byte 8 word 16 doubleword 32 when this component passes in the address there is no need to carry the data type just use the corresponding method to read the corresponding type such as vb263 client readbyte v263 vd263 client readfloat v263 vd263 client readint32 v263 db108 dbw4 client readuint16 db108 4 db1 dbx0 0 client readboolean db1 0 0 db1 dbd0 client readfloat db1 0 c data type smart200 1200 1500 300 bit v1 0 db1 dbx1 0 byte vb1 db1 dbb1 shor br ushort vw2 db1 dbw2 int br uint br float vd4 db1 dbd4 siemensclient best practices 1 when not to take the initiative to open siemens plc generally allows up to 8 long connections so when the number of connections is not enough or when doing testing do not take the initiative to open so that the component will automatically open and close immediately 2 when to take the initiative to open when the number of long connections is enough and you want to improve the read and write performance 3 in addition to active open connections batch read and write can also greatly improve read and write performance batch read dictionary string datatypeenum addresses new dictionary string datatypeenum addresses add db4 24 datatypeenum float addresses add db1 434 0 datatypeenum bool addresses add v4109 datatypeenum byte var result client batchread addresses batch write dictionary string object addresses new dictionary string object addresses add db4 24 float 1 addresses add db4 0 float 2 addresses add db1 434 0 true var result client batchwrite addresses 4 note when writing data you need to clarify the data type client write db4 12 9 what is written is of type int client write db4 12 float 9 what is written is a float type 5 siemensclient is a thread safe class due to limited long plc connections siemensclient is designed as a thread safe class you can set siemensclient as a singleton and use the instance of siemensclient to read and write plc between multiple threads mitsubishiclient mitsubishi read and write operations 1 instantiate the client enter the correct ip and port mitsubishiclient client new mitsubishiclient mitsubishiversion qna 3e 127 0 0 1 6000 2 write operation client write m100 true client write d200 short 11 client write d210 33 3 read operation var value1 client readboolean m100 value var value2 client readint16 d200 value var value3 client readint32 d210 value 4 if there is no active open it will automatically open and close the connection every time you read and write operations which will greatly reduce the efficiency of reading and writing so it is recommended to open and close manually client open 5 read and write operations will return the operation result object result var result client readint16 d210 5 1 whether the reading is successful true or false var issucceed result issucceed 5 2 exception information for failed reading var errmsg result err 5 3 read the request message actually sent by the operation var requst result requst 5 4 read the response message from the server var response result response 5 5 read value var value4 result value omronfinsclient omron read and write operations 1 instantiate the client enter the correct ip and port omronfinsclient client new omronfinsclient 127 0 0 1 6000 2 write operation client write m100 true client write d200 short 11 client write d210 33 3 read operation var value1 client readboolean m100 value var value2 client readint16 d200 value var value3 client readint32 d210 value 4 if there is no active open it will automatically open and close the connection every time you read and write operations which will greatly reduce the efficiency of reading and writing so it is recommended to open and close manually client open 5 read and write operations will return the operation result object result var result client readint16 d210 5 1 whether the reading is successful true or false var issucceed result issucceed 5 2 exception information for failed reading var errmsg result err 5 3 read the request message actually sent by the operation var requst result requst 5 4 read the response message from the server var response result response 5 5 read value var value4 result value allenbradleyclient read and write operations 1 instantiate the client enter the correct ip and port allenbradleyclient client new allenbradleyclient 127 0 0 1 44818 2 write operation client write a1 short 11 3 read operation var value client readint16 a1 value 4 if there is no active open it will automatically open and close the connection every time you read and write operations which will greatly reduce the efficiency of reading and writing so it is recommended to open and close manually client open 5 read and write operations will return the operation result object result var result client readint16 a1 5 1 whether the reading is successful true or false var issucceed result issucceed 5 2 exception information for failed reading var errmsg result err 5 3 read the request message actually sent by the operation var requst result requst 5 4 read the response message from the server var response result response 5 5 read value var value4 result value some projects based on iotclient library iotclient tool desktop program tool open source iotclient tool https github com zhaopeiym iotclient examples releases download 1 0 3 iotclient exe https github com zhaopeiym iotclient examples 1 plc 2 iotclient image https user images githubusercontent com 5820324 115138587 b7bebc80 a05f 11eb 9f7c 720a88bdca6e png image https user images githubusercontent com 5820324 115138592 bbeada00 a05f 11eb 9fc4 4b15a426cdb3 png image https user images githubusercontent com 5820324 115138594 bd1c0700 a05f 11eb 8d4b 34a567669e3d png image https user images githubusercontent com 5820324 115138596 bee5ca80 a05f 11eb 9878 9b05a4cfbc0b png image https user images githubusercontent com 5820324 115138597 c016f780 a05f 11eb 9d09 298a54f55266 png image https user images githubusercontent com 5820324 115138600 c2795180 a05f 11eb 92b0 1a1d278c20c8 png image https user images githubusercontent com 5820324 115138602 c3aa7e80 a05f 11eb 9cd7 be876735a26f png image https user images githubusercontent com 5820324 115138603 c5744200 a05f 11eb 9cdb a222aa9b7b25 png image https user images githubusercontent com 5820324 115138606 c73e0580 a05f 11eb 9ca1 5ece1bae8e71 png image https user images githubusercontent com 5820324 115138607 c86f3280 a05f 11eb 83f1 d1706331406a png energy management system commercial image https user images githubusercontent com 5820324 117001443 f10c5300 ad14 11eb 8597 bcc6e573c542 png image https user images githubusercontent com 5820324 117001444 f1a4e980 ad14 11eb 80ea 0972211e46a1 png image https user images githubusercontent com 5820324 117001447 f23d8000 ad14 11eb 9771 1854b13bef4b png image https user images githubusercontent com 5820324 117001451 f2d61680 ad14 11eb 9507 bf4123e5cbe8 png image https user images githubusercontent com 5820324 117001454 f36ead00 ad14 11eb 8ea1 e993298eca9b png image https user images githubusercontent com 5820324 117001460 f49fda00 ad14 11eb 8c75 eb88a24983b6 png image https user images githubusercontent com 5820324 117001461 f5d10700 ad14 11eb 9d82 d73a7347ad32 png image https user images githubusercontent com 5820324 117001464 f6699d80 ad14 11eb 8810 50b20f8954ae png image https img2020 cnblogs com blog 208266 202106 208266 20210630094808579 2089270828 svg image https user images githubusercontent com 5820324 116964170 796f0180 acdd 11eb 9514 fd9a05c15eae png image https user images githubusercontent com 5820324 116964172 7a079800 acdd 11eb 91ac 13c1a321145d png image https user images githubusercontent com 5820324 116964174 7aa02e80 acdd 11eb 8051 158f13ed2993 png image https user images githubusercontent com 5820324 116964175 7b38c500 acdd 11eb 80b4 97827ee03374 png image https user images githubusercontent com 5820324 116964177 7c69f200 acdd 11eb 94b8 ddbf5081ddaf png image https user images githubusercontent com 5820324 116964179 7d028880 acdd 11eb 95c6 601e235e3b6b png image https user images githubusercontent com 5820324 116964181 7d9b1f00 acdd 11eb 9914 911167e0af05 png haidilao terminal control commercial web image https user images githubusercontent com 5820324 117001939 87d90f80 ad15 11eb 8848 7a4956ba1ce9 png image https user images githubusercontent com 5820324 117001942 87d90f80 ad15 11eb 85b2 778cadaf85ad png image https user images githubusercontent com 5820324 117001947 890a3c80 ad15 11eb 9e28 57e8b05cd04c png image https user images githubusercontent com 5820324 117001949 89a2d300 ad15 11eb 9226 2e2683e2cc7f png image https user images githubusercontent com 5820324 116964517 5002a580 acde 11eb 9bfb c859a57307c7 png image https user images githubusercontent com 5820324 116964519 509b3c00 acde 11eb 8245 573ac3fa7f16 png image https user images githubusercontent com 5820324 116964521 5133d280 acde 11eb 85de b09dde1ca41e png image https user images githubusercontent com 5820324 116964525 51cc6900 acde 11eb 924f f3320e4a179c png
iot iotclient tcp socket modbustcp modbusrtu modbusascii plc bacnet siemens siemens-s7 siemens-plc modbus plc-modbus-bacnet mitsubishi mitsubishi-plc omron-plc omronfins omron
server
nrlpk
nrlpk natural russian language processing by the keys 1 1 https vpk name news 301216 kitai k 2025 godu zapustit eshe 100 kosmicheskih apparatov html 239 0 42 28 45 21 34 10 04 47 06 2 51 11 76 8 33 97 9 n1 15 07 2019 27 08 2019 2 2 https vpk name news 301984 ekipazhi vta otrabotali novuyu taktiku desantirovaniya podrazdelenii vdv html 109 0 19 27 22 94 20 18 88 6 42 28 22 73 96 33 n2 15 07 2019 28 08 2019 3 3 https vpk name news 302231 zampred vebrf soobshil chto ot 30 do 40 novyih tehnologii svyazanyi s razrabotkami v vpk html 189 0 53 37 57 15 87 11 11 70 5 82 36 67 23 81 97 89 n3 15 07 2019 28 08 2019 4 4 https vpk name blog 7vroey8u 2586 2 55 40 64 27 3 12 3 45 04 5 38 19 69 12 26 97 44 n4 22 07 2019 29 08 2019 04 09 2019 5 5 https vpk name news 301770 uspeshnyii pusk raketyi soyuz21v s plesecka zavershil programmu ispyitanii nositelya rkc progress html 145 1 38 20 69 28 97 17 93 61 91 7 59 26 19 34 62 95 87 n5 22 07 2019 29 08 2019 6 6 https vpk name news 304495 rossiiskie kompanii prosyat minpromtorg podderzhat otechestvennuyu promyishlennost html 222 0 9 27 03 23 42 13 51 57 69 1 8 7 69 6 67 99 1 n6 22 07 2019 29 08 2019 7 7 https vpk name news 305019 diversifikaciya organizacii oboronnopromyishlennogo kompleksa i grazhdanskii ryinok zakupok html 2246 1 56 20 53 46 17 10 02 21 7 2 94 6 37 4 89 98 44 n7 01 08 2019 30 08 2019 8 8 https vpk name news 308139 nedavno postroennaya podlodka kndr mozhet nesti tri ballisticheskie raketyi minoboronyi yuzhnoi korei html 152 1 32 26 97 17 76 13 16 74 07 7 89 44 44 25 96 71 n8 01 08 2019 30 08 2019 9 9 https vpk name news 308053 rossiyu zashityat 50letnimi letayushimi lodkami html 122 0 82 21 31 26 23 18 85 71 87 13 12 50 47 83 99 18 n9 01 08 2019 30 08 2019 10 10 https vpk name news 308298 voennyie avtomobilistyi iz pyati stran gotovyatsya k etapu konkursa armi2019 mastera avtobronetankovoi tehniki video html 231 0 20 35 30 3 18 62 61 43 8 23 27 14 20 93 96 97 n10 01 08 2019 02 09 2019 11 11 https vpk name news 308431 formiruemaya v tiksi diviziya pvo severnogo flota poluchila 170 edinic voennoi tehniki html 198 0 51 23 74 27 27 12 12 44 44 4 55 16 67 16 67 96 47 n11 01 08 2019 02 09 2019 12 12 https vpk name news 308470 bolee 350 desantnikov pryignuli s parashyutami v ramkah podgotovki k armi2019 html 122 0 82 21 31 32 79 28 69 87 5 9 02 27 5 31 43 96 72 n12 01 08 2019 02 09 2019 13 13 https vpk name news 308114 voennosluzhashie litovskopolskoukrainskoi brigadyi uchastvuyut v ucheniyah v gruzii agile spirit 2019 html 110 0 91 20 91 36 36 30 91 85 16 36 45 47 06 98 18 n13 01 08 2019 02 09 2019 14 14 https vpk name news 308389 osenyu ozhidayutsya letnyie ispyitaniya v ssha novyih raket srednei dalnosti nazemnogo bazirovaniya ryabkov html 358 0 28 27 37 29 61 12 85 43 4 9 5 32 08 17 39 99 72 n14 01 08 2019 03 09 2019 15 15 https vpk name news 298500 v egipet otpravlena ocherednaya partiya razvedyivatelnoudarnyih vertoletov ka52 alligator html 200 1 24 5 32 5 18 55 38 11 5 35 38 33 33 98 5 n15 03 08 2019 03 09 2019 16 16 https vpk name news 293039 dlya ispyitanii analoga rossiiskogo kinzhala v ssha ispolzovali b52 html 250 0 4 29 2 26 8 14 8 55 22 10 4 38 81 35 14 99 6 n16 05 08 2019 03 09 2019 17 17 https vpk name news 309098 sozdanie remontnyih batalonov v sostave armii v 25 raza uskorit vosstanovlenie voennoi tehniki v cvo html 121 0 83 22 31 29 75 16 53 55 56 2 48 8 33 15 96 69 n17 05 08 2019 03 09 2019 18 18 https vpk name news 309343 putin podpisal zakonyi o ratifikacii protokolov ob izmeneniyah v ustav odkb html 145 1 38 22 76 31 72 16 55 52 17 0 0 0 97 93 n18 05 08 2019 03 09 2019 19 19 https vpk name news 309190 s neba v boi chto razrabatyivayut dlya rossiiskogo desanta budushego html 2159 0 74 34 14 28 62 7 27 25 4 4 21 14 72 14 65 98 56 n19 06 08 2019 03 09 2019 20 20 https vpk name news 309259 zamenu an12 sozdadut v kb tupoleva html 119 1 68 24 37 26 05 16 81 64 52 14 29 54 84 40 96 64 n20 07 08 2019 04 09 2019 21 21 https vpk name news 310606 stopmenedzheryi html 1803 0 72 29 84 23 29 7 27 31 19 3 05 13 1 14 5 98 89 n21 07 08 2019 04 09 2019 22 22 https vpk name news 310258 prizyiv lukashenko html 377 0 27 31 3 11 14 5 31 47 62 0 8 7 14 10 98 67 n22 07 08 2019 04 09 2019 23 23 https vpk name news 310450 sekretnyii spravochnik ssha po voine s rossiei opublikovan v seti html 198 0 51 28 79 18 18 8 59 47 22 5 05 27 78 23 53 98 99 n23 08 08 2019 04 09 2019 24 24 https vpk name news 310802 v centre kurska ustanovyat yadernuyu raketu html 402 1 49 28 61 20 9 11 19 53 57 10 7 51 19 48 89 98 01 n24 08 08 2019 04 09 2019 25 25 https vpk name news 311035 sovet po aviastroeniyu rabota na perspektivu html 982 0 61 20 67 38 49 11 51 29 89 6 42 16 67 15 93 99 19 n25 09 08 2019 04 09 2019 26 26 https vpk name news 311200 pentagon dolzhen perevooruzhitsya html 599 0 17 32 72 21 2 9 02 42 52 8 35 39 37 24 07 99 n26 09 08 2019 06 09 2019 27 27 https vpk name news 311129 nalet starinyi html 858 1 17 33 22 16 43 7 58 46 1 8 86 53 9 43 08 98 37 n27 12 08 2019 06 09 2019 28 28 https vpk name news 311566 svr poluchila pravo rasporyazhatsya nauchnyimi dannyimi iz razryada gostainyi html 170 0 24 12 14 12 8 82 62 5 6 47 45 83 40 99 41 n28 12 08 2019 06 09 2019 29 29 https vpk name news 311731 meropriyatiya po ukrepleniyu oboronosposobnosti yuzhnyih kuril budut prodolzhenyi mid rf html 357 0 56 29 41 22 69 10 08 44 44 2 8 12 35 11 11 96 08 n29 12 08 2019 06 09 2019 30 30 https vpk name news 312184 pentagon ne proschital riski ekspert o voine ssha i rossii na more html 581 0 17 28 23 32 36 14 29 44 15 9 12 28 19 25 3 96 04 n30 16 08 2019 06 09 2019 31 31 https vpk name news 312304 vyizov prinyat v rossiiskih voiskah poyavitsya svoya sotovaya svyaz html 699 1 14 25 75 24 89 7 44 29 89 3 72 14 94 25 98 71 n31 16 08 2019 06 09 2019 32 32 https vpk name news 313811 die welt germaniya mechta ob osobom puti html 803 0 37 40 72 17 31 7 72 44 6 3 99 23 02 20 97 98 88 n32 16 08 2019 06 09 2019 33 33 https vpk name news 313613 v primore proveli trenirovku alligatoryi i terminatoryi html 103 0 97 23 3 21 36 19 42 90 91 4 85 22 73 20 96 12 n33 16 08 2019 06 09 2019 34 34 https vpk name news 314509 indiiskii su30mki v cifrah html 208 0 48 26 92 28 37 14 9 52 54 9 62 33 9 25 81 99 52 n34 19 08 2019 09 09 2019 35 35 https vpk name news 314144 s nachala avgusta bolee 100 ed voennoi tehniki postupilo v centralnyii voennyii okrug html 129 0 22 48 31 78 22 48 70 73 6 98 21 95 20 69 95 35 n35 19 08 2019 09 09 2019 36 36 https vpk name news 314329 shturmovik anakonda kryilya postapokalipsisa html 542 0 88 26 99 21 68 13 72 63 27 7 96 36 73 30 65 98 52 n36 21 08 2019 09 09 2019 37 37 https vpk name news 315098 rossiiskim uchenyim hotyat ogranichit kontaktyi s inostrannyimi kollegami chto zhdet naukoemkie startapyi html 282 1 06 31 21 17 73 6 74 38 1 77 10 15 79 98 23 n37 21 08 2019 09 09 2019 38 38 https vpk name news 315551 v manevrah centr2019 budut uchastvovat voennyie devyati stran html 272 0 30 88 20 59 17 65 85 71 7 35 35 71 35 42 98 53 n38 21 08 2019 09 09 2019 39 39 https vpk name news 315644 ubiistvennyii otvet na tomahawk ssha yadernyii samohodnyii rk55 relef html 176 0 22 73 22 73 19 32 85 7 39 32 5 23 53 98 3 n39 22 08 2019 09 09 2019 40 40 https vpk name news 315875 napryazhennaya obstanovka shoigu bespokoit situaciya na zapadnyih granicah html 639 0 47 28 95 25 67 12 36 48 17 5 95 23 17 20 25 96 56 n40 22 08 2019 09 09 2019 41 41 https vpk name news 315933 sleduyushii den ssha zhdut poteryu vlasti maduro html 914 0 44 31 95 23 74 8 97 37 79 6 13 25 81 17 07 97 81 n41 22 08 2019 10 09 2019 42 42 https vpk name news 315605 okazalis ne gotovyi pochemu ssha ne pobedyat v voine s kitaem html 959 0 1 28 57 24 61 6 78 27 54 4 9 19 92 23 08 98 33 n42 22 08 2019 10 09 2019 43 43 https vpk name news 315792 nachalo postavok su57e za rubezh razrushit monopoliyu f35 html 207 0 30 92 21 26 11 11 52 27 9 66 45 45 43 48 99 52 n43 22 08 2019 10 09 2019 44 44 https vpk name news 315955 kupit motor sich kitai zhdet odobreniya na priobretenie svyishe 50 akcii html 286 1 4 29 02 19 93 12 59 63 16 4 2 21 05 16 67 98 6 n44 26 08 2019 10 09 2019 45 45 https vpk name news 316402 sverhzvukovyie samoletyi vozvrashayutsya odni etogo zhdut drugie boyatsya html 2082 0 48 35 69 23 01 7 16 31 11 5 43 23 59 16 78 98 99 n45 27 08 2019 10 09 2019 46 46 https vpk name news 320478 v rossii izmenyatsya pravila ucheta prizyivnikov html 461 0 43 35 14 16 7 5 42 32 47 0 22 1 3 4 98 05 n46 03 09 2019 10 09 2019 47 47 https vpk name news 322434 sovremennoe sostoyanie programmyi vertoleta ka62 html 1089 1 01 26 81 27 09 8 17 30 17 6 89 25 42 24 72 98 53 n47 13 10 2017 48 48 https habr com ru post 468141 1217 3 37 27 12 21 12 5 42 25 68 0 33 1 56 4 55 96 39 n48 16 09 2019 48 1 48 1 https habr com ru post 468141 1262 1 03 30 35 23 61 6 18 26 17 0 4 1 68 5 13 98 81 n48 1 25 09 2019 49 49 https habr com ru post 468141 comment 20654675 295 2 03 43 73 18 64 6 44 34 55 0 34 1 82 5 26 97 97 n49 25 09 2019 4 1 4 1 https vpk name blog 7vroey8u 2731 2 49 42 99 18 75 5 24 27 93 5 57 29 69 29 37 97 51 n4 1 01 10 2019 50 50 vc 1361 0 81 31 15 25 64 6 1 23 78 0 37 1 43 4 82 99 19 n50 02 10 2019 52 52 https habr com ru post 468141 1274 0 63 30 38 9 5 2 98 31 4 0 31 3 31 7 89 99 37 n52 23 01 2020 4 3 4 3 https vpk name blog 7vroey8u 2792 0 36 43 73 8 38 3 55 42 31 3 55 42 31 39 39 99 64 n4 3 01 04 2020 53 53 https habr com ru post 502366 118 0 85 16 95 15 25 14 41 94 44 0 0 0 99 15 n53 26 05 2020
ai
Data-Engineering-Database-Dev-Administer-and-Monitor-Couchbase
data engineering database dev administer and monitor couchbase data engineering database dev administer and monitor couchbase
server
rtos-benchmark
readme the benchmark project contains a set of tests aimed to measure the performance of certain os operations it currently supports both the qemu x86 and frdm k64f boards on zephyr and only the frdm k64f board on freertos additional boards and rtoses are expected to be added in the future it is recognized that running a benchmark test suite on qemu is not generally recommended and any results from that should be taken with a grain of salt that being said the primary reason it has been added has been to act as a blueprint for integrating additional boards and architectures setting up common install the gnu arm compiler for the frdm k64f board gnu arm embedded toolchain https developer arm com tools and software open source software developer tools gnu toolchain gnu rm and set armgcc dir to the installation directory zephyr specifics refer to the zephyr getting started guide https docs zephyrproject org latest getting started index html for installing and setting up the zephyr environment this must be done so that a freestanding application can be built as indicated under common be sure to set the armgcc dir environment variable appropriately to enable use of the 3rd party gnu arm embedded toolchain set the following environment variables export zephyr toolchain variant gnuarmemb export gnuarmemb toolchain path path to installed toolchain freertos specifics build and download the frdm k64f sdk from the nxp mcuxpresso sdk builder https mcuxpresso nxp com en welcome select linux as your host os and gcc arm embedded as your toolchain be sure to also check the freertos checkbox to build freertos awareness and examples into the sdk mkdir freertos sdk version frdm k64f unzip downloads sdk version frdm k64f zip d sdk install dir sdk version frdm k64f install pyocd https github com pyocd pyocd tool used to flash frdm k64f vxworks specifics install vxworks version later than 24 03 including building and flashing zephyr on frdm k64f cmake gninja drtos zephyr dboard frdm k64f s b build ninja c build ninja c build flash zephyr on other boards similar steps apply to when building for a different board in the cmake command specify the name of the board after dboard for example cmake gninja drtos zephyr dboard qemu x86 s b build remember that the zephyr base environment variable must be set so that the zephyr west tool can be found freertos on frdm k64f cmake gninja drtos freertos dboard frdm k64f dmcux sdk path sdk install dir sdk version frdm k64f s b build ninja c build ninja c build flash vxworks vxworks supports to run rtos benchmark for either posix interfaces or non posix interfaces at user space on difference boards for example with the bsp nxp s32g274 create and build vsb vxprj vsb create force s bsp nxp s32g274 vsb nxp s32g274 cd vsb nxp s32g274 vxprj vsb config s add wrs config rtos benchmark y make create and build vip the followings are the steps for posix interfaces to test non posix interfaces just replace include rtos benchmark posix with include rtos benchmark nonposix and replace rtos benchmark posix vxe with rtos benchmark non posix vxe vxprj create vsb path to vsb nxp s32g274 vip nxp s32g274 cd vip nxp s32g274 vxprj component add include rtos benchmark posix mkdir romfs cp path to vsb nxp s32g274 usr root llvm bin rtos benchmark posix vxe romfs vxprj parameter set rtos benchmark options 1 vxprj component remove include network vxprj build load the image on the target rtos benchmark will automatically run on bootup connecting connect the frdm k64f to your host via usb in another terminal open a serial terminal such as screen minicom etc to see the output minicom d dev ttyacm0 debugging debugging on both zephyr and freertos are quite similar cmake rtos debug options ninja c build debugserver in another terminal start gdb armgcc dir bin arm none eabi gdb build freertos elf gdb target remote 3333 cmake zephyr debug options nothing extra just the standard cmake instruction cmake gninja drtos zephyr dboard frdm k64f s b build cmake freertos debug options add dcmake build type debug to the build options cmake gninja dcmake build type debug drtos freertos dboard frdm k64f dmcux sdk path sdk version frdm k64f s b build
os
UNetwork
this is the unetwork alpha network build status https travis ci org u network unetwork svg branch master https travis ci org u network unetwork unetwork unetwork is a decentralized distributed network protocol based on blockchain technology and is implemented in golang through peer to peer network unetwork can be used to digitize assets and provide financial service including asset registration issuance transfer etc highlight features scalable lightweight universal smart contract crosschain interactive protocol quantum resistant cryptography optional module china national crypto standard optional module high optimization of tps distributed storage and file sharding solutions based on ipfs p2p link layer encryption node access control multiple consensus algorithm support dbft rbft sbft configurable block generation time configurable digital currency incentive configable sharding consensus in progress building the requirements to build unetwork are go version 1 8 or later glide a third party package management tool properly configured go environment deployment to run unetwork successfully at least 4 nodes are required the four nodes can be deployed in the following two way multi hosts deployment single host deployment contributing can i contribute patches to unetwork project yes please open a pull request with signed off commits we appreciate your help you can also send your patches as emails to the developer mailing list please join the unetwork mailing list or forum and talk to us about it either way if you don t sign off your patches we will not accept them this means adding a line that says signed off by name email at the end of each commit indicating that you wrote the code and have the right to pass it on as an open source patch also please write good git commit messages a good commit message looks like this header line explain the commit in one line use the imperative body of commit message is a few lines of text explaining things in more detail possibly giving some background about the issue being fixed etc etc the body of the commit message can be several paragraphs and please do proper word wrap and keep columns shorter than about 74 characters or so that way git log will show things nicely even when it s indented make sure you explain your solution and why you re doing what you re doing as opposed to describing what you re doing reviewers and your future self can read the patch but might not understand why a particular solution was implemented reported by whoever reported it signed off by your name youremail yourhost com community site http u network license unetwork is licensed under the apache license version 2 0 see license for the full license text
blockchain golang prediction-markets consensus-algorithm unetwork
blockchain
Gardening-Store_Backend
gardening store backend my mini project for app development course backend developed in springtool suite project overview an online store for organic veggies and fruits is a web application that allows customers to purchase fresh organic produce online it provides a platform for farmers suppliers or grocery stores specializing in organic products to showcase and sell their produce to consumers key features of online store for gardening typically includes 1 product catalog the web application features a comprehensive catalog of gardening products categorized into various sections such as tools equipment plants seeds fertilizers pesticides garden decor and more each product listing includes detailed information images pricing and customer reviews 2 search and filtering users can search for specific gardening products or use filtering options to narrow down their search based on categories brands prices ratings and other criteria this helps users find the desired products more efficiently 3 product details each product listing provides detailed information about the gardening items including specifications features usage instructions and customer reviews users can access this information to make informed purchase decisions 4 shopping cart and checkout users can add gardening products to their shopping cart as they browse through the catalog the web application allows users to review their cart adjust quantities and proceed to the secure checkout process users provide shipping details select a payment method and confirm the order 5 payment integration the web application integrates with secure payment gateways to facilitate smooth and secure online transactions customers can use various payment methods such as credit cards debit cards or digital wallets to complete their purchases 6 order tracking after completing a purchase users can track the status of their orders the web application provides order tracking functionality enabling users to monitor the progress of their shipments and receive updates on delivery dates 7 customer accounts customers can create accounts on the web application to save their delivery addresses track order history and manage preferences this feature provides a personalized experience and facilitates easy reordering in the future 8 reviews and ratings customers can leave reviews and ratings for products they purchase this feedback helps other customers make informed decisions and provides valuable insights to the store about the quality of their products 9 support and customer service the web application provides customer support and assistance to address any inquiries or concerns this can include faqs chat support or contact forms to ensure a positive shopping experience
backend java rest-api spring-boot springtoolsuite
server
finance_ml
finance ml python implementations of machine learning helper functions for quantiative finance based on books advances in financial machine learning https www amazon co jp advances financial machine learning english ebook dp b079kldw21 and machine learning for asset managers https www amazon com machine learning managers elements quantitative dp 1108792898 written by marcos lopez de prado installation excute the following command python python setup py install or simply add your path to finace ml to your pythonpath implementation the following functions are implemented labeling multiporcessing sampling feature selection asset allcation breakout detection examples some of example notebooks are found under the folder mlassetmanagers multiprocessing parallel computing using multiprocessing library here is the example of applying function to each element with parallelization python import pandas as pd import numpy as np def apply func x return x 2 def func df timestamps f df df loc timestamps for idx x in df items df loc idx f x return df df pd series np random randn 10000 from finance ml multiprocessing import mp pandas obj results mp pandas obj func pd obj timestamps df index num threads 24 df df f apply func print results head output 0 0 449278 1 1 411846 2 0 157630 3 4 949410 4 0 601459 for more detail please refer to example notebook
ai
Line-Following-Encoders
line following encoders embedded systems car code for ecgr 2252 ece design i line following algorithm using encoders
os
ESP32_SH1106_I2C_RTOS
esp32 sh1106 i2c rtos a smiple code for esp32 sh1106 i2c rtos
os
Java-Machine-Learning-for-Computer-Vision
java machine learning for computer vision video this is the code repository for java machine learning for computer vision video https www packtpub com big data and business intelligence java machine learning computer vision video utm source github utm medium repository utm campaign 9781789130652 published by packt https www packtpub com utm source github it contains all the supporting project files necessary to work through the video course from start to finish about the video course the goal of this course is to walk you through the process of efficiently training deep neural networks for computer vision using the most modern techniques the course is designed to get you through the process of becoming familiar with deep neural networks in order to be able to train them efficiently customize existing state of the art architectures build real world java applications and get great results in short time you will go through building real world computer vision applications ranging from simple java handwritten digit recognition to real time java autonomous car driving systems and face recognition by the end of the course you will discover the best practices and most modern techniques to build advanced computer vision java applications and get production grade accuracy h2 what you will learn h2 div class book info will learn text ul li discover how neural networks work and understand the limitations and challenges developers face nowadays li best practice methods and parameters and how to build deep neural networks li hands on real java applications for image classification real time video object detection face recognition and art generation li explore some of the most used machine learning java frameworks nbsp li utilize your newly acquired machine learning skills to help you delve into the world of data science li ul div instructions and navigation assumed knowledge to fully benefit from the coverage included in this course you will need br machine learning beginner level intermediate java and programming knowledge no prior heavy math knowledge familiarity with git and github for source control familiar with maven building tool and java ide technical requirements this course has the following software requirements br this course has the following software requirements java 1 8 maven 3 x intelij or eclipse windows 10 good hardware ram 8gb cpu core i7 this course has been tested on the following system configuration os windows 10 processor intel core i7 java 8 memory 8 12 16gb hard disk space 100gb video card simple related products hands on scikit learn for machine learning video https www packtpub com big data and business intelligence hands scikit learn machine learning video utm source github utm medium repository utm campaign 9781789137132 machine learning and tensorflow the google cloud approach video https www packtpub com application development machine learning and tensorflow google cloud approach video utm source github utm medium repository utm campaign 9781789614398 java se 8 programmer 2 part ii integrated course https www packtpub com application development java se 8 programmer 2 part ii integrated course utm source github utm medium repository utm campaign 9781788297530
ai
ZocSec.SecurityAsCode.GitHub
zocsec securityascode github p img src zocsecshieldblue png align right welcome to the zocdoc information security team zocsec securityascode repository for github we use aws s in built technologies to automate the remediation of common security problems in this repository zocsec presents code configuration used to lock down our github environment p project list these are the projects we re currently ready share github inventory tool github inventory tool this github python script collects all repositories private and public from authenticated github account github automated security github automated security an automated means to secure private github repositories from unintentionally becomes public and enable scan for vulnerable dependencies github enable vuln scan github enable vuln scan a simple python script that enable scan for vulnerable dependencies on all repos under any organizational github we will be sharing more of our projects in the future contributions we welcome contributions and pull requests to this repo give us feedback the primary contributors to this effort are jay ball veggiespam https github com veggiespam and gary tsai garymalaysia https github com garymalaysia this project was released to the public as part of the zocdoc s zocsec securityascode initiative copyright 2018 2019 zocdoc inc www zocdoc com vim spell expandtab
open-source
server
embedded-systems-design
embedded systems design this repository contains all the source files for all codes developed in the articles in my embedded systems series blog url https asogwa001 hashnode dev
os
Cognitive-Vision-DotNetCore
vision docs images vision png vision api net core client library sample branch build status develop build status https travis ci org microsoft cognitive vision dotnetcore svg branch develop https travis ci org microsoft cognitive vision dotnetcore master build status https travis ci org microsoft cognitive vision dotnetcore svg branch master https travis ci org microsoft cognitive vision dotnetcore overview this repo contains the net core client library samples for the microsoft computer vision api an offering within microsoft cognitive services https www microsoft com cognitive services formerly known as project oxford learn about the computer vision api https www microsoft com cognitive services en us computer vision api read the documentation https www microsoft com cognitive services en us computer vision api documentation find more sdks samples https www microsoft com cognitive services en us sdk sample api computer 20vision before you can use the sdk or run the sample application you must subscribe to the face api which is part of microsoft cognitive services you can learn how to subscribe here https www microsoft com cognitive services en us sign up the client library the client library is a thin net core client wrapper for the vision api the easiest way to use this client library is to get the microsoft projectxoford vision dotnetcore package from nuget please go to the net core face api package in nuget https www nuget org packages microsoft projectoxford vision dotnetcore for more details contributing contributions are welcome feel free to file issues and pull requests on the repo and we ll address them as we can learn more about how you can help on our contribution rules guidelines contributing md this project has adopted the microsoft open source code of conduct https opensource microsoft com codeofconduct for more information see the code of conduct faq https opensource microsoft com codeofconduct faq or contact opencode microsoft com mailto opencode microsoft com with any additional questions or comments license all microsoft cognitive services sdks and samples are licensed with the mit license for more details see license license md developer code of conduct developers using this project are expected to follow the developer code of conduct for microsoft cognitive services at http go microsoft com fwlink linkid 698895 http go microsoft com fwlink linkid 698895 disclaimer the image voice video or text understanding capabilities of microsoft projectoxford vision dotnetcore use microsoft cognitive services microsoft will receive the images audio video and other data that you upload via this app for service improvement purposes report abuse to report abuse of the microsoft cognitive services to microsoft please visit the microsoft cognitive services website at https www microsoft com cognitive services https www microsoft com cognitive services and use the report abuse link at the bottom of the page to contact microsoft privacy policy for more information about microsoft privacy policies please see their privacy statement here https go microsoft com fwlink linkid 521839 https go microsoft com fwlink linkid 521839
ai
2020-fall-cs160-team-snorlax
postcard 2020 fall cs160 team snorlax by johanna mike samuel jake and rabia zenhub https app zenhub com workspaces team snorlax 5f7e751d9864e80020434391 board about and vision for people who browse the internet who value an easy to upload share and view content service postcard is a web based service that allows users to upload images to the internet and create a sharable link that will allow users to view the images unlike imgur photobucket flickr our product would maintain anonymity by uploading images albums without creating an account and allows simple shareable links project code postcard s main code can be found in the directory project src main project readme main readme md file for instructions on usage of postcard is within the project directory project testing postcard s testcases utilizing selenium and junit can be found under project src test java com snorlax snorlaxapplicationtests java initial prototype an initial prototype to test postcard s high level design using react spring boot and h2 database can be found in the prototype directory the functionality of the prototype is an example usage of the three components together and does not reflect poostcard s functionality please refer to the sept 27th commit for the prototype docker image docker image file can be found under the project directory
server
bnpy
bnpy bayesian nonparametric machine learning for python project website https bnpy readthedocs io en latest 8226 example gallery https bnpy readthedocs io en latest examples 8226 installation https bnpy readthedocs io en latest installation html 8226 team team 8226 academic papers academic papers 8226 report an issue https github com bnpy bnpy issues about this python module provides code for training popular clustering models on large datasets we focus on bayesian nonparametric models based on the dirichlet process but also provide parametric counterparts bnpy supports the latest online learning algorithms as well as standard offline methods our aim is to provide an inference platform that makes it easy for researchers and practitioners to compare models and algorithms supported probabilistic models aka allocation models mixture models finitemixturemodel fixed number of clusters dpmixturemodel infinite number of clusters via the dirichlet process topic models aka admixtures models finitetopicmodel fixed number of topics this is latent dirichlet allocation hdptopicmodel infinite number of topics via the hierarchical dirichlet process hidden markov models hmms finitehmm markov sequence model with a fixture number of states hdphmm markov sequence models with an infinite number of states supported data observation models aka likelihoods multinomial for bag of words data mult gaussian for real valued vector data gauss full covariance diaggauss diagonal covariance zeromeangauss zero mean full covariance auto regressive gaussian autoreggauss supported learning algorithms expectation maximization em full dataset variational bayes vb memoized variational can process small minibatches per update step movb stochastic variational can process small minibatches per update step sovb these are all variants of variational inference a family of optimization algorithms example gallery you can find many examples of bnpy in action in our curated example gallery https bnpy readthedocs io en latest examples these same demos are also directly available as python scrips inside the examples folder of the project github repository https github com bnpy bnpy tree master examples quick start you can use bnpy from a command line terminal or from within python both options require specifying a dataset an allocation model an observation model likelihood and an algorithm optional keyword arguments with reasonable defaults allow control of specific model hyperparameters algorithm parameters etc below we show how to call bnpy to train a 8 component gaussian mixture model on a default toy dataset stored in a csv file on disk in both cases log information is printed to stdout and all learned model parameters are saved to disk calling from the terminal command line python m bnpy run path to my dataset csv finitemixturemodel gauss em k 8 output path tmp my dataset results calling directly from python import bnpy bnpy run path to dataset csv finitemixturemodel gauss em k 8 output path tmp my dataset results advanced examples train dirichlet process gaussian mixture model dp gmm via full dataset variational algorithm aka vb for variational bayes python m bnpy run path to dataset csv dpmixturemodel gauss vb k 8 train dp gmm via memoized variational with birth and merge moves with data divided into 10 batches python m bnpy run path to dataset csv dpmixturemodel gauss memovb k 8 nbatch 10 moves birth merge quick help print help message for required arguments python m bnpy run help print help message for specific keyword options for gaussian mixture models python m bnpy run path to dataset csv finitemixturemodel gauss em kwhelp installation to use bnpy for the first time follow the documentation s installation instructions https bnpy readthedocs io en latest installation html team primary investigators mike hughes assistant professor aug 2018 present tufts university dept of computer science website https www michaelchughes com erik sudderth professor university of california irvine website https www ics uci edu sudderth contributors soumya ghosh dae il kim geng ji william stephenson sonia phene gabe hope leah weiner alexis cook mert terzihan mengrui ni jincheng li xi chen tufts academic papers conference publications based on bnpy nips 2015 hdp hmm paper our nips 2015 paper describes inference algorithms that can add or remove clusters for the sticky hdp hmm scalable adaptation of state complexity for nonparametric hidden markov models michael c hughes william stephenson and erik b sudderth nips 2015 paper http michaelchughes com papers hughesstephensonsudderth nips 2015 pdf supplement http michaelchughes com papers hughesstephensonsudderth nips 2015 supplement pdf scripts to reproduce experiments http bitbucket org michaelchughes x hdphmm nips2015 aistats 2015 hdp topic model paper our aistats 2015 paper describes our algorithms for hdp topic models reliable and scalable variational inference for the hierarchical dirichlet process michael c hughes dae il kim and erik b sudderth aistats 2015 paper http michaelchughes com papers hugheskimsudderth aistats 2015 pdf supplement http michaelchughes com papers hugheskimsudderth aistats 2015 supplement pdf bibtex http cs brown edu people mhughes papers hugheskimsudderth aistats2015 memoizedhdp bibtex txt nips 2013 dp mixtures paper our nips 2013 paper introduced memoized variational inference algorithm and applied it to dirichlet process mixture models memoized online variational inference for dirichlet process mixture models michael c hughes and erik b sudderth nips 2013 paper http michaelchughes com papers hughessudderth nips 2013 pdf supplement http michaelchughes com papers hughessudderth nips 2013 supplement pdf bibtex http cs brown edu people mhughes papers hughessudderth nips2013 memoizeddp bibtex txt workshop papers our short paper from a workshop at nips 2014 describes the vision for bnpy as a general purpose inference engine bnpy reliable and scalable variational inference for bayesian nonparametric models michael c hughes and erik b sudderth probabilistic programming workshop at nips 2014 paper http michaelchughes com papers hughessudderth nipsprobabilisticprogrammingworkshop 2014 pdf target audience primarly we intend bnpy to be a platform for researchers by gathering many learning algorithms and popular models in one convenient modular repository we hope to make it easier to compare and contrast approaches we also hope that the modular organization of bnpy enables researchers to try out new modeling ideas without reinventing the wheel
ai
govuk-frontend-aspnetcore
asp net core integration for gov uk design system ci https github com gunndabad govuk frontend aspnetcore workflows ci badge svg nuget with prereleases https img shields io nuget vpre govuk frontend aspnetcore targets gds frontend v4 7 0 https github com alphagov govuk frontend releases tag v4 7 0 installation 1 install nuget package install the govuk frontend aspnetcore nuget package https www nuget org packages govuk frontend aspnetcore install package govuk frontend aspnetcore or via the net core command line interface dotnet add package govuk frontend aspnetcore 2 configure your asp net core application add services to your application s startup class cs using govuk frontend aspnetcore public class startup public void configureservices iservicecollection services services addgovukfrontend 3 register tag helpers in your viewimports cshtml file razor addtaghelper govuk frontend aspnetcore 4 configure your page template you have several options for configuring your page template https design system service gov uk styles page template using the govukpagetemplate razor view a razor view is provided with the standard page template markup and razor sections where you can add in your header footer and any custom markup you require in your layout cshtml file razor layout govukpagetemplate section header your header markup goes here renderbody section footer your footer markup goes here the view can be customised by defining the following sections and viewdata viewbag variables section name description beforecontent add content that needs to appear outside main element br for example the back link docs components back link md component breadcrumbs docs components breadcrumbs md component phase banner docs components phase banner md component bodyend add content just before the closing body element bodystart add content after the opening body element br for example the cookie banner component footer override the default footer component head add additional items inside the head element br for example meta name description content my page description header override the default header component headicons override the default icons used for gov uk branded pages br for example link rel shortcut icon href favicon ico type image x icon skiplink override the default skip link docs components skip link md component viewdata key type description bodyclasses string add class es to the body element containerclasses string add class es to the container this is useful if you want to make the page wrapper a fixed width htmlclasses string add class es to the html element htmllang string set the language of the whole document if your title and main element are in a different language to the rest of the page use htmllang to set the language of the rest of the page mainclasses string add class es to the main element mainlang string set the language of the main element if it s different to htmllang opengraphimageurl string set the url for the open graph image meta tag the url must be absolute including the protocol and domain name title string override the default page title title element themecolor string set the toolbar colour on some devices https developers google com web updates 2014 11 support for theme color in chrome 39 for android create your own razor view if the standard template above is not sufficient you can create your own razor view by default references to the gds frontend css and script assets will be added automatically to the head and body elements if you want to control the asset references yourself you can disable the automatic import cs services addgovukfrontend options options addimportstohtml false the pagetemplatehelper class defines several methods that can simplify the css and script imports generatestyleimports imports css stylesheets and should be added to head generatejsenabledscript declares some inline javascript that adds the js enabled class to the body and should be placed at the start of body generatescriptimports imports javascript files and should be added to the end of body the latter two methods take an optional cspnonce parameter when provided a nonce attribute will be added to the inline scripts pagetemplatehelper can be injected into your view and used like so razor inject govuk frontend aspnetcore pagetemplatehelper pagetemplatehelper pagetemplatehelper generatestyleimports content security policy csp there are two built in mechanisms to help in generating a script src csp directive that works correctly with the inline scripts used by the page template the preferred option is to use the getcspscripthashes method on pagetemplatehelper this will return a string that can be inserted directly into the script src directive in your csp alternatively a csp nonce can be appended to the generated script tags a delegate must be configured on govukfrontendoptions that retrieves a nonce for a given httpcontext cs services addgovukfrontend options options getcspnonceforrequest context return your nonce here see the samples mvcstarter project for an example of this working gds assets this package serves the gds frontend assets stylesheets javascript fonts inside the host application so these do not need to be imported separately components accordion docs components accordion md back link docs components back link md breadcrumbs docs components breadcrumbs md button docs components button md checkboxes docs components checkboxes md character count docs components character count md date input docs components date input md details docs components details md error message docs components error message md error summary docs components error summary md fieldset docs components fieldset md file upload docs components file upload md inset text docs components inset text md notification banner docs components notification banner md pagination docs components pagination md panel docs components panel md phase banner docs components phase banner md radios docs components radios md select docs components select md skip link docs components skip link md summary list docs components summary list md tabs docs components tabs md tag docs components tag md textarea docs components textarea md text input docs components text input md warning text docs components warning text md validators max words validator docs validation maxwords md
asp-net-core gds
os
checked.in
checked in repository for checked in application for mobile software development course this application is mainly for group of friends who intend to go to public event and keep track of each others locations from the application you can register groups and send invitations to friends who can inturn accept or decline the invitation each user mainly has the ability to track the members of a group using a google maps and will be able to set a check in point for each group when a member of a group gets within 50 metres of the check in location a notification is sent to the whole group application is currently not user experience friendly just the components plugged in there frontend development to be done
front_end
mathapp_backend
copyright 2019 ser401 project 14 team all rights reserved team members raymond acevedo shawn weiner christopher salazar robert pillitteri shelton lacy unauthorized copying of this file via any medium is strictly prohibited proprietary and confidential mathapp backend backend development for mathapp asu capstone project see below for docker instructions requirements node v10 15 3 npm v6 4 1 mysql current see note on installation below apache ant v1 10 5 optional for ant build scripts java se 1 8 0 202 optional for ant build scripts installation 1 install mysql node js and npm apache ant and java se are optional 1 note mysql version 8 0 and higher has changed authentication methods aws has mysql native password enabled by default please ensure when installing mysql 8 0 or greater that you choose to use the mysql native password authentication if during database installation you recieve an error similar to client does not support authentication protocol requested by server your mysql installation has the new authentication enabled review several fixes here https medium com crmcmullen how to run mysql 8 0 with native password authentication 502de5bac661 2 download source code or clone the repository 3 for local development 1 copy env sample to env 2 update values in the new env file to match the local environment or desired settings 4 if using database setup js please update the admin user and admin password constants in the env file for an account with admin privileges on the mysql db 5 for aws deployment use the ant aws command in the base directory this will assemble the correct package for uploading to aws elastic beanstalk under aws mathapp zip just upload this zip file 1 review the env sample file i ve adjusted the files to use the default rds environment variables supplied by elastic beanstalk 6 in root directory type the following commands 1 yarn install only once on initial install 2 node database setup js only once on initial install if needed 3 npm start 7 application will listen on port indicated by the environment variable port specified in env or by default 8000 docker this application has been updated to work with docker compose and as such if you want to deploy this using docker please use the following information requirements docker 1 java development kit jdk v8 or higher 2 apache ant 3 docker w docker compose settings 1 the docker compose loads all of the relevant environment variables if needed or desired please update the environment variables to match your system settings running the app 1 in the app base directory run the command ant docker start please be aware that there is a large delay in getting the node application running due to the development environment it runs npm install each time it starts the containers please wait until you get the console logging indicating the node application has started successfully 2 to stop the application use ant docker stop api documentation 1 api documentation is served via swagger ui at the base url for the api server it has been formally written in the openapi3 standard
server
azure-search-openai-demo
name chatgpt enterprise data description chat with your data using openai and cognitive search languages azdeveloper typescript python bicep products azure azure cognitive search azure openai azure app service page type sample urlfragment azure search openai demo chatgpt enterprise data with azure openai and cognitive search table of contents features features azure account requirements azure account requirements azure deployment azure deployment cost estimation cost estimation project setup project setup github codespaces github codespaces vs code dev containers vs code dev containers local environment local environment deploying from scratch deploying from scratch deploying with existing azure resources deploying with existing azure resources deploying again deploying again sharing environments sharing environments enabling optional features enabling optional features enabling application insights enabling application insights enabling authentication enabling authentication enabling login and document level access control enabling login and document level access control enabling cors for an alternate frontend enabling cors for an alternate frontend using the app using the app running locally running locally productionizing productionizing resources resources note note faq faq troubleshooting troubleshooting open in github codespaces https img shields io static v1 style for the badge label github codespaces message open color brightgreen logo github https github com codespaces new hide repo select true ref main repo 599293758 machine standardlinux32gb devcontainer path devcontainer 2fdevcontainer json location westus2 open in dev containers https img shields io static v1 style for the badge label dev 20containers message open color blue logo visualstudiocode https vscode dev redirect url vscode ms vscode remote remote containers cloneinvolume url https github com azure samples azure search openai demo this sample demonstrates a few approaches for creating chatgpt like experiences over your own data using the retrieval augmented generation pattern it uses azure openai service to access the chatgpt model gpt 35 turbo and azure cognitive search for data indexing and retrieval the repo includes sample data so it s ready to try end to end in this sample application we use a fictitious company called contoso electronics and the experience allows its employees to ask questions about the benefits internal policies as well as job descriptions and roles rag architecture docs appcomponents png features chat and q a interfaces explores various options to help users evaluate the trustworthiness of responses with citations tracking of source content etc shows possible approaches for data preparation prompt construction and orchestration of interaction between model chatgpt and retriever cognitive search settings directly in the ux to tweak the behavior and experiment with options optional performance tracing and monitoring with application insights chat screen docs chatscreen png azure account requirements important in order to deploy and run this example you ll need azure account if you re new to azure get an azure account for free https azure microsoft com free cognitive search and you ll get some free azure credits to get started azure subscription with access enabled for the azure openai service you can request access with this form https aka ms oaiapply if your access request to azure openai service doesn t match the acceptance criteria https learn microsoft com legal cognitive services openai limited access context 2fazure 2fcognitive services 2fopenai 2fcontext 2fcontext you can use openai public api https platform openai com docs api reference introduction instead learn how to switch to an openai instance switching from an azure openai endpoint to an openai instance azure account permissions your azure account must have microsoft authorization roleassignments write permissions such as role based access control administrator https learn microsoft com azure role based access control built in roles role based access control administrator preview user access administrator https learn microsoft com azure role based access control built in roles user access administrator or owner https learn microsoft com azure role based access control built in roles owner if you don t have subscription level permissions you must be granted rbac https learn microsoft com azure role based access control built in roles role based access control administrator preview for an existing resource group and deploy to that existing group existing resource group your azure account also needs microsoft resources deployments write permissions on the subscription level azure deployment cost estimation pricing varies per region and usage so it isn t possible to predict exact costs for your usage however you can try the azure pricing calculator https azure com e 8ffbe5b1919c4c72aed89b022294df76 for the resources below azure app service basic tier with 1 cpu core 1 75 gb ram pricing per hour pricing https azure microsoft com pricing details app service linux azure openai standard tier chatgpt and ada models pricing per 1k tokens used and at least 1k tokens are used per question pricing https azure microsoft com en us pricing details cognitive services openai service form recognizer so standard tier using pre built layout pricing per document page sample documents have 261 pages total pricing https azure microsoft com pricing details form recognizer azure cognitive search standard tier 1 replica free level of semantic search pricing per hour pricing https azure microsoft com pricing details search azure blob storage standard tier with zrs zone redundant storage pricing per storage and read operations pricing https azure microsoft com pricing details storage blobs azure monitor pay as you go tier costs based on data ingested pricing https azure microsoft com pricing details monitor to reduce costs you can switch to free skus for azure app service and form recognizer by changing the parameters file under the infra folder there are some limits to consider for example the free form recognizer resource only analyzes the first 2 pages of each document you can also reduce costs associated with the form recognizer by reducing the number of documents in the data folder or by removing the postprovision hook in azure yaml that runs the prepdocs py script to avoid unnecessary costs remember to take down your app if it s no longer in use either by deleting the resource group in the portal or running azd down project setup you have a few options for setting up this project the easiest way to get started is github codespaces since it will setup all the tools for you but you can also set it up locally local environment if desired github codespaces you can run this repo virtually by using github codespaces which will open a web based vs code in your browser open in github codespaces https img shields io static v1 style for the badge label github codespaces message open color brightgreen logo github https github com codespaces new hide repo select true ref main repo 599293758 machine standardlinux32gb devcontainer path devcontainer 2fdevcontainer json location westus2 vs code dev containers a related option is vs code dev containers which will open the project in your local vs code using the dev containers extension https marketplace visualstudio com items itemname ms vscode remote remote containers 1 start docker desktop install it if not already installed 1 open the project open in dev containers https img shields io static v1 style for the badge label dev 20containers message open color blue logo visualstudiocode https vscode dev redirect url vscode ms vscode remote remote containers cloneinvolume url https github com azure samples azure search openai demo 1 in the vs code window that opens once the project files show up this may take several minutes open a terminal window 1 run azd auth login 1 now you can follow the instructions in deploying from scratch deploying from scratch below local environment first install the required tools azure developer cli https aka ms azure dev install python 3 9 3 10 or 3 11 https www python org downloads important python and the pip package manager must be in the path in windows for the setup scripts to work important ensure you can run python version from console on ubuntu you might need to run sudo apt install python is python3 to link python to python3 node js 14 https nodejs org en download git https git scm com downloads powershell 7 pwsh https github com powershell powershell for windows users only important ensure you can run pwsh exe from a powershell terminal if this fails you likely need to upgrade powershell then bring down the project code 1 create a new folder and switch to it in the terminal 1 run azd auth login 1 run azd init t azure search openai demo note that this command will initialize a git repository and you do not need to clone this repository deploying from scratch execute the following command if you don t have any pre existing azure services and want to start from a fresh deployment 1 run azd up this will provision azure resources and deploy this sample to those resources including building the search index based on the files found in the data folder important beware that the resources created by this command will incur immediate costs primarily from the cognitive search resource these resources may accrue costs even if you interrupt the command before it is fully executed you can run azd down or delete the resources manually to avoid unnecessary spending you will be prompted to select two locations one for the majority of resources and one for the openai resource which is currently a short list that location list is based on the openai model availability table https learn microsoft com azure cognitive services openai concepts models model summary table and region availability and may become outdated as availability changes 1 after the application has been successfully deployed you will see a url printed to the console click that url to interact with the application in your browser it will look like the following output from running azd up assets endpoint png note it may take 5 10 minutes for the application to be fully deployed if you see a python developer welcome screen or an error page then wait a bit and refresh the page deploying with existing azure resources if you already have existing azure resources you can re use those by setting azd environment values existing resource group 1 run azd env set azure resource group name of existing resource group 1 run azd env set azure location location of existing resource group existing openai resource azure openai 1 run azd env set azure openai service name of existing openai service 1 run azd env set azure openai resource group name of existing resource group that openai service is provisioned to 1 run azd env set azure openai chatgpt deployment name of existing chatgpt deployment only needed if your chatgpt deployment is not the default chat 1 run azd env set azure openai emb deployment name of existing gpt embedding deployment only needed if your embeddings deployment is not the default embedding when you run azd up after and are prompted to select a value for openairesourcegrouplocation make sure to select the same location as the existing openai resource group openai com openai 1 run azd env set openai host openai 2 run azd env set openai organization your openai organization 3 run azd env set openai api key your openai api key 4 run azd up you can retrieve your openai key by checking your user page https platform openai com account api keys and your organization by navigating to your organization page https platform openai com account org settings learn more about creating an openai free trial at this link https openai com pricing do not check your key into source control when you run azd up after and are prompted to select a value for openairesourcegrouplocation you can select any location as it will not be used existing azure cognitive search resource 1 run azd env set azure search service name of existing azure cognitive search service 1 run azd env set azure search service resource group name of existing resource group with acs service 1 if that resource group is in a different location than the one you ll pick for the azd up step then run azd env set azure search service location location of existing service 1 if the search service s sku is not standard then run azd env set azure search service sku name of sku the free tier won t work as it doesn t support managed identity see other possible values https learn microsoft com azure templates microsoft search searchservices pivots deployment language bicep sku 1 if you have an existing index that is set up with all the expected fields then run azd env set azure search index name of existing index otherwise the azd up command will create a new index you can also customize the search service new or existing for non english searches 1 to configure the language of the search query to a value other than en us run azd env set azure search query language name of query language see other possible values https learn microsoft com python api azure search documents azure search documents models querylanguage view azure python preview 1 to turn off the spell checker run azd env set azure search query speller none see other possible values https learn microsoft com python api azure search documents azure search documents models queryspellertype view azure python preview 1 to configure the name of the analyzer to use for a searchable text field to a value other than en microsoft run azd env set azure search analyzer name name of analyzer name see other possible values https learn microsoft com dotnet api microsoft azure search models field analyzer view azure dotnet legacy viewfallbackfrom azure dotnet other existing azure resources you can also use existing form recognizer and storage accounts see infra main parameters json for list of environment variables to pass to azd env set to configure those existing resources provision remaining resources now you can run azd up following the steps in deploying from scratch deploying from scratch above that will both provision resources and deploy the code deploying again if you ve only changed the backend frontend code in the app folder then you don t need to re provision the azure resources you can just run azd deploy if you ve changed the infrastructure files infra folder or azure yaml then you ll need to re provision the azure resources you can do that by running azd up sharing environments to give someone else access to a completely deployed and existing environment either you or they can follow these steps 1 install the azure cli https learn microsoft com cli azure install azure cli 1 run azd init t azure search openai demo or clone this repository 1 run azd env refresh e environment name they will need the azd environment name subscription id and location to run this command you can find those values in your azure env name env file this will populate their azd environment s env file with all the settings needed to run the app locally 1 set the environment variable azure principal id either in that env file or in the active shell to their azure id which they can get with az ad signed in user show 1 run scripts roles ps1 or scripts roles sh to assign all of the necessary roles to the user if they do not have the necessary permission to create roles in the subscription then you may need to run this script for them once the script runs they should be able to run the app locally enabling optional features enabling application insights to enable application insights and the tracing of each request along with the logging of errors set the azure use application insights variable to true before running azd up 1 run azd env set azure use application insights true 1 run azd up to see the performance data go to the application insights resource in your resource group click on the investigate performance blade and navigate to any http request to see the timing data to inspect the performance of chat requests use the drill into samples button to see end to end traces of all the api calls made for any chat request tracing screenshot docs transaction tracing png to see any exceptions and server errors navigate to the investigate failures blade and use the filtering tools to locate a specific exception you can see python stack traces on the right hand side enabling authentication by default the deployed azure web app will have no authentication or access restrictions enabled meaning anyone with routable network access to the web app can chat with your indexed data you can require authentication to your azure active directory by following the add app authentication https learn microsoft com azure app service scenario secure app authentication app service tutorial and set it up against the deployed web app to then limit access to a specific set of users or groups you can follow the steps from restrict your azure ad app to a set of users https learn microsoft com azure active directory develop howto restrict your app to a set of users by changing assignment required option under the enterprise application and then assigning users groups access users not granted explicit access will receive the error message aadsts50105 your administrator has configured the application app name to block users unless they are specifically granted assigned access to the application enabling login and document level access control by default the deployed azure web app allows users to chat with all your indexed data you can enable an optional login system using azure active directory to restrict access to indexed data based on the logged in user enable the optional login and document level access control system by following this guide loginandaclsetup md enabling cors for an alternate frontend by default the deployed azure web app will only allow requests from the same origin to enable cors for a frontend hosted on a different origin run 1 run azd env set allowed origin https your domain com 2 run azd up for the frontend code change backend uri in api ts to point at the deployed backend url so that all fetch requests will be sent to the deployed backend running locally you can only run locally after having successfully run the azd up command if you haven t yet follow the steps in azure deployment azure deployment above 1 run azd auth login 2 change dir to app 3 run start ps1 or start sh or run the vs code task start app to start the project locally using the app in azure navigate to the azure webapp deployed by azd the url is printed out when azd completes as endpoint or you can find it in the azure portal running locally navigate to 127 0 0 1 50505 once in the web app try different topics in chat or q a context for chat try follow up questions clarifications ask to simplify or elaborate on answer etc explore citations and sources click on settings to try different options tweak prompts etc productionizing this sample is designed to be a starting point for your own production application but you should do a thorough review of the security and performance before deploying to production here are some things to consider openai capacity the default tpm tokens per minute is set to 30k that is equivalent to approximately 30 conversations per minute assuming 1k per user message response you can increase the capacity by changing the chatgptdeploymentcapacity and embeddingdeploymentcapacity parameters in infra main bicep to your account s maximum capacity you can also view the quotas tab in azure openai studio https oai azure com to understand how much capacity you have azure storage the default storage account uses the standard lrs sku to improve your resiliency we recommend using standard zrs for production deployments which you can specify using the sku property under the storage module in infra main bicep azure cognitive search the default search service uses the standard sku with the free semantic search option which gives you 1000 free queries a month assuming your app will experience more than 1000 questions you should either change semanticsearch to standard or disable semantic search entirely in the app backend approaches files if you see errors about search service capacity being exceeded you may find it helpful to increase the number of replicas by changing replicacount in infra core search search services bicep or manually scaling it from the azure portal azure app service the default app service plan uses the basic sku with 1 cpu core and 1 75 gb ram we recommend using a premium level sku starting with 1 cpu core you can use auto scaling rules or scheduled scaling rules and scale up the maximum minimum based on load authentication by default the deployed app is publicly accessible we recommend restricting access to authenticated users see enabling authentication enabling authentication above for how to enable authentication networking we recommend deploying inside a virtual network if the app is only for internal enterprise use use a private dns zone also consider using azure api management apim for firewalls and other forms of protection for more details read azure openai landing zone reference architecture https techcommunity microsoft com t5 azure architecture blog azure openai landing zone reference architecture ba p 3882102 loadtesting we recommend running a loadtest for your expected number of users you can use the locust tool https docs locust io with the locustfile py in this sample or set up a loadtest with azure load testing resources revolutionize your enterprise data with chatgpt next gen apps w azure openai and cognitive search https aka ms entgptsearchblog azure cognitive search https learn microsoft com azure search search what is azure search azure openai service https learn microsoft com azure cognitive services openai overview comparing azure openai and openai https learn microsoft com en gb azure cognitive services openai overview comparing azure openai and openai clean up to clean up all the resources created by this sample 1 run azd down 2 when asked if you are sure you want to continue enter y 3 when asked if you want to permanently delete the resources enter y the resource group and all the resources will be deleted note note the pdf documents used in this demo contain information generated using a language model azure openai service the information contained in these documents is only for demonstration purposes and does not reflect the opinions or beliefs of microsoft microsoft makes no representations or warranties of any kind express or implied about the completeness accuracy reliability suitability or availability with respect to the information contained in this document all rights reserved to microsoft faq details a id ingestion why chunk a summary why do we need to break up the pdfs into chunks when azure cognitive search supports searching large documents summary chunking allows us to limit the amount of information we send to openai due to token limits by breaking up the content it allows us to easily find potential chunks of text that we can inject into openai the method of chunking we use leverages a sliding window of text such that sentences that end one chunk will start the next this allows us to reduce the chance of losing the context of the text details details a id ingestion more pdfs a summary how can we upload additional pdfs without redeploying everything summary to upload more pdfs put them in the data folder and run scripts prepdocs sh or scripts prepdocs ps1 to avoid reuploading existing docs move them out of the data folder you could also implement checks to see whats been uploaded before our code doesn t yet have such checks details details a id compare samples a summary how does this sample compare to other chat with your data samples summary another popular repository for this use case is here https github com microsoft sample app aoai chatgpt that repository is designed for use by customers using azure openai studio and azure portal for setup it also includes azd support for folks who want to deploy it completely from scratch the primary differences this repository includes multiple rag retrieval augmented generation approaches that chain the results of multiple api calls to azure openai and acs together in different ways the other repository uses only the built in data sources option for the chatcompletions api which uses a rag approach on the specified acs index that should work for most uses but if you needed more flexibility this sample may be a better option this repository is also a bit more experimental in other ways since it s not tied to the azure openai studio like the other repository feature comparison feature azure search openai demo sample app aoai chatgpt rag approach multiple approaches only via chatcompletion api data sources vector support yes yes data ingestion yes pdf yes pdf txt md html persistent chat history no browser tab only yes in cosmosdb technology comparison tech azure search openai demo sample app aoai chatgpt frontend react react backend python quart python flask vector db azure cognitive search azure cognitive search deployment azure developer cli azd azure portal az azd details details a id switch gpt4 a summary how do you use gpt 4 with this sample summary in infra main bicep change chatgptmodelname to gpt 4 instead of gpt 35 turbo you may also need to adjust the capacity above that line depending on how much tpm your account is allowed details details a id chat ask diff a summary what is the difference between the chat and ask tabs summary the chat tab uses the approach programmed in chatreadretrieveread py https github com azure samples azure search openai demo blob main app backend approaches chatreadretrieveread py it uses the chatgpt api to turn the user question into a good search query it queries azure cognitive search for search results for that query optionally using the vector embeddings for that query it then combines the search results and original user question and asks chatgpt api to answer the question based on the sources it includes the last 4k of message history as well or however many tokens are allowed by the deployed model the ask tab uses the approach programmed in retrievethenread py https github com azure samples azure search openai demo blob main app backend approaches retrievethenread py it queries azure cognitive search for search results for the user question optionally using the vector embeddings for that question it then combines the search results and user question and asks chatgpt api to answer the question based on the sources details details a id azd up explanation a summary what does the azd up command do summary the azd up command comes from the azure developer cli https learn microsoft com en us azure developer azure developer cli overview and takes care of both provisioning the azure resources and deploying code to the selected azure hosts the azd up command uses the azure yaml file combined with the infrastructure as code bicep files in the infra folder the azure yaml file for this project declares several hooks for the prepackage step and postprovision steps the up command first runs the prepackage hook which installs node dependencies and builds the react js based javascript files it then packages all the code both frontend and backend into a zip file which it will deploy later next it provisions the resources based on main bicep and main parameters json at that point since there is no default value for the openai resource location it asks you to pick a location from a short list of available regions then it will send requests to azure to provision all the required resources with everything provisioned it runs the postprovision hook to process the local data and add it to an azure cognitive search index finally it looks at azure yaml to determine the azure host appservice in this case and uploads the zip to azure app service the azd up command is now complete but it may take another 5 10 minutes for the app service app to be fully available and working especially for the initial deploy related commands are azd provision for just provisioning if infra files change and azd deploy for just deploying updated app code details details a id appservice logs a summary how can we view logs from the app service app summary you can view production logs in the portal using either the log stream or by downloading the default docker log file from advanced tools the following line of code in app backend app py configures the logging level python logging basicconfig level os getenv app log level default level to change the default level either change default level or set the app log level environment variable to one of the allowed log levels https docs python org 3 library logging html logging levels debug info warning error critical if you need to log in a route handler use the the global variable current app s logger python async def chat current app logger info received chat request otherwise use the logging module s root logger python logging info system message s system message if you re having troubles finding the logs in app service see this blog post on tips for debugging app service app deployments http blog pamelafox org 2023 06 tips for debugging flask deployments to html or watch this video about viewing app service logs https www youtube com watch v f0 ayuvws54 details troubleshooting here are the most common failure scenarios and solutions 1 the subscription azure subscription id doesn t have access to the azure openai service please ensure azure subscription id matches the id specified in the openai access request process https aka ms oai access 1 you re attempting to create resources in regions not enabled for azure openai e g east us 2 instead of east us or where the model you re trying to use isn t enabled see this matrix of model availability https aka ms oai models 1 you ve exceeded a quota most often number of resources per region see this article on quotas and limits https aka ms oai quotas 1 you re getting same resource name not allowed conflicts that s likely because you ve run the sample multiple times and deleted the resources you ve been creating each time but are forgetting to purge them azure keeps resources for 48 hours unless you purge from soft delete see this article on purging resources https learn microsoft com azure cognitive services manage resources tabs azure portal purge a deleted resource 1 you see certificate verify failed when the prepdocs py script runs that s typically due to incorrect ssl certificates setup on your machine try the suggestions in this stackoverflow answer https stackoverflow com questions 35569042 ssl certificate verify failed with python3 43855394 43855394 1 after running azd up and visiting the website you see a 404 not found in the browser wait 10 minutes and try again as it might be still starting up then try running azd deploy and wait again if you still encounter errors with the deployed app consult these tips for debugging app service app deployments http blog pamelafox org 2023 06 tips for debugging flask deployments to html or watch this video about downloading app service logs https www youtube com watch v f0 ayuvws54 please file an issue if the logs don t help you resolve the error
azure azurecognitivesearch chatgpt openai azureopenai azd-templates
ai
FreeRTOS-Kernel-Partner-Supported-Ports
freertos partner supported ports this repository contains freertos ports supported by freertos partners follow the steps below to contribute a freertos port to this repository 1 write the freertos port for your compiler and architecture 2 create a project in the freertos partner supported demos repository https github com freertos freertos partner supported demos tree main for your hardware for running tests as mentioned here https github com freertos freertos blob main freertos demo thirdparty template readme md 3 make sure all the tests pass add the test results in the pull request description 4 add a readme file with the following information 1 how to use this port 2 link to the test project created in step 2 3 any other relevant information 5 raise a pr to merge the freertos port 6 raise another pr to merge the test project in the freertos partner supported demos repository https github com freertos freertos tree main freertos demo thirdparty community supported license this repository contains multiple directories each individually licensed please see the license file in each directory
os
rise-node
rise node version 1 2 0 rise build status https travis ci org risevision rise node svg branch development https travis ci org risevision rise node coverage status https coveralls io repos github risevision rise node badge svg branch development https coveralls io github risevision rise node branch development installation an automatic install script for ubuntu is available wiki for detailed information on node installation and management please refer to the wiki of this repository https github com risevision rise node wiki quick start rise installation mainnet first perform some basic checks make sure not to run as root or with sudo your user will need sudo privileges though postgressql must not be installed on your server then go home cd home download the installer wget https raw githubusercontent com risevision rise build master scripts install sh install rise bash install sh install r mainnet u https downloads rise vision core mainnet latest tar gz the installer will start the node automatically if installation was successful optional fast sync from a snapshot cd rise wget https downloads rise vision snapshots mainnet latest o latestsnap gz manager sh restorebackup latestsnap gz basic node management installer will create a rise folder in your homedir make sure to cd to this dir when managing your node check the status of your node with manager sh status stop node with manager sh stop node insert your passphrase so you can forge nano etc node config json and change this section to include your passphrase fileloglevel error forging secret my secret access whitelist 127 0 0 1 and finally restart your node to apply the changes manager sh reload node quick start rise installation testnet the same as above only the install step is different bash install sh install r testnet u https downloads rise vision core testnet latest tar gz optional fast sync from a snapshot wget https downloads rise vision snapshots testnet latest o latestsnap gz manager sh restorebackup latestsnap gz authors andrea b vekexasia crypto gmail com jan lepetitjan icloud com mariusz serek mariusz serek net goldeneye shift team ralfs shift team joey shiftcurrency gmail com boris povod boris crypti me pavel nekrasov landgraf paul gmail com sebastian stupurac stupurac sebastian gmail com oliver beddows oliver lightcurve io isabella dell isabella lightcurve io marius serek mariusz serek net maciej baj maciej lightcurve io license copyright 2017 rise br copyright 2016 2017 shift br copyright 2016 2017 lisk foundation this program is free software you can redistribute it and or modify it under the terms of the gnu general public license as published by the free software foundation either version 3 of the license or at your option any later version this program is distributed in the hope that it will be useful but without any warranty without even the implied warranty of merchantability or fitness for a particular purpose see the gnu general public license for more details you should have received a copy of the gnu general public license https github com risevision rise node src master license along with this program if not see http www gnu org licenses this program also incorporates work previously released with lisk 0 7 0 and earlier versions under the mit license https opensource org licenses mit to comply with the requirements of that license the following permission notice applicable to those parts of the code only is included below copyright 2017 rise br copyright 2016 2017 shift br copyright 2016 2017 lisk foundation br copyright 2015 crypti permission is hereby granted free of charge to any person obtaining a copy of this software and associated documentation files the software to deal in the software without restriction including without limitation the rights to use copy modify merge publish distribute sublicense and or sell copies of the software and to permit persons to whom the software is furnished to do so subject to the following conditions the above copyright notice and this permission notice shall be included in all copies or substantial portions of the software the software is provided as is without warranty of any kind express or implied including but not limited to the warranties of merchantability fitness for a particular purpose and noninfringement in no event shall the authors or copyright holders be liable for any claim damages or other liability whether in an action of contract tort or otherwise arising from out of or in connection with the software or the use or other dealings in the software
blockchain rise typescript
blockchain
BlockChainVoting
blockchainvoting a blockchain based e voting system created as the final year project of shri bhagubhai mafatlal polytechnic teammates include me sayyam gada and charmee mehta the application is mit licensed build setup bash install dependencies npm install serve with hot reload at localhost 3000 npm start create your own b env b file and the file should contain bash email your email id password your password for email id install metamask extension https metamask io download html and make sure to have some ether to test the application locally ether can be fetched from rinkeby faucet https faucet rinkeby io note make sure to install node js v11 14 0 to make sure the app runs fine testing for other node versions is yet to be done mongodb must be working in background on localhost 27017 please star the repo if it helped you in any way tech stack solidity web3 for writing connecting the blockchain contract next js semantic ui react front end mongodb expressjs node js back end ipfs file storage for images screenshots of the app homepage of the application screenshots homepage png company registers logs in screenshots company login png company creates an election if not created screenshots create election png dashboard on successful election creation screenshots dashboard png list of candidates for the election here you can add candidates screenshots candidate list png candidate has been notified on the mail screenshots candidate registeration mail png list of voters for the election here you can add voters screenshots voterlist png voters have been sent their secure usernames and passwords on the mail screenshots voter registeration mail png voter login page screenshots voter login png successful voting scenario screenshots successful voting png unsuccessful voting scenario screenshots unsuccessful voting png notification to each candidate and voter for the winner of candidates screenshots winner candidate mail png
blockchain-voting blockchain-technology solidity blockchain election e-voting
blockchain
blockchain-certificates
blockchain certificates this project allows an institution to issue digital certificates it creates pdf certificate files or uses existing ones and issues a hash representing those files into the bitcoin network s blockchain now also supports litecoin more information on creating certificates https github com verifiable pdfs blockchain certificates blob master docs create certificates md more information on issuing existing certificates https github com verifiable pdfs blockchain certificates blob master docs issue certificates md more information on the cred protocol and it s implementation https github com verifiable pdfs blockchain certificates wiki
blockchain certificates credentials bitcoin pdf validation
blockchain
Payroll-System
hr easy mobile payroll app software engineering csc 430 this app is an extension to the web application we ve designed and developed in csc 430 software engineering at the college of staten island overview as part of our midterm and final project in our senior level course in software engineering with a team of five developers we ve managed to build a large scale payroll program in a form of a web app that uses a backend to process store edit delete and update user information this information is used to calculate medical benefits tax deductions and other things to compute employee payroll based on the specifications defined earlier in the semester the final output in the web app is generating a paycheck for the employee with all the calculations complete and only yielding the final salary as part of a team of developers using my skills in ios development i took the opportunity to create a lightweight mobile version of the web app to facilitate in the paying of employees in an easy to use interface that allows hr staff to pay individual employees based on data fetched off the server if employee was paid or not the app was designed using storyboards due to the time constraint of using a programmatic approach the app relies in a rest api written in php to talk to the database and fetch data off the server to then serve it via json objects this data is then converted to notable swift objects that are used by the app serve the data shown below div align center img src screenshots image1 png width 700 br span sample user to be paid span br br br img src screenshots image2 png width 700 br span sample user paid span div app details os ios 12 1 device used ipad air 2 languages swift 4 php json interface design and programming neftali samarey q morales neftali samarey
os
MOOC-Coursera-Advanced-Machine-Learning
advanced machine learning coursera mooc specialization national research university higher school of economics yandex img src logo png width 800 height 200 coursera webpage https www coursera org specializations aml syllabus this specialization gives an introduction to deep learning reinforcement learning natural language understanding computer vision and bayesian methods top kaggle machine learning practitioners and cern scientists will share their experience of solving real world problems and help you to fill the gaps between theory and practice upon completion of 7 courses you will be able to apply modern machine learning methods in enterprise and understand the caveats of real world data and settings you will master your skills by solving a wide variety of real world problems like image captioning and automatic game playing throughout the course projects you will gain the hands on experience of applying advanced machine learning techniques that provide the foundation to the current state of the art in ai br table of contents 1 introduction to deep learning certified completion x week 1 optimization x week 2 multilayer perceptron and introduction to tensorflow keras x week 3 convolutional neural networks x week 4 autoencoders and generative adversarial networks x week 5 recurrent neural networks x final project image captioning 2 how to win a data science competition learn from top kagglers certified completion x week 1 feature preprocessing and engineering x week 2 exploratory data analysis validation strategies and data leakages x week 3 metric optimization and advanced feature engineering i x week 4 hyperparameter optimization advanced feature engineering ii and ensembling x final project kaggle competition predict future sales 3 bayesian methods for machine learning certified completion x week 1 refresher on bayesian probability theory x week 2 expectation maximization algorithm and gaussian mixture models x week 3 variational inference and latent dirichlet allocation x week 4 markov chain monte carlo x week 5 bayesian neural networks and variational autoencoders x week 6 gaussian processes and bayesian optimization x final project forensics to generate images of suspects 4 natural language processing on hold x week 1 text classification with linear models x week 2 language modelling with probabilistic graphical models and neural networks x week 3 word embeddings and topic models x week 4 machine translation and sequence to sequence models final project stackoverflow task oriented chatbot 5 practical reinforcement learning certified completion x week 1 introduction to reinforcement learning x week 2 model based reinforcement learning dynamic programming x week 3 model free reinforcement learning sarsa monte carlo q learning x week 4 approximate and deep reinforcement learning deep q learning x week 5 policy gradient reinforcement learning x week 6 advanced topics on exploration and planning br future courses 6 addressing large hadron collider challenges by machine learning on hold 7 deep learning in computer vision on hold
ai
coursera-angular-js
front end javascript frameworks angularjs universidad cient fica y tecnol gica de hong kong this course concentrates mainly on javascript based front end frameworks and in particular angularjs the most popular among them we will review the model view controller mvc design pattern in the context of angularjs you will be introduced to various aspects of angularjs including two way data binding and angular directives and filters you will then be introduced to angular controllers and scopes ui routing and templates will then be reviewed finally we will look at angular modules and services single page application spa development using angular will also be explored you must have either completed the previous course in the specialization on bootstrap or have a working knowledge of bootstrap to be able to navigate this course at the end of this course you will be familiar with client side javascript frameworks and the mvc design pattern be able to implement single page applications in angularjs be able to use various angular features including directives filters controllers scope and routing be able to implement a functional front end web application using angularjs https www coursera org learn angular js
front_end
iot-predictive-analytics
equipment failure prediction using iot sensor data data science experience is now watson studio although some images in this code pattern may show the service as data science experience the steps and processes will still work this ibm pattern is intended for anyone who wants to experiment learn enhance and implement a new method for predicting equipment failure using iot sensor data sensors mounted on devices like iot devices automated manufacturing like robot arms process monitoring and control equipment etc collect and transmit data on a continuous basis which is time stamped the first step would be to identify if there is any substantial shift in the performance of the system using time series data generated by a single iot sensor for a detailed flow on this topic you can refer to the change point detection ibm pattern https developer ibm com code journey detect change points in iot sensor data once a change point is detected in one key operating parameter of the iot equipment then it makes sense to follow it up with a test to predict if this recent shift will result in a failure of an equipment this pattern is an end to end walk through of a prediction methodology that utilizes multivariate iot data to predict any failure of an equipment bivariate prediction algorithm logistic regression https simple wikipedia org wiki logistic regression is used to implement this prediction predictive packages in python 2 0 software is used in this pattern with sample sensor data loaded into the data science experience cloud all the intermediary steps are modularized and all code open sourced to enable developers to use modify the modules sub modules as they see fit for their specific application when you have completed this pattern you will understand how to read iot sensor data stored in the data base configure the features and target variables for prediction model split the multivariate data into train and test datasets by configuring the ratio train the model using logistic regression and measure the prediction accuracy score the test data and measure prediction accuracy evaluate the model s predictive performance further by computing a confusion matrix rerun experiments by changing the configuration parameters png doc images ipredict arch flow png steps 1 user signs up for ibm watson studio 2 user loads the sample iot sensor time series data to database 3 a configuration file holds all the key parameters for running the iot time series prediction algorithm 4 the prediction algorithm written in python 2 0 jupyter notebook uses the configuration parameters and sensor data from db 5 python notebook runs on spark in ibm watson studio to ensure performance and scalability 6 the outputs of the prediction algorithm is saved in object storage for consumption developers can reuse all components that support the above steps like 1 reading iot sensor data from db 2 function to split test and train datasets build logistic regression models score models compute accuracy metrics like confusion matrix 3 user configurable features and target variables for predicting equipment failures test and train data sets 4 computations of key statistics that help evaluate the predictive capability of the models 5 repeat the experiment by altering the configuration parameters by rerunning the models included components ibm watson studio https www ibm com cloud watson studio analyze data using python jupyter notebook and rstudio in a configured collaborative environment that includes ibm value adds such as managed spark db2 warehouse on cloud https console bluemix net catalog services db2 warehouse on cloud ibm db2 warehouse on cloud is a fully managed enterprise class cloud data warehouse service powered by ibm blu acceleration ibm cloud object storage https console ng bluemix net catalog services object storage cm sp dw bluemix code devcenter an ibm cloud service that provides an unstructured cloud data store to build and deliver cost effective apps and services with high reliability and fast speed to market featured technologies analytics https developer ibm com code technologies analytics cm ibmcode featured technologies analytics finding patterns in data to derive information data science https developer ibm com code technologies data science cm ibmcode featured technologies data science systems and scientific methods to analyze structured and unstructured data in order to extract knowledge and insights watch the video http img youtube com vi k8upyd3jufs 0 jpg https youtu be k8upyd3jufs steps follow these steps to setup and run this ibm code pattern the steps are described in detail below 1 sign up for watson studio 1 sign up for watson studio 2 create ibm cloud services 2 create ibm cloud services 3 create the jupyter notebook 3 create the jupyter notebook 4 add the data and configuraton file 4 add the data and configuration file 5 run the notebook 5 run the notebook 6 view the results 6 view the results 1 sign up for watson studio sign up for ibm s watson studio https dataplatform cloud ibm com by signing up for watson studio an object storage service will be created in your ibm cloud account png doc images ipredict dsx experience create png 2 create ibm cloud services 2 1 download sample data download the sample data file https github com ibm iot predictive analytics blob master data iot sensor dataset csv from github and store it in your a local folder this will be used to upload to database in the next steps once you are familiar with the entire flow of this pattern you can use your own data for analysis but ensure that your data format is exactly same as provided in the sample data file 2 2 create a db2 warehouse on ibm cloud if you are not already familiar with how to create access data from data store in watson studio get yourself familiarised by following this documentation add data to project https datascience ibm com docs content manage data add data project html topics related to data creation and access that will be specifically helpful in this pattern are as below create connections to databases https datascience ibm com docs content manage data dw08 html load and access data in a notebook https datascience ibm com docs content analyze data load and access data html linkinpage true i click on db2 warehouse on cloud service in the ibm cloud dashboard click open to launch the dashboard db2 warehouse on cloud https console bluemix net catalog services db2 warehouse on cloud png doc images ipredict db2 whse oncloud png note data will loaded into a db2 database instead of reading directly from the csv file this is done to ensure end to end consistency of solution architecture when combined with other iot ibm patterns ii choose an appropriate name for the db2 warehouse service name and choose free pricing plan click on create png doc images ipredict db2 service create png iii click on db2 warehouse on cloud instance on ibm cloud dashboard you must be able to see the db2 warehouse service you created in the previous step click on the service name from the list once you are in the service details page click on open button png doc images ipredict db2 object storage png iv load data which is downloaded in step 5 2 1 into a db2 warehouse table by selecting the sample data from my computer browse files png doc images ipredict db2 browse file png v click on next from the panel choose schema and then create a new table png doc images ipredict db2 create table1 png the screenshot above shows dash100002 as the schema name select an appropriate schema name for which you have read write access it is important to specify the name of the db2 table as iot sensor data as it will be referred in data science experience to read data from in later steps 2 3 create db2 warehouse connection in watson studio we need to link the data we just uploaded into the db2 warehouse database with watson studio in order to run the analysis below are the steps to add a connection to access the data in watson studio python jupyter notebook i navigate to watson studio project viewall project pick your project ii choose data services connections menu iii click on the create connection button png doc images ipredict db2 create conn1 png iv give a name for your watson studio data connection v choose service instance as the name of the db2 warehouse service name you created earlier and click create png doc images ipredict db2 create conn2 png vi navigate back to project viewall project pick your project vii click on the find and add data icon 1010 on top right viii click on connection tab the check box next to the db2 warehouse data connection you just created and click apply ix now the new connection is added to your watson studio iot predictive project png doc images ipredict db2 create conn3 png 3 create the jupyter notebook first create a new project in watson studio follow the detailed steps provided in the ibm online documentation for watson studio project creation https datascience ibm com docs content analyze data creating notebooks html or watch a video on using watson studio to create a project https youtu be qsttejchtl0 in watson studio http dataplatform ibm com use the menu on the top to select projects and then default project click on add notebooks upper right to create a notebook select the from url tab enter a name for the notebook optionally enter a description for the notebook enter this notebook url https github com ibm iot predictive analytics blob master notebook watson iotfailure prediction ipynb select the free anaconda runtime click the create button upload the sample json txt watson studio configuration file to watson studio object storage from url below https github com ibm iot predictive analytics blob master configuration iotpredict config json https github com ibm iot predictive analytics blob master configuration iotpredict config txt to upload these files in watson studio object storage go to my projects your project name click on the find and add data icon on top ribbon select the file and upload one by one png doc images ipredict upload file sample png now you must be able to see the uploaded files listed under my projects your project name assets tab png doc images ipredict dsx fileassets png 4 add the data and configuraton file fix up configuration parameter json file name and values go to the notebook in watson studio by navigating to my projects iot predictive under assets tab under notebooks section you will find the notebook you just imported click on the click to edit and lock icon to edit the notebook in jupyter notebook in watson studio for more details on creating editing and sharing notebooks in ibm watson studio refer to notebooks watson studio documentation https datascience ibm com docs content analyze data notebooks parent html you can now update the variables that refer to the json configuration file name in the r jupyter notebook this step is necessary only if you had changed the name of the sample json configuration file you had uploaded earlier for any reason png doc images ipredict set json filename png the default json configuration file you uploaded earlier works without any changes with the sample data supplied but if you have a data file with different column names and wanted to customise the model to use these column names you can do so below are the steps to configure the json configuration file to train the predictive models using your custom data file 1 download the json configuration file https github com ibm iot predictive analytics blob master configuration iotpredict config json to your computer local folder 2 open a local copy of the json file in text editor like notepad and edit the watson studio configuration json file https github com ibm iot predictive analytics blob master configuration iotpredict config json 3 update the paramvalue only underlined in red in image below to suit your requirements and save the json file retain the rest of the format and composition of the json file 4 delete the copy of iotpredict config json in watson studio data store if one is already uploaded by you earlier 5 now upload your local edited copy of iotpredict config json by following the steps in section 5 3 above png doc images ipredict json file sample png the descriptions of the parameters that can be configured are as below i features list of variable names that are independent x variables for prediction ii target target variable name that needs to be predicted y with values in binary 1 or 0 form with 1 indicating a failure iii data size percentage of sample data to be reserved for testing in decimal form example 0 7 indicates 70 of the data will be used for training the model and 30 will be used as test data the cell 3 1 2 of the jupyter notebook has a function definition which is shown for illustration purposes these details that have user specific security details are striked out in the screenshots shown below this function will need to be recreated with your user specific access credentials ang target data object in order to do that first delete all pre existing code in cell 3 1 2 of the notebook note the pynb file that you imported have code with dummy credentials for illustration purposes this needs to be replaced by your user specific function with your own access credentials the steps below explain that in section 3 1 2 of jupyter notebook not this readme file insert replace your own object storage file credentials to read the iotpredict config txt configuration file png doc images ipredict insert jsonconn png this step will auto generate a function that reads the data followed by a call to the function as below 1 def get object storage file with credentials alphanumeric characters container filename 2 data 1 get object storage file with credentials alphanumeric characters iotpredictive iotpredict config txt rename the function by removing the alphanumeric characters to get object storage file with credentials container filename delete the second part that calls the function and reads the data this is done elsewhere in the code you have imported go to section 3 2 cell in 7 and do the following 1 update the name of the function in section 3 2 of the jupyter notebook also to get object storage file with credentials 2 the container name used in the sample code is iotpredictive change the container name if it is different for you png doc images ipredict insert filecreds png 3 update the second parameter iotpredict config txt in sample code to v sampleconfigfilename the modified code in this cell should look like below inputfo get object storage file with credentials iotpredictive v sampleconfigfilename d json load inputfo refer to screen shot above for details for more details revisit the documentation help links provided in beginning of section 5 2 2 add the data and configuration to the notebook use find and add data look for the 10 01 icon and its connections tab you must be able to see your database connection created earlier from there you can click insert to code under the data connection list and add ibm dbr code with connection credentials to the flow png doc images ipredict insert dataconn png note if you don t have your own data and configuration files you can reuse our example in the read iot sensor data from database section look in the data iot sensor dataset csv directory for data file png doc images ipredict insert read data func png 5 run the notebook when a notebook is executed what is actually happening is that each code cell in the notebook is executed in order from top to bottom each code cell is selectable and is preceded by a tag in the left margin the tag format is in x depending on the state of the notebook the x can be a blank this indicates that the cell has never been executed a number this number represents the relative order this code step was executed a this indicates that the cell is currently executing there are several ways to execute the code cells in your notebook one cell at a time select the cell and then press the play button in the toolbar batch mode in sequential order from the cell menu bar there are several options available for example you can run all cells in your notebook or you can run all below that will start executing from the first cell under the currently selected cell and then continue executing all cells that follow at a scheduled time press the schedule button located in the top right section of your notebook panel here you can schedule your notebook to be executed once at some future time or repeatedly at your specified interval 6 view the results the notebook outputs the results in the notebook which can be copied to clipboard the training model prediction accuracy is output in section 5 2 the overall prediction accuracy is output as a percentage png doc images ipredict train model png if you are satisfied with the training model accuracy you can proceed further for scoring the test data using the trained model and analyze the results the confusion matrix is computed on the results of the testing for a dep dive understanding of the model performance png doc images ipredict confusion matrix png overall accuracy percentage gives the overall prediction performance of the model sensitivity and specificity of the model is also calculated along with absolute values of false positives and false negatives to give the data scientist analyst an idea of predictive accuracy in the model it can be checked if these are within thresholds for the specific application of the model or iot equipment troubleshooting see debugging md debugging md license this code pattern is licensed under the apache software license version 2 separate third party code objects invoked within this code pattern are licensed by their respective providers pursuant to their own separate licenses contributions are subject to the developer certificate of origin version 1 1 dco https developercertificate org and the apache software license version 2 http www apache org licenses license 2 0 txt apache software license asl faq http www apache org foundation license faq html whatdoesitmean
ibmcode sensor-data data-science jupyter-notebook prediction-model iot
server
My_roomate
my roomate https i imgur com weszog4 png sobre la app my roomate es una app para estudiantes trabajadores o personas que busquen un roomie para compartir renta ya sea por que tienen poco presupuesta para vivir en un buen lugar quieren ahorra o conocer nueva gente colabora en el desarrollo si quieres colaborar clonar el repo y has un pullrequest para contactare git clone https github com megajjks my roomate git testea nuestro mvp tu retroalimentaci n es muy importan para nosotros prueba nuestra app en esta liga https drive google com open id 1mms avb psbuwb7dpflenotsggvlcbk4 https drive google com open id 1mms avb psbuwb7dpflenotsggvlcbk4 app nota la app esta modo building as que tienes que darle permiso para instalarlo ya que no esta en la app
front_end
iot-hub-device-update
what is device update for iot hub device update for iot hub is a service that enables you to deploy over the air updates ota for your iot devices device update for iot hub is an end to end platform that customers can use to publish distribute and manage over the air updates for everything from tiny sensors to gateway level devices device update for iot hub also provides controls on how to manage the deployment updates so you are always in control of when and how devices are updated device update for iot hub also provides reporting capabilities so you are always up to date on the state of your devices via integration with iot hub device update for iot hub features provide a powerful and flexible experience including update management ux integrated with azure iot hub gradual update rollout through device grouping and update scheduling controls programmatic apis to enable automation and custom portal experiences at a glance update compliance and status views across heterogenous device fleets support for resilient device updates a b to deliver seamless rollback subscription and role based access controls available through the azure com portal on premise content cache and nested edge support to enable updating cloud disconnected devices detailed update management and reporting tools reference agent build status ubuntu 18 04 amd64 ubuntu 18 04 build status https dev azure com azure device update adu linux client apis build status azure iot hub device update branchname main https dev azure com azure device update adu linux client build latest definitionid 27 branchname main getting started device update for iot hub https aka ms iot hub device update docs getting started with device update agent docs agent reference
server
machine
codecov https codecov io gh sillsdev machine graph badge svg token jssqlmzucu https codecov io gh sillsdev machine machine for net machine is a natural language processing library it is specifically focused on providing tools and techniques that are useful for processing languages that are very resource poor the library is also useful as a foundation for building more advanced language processing techniques the library currently only provides a basic set of algorithms but the goal is to include many more in the future features features installation installation tutorials tutorials features translation machine provides a set of translation engines it currently includes a smt engine based on a fork of thot https github com sillsdev thot and a rule based engine based on the hermitcrab morhphological parser word alignment machine provides implementations of many common statistical word alignment models such as ibm models 1 4 hmm and fastalign these models are implemented in the thot https github com sillsdev thot library morphology machine contains a rule based morphological phonological parser called hermitcrab feature structures machine provides a flexible implementation of feature structures with efficient unification subsumption and priority union operations feature values can be atomic symbols strings or variables annotations an annotation is a tagged portion of data with its associated metadata the metadata for an annotation is represented as a feature structure which is essentially a set of feature value pairs annotations can also be hierarchical an annotation can contain other annotations annotations are normally used on textual data but machine can support annotations on any type of data patterns machine contains a regex like pattern matching engine machine is different than most pattern matching engines which specify patterns that match strings of characters instead machine can specify patterns that match annotations on data an annotation describes the metadata for a part of the data data can be tagged in any way that is desired for example all the words in a document can be tagged with their part of speech because machine works on metadata instead of the underlying data it provides a very powerful flexible pattern matching capability that is difficult to duplicate with normal regular expressions machine compiles patterns in to a format that allows for efficient matching in most cases linear to the number of annotations on the input a pattern in machine supports many of the features that normal regular expressions support such as alternation repetition kleene star optionality capturing groups etc it does not support backtracking as mentioned earlier the patterns are not matched against characters but instead against feature structures since this is how annotations are represented machine does not check for exact matches between feature structures but uses an operation called unification unification is a way of combining two feature structures but only if they are compatible two feature structures are not compatible if they have contradictory values for the same feature an annotation matches a feature structure constraint in a pattern if the feature structures can be unified machine patterns handle matching of hierarchical annotations by searching for matches in a depth first manner patterns are represented as finite state automata fsa fsas provide a natural model for the type of regular languages that machine patterns represent in addition fsas can be determinized so that pattern matching can be performed efficiently rules machine also provides a rules module which can be used to specify rules for manipulating annotated data pattern rules provide a mechanism for modifying parts of data that match the specified pattern rule application behavior is specified as code pattern rules can be applied iteratively or simultaneously rules can be aggregated using rule batches and rule cascades rule batches can be used to apply a set of rules disjunctively rule cascades can be used to apply multiple rules in successive order statistical methods probability distributions machine includes various methods for estimating probability distributions from observed data the current discounting techniques include witten bell simple good turing maximum likelihood and lidstone n gram model machine includes a generic n gram model implementation the n gram model is smoothed using modified kneser ney smoothing clustering machine provides implementations of various clustering algorithms these include density based algorithms such as dbscan and optics and hierarchical algorithms such as upgma and neighbor joining sequence alignment pairwise pairwise sequence alignment is implemented using a dynamic programming approach similar to most common implementations of the levenshtein distance it supports substitution insertion deletion expansion and compression it also supports the following alignment modes global local half local and semi global multiple the implementation of multiple sequence alignment is based on the clustal w algorithm https www bimas cit nih gov clustalw clustalw html stemming machine provides an unsupervised stemming algorithm specifically designed for resource poor languages the stemmer is trained using a list of words either derived from a corpus or a lexicon the algorithm can also be used to identify possible affixes it is based on the unsupervised stemming algorithm proposed in harald hammarstr m s doctoral dissertation http aflat org files phd pdf installation machine is available as a set of nuget packages sil machine https www nuget org packages sil machine core library sil machine translation thot https www nuget org packages sil machine translation thot statistical machine translation and word alignment sil machine morphology hermitcrab https www nuget org packages sil machine morphology hermitcrab rule based morphological parsing sil machine webapi https www nuget org packages sil machine webapi asp net core web api middleware machine is also available as a command line tool that can be installed as a net tool dotnet tool install g sil machine tool tutorials if you would like to find out more about how to use machine check out the tutorial jupyter notebooks tokenization samples tokenization ipynb text corpora samples corpora ipynb word alignment samples word alignment ipynb machine translation samples machine translation ipynb development csharpier all c code should be formatted using csharpier https csharpier com the best way to enable support for csharpier is to install the appropriate ide extension https csharpier com docs editors and configure it to format on save development locally install mongodb 6 0 and mongodbcompass and run it on localhost 27017 create the following folders c var lib machine data c var lib machine machine set the following environment variables aspnetcore environment development open machine sln and debug the apiserver now you are running the complete environment where everything is being debugged and the mongodb is exposed develop with serval install https github com sillsdev serval in an adjacent folder follow the instructions in serval for develoment to debug machine and machine job together launch dockercomb in vscode
language-translation machine-translation natural-language-processing
ai
Mobile-App-Development-Project-Front-end
mobile app development project front end now available in youtube h3 https www youtube com watch v fwrgvingr54 t 4s h3 mad geeks 12345678 https github com se laps mobile app development project front end assets 87580847 613139e5 1a7f 4acc a0ca 85a579bbdf68 introduction our focused solution for a problem in our university is waiting que in the canteen we have 4 canteens in our university all the students get lunch break at 12 00p m when we go for a canteen at that time there is a huge line in the cashier to order the food so we come up with a solution for reduce the cashier que and increase the productive sales by introducing an online food ordering app our food ordering app have special features when we compared it with normal restaurant app only the people who have permission to enter the university premises can use the application we also focus on how to minimize food wastage in our canteens by launching this app all the students lecturers staff and everyone inside the university will be able to select a meal without walking far and order online if the university allows to add a delivery service we also can add a delivery function to our app as well with a high secured payment gateway all the users including students and lecturers can easily place through the app and pick their meals from canteens without wasting a second in a canteen que features user authentication to ensure that only university members can use the app user authentication will be implemented using university email addresses or other secure methods online food ordering users will be able to place food orders online through the app providing them with the convenience of ordering from anywhere on campus payment options the app will offer multiple payment options including debit card payments and the ability to pay when the food is delivered giving users flexibility and convenience real time menu updates canteens will have the capability to add update or remove dishes from the menu in real time through a secure admin panel users will receive notifications about these changes ensuring they always have access to the latest menu information feedback and suggestions users will have a dedicated section in the app to provide feedback and suggestions creating a direct line of communication between customers and canteen management this feedback can be used to improve the quality of service and menu offerings order tracking users can track the status of their orders in real time reducing wait times and uncertainty promotions and special offers canteens can use the app to promote special offers discounts and events to attract more customers benefits for university students apps for ordering food online are simple to use and make it simple for customers to browse menus choose dishes and place orders by using this mobile application to place a food order you can significantly reduce your waiting time in order to make sure customers are aware of the total cost before placing their orders apps frequently display menu prices discounts special dishes and total fees upfront customers can choose the top rated restaurants and foods by reading user generated reviews and ratings online food delivery services offer a variety of payment options including cash and credit debit cards to accommodate different customer preferences feedback from customers on their orders can help restaurants improve their offerings they can also contact customer service for support with any problems activity diagram activity drawio https github com se laps mobile app development project front end assets 97075043 c59bf9ec 0641 4f52 82f4 396cb41aa298 technology used the dine delish app is developed using the following technologies frontend flutter backend firebase team members lahiru senavirathna se laps https github com se laps project lead frontend backend developer thevindu ransara trsrathnayaka https github com trsrathnayaka ui ux designer frontend backend developer wasana muthumali muthumaliperera https github com muthumaliperera ui ux designer frontend developer prasitha samaarachchi prasitha7 https github com prasitha7 frontend developer oshadi savidya oshadisavidya https github com oshadisavidya frontend developer ishanki nipunika ishanki88 https github com ishanki88 frontend developer deshan narayana deshanbsn https github com deshanbsn frontend developer conclution our application nfcourt enables a platform for students lecturers and all the other nsbm users to order their meals without wasting time in a line nfcourt has a secured payment gateway so all the users can add their bank account details without any doubts by launching the app we hope to increase the sales of canteens and reduce the waiting que as one second is important to academic students and lecturers to more than waiting in a food line in first launch we will introduce features including an ai bot in the meantime we will add more advanced features with the feedbacks and suggestions that receives to the app from users thank you thank you for your interest in the dine delish mobile app we hope you find it valuable in enhancing your university experience if you have any quections feedback or suggestions please dont t hesitate to contact us if you like this project just click and share it with others
front_end
mini-project-ty
wceachievo mini project ii a platform for wce to showcase activities and achievements under the guidance of dr p k kharat contributors mr shah prajwal utkarsh 21610009 br ms patil yashashwi kailas 21610020 br ms umare shravani ashok 21610075 br ms ladda vaishnavi laxmikant 21610076 br ms bhosale swarada pravin 21610077 br mr choudhary gopal jalaram 21610080 br introduction introducing wceacheivo an innovative web based application aimed at developing how departments activities and achievements are showcased built on the robust mern stack wceacheivo streamlines the process of generating weekly reports that highlight new student accomplishments teacher achievements and departmental activities this platform serves as a dynamic bridge connecting students faculty and authorities in a cohesive and engaging manner
cloud
CVBookExercise
cvbookexercise my attempted answers to the exercises from programming computer vision with python by jan erik solem as i was not able to find examples to these exercises anywhere while reading through the book i thought it may be a good idea to share my answers even though it is not perfect that s all if it is actually a not good idea for any reason please let me know i will try to fix it although i have tried my best these answers may not be correct also they are not optimal either so don t use them as answers to check against remember it is done by someone who s just trying to learn this topic the scripts require external modules which are explained in the book please download them from the support page of the link http programmingcomputervision com some codes need test data not described in the book so i downloaded them from here and there as these files belong to someone else i did not include those files in this repository please download them from the url described in each file the files in jupyter notebook format they python version used is 2 7 because that s the version used in the book
ai
Machine-Learning-by-Andrew-Ng-in-Python
machine learning by andrew ng in python documenting my python implementation of andrew ng s machine learning course linear regression https github com benlau93 machine learning by andrew ng in python tree master linearregression br logistic regression https github com benlau93 machine learning by andrew ng in python tree master logisticregression br neural networks https github com benlau93 machine learning by andrew ng in python tree master neuralnetworks br bias vs variance https github com benlau93 machine learning by andrew ng in python tree master bias vs variance br support vector machines https github com benlau93 machine learning by andrew ng in python tree master supportvectormachines br unsupervised learning https github com benlau93 machine learning by andrew ng in python tree master kmeansclustering pca br anomaly detection https github com benlau93 machine learning by andrew ng in python tree master anomaly 20detection br
ai
NLP-Papers
nlp papers distributed word representations distributed word representations distributed sentence representations distributed sentence representations entity recognition sequence tagging entity recognition language model lm for pre training language model machine translation machine translation question answering machine reading comprehension question answering recommendation systems recommendation systems relation extraction relation extraction sentences matching natural language inference textual entailment sentences matching text classification sentiment classification text classification materials toolkits corpus materials papers and notes distributed word representations 2017 11 faruqui and dyer 2014 improving vector space word representations using multilingual correlation pdf http repository cmu edu lti 31 note distributed 20representations 2017 11 faruqui 20and 20dyer 20 202014 20 20improving 20vector 20space 20word 20representations 20using 20multilingual 20correlation note md maaten and hinton 2008 visualizing data using t sne pdf http www jmlr org papers v9 vandermaaten08a html pdf annotated distributed 20representations 2017 11 maaten 20and 20hinton 20 202008 20 20visualizing 20data 20using 20t sne maaten 20and 20hinton 20 202008 20 20visualizing 20data 20using 20t sne pdf note distributed 20representations 2017 11 maaten 20and 20hinton 20 202008 20 20visualizing 20data 20using 20t sne note md ling et al 2015 finding function in form compositional character models for open vocabulary word representation pdf https arxiv org abs 1508 02096 pdf annotated https github com llhthinker nlp papers blob master distributed 20representations 2017 11 finding 20function 20in 20form 20compositional 20character 20models finding 20function 20in 20form 20compositional 20character 20models pdf note https github com llhthinker nlp papers blob master distributed 20representations 2017 11 finding 20function 20in 20form 20compositional 20character 20models note md bojanowski et al 2016 enriching word vectors with subword information pdf https arxiv org abs 1607 04606 pdf annotated https github com llhthinker nlp papers blob master distributed 20representations 2017 11 enriching 20word 20vectors 20with 20subword 20information enriching 20word 20vectors 20with 20subword 20information pdf note https github com llhthinker nlp papers blob master distributed 20representations 2017 11 enriching 20word 20vectors 20with 20subword 20information note md 2017 12 bengio and sen cal 2003 quick training of probabilistic neural nets by importance sampling pdf http www iro umontreal ca lisa pointeurs senecal aistats2003 pdf pdf annotated https github com llhthinker nlp papers blob master distributed 20representations 2017 12 quick 20training 20of 20probabilistic 20neural 20nets 20by 20importance 20sampling quick 20training 20of 20probabilistic 20neural 20nets 20by 20importance 20sampling pdf note https github com llhthinker nlp papers blob master distributed 20representations 2017 12 quick 20training 20of 20probabilistic 20neural 20nets 20by 20importance 20sampling note md references word2vec tensorflow https github com llhthinker udacity deeplearning blob master 5 word2vec ipynb subword based word vector https github com facebookresearch fasttext chinese word vectors https github com embedding chinese word vectors tencent ai lab embedding corpus for over 8 million chinese words and phrases https ai tencent com ailab nlp en embedding html distributed sentence representations 2017 11 le and mikolov 2014 distributed representations of sentences and documents pdf http proceedings mlr press v32 le14 pdf pdf annotated https github com llhthinker nlp papers blob master distributed 20representations 2017 11 distributed 20representations 20of 20sentences 20and 20documents distributed 20representations 20of 20sentences 20and 20documents pdf note https github com llhthinker nlp papers blob master distributed 20representations 2017 11 distributed 20representations 20of 20sentences 20and 20documents note md 2018 12 li and hovy 2014 a model of coherence based on distributed sentence representation pdf http www aclweb org anthology d14 1218 pdf annotated https github com llhthinker nlp papers blob master distributed 20representations sentence embedding a 20model 20of 20coherence 20based 20on 20distributed 20sentence 20representation pdf note https github com llhthinker nlp papers blob master distributed 20representations sentence embedding note md a model of coherence based on distributed sentence representation kiros et al 2015 skip thought vectors pdf http papers nips cc paper 5950 skip thought vectors pdf annotated https github com llhthinker nlp papers blob master distributed 20representations sentence embedding skip thought 20vectors pdf note https github com llhthinker nlp papers blob master distributed 20representations sentence embedding note md skip thought vectors hill et al 2016 learning distributed representations of sentences from unlabelled data pdf https arxiv org abs 1602 03483 pdf annotated https github com llhthinker nlp papers blob master distributed 20representations sentence embedding learning 20distributed 20representations 20of 20sentences 20from 20unlabelled 20data pdf note https github com llhthinker nlp papers blob master distributed 20representations sentence embedding note md learning distributed representations of sentences from unlabelled data arora et al 2016 a simple but tough to beat baseline for sentence embeddings pdf https openreview net forum id syk00v5xx pdf annotated https github com llhthinker nlp papers blob master distributed 20representations sentence embedding a 20simple 20but 20tough to beat 20baseline 20for 20sentence 20embeddings pdf note https github com llhthinker nlp papers blob master distributed 20representations sentence embedding note md a simple but tough to beat baseline for sentence embeddings pagliardini et al 2017 unsupervised learning of sentence embeddings using compositional n gram features sent2vec pdf https arxiv org abs 1703 02507 pdf annotated https github com llhthinker nlp papers blob master distributed 20representations sentence embedding unsupervised 20learning 20of 20sentence 20embeddings 20using 20compositional 20n gram 20features pdf note https github com llhthinker nlp papers blob master distributed 20representations sentence embedding note md unsupervised learning of sentence embeddings using compositional n gram features logeswaran et al 2018 an efficient framework for learning sentence representations quick thought vectors pdf https arxiv org abs 1803 02893 pdf annotated https github com llhthinker nlp papers blob master distributed 20representations sentence embedding an 20efficient 20framework 20for 20learning 20sentence 20representations pdf note https github com llhthinker nlp papers blob master distributed 20representations sentence embedding note md an efficient framework for learning sentence representations 2019 01 wieting et al 2015 towards universal paraphrastic sentence embeddings pdf https arxiv org abs 1511 08198 pdf annotated https github com llhthinker nlp papers blob master distributed 20representations sentence embedding towards 20universal 20paraphrastic 20sentence 20embeddings pdf note https github com llhthinker nlp papers blob master distributed 20representations sentence embedding note md towards universal paraphrastic sentence embeddings adi et al 2016 fine grained analysis of sentence embeddings using auxiliary prediction tasks pdf https arxiv org abs 1608 04207 pdf annotated https github com llhthinker nlp papers blob master distributed 20representations sentence embedding fine grained 20analysis 20of 20sentence 20embeddings 20using 20auxiliary 20prediction 20tasks pdf note https github com llhthinker nlp papers blob master distributed 20representations sentence embedding note md fine grained analysis of sentence embeddings using auxiliary prediction tasks conneau et al 2017 supervised learning of universal sentence representations from natural language inference data infersent pdf https arxiv org abs 1705 02364 pdf annotated https github com llhthinker nlp papers blob master distributed 20representations sentence embedding supervised 20learning 20of 20universal 20sentence 20representations 20from 20natural 20language 20inference 20data pdf note https github com llhthinker nlp papers blob master distributed 20representations sentence embedding note md supervised learning of universal sentence representations from natural language inference data cer et al 2018 universal sentence encoder pdf https arxiv org abs 1803 11175 pdf annotated https github com llhthinker nlp papers blob master distributed 20representations sentence embedding universal 20sentence 20encoder pdf note https github com llhthinker nlp papers blob master distributed 20representations sentence embedding note md universal sentence encoder references awesome sentence embedding a curated list of pretrained sentence and word embedding models https github com separius awesome sentence embedding senteval evaluation toolkit for sentence embeddings https github com facebookresearch senteval doc2vec gensim https github com jhlau doc2vec skip thought vectors https github com tensorflow models tree master research skip thoughts sif sentence embedding by smooth inverse frequency weighting scheme https github com princetonml sif quick thought vectors https github com lajanugen s2v sent2vec https github com epfml sent2vec infersent https github com facebookresearch infersent entity recognition 2018 10 lample et al 2016 neural architectures for named entity recognition pdf https arxiv org abs 1603 01360 ma and hovy 2016 end to end sequence labeling via bi directional lstm cnns crf pdf https arxiv org abs 1603 01354 yang et al 2017 transfer learning for sequence tagging with hierarchical recurrent networks pdf https arxiv org abs 1703 06345 peters et al 2017 semi supervised sequence tagging with bidirectional language models pdf https arxiv org abs 1705 00108 shang et al 2018 learning named entity tagger using domain specific dictionary pdf https arxiv org abs 1809 03599 references chinesener tensorflow https github com zjy ucas chinesener flair pytorch https github com zalandoresearch flair language model 2017 11 bengio et al 2003 a neural probabilistic language model pdf http www jmlr org papers v3 bengio03a html press and wolf 2016 using the output embedding to improve language model pdf https arxiv org abs 1608 05859 2019 02 peters et al 2018 deep contextualized word representations elmo pdf https arxiv org abs 1802 05365 note https zhuanlan zhihu com p 38254332 howard and ruder 2018 universal language model fine tuning for text classification ulmfit pdf http www aclweb org anthology p18 1031 radford et al 2018 improving language understanding by generative pre training pdf https www cs ubc ca amuham01 ling530 papers radford2018improving pdf devlin et al 2018 bert pre training of deep bidirectional transformers for language understanding pdf https arxiv org abs 1810 04805 references blog the illustrated bert elmo and co how nlp cracked transfer learning http jalammar github io illustrated bert elmo elmo allennlp https allennlp org elmo pre trained elmo representations for many languages https github com hit scir elmoformanylangs quick start training an imdb sentiment model with ulmfit https docs fast ai text html quick start training an imdb sentiment model with ulmfit finetune transformer lm code and model for the paper improving language understanding by generative pre training https github com openai finetune transformer lm bert google research bert https github com google research bert officical tensorflow code and pre trained models for bert huggingface transformers https github com huggingface transformers provides state of the art general purpose architectures bert gpt 2 roberta xlm distilbert xlnet ctrl for nlu and nlg using tensorflow 2 0 and pytorch awesome bert https github com jiakui awesome bert bert nlp papers applications and github resources bert github machine translation 2017 12 oda et al 2017 neural machine translation via binary code predict pdf https arxiv org abs 1704 06918 note machine 20translation oda 20et 20al 20 202017 20 20neural 20machine 20translation 20via 20binary 20code 20prediction note md kalchbrenner et al 2016 neural machine translation in linear time pdf https arxiv org abs 1610 10099 pdf annotated machine 20translation kalchbrenner 20et 20al 20 202016 20 20neural 20machine 20translation 20in 20linear 20time kalchbrenner 20et 20al 20 202016 20 20neural 20machine 20translation 20in 20linear 20time pdf note machine 20translation kalchbrenner 20et 20al 20 202016 20 20neural 20machine 20translation 20in 20linear 20time note md 2018 05 sutskever et al 2014 sequence to sequence learning with neural networks pdf http papers nips cc paper 5346 sequence to sequence learning with neural cho et al 2014 learning phrase representations using rnn encoder decoder for nmt pdf https arxiv org abs 1406 1078 bahdanau et al 2014 nmt by jointly learning to align and translate pdf https arxiv org abs 1409 0473 luong et al 2015 effective approaches to attention based nmt pdf https arxiv org abs 1508 04025 2018 06 gehring et al 2017 convolutional sequence to sequence learning pdf https arxiv org abs 1705 03122 vaswani et al 2017 attention is all you need pdf https arxiv org abs 1706 03762 note1 the illustrated transformer http jalammar github io illustrated transformer note2 the annotated transformer http nlp seas harvard edu 2018 04 03 attention html references opennmt py in pytorch https github com opennmt opennmt py nmt in tensorflow https github com tensorflow nmt mt reading list https github com thunlp mt mt reading list question answering 2018 03 wang and jiang 2016 machine comprehension using match lstm and answer pointer pdf https arxiv org abs 1608 07905 seo et al 2016 bidirectional attention flow for machine comprehension pdf https arxiv org abs 1611 01603 cui et al 2016 attention over attention neural networks for reading comprehension pdf https arxiv org abs 1607 04423 2018 04 clark and gardner 2017 simple and effective multi paragraph reading comprehension pdf https arxiv org abs 1710 10723 wang et al 2017 gated self matching networks for reading comprehension and question answering pdf http www aclweb org anthology p17 1018 yu et al 2018 qanet combining local convolution with global self attention for reading comprehension pdf https arxiv org abs 1804 09541 references dureader https github com baidu dureader squad https rajpurkar github io squad explorer ms marco http www msmarco org leaders aspx https zhuanlan zhihu com p 22671467 rcpapers must read papers on machine reading comprehension https github com thunlp rcpapers recommendation systems 2019 05 rendle s 2010 factorization machines pdf https ieeexplore ieee org stamp stamp jsp tp arnumber 5694074 note https www cnblogs com pinard p 6370127 html cheng et al 2016 wide deep learning for recommender systems pdf https dl acm org citation cfm id 2988454 guo et al 2017 deepfm a factorization machine based neural network for ctr prediction pdf https arxiv org abs 1703 04247 he and chua 2017 neural factorization machines for sparse predictive analytics pdf https dl acm org citation cfm id 3080777 references 10 ctr https zhuanlan zhihu com p 63186101 relation extraction 2018 08 mintz et al 2009 distant supervision for relation extraction without labeled data pdf https dl acm org citation cfm id 1690287 zeng et al 2015 distant supervision for relation extraction via piecewise convolutional neural networks pdf http www aclweb org anthology d15 1203 zhou et al 2016 attention based bidirectional long short term memory networks for relation classification pdf http www aclweb org anthology p16 2034 lin et al 2016 neural relation extraction with selective attention over instances pdf http www aclweb org anthology p16 1200 2018 09 ji et al 2017 distant supervision for relation extraction with sentence level attention and entity descriptions pdf http www aaai org ocs index php aaai aaai17 paper download 14491 14078 levy et al 2017 zero shot relation extraction via reading comprehension pdf https arxiv org abs 1706 04115 references opennre https github com thunlp opennre nrepapers must read papers on neural relation extraction nre https github com thunlp nrepapers awesome relation extraction https github com roomylee awesome relation extraction entity relation extraction https github com yuanxiaosc entity relation extraction sentences matching 2017 12 hu et al 2014 convolutional neural network architectures for matching natural language sentences pdf https papers nips cc paper 5550 convolutional neural network architectures for matching natural language sentences pdf pdf annotated https github com llhthinker nlp papers blob master sentences 20matching 2017 12 20convolutional 20matching 20model 20 convolutional 20neural 20network 20architectures 20for 20matching 20natural 20language 20sentences pdf note https github com llhthinker nlp papers blob master sentences 20matching 2017 12 convolutional 20matching 20model note md 2018 07 nie and bansal 2017 shortcut stacked sentence encoders for multi domain inference pdf https arxiv org abs 1708 02312 note sentences 20matching note md shortcut stacked sentence encoders for multi domain inference wang et al 2017 bilateral multi perspective matching for natural language sentences pdf https arxiv org abs 1702 03814 note sentences 20matching note md bilateral multi perspective matching for natural language sentences tay et al 2017 a compare propagate architecture with alignment factorization for natural language inference pdf https arxiv org abs 1801 00102 chen et al 2017 enhanced lstm for natural language inference pdf https arxiv org abs 1609 06038 note sentences 20matching note md enhanced lstm for natural language inference ghaeini et al 2018 dr bilstm dependent reading bidirectional lstm for natural language inference pdf https arxiv org abs 1802 05577 references the stanford natural language inference snli corpus https nlp stanford edu projects snli a curated list of papers dedicated to neural text semantic matching https github com ntmc community awesome neural models for semantic match matchzoo tensorflow https github com ntmc community matchzoo pytorch https github com ntmc community matchzoo py anyq faq simnet https github com baidu anyq kaggle quora question pairs https github com houjp kaggle quora question pairs text classification 2017 09 joulin et al 2016 bag of tricks for efficient text classification pdf https arxiv org abs 1607 01759v3 pdf annotated https github com llhthinker nlp papers blob master text 20classification 2017 09 bag 20of 20tricks 20for 20efficient 20text 20classification bag 20of 20tricks 20for 20efficient 20text 20classification pdf note https github com llhthinker nlp papers blob master text 20classification 2017 09 bag 20of 20tricks 20for 20efficient 20text 20classification note md 2017 10 kim 2014 convolutional neural networks for sentence classification pdf https arxiv org abs 1408 5882 pdf annotated https github com llhthinker nlp papers blob master text 20classification 2017 10 convolutional 20neural 20networks 20for 20sentence 20classification convolutional 20neural 20networks 20for 20sentence 20classification pdf note https github com llhthinker nlp papers blob master text 20classification 2017 10 convolutional 20neural 20networks 20for 20sentence 20classification note md zhang and wallace 2015 a sensitivity analysis of and practitioners guide to convolutional neural networks for sentence classification pdf https arxiv org abs 1510 03820 pdf annotated https github com llhthinker nlp papers blob master text 20classification 2017 10 a 20sensitivity 20analysis 20of 20 and 20practitioners e2 80 99 20guide 20to 20convolutional a 20sensitivity 20analysis 20of 20 and 20practitioners e2 80 99 20guide 20to 20convolutional pdf note https github com llhthinker nlp papers blob master text 20classification 2017 10 a 20sensitivity 20analysis 20of 20 and 20practitioners e2 80 99 20guide 20to 20convolutional note md zhang et al 2015 character level convolutional networks for text classification pdf http papers nips cc paper 5782 character level convolutional networks fo pdf annotated https github com llhthinker nlp papers blob master text 20classification 2017 10 character level 20convolutional 20networks 20for 20text 20classification character level 20convolutional 20networks 20for 20text 20classification pdf note https github com llhthinker nlp papers blob master text 20classification 2017 10 character level 20convolutional 20networks 20for 20text 20classification note md lai et al 2015 recurrent convolutional neural networks for text classification pdf http www nlpr ia ac cn cip liukang liukangpagefile recurrent 20convolutional 20neural 20networks 20for 20text 20classification pdf pdf annotated https github com llhthinker nlp papers blob master text 20classification 2017 10 recurrent 20convolutional 20neural 20networks 20for 20text 20classification recurrent 20convolutional 20neural 20networks 20for 20text 20classification pdf note https github com llhthinker nlp papers blob master text 20classification 2017 10 recurrent 20convolutional 20neural 20networks 20for 20text 20classification note md yang et al 2016 hierarchical attention networks for document classification pdf https www aclweb org anthology n16 1174 2017 11 iyyer et al 2015 deep unordered composition rivals syntactic methods for text classification pdf http www aclweb org anthology p15 1162 pdf annotated https github com llhthinker nlp papers blob master text 20classification 2017 11 deep 20unordered 20composition 20rivals 20syntactic 20methods 20for 20text 20classification deep 20unordered 20composition 20rivals 20syntactic 20methods 20for 20text 20classification pdf note https github com llhthinker nlp papers blob master text 20classification 2017 11 deep 20unordered 20composition 20rivals 20syntactic 20methods 20for 20text 20classification note md 2019 04 aspect level sentiment classification wang et al 2016 attention based lstm for aspect level sentiment classification pdf https www aclweb org anthology d16 1058 tang et al 2016 aspect level sentiment classification with deep memory network pdf https arxiv org abs 1605 08900 chen et al 2017 recurrent attention network on memory for aspect sentiment analysis pdf https www aclweb org anthology d17 1047 xue and li 2018 aspect based sentiment analysis with gated convolutional networks pdf https arxiv org abs 1805 07043 references fasttext https github com facebookresearch fasttext text classification https github com brightmart text classification pytorchtext https github com chenyuntc pytorchtext materials neural networks for nlp cs11 747 spring 2019 cmu http www phontron com class nn4nlp2019 deep learning an overview of gradient descent optimization algorithms http ruder io optimizing gradient descent focal loss https zhuanlan zhihu com p 49981234 nlp progress https github com sebastianruder nlp progress awesome chinese nlp https github com crownpku awesome chinese nlp stateoftheart ai https www stateoftheart ai funnlp https github com fighting41love funnlp stanfordnlp official stanford nlp python library for many human languages https github com stanfordnlp stanfordnlp browse state of the art https paperswithcode com sota corpus chinesenlpcorpus https github com insanelife chinesenlpcorpus nlp chinese corpus https github com brightmart nlp chinese corpus clue https github com cluebenchmark clue
nlp deep-learning
ai
ph-ee-operations-web
payment hub ee ui ui component for the payment hub ee application this project is based on the openmf web app to provide the same ux as we have for fineract 1 x getting started 1 ensure you have the following installed in your system git https git scm com downloads npm https nodejs org en download 2 install angular cli https github com angular angular cli globally npm install g angular cli 16 2 3 3 clone the project locally into your system git clone https github com openmf ph ee operations web git 4 cd into project root directory and make sure you are on the master branch 5 install the dependencies npm install 6 before to run the app set the environment variables as you need it please see the environment variable details above 7 to preview the app run the following command and navigate to http localhost 4200 ng serve the application is using the demo server with basic authentication by default the credentials for the same are username mifos password password development server run ng serve for a dev server navigate to http localhost 4200 the app will automatically reload if you change any of the source files code scaffolding run ng generate component component name to generate a new component you can also use ng generate directive pipe service class guard interface enum module build run ng build to build the project the build artifacts will be stored in the dist directory use the prod flag for a production build environment variables you can set the parameters now using environment variables please modify them accordingly your needs serverurl authserverurl etc the environment variables to be set are ph ops backend server url setting for the payment hub server url to operations backend services ph vou backend server url setting for the payment hub server url to vouchers backend services ph act backend server url setting for the payment hub server url to account management backend services ph platform tenant id setting for the platform tenant identifier used in the apis calls default value phdefault ph oauth enabled boolean value to enable or disable the oauth authentication ph oauth server url setting for the server url to oauth services ph oauth basic auth boolean value to enable or disable the basic authentication for oauth ph oauth basic auth token setting the authentication token for oauth authentication ph oauth type set the oauth authentication type currently is only supported keycloak ph oauth realm for the oauth authentication with keycloak we need to define the realm ph oauth client id for the oauth authentication with keycloak we need to define the client identifier to be used ph oauth client secret for the oauth authentication with keycloak we could to define the client secret to be used ph default language setting for languages i18n still under development default language to be used by default en english us ph supported languages language list of available languages splited by colon like en fr es profiles there are 3 profiles at the moment dev default environment ts prod environment prod ts kubernetes environment kubernetes ts you can define various settings based on these profiles usage npm build server configuration prod kubernetes to build the application with the kubernetes profile npm build configuration kubernetes docker compose it is possible to do a one touch installation of mifos x web app using containers aka docker fineract now packs the mifos community app web ui in it s docker deploy as prerequisites you must have docker and docker compose installed on your machine see docker install https docs docker com install and docker compose install https docs docker com compose install now to run a new mifosx web app instance you can simply 1 git clone https github com openmf ph ee operations web git cd ph ee operations web 1 for windows use git clone https github com openmf ph ee operations web git config core autocrlf input cd ph ee operations web 2 docker compose up d 3 access the webapp on http localhost 4200 in your browser mocked backend to use mocked responses please do the following modifications authentication service change login logincontext logincontext this alertservice alert type authentication start message please wait this rememberme logincontext remember this storage this rememberme localstorage sessionstorage this authenticationinterceptor settenantid logincontext tenant let httpparams new httpparams httpparams httpparams set username logincontext username httpparams httpparams set password logincontext password httpparams httpparams set tenantidentifier logincontext tenant if environment oauth enabled true httpparams httpparams set grant type password if environment oauth basicauth this authenticationinterceptor setauthorization basic environment oauth basicauthtoken return this http disableapiprefix post environment oauth serverurl oauth token params httpparams pipe map tokenresponse oauth2token todo fix userdetails api this storage setitem this oauthtokendetailsstoragekey json stringify tokenresponse this onloginsuccess username logincontext username accesstoken tokenresponse access token authenticated true tenantid logincontext tenant as any return of true else return this http post authentication params httpparams pipe map credentials credentials this onloginsuccess credentials return of true to login logincontext logincontext this alertservice alert type authentication start message please wait this rememberme logincontext remember this storage this rememberme localstorage sessionstorage this authenticationinterceptor settenantid logincontext tenant let httpparams new httpparams httpparams httpparams set username logincontext username httpparams httpparams set password logincontext password httpparams httpparams set tenantidentifier logincontext tenant this onloginsuccess as any return of true auto
front_end
Flutter-Tutorial-1.3-WebView-Navigation-Controls-Javascript-communication
flutter tutorial 1 3 webview navigation controls javascript communication introduction welcome to himdeve development where we are preparing the best tutorials to make your mobile app development easier and more efficient goal 1 we add navigation buttons go back and forward in your browsing history and refresh button to reload the page 2 in addition we will learn how to communicate directly with the javascript page loaded in webview how to write and call javascript for webview in flutter tutorial en https himdeve com flutter tutorials flutter tutorial 1 3 webview navigation controls javascript communication tutorial sk cz https himdeve com flutter sk cz tutorialy flutter tutorial 1 3 webview navigation controls javascript communication
front_end
SpamClassifierOfficeHours
spamclassifierofficehours the code from my codementor office hours on an introduction to machine learning and natural language processing prerequisites nltk weka http www cs waikato ac nz ml weka to extract the features run python feature extract py this will generate an arff file to be opened in weka i encourage all of you to fork this repo and add your own features to see if you can imporove our results feel free to get in contact with me via codementor https www codementor io benjamincohen with any questions comments or concerns thanks for attending my office hours ben
ai
Designing-Advanced-Data-Architectures-for-Business-Intelligence
designing advanced data architectures for business intelligence every assignemnt has it s own readme for detailed explanation assignment1 sakila database description 1 formulated advanced sql queries on sakila database 2 implemented data visualizations on power bi assignment 2 chinook database description 1 converted er model to dimensional model listed facts dimensions made a list of tables to be combined created date and calendar dimensions applied scd s slowly changing dimension on respective tables created tables with surrogate sks nks fks determined table attributes and performed source to target mappings 2 created data model in er studio data architect 3 generated ddl created tables and uploaded data in the target database viz sql server mysql postgresql and oracle 11g 4 implemented visualization reports and dashboards in power bi assignment 3 nypd description 1 created dimensional data model using er studio data architect 2 generated ddl and created schema in nypd database mysql 3 performed data profiling and cleansing processes 4 loaded data into data model from respective csv s using talend 5 implemented visualization reports in power bi and tableau using nypd database project1 adventure works purchasing dw description 1 analyzed data for source target mappings and performed data profiling on adventure works 2017 database 2 created dimensional data model in er studio data architect and performed ddl operations on the target database 3 implemented error handling and inserted the unwanted data into reject tables along with their respective reject codes and reject reasons 4 performed data cleansing and loaded data into adventure works purchasing dw using alteryx and talend 5 implemented visualization reports and interactive dashboards on power bi and tableau answering business questions to gain better insights final project imdb data and analysis description 1 designed dimensional data models in er studio data architect 2 created staging tables using sql scripts on microsoft sql server 3 maintained sor system of record table to maintain authorized sources for a particular data subject 4 performed data profiling and loaded data from tsv csv files into the staging tables using talend 5 used these staging tables to populate the dimensional tables using talend 6 implemented dashboards and visualizations on tableau and power bi for answering business questions and to gain better insights
cloud
Blockchain
blockchain a simple blockchain by hawshemi live website https blockchain up railway app video demo https youtu be ieposdoyipg this project submitted at cs50 final project website features registration and logins with password hashing password validation check handle errors with codes check the blockchain integrity make a transaction on the blockchain review the transaction history table ability to export to xls ability to reset the blockchain logout the blockchain stores the block information on a json file read the json and view to blocks validate the blocks using their hash store the json and blocks in a database database and tables the database consists of 3 tables the users containing userid username and hashed password the sequence that stores the number of users and transactions the transcaction table contains all block information including number receiver sender amount timestamp and transaction id users can see the blockchain integrity on the main page after login but they can only see their transactions in the history tab tech stack the site was built using flask python sqlite3 for the database css for styling and some javascript for button actions exports and background particles how to run this repo is hosted on blockchain up railway app https blockchain up railway app for local run 0 clone the github project 1 go to main py and at the end of the file edit if name main app run debug true host 0 0 0 0 port os getenv port default 5000 to if name main app run debug true 2 create a virtual environment of python and then run pip install r requirements txt and then python main py how this works first we have a blockchain folder which contains the blocks json the blocks are created in the block py file def write block the genesis block is file number 1 in the folder do not delete this file for hashing it uses the hashlib library hashlib md5 content hexdigest when a new block is created it generates a hash and then the old hash is compared to the new one for an integrity check in main py the flask app would run and all the routing for html for more information of how blockchians work https www investopedia com terms b blockchain asp
blockchain cs50x flask python webapp
blockchain
Image-Filtering-Project-With-Microservices
image filtering project with microservices cloud application developed alongside the udacity cloud engineering nanodegree it allows users to register and log into a web client post photos to the feed and process photos using an image filtering microservice the project is split into two parts 1 frontend angular web application built with ionic framework 2 backend restful api node express application getting started chosen architecture aws architecture screenshots architecture png prerequisite 1 the depends on the node package manager npm you will need to download and install node from https nodejs com en download https nodejs org en download this will allow you to be able to run npm commands 2 environment variables will need to be set these environment variables include database connection details that should not be hard coded into the application code 3 this project uses a continuous integration tool such as travis travis yml docker and kubernetes environment script set the config values for environment variables through kubernetes s secrets kubectl create secret generic prod db secret from literal username produser from literal password y4nys7f11 don t forget to update the deployment yaml specifying the environment variables declared as secrets database create a postgresql database either locally or on aws rds set the config values for environment variables through kubernetes s secrets kubectl create secret generic prod db secret from literal username produser from literal password y4nys7f11 don t forget to update the deployment yaml specifying the environment variables declared as secrets s3 create an aws s3 bucket set the config values for environment variables through kubernetes s secrets kubectl create secret generic prod db secret from literal username produser from literal password y4nys7f11 don t forget to update the deployment yaml specifying the environment variables declared as secrets backend api to download all the package dependencies run the command from the directory udagram api bash npm install to run the application locally run bash npm run dev you can visit http localhost 8080 api v0 feed in your web browser to verify that the application is running you should see a json payload feel free to play around with postman to test the api s frontend app to download all the package dependencies run the command from the directory udagram frontend bash npm install install ionic framework s command line tools for us to build and run the application bash npm install g ionic prepare your application by compiling them into static files bash ionic build run the application locally using files created from the ionic build command bash ionic serve you can visit http localhost 8100 in your web browser to verify that the application is running you should see a web interface
microservices cloud udacity instagram image filtering aws angular ionic kubernetes docker travis
cloud
BloomBot
bloombot this is a chat bot made with the bloom llm https huggingface co bigscience bloom checkout the live version here https scyberj github io bloombot
ai chatbot
ai
iotivity
the iotivity project iotivity is an open source software framework enabling seamless device to device connectivity to address the emerging needs of the internet of things the project is described on https iotivity org and https wiki iotivity org the iotivity project is licensed under the apache 2 0 license license md components used by the project are described in notice md notice md to become a contributor start by reading https www iotivity org get involved which describes project governance contribution guidelines and obtaining an account the master git location for iotivity projects is gated by an instance of the gerrit reviewing system https gerrit iotivity org such that pushing a change in git is intercepted by gerrit and presented as a review page the process of setting up and using gerrit for iotivity is documented in a pair of wiki pages https wiki iotivity org how to use gerrit https wiki iotivity org submitting to gerrit the issue tracker for the project lives at https jira iotivity org note if you are reading this file from a source other than the official repository https gerrit iotivity org gerrit gitweb p iotivity git a summary for example on the github mirror please be aware that work happens relative to the official location for example github pull requests and issue filings are very unlikely to be acted upon or even seen by project participants
server
uswds-sf-lightning-community
us web design system salesforce lightning community theme uswds v2 13 2 https img shields io badge uswds v2 13 2 252f3e style for the badge logo data 3aimage 2fsvg 2bxml 3bbase64 2cphn2zyb4bwxucz0iahr0cdovl3d3dy53my5vcmcvmjawmc9zdmciihdpzhropsiyotkiighlawdodd0imjgziib2awv3qm94psiwidagmjk5idi4myi 2bica8zybmawxspsjub25liibmawxslxj1bgu9imv2zw5vzgqipiagica8cg9sewdvbibmawxspsijodfbruzdiibwb2ludhm9iji5oc40nsaxnjkumzqyide5my43nzmgmty5ljmznsayndyumta0idguotq0ii8 2bicagidxwb2x5z29uigzpbgw9iim0rduyquyiihbvaw50cz0imjmzljg0ncawidiwms40osa5os4xmtygnjqunduxidailz4gicagphbvbhlnb24gzmlsbd0ii0vfnjaxrcigcg9pbnrzpsi0ljq1ocaxodmumjy5idg5lje0ocaxmjiumde2ide0ms41idi4mi40mdeilz4gicagphbvbhlnb24gzmlsbd0ii0y2qkq5qyigcg9pbnrzpsi1mi4zndygoc42mjqgmtm3ljaynya2os44odkgmcaxnjkumdizii8 2bicagidxwb2x5z29uigzpbgw9iinfnku2rtyiihbvaw50cz0imtu2ljgxnyayodiundc1ideync40nzygmtgzljm1ncayotmuodu5ide4my4zndmilz4gidwvzz48l3n2zz4 3d https github com uswds uswds code style prettier https img shields io badge code style prettier ff69b4 svg style for the badge https github com prettier prettier an implementation of the us web design system for salesforce lightning communities us web design system salesforce community template example img desktop png us web design system salesforce community theme example us web design system salesforce lightning community theme us web design system salesforce lightning community theme implementing this theme implementing this theme installation instructions installation instructions contributing contributing accessibility accessibility licenses and attribution licenses and attribution implementing this theme if you have not utilized the theme before deploying the demo community to a sandbox or scratch org is a great place to start the demo community is a full site implementation enabling you to see various features quickly in addition to the demo community a wiki https github com gsa uswds sf lightning community wiki is maintained to highlight components https github com gsa uswds sf lightning community wiki uswds components theme settings https github com gsa uswds sf lightning community wiki theme settings and more installation instructions see installation https github com gsa uswds sf lightning community blob master installation md contributing see contributing https github com gsa uswds sf lightning community blob master contributing md accessibility this template follows the us web design system markup as much as possible within the salesforce interface where deviations from the original are required salesforce lightning design system is sought as a first alternative licenses and attribution a few parts of this project are not in the public domain attribution and licensing information for those parts are described in detail in license md license md the rest of this project is in the worldwide public domain released under the cc0 1 0 universal public domain dedication https creativecommons org publicdomain zero 1 0
salesforce-lightning salesforce-community uswds gov salesforce salesforce-community-theme
os
AIO
all in one general multimodal large language model this reporsitory introduces a new visual information incorporation strategy referred to as recall mechanism for multimoldal language model what s more the model additionly supports all cv tasks including detection segementation and more with additional special tokens p align center img src figures flow png alt aio framework width 400 br sub em overview of the proposed aio framework em sub p as shown in the figure global visual information is first concatenated with textual embedding to provide a coarse cues when generate the token low level patches are recalled and contexted to generate the next token
ai
iavl
iavl tree version https img shields io github tag cosmos iavl svg https github com cosmos iavl releases latest license https img shields io github license cosmos iavl svg https github com cosmos iavl blob master license api reference https camo githubusercontent com 915b7be44ada53c290eb157634330494ebe3e30a 68747470733a2f2f676f646f632e6f72672f6769746875622e636f6d2f676f6c616e672f6764646f3f7374617475732e737667 https pkg go dev github com cosmos iavl lint https github com cosmos iavl workflows lint badge svg branch master test https github com cosmos iavl workflows test badge svg branch master discord chat https img shields io discord 669268347736686612 svg https discord gg cosmosnetwork note requires go 1 18 a versioned snapshottable immutable avl tree for persistent data benchmarks https dashboard bencher orijtech com graphs repo https 3a 2f 2fgithub com 2fcosmos 2fiavl git the purpose of this data structure is to provide persistent storage for key value pairs say to store account balances such that a deterministic merkle root hash can be computed the tree is balanced using a variant of the avl algorithm http en wikipedia org wiki avl tree so all operations are o log n nodes of this tree are immutable and indexed by their hash thus any node serves as an immutable snapshot which lets us stage uncommitted transactions from the mempool cheaply and we can instantly roll back to the last committed state to process transactions of a newly committed block which may not be the same set of transactions as those from the mempool in an avl tree the heights of the two child subtrees of any node differ by at most one whenever this condition is violated upon an update the tree is rebalanced by creating o log n new nodes that point to unmodified nodes of the old tree in the original avl algorithm inner nodes can also hold key value pairs the avl algorithm note the plus modifies the avl algorithm to keep all values on leaf nodes while only using branch nodes to store keys this simplifies the algorithm while keeping the merkle hash trail short in ethereum the analog is patricia tries http en wikipedia org wiki radix tree there are tradeoffs keys do not need to be hashed prior to insertion in iavl trees so this provides faster iteration in the key space which may benefit some applications the logic is simpler to implement requiring only two types of nodes inner nodes and leaf nodes on the other hand while iavl trees provide a deterministic merkle root hash it depends on the order of transactions in practice this shouldn t be a problem since you can efficiently encode the tree structure when serializing the tree contents iavl x cosmos sdk iavl db interface cosmos sdk v0 19 x https github com cosmos iavl tree release v0 19 x tm db https github com tendermint tm db v0 45 x v0 46 x v0 20 x https github com cosmos iavl tree release v0 20 x cometbft db https github com cometbft cometbft db v0 47 x v1 x x https github com cosmos iavl tree release v1 x x cosmos db https github com cosmos cosmos db note in the past a v0 21 x release was published but never used in production it was retracted to avoid confusion
merkle-tree blockchain cryptography
blockchain
IODTNepal
iodtnepal information technology for all
server
Mobile-App-Project
project title this app is designed to connect traditional doctor with users in a seamless and efficient way with this app users can filter for traditional doctors based on their rating and location they can then navigate through the possible options of a traditional doctors and schedule an appointment with just a few clicks the app also allows users to view traditional doctors s profiles and read rating from other patients to help them make an informed decision on the other hand traditional doctors can also see their requests for appointments this app provides a convenient and hassle free experience for both users and traditional doctors group members members name section id no yohannes desta 1 ugr 1364 13 surafel workayehu 1 ugr 9701 13 salahadin juhar 2 ugr 8613 13 betselot kidane 4 ugr 8473 13 fikremariam anteneh 4 ugr 9301 13
front_end
operationalize-machinelearning-service
operationalize machinelearning service cloud devops engineering nanodegree project 4 operationalize a machine learning microservice api using docker and kubernetes circleci https circleci com gh ifyy operationalize machinelearning service svg style svg https github com ifyy operationalize machinelearning service project summary in this project i applied the skills learnt in the course to operationalise a machine learning microservice api the ml model is a pre trained sklearn model that can predict housing prices in boston according to several features such as average rooms in a home and data about highway access teacher to pupil ratios and so on you can read more about the data which was initially taken from kaggle on the data source site https www kaggle com c boston housing the api is a python flask app app py that serves out predictions inference about housing prices through api calls this project could be extended to any pre trained machine learning model such as those for image recognition and data labeling setup the environment create a virtualenv and activate it make setup install the necessary dependencies make install running app py 1 standalone python app py 2 run in docker run docker sh 3 run in kubernetes run kubernetes sh kubernetes steps setup and configure docker locally setup and configure kubernetes locally create flask app in container run via kubectl
cloud
frontend-developer-joyofenergy
introducing joi energy joi energy is a new start up in the energy industry they provide their customers with smart meters that record their energy usage enabling them to save both money and the environment the smart meters also record the energy that a customer feeds back into the power grid via solar panels installed on their property you have been placed into the development team whose current goal is to build a dashboard which will display the information gathered from the smart meters unfortunately two members of the team are on annual leave and another one has called in sick you are left with another thoughtworker to progress with the current user stories this is your chance to make an impact on the business improve the code base and deliver value requirements the project requires node v14 16 1 https nodejs org en download or higher styling the application uses a utility css pattern from basscss https basscss com run the application console npm start the application will launch at http localhost 8080 run the tests console npm test typescript typescript support is out of box you can create ts files or import it into any of the existing files
front_end
web-development-course
web development course web development course jakub piskorowski on 26 08 2022 opis repozytorium web development course zawiera zbi r materia w wykorzystywanych w programie nauczania tworzenia stron internetowych od podstaw kurs zosta skupiony g wnie na tworzeniu funkcjonalno ci strony w j zyku javascript oraz php spis tre ci 1 frontend emsp 1 1 html emsp emsp 1 1 01 wprowadzenie do html 1 frontend 1 1 html 1 1 01 wprowadzenie html readme md emsp emsp 1 1 02 tabele i listy 1 frontend 1 1 html 1 1 02 tabele listy readme md emsp emsp 1 1 03 formatowanie tekstu 1 frontend 1 1 html 1 1 03 tekst obraz readme md emsp emsp 1 1 04 grafika i multimedia 1 frontend 1 1 html 1 1 04 grafika multimedia readme md emsp emsp 1 1 05 linki i odno niki w html 1 frontend 1 1 html 1 1 05 linki readme md emsp emsp 1 1 06 tworzenie formularzy emsp 1 2 css emsp 1 3 frameworki emsp emsp 1 3 01 bootstrap 1 frontend 1 3 frameworki 1 3 01 bootstrap readme md 2 backend emsp 2 1 javascript emsp emsp 2 1 01 charakterystyka j zyka javascript 2 backend 2 1 javascript 2 1 01 charakterystyka jezyka readme md emsp emsp 2 1 02 typy danych 2 backend 2 1 javascript 2 1 02 typy danych readme md emsp emsp 2 1 03 zasi g zmiennych 2 backend 2 1 javascript 2 1 03 zasieg zmiennych readme md emsp emsp 2 1 04 instrukcje steruj ce 2 backend 2 1 javascript 2 1 04 instrukcje sterujace readme md emsp emsp 2 1 05 okna dialogowe 2 backend 2 1 javascript 2 1 05 okna dialogowe readme md emsp emsp 2 1 06 p tle 2 backend 2 1 javascript 2 1 06 petle readme md emsp emsp 2 1 07 funkcje 2 backend 2 1 javascript 2 1 07 funkcje readme md emsp emsp 2 1 08 debugowanie kodu javascript emsp emsp 2 1 09 obs uga zdarze 2 backend 2 1 javascript 2 1 09 obsluga zdarzen readme md emsp emsp 2 1 10 obs uga formularzy 2 backend 2 1 javascript 2 1 10 obsluga formularzy readme md emsp emsp 2 1 11 wybrane pola i metody modelu dom cz 1 2 backend 2 1 javascript 2 1 11 dom cz1 readme md emsp emsp 2 1 12 wybrane pola i metody modelu dom cz 2 2 backend 2 1 javascript 2 1 12 dom cz2 readme md emsp 2 2 php emsp emsp 2 2 01 instalowanie pakiet w xamp 2 backend 2 2 php 2 2 01 instalowanie xamp readme md emsp emsp 2 2 02 sk adnia php 2 backend 2 2 php 2 2 02 skladnia php readme md emsp emsp 2 2 03 zmienne i operatory 2 backend 2 2 php 2 2 03 zmienne operatory readme md emsp emsp 2 2 04 wyra enia warunkowe 2 backend 2 2 php 2 2 04 wyrazenia warunkowe readme md emsp emsp 2 2 05 p tle 2 backend 2 2 php 2 2 05 petle readme md emsp emsp 2 2 06 tablice 2 backend 2 2 php 2 2 06 tablice readme md emsp emsp 2 2 07 zastosowanie funkcji do obs ugi tablic 2 backend 2 2 php 2 2 07 funkcje do tablic readme md emsp emsp 2 2 08 obs uga formularzy 2 backend 2 2 php 2 2 08 obsluga formularzy readme md emsp emsp 2 2 09 funkcje 2 backend 2 2 php 2 2 09 funkcje readme md emsp emsp 2 2 10 sesja i ciasteczka 2 backend 2 2 php 2 2 10 sesja ciasteczka readme md emsp emsp 2 2 11 po czenie z baz mysql za po rednictwem php 2 backend 2 2 php 2 2 11 mysql z php readme md emsp 2 3 testowanie emsp emsp 2 3 01 wst p do pisania test w jednostkowych 2 backend 2 3 tests 2 3 01 wstep testy readme md emsp emsp 2 3 02 phpunit instalacja pierwszy test 2 backend 2 3 tests 2 3 02 php unit instalacja readme md 3 przygotowanie strony do publikacji emsp 3 1 przygotowanie strony emsp emsp 3 1 01 walidacja i optymalizacja 3 przygotowanie publikacja 3 1 przygotowanie strony 3 1 01 walidacja optymalizacja readme md emsp emsp 3 1 02 zabezpieczanie witryny 3 przygotowanie publikacja 3 1 przygotowanie strony 3 1 02 zabezpieczanie witryny readme md emsp emsp 3 1 03 monitorowanie ruchu na stronie 3 przygotowanie publikacja 3 1 przygotowanie strony 3 1 03 monitorowanie ruchu readme md emsp 3 2 publikacja strony 3 przygotowanie publikacja 3 2 publikacja strony readme md
front_end
components
sanoma learning design system a design system with web components for the various products of sanoma learning getting started ensure you have the lts version of node js installed see https nodejs org en ensure if you are using windows to use the windows subsystem for linux wsl run yarn in the root of the project for all the dependencies to download install local development to launch a local version of the storybook deploy runs all storybooks at the same time yarn start watch you don t need to run a separate yarn build watch to build the components separately website to run the documentation website locally run yarn workspace sl design system website start site from the project root
design-system web-components
os
GCP-data-engineer-training-course
notes for data engineering on google cloud platform v1 1 this four day instructor led class provides participants a hands on introduction to designing and building data processing systems on google cloud platform through a combination of presentations demos and hand on labs participants will learn how to design data processing systems build end to end data pipelines analyze data and carry out machine learning the course covers structured unstructured and streaming data day 1 leveraging unstructured data module 1 introduction to cloud dataproc cloud dataproc automation helps you create clusters quickly manage them easily and save money by turning clusters off when you don t need them unstructured data accounts for 90 of enterprise data it s hard to analyzed even with google e g google has lots of street view data it contatins bunch of valuable information mapreduce split big data so each compute node process data local to it operating and adminitraing takes a lot of times dataproc ease hadoop management scaling takes less than 5 mins dataproc zone is important match your data location with your compute location data in google cloud storage gcs is replicated across zones so you can pick any zone within the region where your data resides but cross region cause performace issue standard ha mode for master node for worker node disk performance scale with size don t store input output data in hdfs you want to delete your cluster after your job done preemptible workers can be a good deal 50 cost reduction best practice is 50 50 of on demand and preemptible vm create dataproc via scripts console rest api e g shell gcloud dataproc clusters create my second cluster zone us central1 a master machine type n1 standard 1 master boot disk size 50 num workers 2 worker machine type n1 standard 1 worker boot disk size 50 delete clusters after your job finished shell gcloud dataproc clusters delete my second cluster labs execute pyspark in master node ssh to your master node and type pyspark command to open pyspark shell python data 0 1 2 3 4 5 range 6 distdata sc parallelize data squares distdata map lambda x x x res squares reduce lambda a b a b print res python import numpy as np data range 1000 distdata sc parallelize data terms distdata map lambda k 8 0 2 k 1 2 k 1 res np sqrt terms sum print res submitting pyspark jobs without copying anything code or data to the cluster input dog noir dog bree dog pickles dog sparky cat tom cat alley cat cleo frog kermit pig bacon pig babe dog gigi cat george frog hoppy pig tasty dog fred cat suzy code python usr bin env python from pyspark import sparkcontext sc sparkcontext local file sc textfile gs your bucket name unstructured lab2 input txt datalines file map lambda s s split map lambda x x 0 x 1 print datalines take 100 output u dog u noir u dog u bree u dog u pickles u dog u sparky u cat u tom u cat u alley u cat u cleo u frog u kermit u pig u bacon u pig u babe u dog u gigi u cat u george u frog u hoppy u pig u tasty u dog u fred u cat u suzy databykey datalines reducebykey lambda a b a b print databykey take 100 output u cat u tom u alley u cleo u george u suzy u dog u noir u bree u pickles u sparky u gigi u fred u frog u kermit u hoppy u pig u bacon u babe u tasty countbykey databykey map lambda k v k len v print countbykey take 100 output u cat 5 u dog 6 u frog 2 u pig 3 run job using cli shell gcloud dataproc jobs submit pyspark cluster my cluster gs your bucket name unstructured lab2 py module 2 running dataproc jobs all the serverless services are stateless separation of storage and compute is what enables serverless to work compute cloud dataflow bigquery analytics cloud dataproc storage cloud storage file bigquery storage tables cloud bigtable nosql only change hdfs to gs module 3 leveraging gcp use dataproc to run oss on gcp but about oss that s not already installed install software on dataproc cluster to install software on dataproc cluster upload it to cloud storage specify gcs location in dataproc creation command e g shell bin bash apt get update true apt get install y python numpy python scipy python matplotlib python pandas notes the script is run as root there is no need to use sudo can only install software on master or worker node shell bin bash apt get update true role usr share google get metadata value attributes dataproc role if role master then apt get install y vim else something that goes only on worker fi things that go on both apt get install y python numpy python scipy python matplotlib python pandas specify gcs location when creating cluster shell gcloud dataproc clusters create mycluster initialization actions gs mybucket init actions my init sh initialization action timeout 3m module 4 analyzing unstructured data ml api and case study module 5 bigquery no ops data warehousing and analytics billing model day 2 serverless data analysis module 6 autoscaling data processing pipelines day 3 serverless machine learning day 4 streaming data processing reference data engineering on google cloud platform https cloud google com training courses data engineering data engineering on google cloud platform in taipei https events withgoogle com data engin 422792 class outline content labs code on github https github com googlecloudplatform training data analyst 20180306 deongcp taipei documents questions refs https goo gl s7ur8y dataproc initialization scripts https github com googlecloudplatform dataproc initialization actions
gcp training notes
cloud
NWOIT
nwoit nwoit new ways of information technology
nextjs13 tailwindcss typescript
server
design-tokens-plugin
alt text assets readme cover jpg support end of life read more about sketch tokens https www sketch com blog 2022 03 17 color tokens design tokens delivering consistent design system a sketch plugin that exports design tokens to json format you can export colors typography icons and utilis a must have tool for design system project features 1 design tokens in single source of truth file as sketch library 2 color tokens as json object 3 typography tokens as json object 4 icon tokens as json object 5 utils tokens as json object 1 design tokens in single source of truth file as sketch library use example design tokens sketch file in example file folder you can fill in the content you want in the example file make sure that you use layer style and text styles when you create tokens all tokens must be at the subfolder in sketch named by color typography icons and utils 2 color tokens as json object to define color tokens must be inside group call colors and layer named by color name importan also to make layer to layer style to create a layer style select a layer and choose layer create new layer style learn here https www sketch com docs styling shared styles alt text assets token color sketch jpg sketch naming colors color name json output color index json json color name value ffd59e type color 3 typography tokens as json object to define text tokens must be inside group call typography and layer named by font name importan also to make layer to text style to create a layer style select a layer and choose layer create new text style learn here https www sketch com docs text text styles alt text assets token typography sketch jpg sketch naming typography font name json output typography index json json typography name font family value helvetica font size value 12 weight letter spacing value 0 line height value 16 type typography 4 icon tokens as json object to define icons tokens must be inside group call icons and layer named by icon name important to keep the svg icon to union format alt text assets token icons sketch jpg sketch naming icons icon name json output icons index json json icon icon eye solid value svg id icon eye solid width 24px height 16px viewbox 0 0 24 16 version 1 1 xmlns http www w3 org 2000 svg xmlns xlink http www w3 org 1999 xlink generator sketch 59 1 86144 https sketch com title shape title g id design tokens stroke none stroke width 1 fill none fill rule evenodd g id tokens transform translate 120 000000 767 000000 fill 000000 fill rule nonzero g id icons transform translate 24 000000 763 000000 g id icon eye solid transform translate 96 000000 4 000000 path d m23 8549772 7 39166667 c21 5953939 2 98291667 17 1220605 0 11 9999772 0 c6 87789388 0 2 40331055 2 985 0 144977215 7 39208333 c 0 0483257383 7 77445825 0 0483257383 8 22595841 0 144977215 8 60833333 c2 40456055 13 0170833 6 87789388 16 11 9999772 16 c17 1220605 16 21 5966439 13 015 23 8549772 8 60791667 c24 0482802 8 22554175 24 0482802 7 77404159 23 8549772 7 39166667 z m11 9999772 14 c8 68626872 14 5 99997721 11 3137085 5 99997721 8 c5 99997721 4 6862915 8 68626872 2 11 9999772 2 c15 3136857 2 17 9999772 4 6862915 17 9999772 8 c18 0007514 9 59153616 17 368859 11 1181095 16 2434729 12 2434957 c15 1180867 13 3688818 13 5915134 14 0007742 11 9999772 14 l11 9999772 14 z m11 9999772 3 99998885 c11 6429486 4 00498962 11 2882238 4 05810724 10 9453939 4 15791667 c11 529363 4 95150825 11 4460613 6 05266558 10 749352 6 74937482 c10 0526428 7 44608405 8 95148546 7 52938577 8 15789388 6 94541667 c7 70949625 8 59740065 8 36454792 10 3513572 9 78606474 11 3049826 c11 2075816 12 2586081 13 0789344 12 1994915 14 4374264 11 1580449 c15 7959183 10 1165982 16 3389584 8 3247936 15 7871857 6 70440527 c15 2354129 5 08401694 13 7117293 3 9959707 11 9999772 3 99998885 l11 9999772 3 99998885 z id shape path g g g g svg type icon 5 utils tokens as json object to define text tokens must be inside group call utils there is different naming convation for other tokens utils has multiple token types types are space radius and shadow see how to name layer from below list alt text assets token utils sketch jpg sketch naming utils space name sketch naming utils radius name sketch naming utils shadow name json output utils index json json utils space name spacer 4 type utils radius name radius 16 16 16 16 type utils shadow name shadows blur 24 x 0 y 8 spread 0 color 473f4f29 enabled true blur 16 x 0 y 4 spread 0 color 2e293314 enabled true type utils border all border 1 type utils how to install download the design tokens sketchplugin zip https github com design meets development design tokens plugin releases download 1 1 2 design tokens sketchplugin zip extract zip and double click the design tokens sketchplugin file download the example file zip https github com design meets development design tokens plugin releases download 1 1 2 example file zip extract zip and double click the design tokens sketch file roadmap github project https github com vjandrei design tokens projects 1 shout out koodiklinikka javascript channel for support to answering my stupid questions https koodiklinikka fi https koodiklinikka fi niki ahlskog for helping me to start the development github https github com shnigi family to support me to having the time to making this contact feedback mail andreas koutsoukos gmail com mailto andreas koutsoukos gmail com license this project is licensed under the terms of the mit license
sketch sketch-plugin design-systems
os
Polling-Website
polling website implemented a polling website for my internet engineering course with python programming language using django framework for back end bootstrap 4 for front end and mongodb for the database screen shot screenshot png
server
Cloud-Engineer-23
cloud engineer 23 cloud engineering project
cloud
mongoose-os
license https img shields io badge license apache 202 0 blue svg https opensource org licenses apache 2 0 gitter https badges gitter im cesanta mongoose os svg https gitter im cesanta mongoose os utm source badge utm medium badge utm campaign pr badge mongoose os an iot firmware development framework over the air firmware updates and remote management reliable updates with rollback on failures remote device access infrastructure security built in flash encryption crypto chip support arm mbedtls optimized for small memory footprint device management dashboard service https mdash net supported microcontrollers cc3220 cc3200 esp32 esp8266 stm32f4 stm32l4 stm32f7 recommended dev kits esp32 devkitc for aws iot https mongoose os com aws iot starter kit esp32 kit for google iot core https mongoose os com gcp built in integration for aws iot google iot core microsoft azure adafruit io generic mqtt servers code in c or javascript ready to go apps and libraries embedded javascript engine mjs https github com cesanta mjs trusted and recommended by amazon aws amazon aws technology partner https aws amazon com blogs apn partner sa roundup may 2017 google iot core mongoose os is a google cloud iot core partner https cloud google com iot partners ibm watson iot mongoose os is a ready for ibm watson iot validated solution https www 356 ibm com partnerworld gsd solutiondetails do solution 55930 lc en statecd p tab 1 microsoft azure iot mongoose os is recommended by microsoft azure iot https azure microsoft com en us blog azure iot automatic device management helps deploying firmware updates at scale texas instruments an official partner of texas instruments http www ti com ww en internet of things iot cloudsolution html stmicroelectronics an official partner of stmicroelectronics https www st com content st com en partner partner program partnerpage cesanta html espressif systems an official partner of espressif systems http espressif com en support download sdk docs support mongoose os documentation https mongoose os com docs mongoose os quickstart setup md support forum ask your technical questions here https community mongoose os com video tutorials https www youtube com channel ucz9lq7b 4bdblolpkwjpsaw featured commercial licensing https mongoose os com licensing html and support available https mongoose os com support html licensing mongoose os is open source and dual licensed mongoose os community edition apache license version 2 0 mongoose os enterprise edition commercial license community vs enterprise edition community edition enterprise edition license apache 2 0 https www apache org licenses license 2 0 commercial contact us https mongoose os com contact html allows to close end product s source code yes yes price free paid see details https mongoose os com licensing html source code functionality limited https mongoose os com docs mongoose os userguide licensing md full technical support community support via forum https forum mongoose os com and chat https gitter im cesanta mongoose os commercial support by mongoose os development team see details https mongoose os com support html how to contribute if you have not done it already sign cesanta cla https cesanta com cla html and send github pull request make a pull request pr against this repo please follow google coding style https google github io styleguide cppguide html send pr to one of the core team member pimvanpelt https github com pimvanpelt nliviu https github com nliviu drbomb https github com drbomb kzyapkov https github com kzyapkov rojer https github com rojer cpq https github com cpq responsibilities of the core team members review and merge pr submissions create new repos in the https github com mongoose os apps and https github com mongoose os libs organisations for new app library contributions create mongoose os releases
iot aws aws-iot esp8266 esp32 stm32 cc3200 firmware iot-platform
server
lollms-webui
lollms web ui div align center img src https github com parisneo lollms blob main lollms assets logo png alt logo width 200 height 200 div github license https img shields io github license parisneo lollms webui github issues https img shields io github issues parisneo lollms webui github stars https img shields io github stars parisneo lollms webui github forks https img shields io github forks parisneo lollms webui discord https img shields io discord 1092918764925882418 color 7289da label discord logo discord logocolor ffffff https discord gg 4rr282wjb6 follow me on twitter https img shields io twitter follow spacenerduino style social https twitter com spacenerduino follow me on youtube https img shields io badge follow 20me 20on youtube red style flat logo youtube https www youtube com user parisneo lollms core library download statistics downloads https static pepy tech badge lollms https pepy tech project lollms downloads https static pepy tech badge lollms month https pepy tech project lollms downloads https static pepy tech badge lollms week https pepy tech project lollms lollms webui download statistics downloads https img shields io github downloads parisneo lollms webui total style flat square https github com parisneo lollms webui releases downloads https img shields io github downloads parisneo lollms webui latest total style flat square https github com parisneo lollms webui releases welcome to lollms webui lord of large language models one tool to rule them all the hub for llm large language model models this project aims to provide a user friendly interface to access and utilize various llm models for a wide range of tasks whether you need help with writing coding organizing data generating images generating music or seeking answers to your questions lollms webui has got you covered as an all encompassing tool with access to over 300 ai expert conditionning across diverse domains and more than 500 fine tuned models over multiple domains you now have an immediate resource for any problem whether your car needs repair or if you need coding assistance in python c or javascript feeling down about life decisions that were made wrongly yet unable see how ask lollms need guidance on what lies ahead healthwise based on current symptoms presented our medical assistance ai can help you get a potential diagnosis and guide you to seek the right medical care if stuck with legal matters such contract interpretation feel free reach out to lawyer personality to get some insight at hand all without leaving comfort home not only does it aid students struggling through those lengthy lectors but provides them extra support during assessments too so they are able grasp concepts properly rather then just reading along lines which could leave many confused afterward want some entertainment then engage laughter botand let yourself go enjoy hysterical laughs until tears roll from eyes while playing dungeons dragonsor make up crazy stories together thanks to creative story generator need illustration work done no worries artbot got us covered there and last but definitely not least lordofmusic here for music generation according to individual specifications so essentially say goodbye boring nights alone because everything possible can be achieved within one single platform called lollms features choose your preferred binding model and personality for your tasks enhance your emails essays code debugging thought organization and more explore a wide range of functionalities such as searching data organization image generation and music generation easy to use ui with light and dark mode options integration with github repository for easy access support for different personalities with predefined welcome messages thumb up down rating for generated answers copy edit and remove messages local database storage for your discussions search export and delete multiple discussions support for image video generation based on stable diffusion support for music generation based on musicgen support for multi generation peer to peer network through lollms nodes and petals support for docker conda and manual virtual environment setups star history a href https star history com parisneo lollms webui parisneo lollms parisneo lollms cpp client parisneo lollms bindings zoo parisneo lollms personalities zoo date picture source media prefers color scheme dark srcset https api star history com svg repos parisneo lollms webui parisneo lollms parisneo lollms cpp client parisneo lollms bindings zoo parisneo lollms personalities zoo type date theme dark source media prefers color scheme light srcset https api star history com svg repos parisneo lollms webui parisneo lollms parisneo lollms cpp client parisneo lollms bindings zoo parisneo lollms personalities zoo type date img alt star history chart src https api star history com svg repos parisneo lollms webui parisneo lollms parisneo lollms cpp client parisneo lollms bindings zoo parisneo lollms personalities zoo type date picture a thank you for all users who tested this tool and helped making it more user friendly installation automatic installation ui if you are using windows just visit the release page download the windows installer and install it automatic installation console download the installation script from scripts folder and run it the installation scripts are win install bat for windows linux install sh for linux mac install sh for mac manual install with anaconda miniconda if you don t have anaconda or miniconda installed please install it install miniconda https docs conda io projects miniconda en latest miniconda install html make sure to add it to your path so that you can run it easily from a terminal if you don t have git installed please install it install git https git scm com book en v2 getting started installing git make sure to add it to your path so that you can run it easily from a terminal run a terminal and create a new environment called lollms with python 3 10 bash conda create name lollms python 3 10 activate the environment bash conda activate lollms if you want to use an nvidia gpu install cuda toolkit 11 8 bash conda install c nvidia label cuda 11 8 0 cuda toolkit clone the project bash git clone https github com parisneo lollms webui git enter the lollms webui folder bash cd lollms webui download submodules lollms core zoos and safe store library bash git submodule init git submodule update cd zoos bindings zoo git checkout main cd personalities zoo git checkout main cd extensions zoo git checkout main cd models zoo git checkout main cd lollms core git checkout main pip install e cd utilities safe store git checkout main pip install e cd install dependancies bash pip install upgrade r requirements txt run the application bash python app py manual install with virtual env make sure you install python 3 10 and git install python https www python org downloads release python 31013 make sure to add it to your path so that you can run it easily from a terminal if you don t have git installed please install it install git https git scm com book en v2 getting started installing git make sure to add it to your path so that you can run it easily from a terminal to use your gpu you may need to install nvidia cuda toolkit https developer nvidia com cuda toolkit run a terminal and install pip bash python m ensurepip upgrade install virtual environment bash pip install venv clone the project bash git clone https github com parisneo lollms webui git enter the lollms webui folder bash cd lollms webui create a virtual environment bash python m venv env activate the virtual environment on windows env scripts activate on linux env bin activate on macos env bin activate download submodules lollms core zoos and safe store library bash git submodule init git submodule update cd zoos bindings zoo git checkout main cd personalities zoo git checkout main cd extensions zoo git checkout main cd models zoo git checkout main cd lollms core git checkout main pip install e cd utilities safe store git checkout main pip install e cd install dependancies bash pip install upgrade r requirements txt run the application bash python app py once installed you need to activate the environment then run the app code of conduct by using this tool users agree to follow these guidelines this tool is not meant to be used for building and spreading fakenews misinformation you are responsible for what you generate by using this tool the creators will take no responsibility for anything created via this lollms you can use lollms in your own project free of charge if you agree to respect the apache 2 0 licenseterms please refer to https www apache org licenses license 2 0 you are not allowed to use lollms to harm others directly or indirectly this tool is meant for peacefull purposes and should be used for good never for bad users must comply with local laws when accessing content provided by third parties like openai api etc including copyright restrictions where applicable disclaimer large language models are amazing tools that can be used for diverse purposes lollms was built to harness this power to help the user inhance its productivity but you need to keep in mind that these models have their limitations and should not replace human intelligence or creativity but rather augment it by providing suggestions based on patterns found within large amounts of data it is up to each individual how they choose use them responsibly the performance of the system varies depending on the used model its size and the dataset on whichit has been trained the larger a language model s training set the more examples generally speaking better results will follow when using such systems as opposed those with smaller ones but there is still no garantee that the output generated from any given prompt would always be perfect and it may contain errors due various reasons so please make sure you do not use it for serious matters like choosing medications or making financial decisions without consultating an expert first hand license this repository uses code under apachelicense version 2 0 see license https github com parisneo lollms webui blob main license file for details about rights granted with respect to usage distribution copyright parisneo 2023
ai llm text-generation
ai
pkernel
pkernel a small rtos for arm cortex m3 4 pkernel is a very small multitasking kernel rtos for cortex m3 m4 microcontrollers written in c in most parts and in assembly in as less code segments as possible features some of the features of pkernel are supports privileged and unprivileged processes processes can exit a process can create processes provide a very basic unix like cron capability for process time scheduling supports services services are small functions pkernel calls from inside systick isr in a strict periodical manner supports sleep and stop mode supports service running mode pkernel runs without any call to processes and in only keeps alive services in sleep mode support syscalls like exit sleep wait signal lock unlock provide a very basic memory management via malloc free use with care a small example c include pkernel h void pr 1 void process 1 code while 1 usually inside a loop void pr 2 void process 2 code while 1 usually inside a loop int main void init with 320bytes kernel stack tick freq for system frequency and mcu clock clock kinit size t 320 clock tick freq knew pr 1 size t 320 1 0 320 byte stack for pr 1 set nice flag and not set fit flag knew pr 2 size t 320 1 0 320 byte stack for pr 1 set nice flag and not set fit flag krun start pkernel while 1 unreachable in the above example we create 2 process functions pr 1 and pr 2 in the main function we initialize pkernel with the desired memory size for privileged process stack idle process for example the desired pkernel frequency tick freq and the hardware clock of the mcu next we register the two processes with 320 bytes of stack for each and run the kernel after that point we will never return to main
os
interior
interior culmination of research and development of grid systems amp how to create layouts using them it started here https codepen io morganfeeney post dont build bootstrap style grid systems with flexbox license attribution noncommercial 4 0 international cc by nc 4 0 acknowledgements this project would not be possible without css grid https www w3 org tr css grid 1 grid systems in graphic design by josef m ller brockmann https en wikipedia org wiki josef m c3 bcller brockmann firefox grid inspector https developer mozilla org en us docs tools page inspector how to examine grid layouts grid by example http gridbyexample com the ongoing support from yuanyuan zhang
css layout scss nunjucks design-systems grid-layout grid typography design-system grid-systems interior css-grid vertical-rhythm
os
Built-in-SQL-functions
built in sql functions part of peer graded assignment coursera ibm data engineering databases and sql for data science with python in this assignment i workedg with multiple real world datasets for the city of chicago learning objectives invoke sql queries using python retrieve sql query results and analyze data interpret and translate data analysis questions into sql queries
server
FEE-Bootcamp-May-2018
fee bootcamp front end engineering bootcamp this repo is going to hold all the artifacts generated during the live coding sessions topics you have to learn and the assignments to be completed as a part of this bootcamp what is the objective to make you learn the fundamentals of web development the practical way and get you on par with a junior front end developer 1 year experience in 6 weeks learning would be happening through involving in discussions on various web development topics discussion based learning where you learn by listening teaching listening and observering how we build web pages and web apps from scratch live coding session by doing practicing on your own and working on assignments working on assignments team projects by building a full blown web app as a team in a real time project environment capstone project how a day in the bootcamp would look like first 30 minutes for queries from the last day assignment discussions next 90 minutes on discussing about different topics related to web development these topics would be published the day before so that the trainees could read about them and come for the discussion discussion would be driven by the trainees 15 minute break 90 minutes live coding session where we would build some stuffs 45 minutes lunch break 60 minutes live coding session rest of day for the trainees to work on assignments
frontend-development bootcamp
front_end
rtos-embedded-system
real time embedded system project a real time embedded system with 4 tasks to respond to the events below tags salvo rtos c programming real time embedded program main file https github com thesmitchawda rtos embedded system blob master rtosmain c salvortos header file https github com thesmitchawda rtos embedded system blob master salvocfg h salvortos config files https github com thesmitchawda rtos embedded system blob master salvomem c project report after clicking the link click on view raw to download the word document https github com thesmitchawda rtos embedded system blob master projectreport smitchawda docx task description task1 task read temp pot p to read the instantaneous values of on board potentiometer temperature sensor task2 task knight rider p to execute the sequential running of the previous project project knight rider until direction changing trigger is pressed task3 task run led p to blink the led at a specified interval every 3s in the project task4 task print temp pot use the uart and send the data received from task1 note all the tasks are given equal priority of 10 as required by the project description and control flow created by smit chawda submitted to sheridan college f a s t
os
SuperPay
superpay e wallet app java backend development it has all the basic functions of paytm wallet like user login transaction etc java backend application with following services user service register a new user maintain all the information get all users find user by given id wallet service create a new wallet update wallet add balance into wallet check the current balance send money to any other user get all the transaction history on his her given email technologies used spring boot hibernate mysql restful apis swagger actuator postman
server
aws-machine-learning-university-accelerated-tab
logo data mlu logo png machine learning university accelerated tabular data class this repository contains slides notebooks and datasets for the machine learning university mlu accelerated tabular data class our mission is to make machine learning accessible to everyone we have courses available across many topics of machine learning and believe knowledge of ml can be a key enabler for success this class is designed to help you get started with tabular data spreadsheet like tables learn about widely used machine learning techniques for tabular data and apply them to real world problems youtube watch all tabular data class video recordings in this youtube playlist https www youtube com playlist list pl8p z6c4gcuvqzcyf znmoiwllkgx9mi2 from our youtube channel https www youtube com channel uc12lqyqtqybxatys9aa7nuw playlists playlist https img youtube com vi kj spc6pai4 0 jpg https www youtube com playlist list pl8p z6c4gcuvqzcyf znmoiwllkgx9mi2 course overview there are three lectures and one final project for this class lecture 1 title studio lab introduction to ml sample ml model model evaluation open in studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github aws samples aws machine learning university accelerated tab blob master notebooks mla tab day1 model ipynb exploratory data analysis open in studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github aws samples aws machine learning university accelerated tab blob master notebooks mla tab day1 eda ipynb k nearest neighbors knn open in studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github aws samples aws machine learning university accelerated tab blob master notebooks mla tab day1 knn ipynb final project open in studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github aws samples aws machine learning university accelerated tab blob master notebooks mla tab day1 final ipynb lecture 2 title studio lab feature engineering open in studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github aws samples aws machine learning university accelerated tab blob master notebooks mla tab day2 text process ipynb tree based models open in studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github aws samples aws machine learning university accelerated tab blob master notebooks mla tab day2 tree ipynb bagging hyperparameter tuning aws ai ml services open in studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github aws samples aws machine learning university accelerated tab blob master notebooks mla tab day2 sagemaker ipynb lecture 3 title studio lab optimization regression models boosting neural networks nn open in studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github aws samples aws machine learning university accelerated tab blob master notebooks mla tab day3 nn ipynb br mxnet open in studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github aws samples aws machine learning university accelerated tab blob master notebooks mla tab day3 mxnet ipynb automl open in studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github aws samples aws machine learning university accelerated tab blob master notebooks mla tab day3 automl ipynb final project practice working with a real world tabular dataset for the final project final project dataset is in the data final project folder https github com aws samples aws machine learning university accelerated tab tree master data final project for more details on the final project check out this notebook https github com aws samples aws machine learning university accelerated tab blob master notebooks mla tab day1 final ipynb interactives visuals interested in visual interactive explanations of core machine learning concepts check out our mlu explain articles https mlu explain github io to learn at your own pace contribute if you would like to contribute to the project see contributing contributing md for more information license the license for this repository depends on the section data set for the course is being provided to you by permission of amazon and is subject to the terms of the amazon license and access https www amazon com gp help customer display html nodeid 201909000 you are expressly prohibited from copying modifying selling exporting or using this data set in any way other than for the purpose of completing this course the lecture slides are released under the cc by sa 4 0 license the code examples are released under the mit 0 license see each section s license file for details
machine-learning tabular-data deep-learning python gluon mxnet sklearn
ai
wmd-course-materials
web mobile development course materials this collection of slides form the course materials for the web mobile development course part of the professional bachelor ict http www ikdoeict be study programme taught at kaho sint lieven http www kahosl be ghent belgium the materials were developed by bram us van damme who blogs over at bram us http www bram us and twitters as bramus http twitter com bramus the materials may be used freely as long as credit to the authors is present and the top right graphical link to ikdoeict be http www ikdoeict be remains in place the engine powering the slide decks is a customized reveal js http lab hakim se reveal js more info on this can be found in the first set of slides
front_end